Tips & Tricks

whatshot

LinkedPipes ETL to be used in the Slovak National Open Data Catalog
2022-09-04

LinkedPipes ETL will be used as a core technology in the updated version of the Slovak National Open Data Catalog, inspired by how it is already in use in the Czech National Open Data Catalog.

whatshot

LinkedPipes ETL to be used in EU CEF Telecom project STIRData
2020-10-30

LinkedPipes ETL will be used as a core technology in the just started EU CEF Telecom project STIRData to promote open data interoperability through Linked Data and standardization!

whatshot

New tutorial about loading to Wikibase available!
2019-11-15

LinkedPipes ETL now allows you to load data into Wikibase instances such as Wikidata. We prepared a tutorial, which will help you with the process.

Older tips & tricks

whatshotLinkedPipes ETL featured @ Wikimania 2019!
2019-07-12

LinkedPipes ETL will be featured as a solution for repeatable loading of data into Wikibases and Wikidata at Wikimania 2019! See you in August in Stockholm, Sweden!

whatshotLinkedPipes ETL featured @ ISWC 2018 Demo Session!
2018-07-04

LinkedPipes ETL will be featured as part of the LinkedPipes DCAT-AP Viewer demo at ISWC 2018! See you in October in Monterey, California, USA!

whatshotLinkedPipes ETL @ iiWAS 2017!
2017-10-30

LinkedPipes ETL will be presented at iiWAS 2017! See you in December in Salzburg, Austria!

whatshotLinkedPipes ETL @ ODBASE 2017!
2017-10-16

Data chunking in LinkedPipes ETL will be presented at ODBASE 2017! See you in Rhodes, Greece!

schoolA set of simple how-tos added
2017-08-17

We added an initial set of simple how-tos that may help you start with LP-ETL. More how-tos are on their way.

schoolGeocoding with nominatim tutorial available
2017-07-31

Another tutorial showing geocoding with Nominatim in LP-ETL by @jindrichmynarz is available!

schoolTabular data to RDF tutorial available
2017-06-30

We have created a tutorial showing conversion of tabular data to RDF, including all transformations and metadata necessary to publish a LOD dataset. The tutorial is based on real-world data of the Local Administrative Units (LAU) code list.

whatshotFeedback welcome
2017-05-05

Current users please fill out this short usability questionnaire. It should not take you more than 2 minutes. Thank you.

whatshotJSON-LD formatting
2017-03-15

JSON-LD files can now be created out of ordinary JSON files by adding a JSON-LD context using the JSON to JSON-LD component. Output JSON-LD files can be formatted to a Compacted, Flattened or Expanded JSON-LD format using the Format JSON-LD component.

whatshotVoID support
2017-02-20

VoID is now supported for creating metadata via the VoID Dataset component, which can enrich the result of DCAT-AP Distribution so that the DCAT-AP to CKAN loader will create CKAN resources for VoID example resources and VoID SPARQL endpoint properties.

whatshotBreaking change in develop branch
2017-02-19

When updating the develop branch, please see the update notes, there are some directories that need to be cleaned before update.

whatshotMultithreading in XSLT and Virtuoso loader
2017-02-01

XSLT transformations can now run with multiple threads per file and OpenLink Virtuoso loader can tell Virtuoso to load files using multiple threads (one file per thread).

whatshotStream compression
2017-01-26

Now you can compress files using gzip and bzip2 stream compression. This can be used to produce smaller dumps still loadable directly to e.g. OpenLink Virtuoso.

whatshotCKAN catalog loader
2017-01-17

Now you can load the DCAT-AP v1.1 compatible metadata created in DCAT-AP Dataset and DCAT-AP Distribution components to a CKAN catalog using the new DCAT-AP to CKAN loader. DCAT-AP v1.1 is the current recommendation for data catalogs in the European Union.

whatshotComponent documentation update
2017-01-05

There is a number of new components present in LinkedPipes ETL. Now, most of them are properly documented and ready to use. In addition, each component documentation page now has an “Edit this page” button on top, so if you see a mistake or have a documentation extension, please do not hesitate to submit a pull request.

fast_forwardComponents for chunked RDF data processing
2016-12-24

Recently we developed a number of components for chunked RDF data processing. They can considerably speed up processing and lower memory requirements for many use cases. They are suitable for data, which can be split into independent entities processed separately, e.g. list of inspections, list of regions, etc. What they have in common is the new RDF chunked data unit containing RDF data split into smaller data chunks. Check out the chunked versions of the original components such as Tabular, SPARQL Endpoint extractor, SPARQL Endpoint loader, SPARQL Construct, Files to RDF, etc. and the new components which use chunks natively - GeoTools and Bing translator.

whatshotComponent templates!
2016-10-31

Ever needed to share component configuration among pipelines? Did your target triple store access method change and you need to change multiple pipelines at once? Component templates add support for configuration reuse among pipelines with per-item granularity. See the component templates documentation!

lightbulb_outlineSupport for StatDCAT-AP draft 4
2016-07-25

We added support for the newest StatDCAT-AP draft 4 to our metadata components for datasets and distributions. Now you can annotate your statistical datasets a bit better.

lightbulb_outlineSupport for DCAT-AP v1.1
2016-07-14

We added components to support production of dataset and distribution metadata according to the newest DCAT-AP v1.1 specification and implementation guidelines by the ISA Programme.

lightbulb_outlineComponents in pipelines can be disabled for debugging
2016-06-22

Now, you can disable individual components in pipelines. This is useful when debugging as you no longer need to copy the whole pipeline and delete components to debug only part of the pipeline. The disabled components behave as if they are not in the pipeline, however, they can be enabled at any time without the need to delete them and then recreate them again.

lightbulb_outlinePipelines can take inputs from HTTP POST requests
2016-06-10

Our new Pipeline input component allows you to have a fixed pipeline and pass various inputs to it using a simple HTTP POST request. This is useful for larger repeated data processing workflows.

whatshot3-minute screencast
2016-06-08

Watch our 3-minute screencast!

lightbulb_outlineGraphical pipeline progress view
2016-05-20

Now when you run a pipeline, you can watch it execute in the new graphical pipeline progress view. It looks the same as the pipeline editor and therefore it is easier to view contents of a particular data unit thant it was before in the list view, especially for more complex pipelines. You can access the intermediary debug data by clicking on the data unit (the circle on a component) and view the component execution messages and other details by clicking on the component. When you need to adjust and rerun the pipeline, you can switch to edit mode and continue as before.

lightbulb_outlineDocumentation updated
2016-05-04

We overhauled the documentation of components. Now, each component has its own page and a detailed documentation as well as sample usage in an importable pipeline fragment will start to appear there. An example can be seen in our {{ mustache }} component documentation.

fast_forwardImport from UnifiedViews
2016-05-03

Do you have pipelines in UnifiedViews? Now you can import them to LinkedPipes ETL and enjoy all of our new features. All you need to do is to use our uv2etl tool. The conversion needs to be implemented for each UV DPU separately, but we have the core RDF DPUs covered and you can always request a DPU to be added via an issue, or contribute the conversion via a pull request.

whatshotLinkedPipes ETL @ ESWC 2016 Demo Track!
2016-04-11

LinkedPipes ETL will be presented at the ESWC 2016 Demo Track! See you in Crete, Greece! Also check out the demo of LinkedPipes Visualization also presented there!

moodInsert pipeline fragment
2016-04-10

Now it is possible to insert a fragment of a pipeline into an existing pipeline. When you click in the pipeline editor into an empty space, you now have 2 options, either place a component or a pipeline. Pipelines can be inserted from URLs (you can point to pipelines in the same instance or other instances, if their IRIs are dereferencable) or you can input the JSON-LD file containing a pipeline. This file can be obtained by downloading a pipeline.

fast_forwardDownload pipeline without credentials
2016-04-07

When you want to share a pipeline with others, e.g. on GitHub, but you don’t want to share your server credentials, you can now click on “Download without credentials”. Your login, password, API keys and server addresses and ports will be removed from the downloaded pipeline.

fast_forwardImport pipelines from web
2016-03-10

When you are importing a pipeline from the web, you no longer have to download it and upload it in LP-ETL. Just copy its URL and use it directly in the import pipeline dialog. In addition, in pipeline details, you can see the URL, from which a current pipeline can be downloaded as RDF. You can share this URL and someone else can use it for importing the pipeline directly from your instance.

moodSmart component suggestions
2016-03-01

We have imporved the way a pipeline can be designed, which will save you time and frustration with our component list. Check out the workflow description.

moodComponents can have descriptions
2016-02-28

Components can have descriptions now. This means that besides their label which you can change, you can add more text describing the component further, distinguishing it from others on the pipeline.

ondemand_videoSee more

whatshotAdvanced debugging capabilities
2016-02-27

We have greatly improved the debugging capabilities. Now we have a graphical overview of how the execution went. When your pipeline fails, you can simply fix what was wrong and resume from the point of failure. And when you are developing your query and need to rerun it over an over again until it is perfect, you can do that now too. Check out the debugging support description.