DBpedia and the live extraction of structured data from Wikipedia

Mohamed Morsey (Department of Computer Science, University of Leipzig, Leipzig, Germany)
Jens Lehmann (Department of Computer Science, University of Leipzig, Leipzig, Germany)
Sören Auer (Department of Computer Science, University of Leipzig, Leipzig, Germany)
Claus Stadler (Department of Computer Science, University of Leipzig, Leipzig, Germany)
Sebastian Hellmann (Department of Computer Science, University of Leipzig, Leipzig, Germany)

Program: electronic library and information systems

ISSN: 0033-0337

Publication date: 20 April 2012

Abstract

Purpose

DBpedia extracts structured information from Wikipedia, interlinks it with other knowledge bases and freely publishes the results on the web using Linked Data and SPARQL. However, the DBpedia release process is heavyweight and releases are sometimes based on several months old data. DBpedia‐Live solves this problem by providing a live synchronization method based on the update stream of Wikipedia. This paper seeks to address these issues.

Design/methodology/approach

Wikipedia provides DBpedia with a continuous stream of updates, i.e. a stream of articles, which were recently updated. DBpedia‐Live processes that stream on the fly to obtain RDF data and stores the extracted data back to DBpedia. DBpedia‐Live publishes the newly added/deleted triples in files, in order to enable synchronization between the DBpedia endpoint and other DBpedia mirrors.

Findings

During the realization of DBpedia‐Live the authors learned that it is crucial to process Wikipedia updates in a priority queue. Recently‐updated Wikipedia articles should have the highest priority, over mapping‐changes and unmodified pages. An overall finding is that there are plenty of opportunities arising from the emerging Web of Data for librarians.

Practical implications

DBpedia had and has a great effect on the Web of Data and became a crystallization point for it. Many companies and researchers use DBpedia and its public services to improve their applications and research approaches. The DBpedia‐Live framework improves DBpedia further by timely synchronizing it with Wikipedia, which is relevant for many use cases requiring up‐to‐date information.

Originality/value

The new DBpedia‐Live framework adds new features to the old DBpedia‐Live framework, e.g. abstract extraction, ontology changes, and changesets publication.

Keywords

Citation

Morsey, M., Lehmann, J., Auer, S., Stadler, C. and Hellmann, S. (2012), "DBpedia and the live extraction of structured data from Wikipedia", Program: electronic library and information systems, Vol. 46 No. 2, pp. 157-181. https://doi.org/10.1108/00330331211221828

Download as .RIS

Publisher

:

Emerald Group Publishing Limited

Copyright © 2012, Emerald Group Publishing Limited

Please note you might not have access to this content

You may be able to access this content by login via Shibboleth, Open Athens or with your Emerald account.
If you would like to contact us about accessing this content, click the button and fill out the form.
To rent this content from Deepdyve, please click the button.