Qlever's view of Wikimedia's concern with regard to the size of Wikidata #2678
alexander-winkler
started this conversation in
General
Replies: 1 comment 1 reply
-
|
@alexander-winkler QLever can indeed handle Wikidata with ease, and, in fact, any realistically perceivable growth. Here is a demo of QLever handling one trillion triples on a single commodity PC: https://qlever.dev/one-trillion . That is more than fifty times larger than the current Wikidata The current Wikidata RDF (around 18 billion triples) can be loaded into QLever in about three hours, while it takes 1-2 weeks with Blazegraph Unfortunately, Wikidata also involves other systems, like Wikibase, which are very inefficient (and also quite old, I think). For example, it currently takes almost a week to dump the complete Wikidata |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Wikimedia is worried about the size of Wikidata (cf. this RfC). I'd be interested to know what qlever makes of the described limits concerning the overall size of wikidata and the number of edits and queries.
From a user perspective, qlever handles the current amount of data with relative ease. But what would be limits of qlever? So how much could Wikidata grow witout qlever getting into trouble?
I'd be happy to hear what you think!
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions