#qlever 420ms, #wikidata #query service Timeout nach 60 Sek. , also mindestens Faktor 143. Keine Ahnung, was da die technischen Hintergründe sind; für die produktive Arbeit mit Wikidata ist #Blazegraph aber ein enorm limitierender Faktor.
#qlever 420ms, #wikidata #query service Timeout nach 60 Sek. , also mindestens Faktor 143. Keine Ahnung, was da die technischen Hintergründe sind; für die produktive Arbeit mit Wikidata ist #Blazegraph aber ein enorm limitierender Faktor.
c'est acté. #Wikidata va se doter de plusieurs graphes. Les articles scientifiques vont être stockés dans un autre graphe, mais ça ne change rien aux données Wikidata. Restera à trouver une alternative à #Blazegraph qui est en fin de vie. #GraphSplit https://wikimania.eventyay.com/2024/talk/SYV8AP/
It updates every 24 hours, as it is configured to. A backend process submits the query to a private, load-balanced query service, and makes it visible through that frontend. If you try to submit a query through that frontend, it won't work.
The next step is setting up the app that lets you enter queries into the database.
#wikidata #wikibase #sparql #rdf #query #blazegraph #graphdatabase
Les cathédrales ont un plan de sauvegarde des oeuvres d'art pour déterminer les oeuvres à sauver. #Wikidata a un document listant ce qu'il faut supprimer en cas de plantage complet de #Blazegraph. Ce sera les articles scientifiques 😃 https://m.wikidata.org/wiki/Wikidata:SPARQL_query_service/WDQS_backend_update/Blazegraph_failure_playbook
However, when #Wikidata emerged, I wanted to connect my database with it. Before two years, I tried to store SMW in #Blazegraph but I made a bad experience with it. When the power went off, it led to fatal error and problem to boot.
Two months ago, I tried a second attempt, with #Virtuoso for this time. It runs without problems.
Now, I can make #sparql queries, however it is quite complicated, e.g. to query properties and pages with longer string names.
Now available for download, the 2023-04-23 #Blazegraph dump for #Wikidata Query Service. Use to bootstrap your own local copy of WDQS. https://datasets.scatter.red/orb/
#Wikidata Query Service #Blazegraph dump now available to download, free of charge (344 GB compressed; over 1 TB decompressed).
Wrapping up this thread with a new #blog post: "Deploying #Wikidata to different graph databases and what works best" #blazegraph #sparql #wikibase
https://harej.co/posts/2023/01/loading-wikidata-into-different-graph-databases-blazegraph-qlever/
The challenge here isn't going to be initial setup, as far as I can tell. The index build is in progress. It is going to be taking updater tech built for #Blazegraph and adapting it to #QLever. Which I suspect is possible, if both use #SPARQL, but I haven't actually tried it yet.
After successfully experimenting with and deploying #Blazegraph for #Wikidata querying I am now experimenting with #QLever which is like a breath of fresh air.
Eventually I would like to offer QLever as an experimental service alongside Blazegraph, eventually retiring Blazegraph.
Rebuilding #Wikidata in #Blazegraph was successful, as was uploading the 1 TB database file to cloud storage.
Now I am syncing the database with present day. Unlike last time, I am pretty sure I am actually pulling from Wikidata this time.
Rebuilding #Wikidata on #Blazegraph ...again... after a misconfigured updater caused data corruption. In parallel I am working on re-attempting Wikidata in #QEndpoint
Recommended system configuration to run a #Blazegraph instance loaded with #Wikidata –
* 2x 1 TB NVMe SSDs, formatted together as a single 2 TB volume
* 128 GB of RAM
* If your CPU can address 128 GB of RAM it's probably good enough
Successfully rebuilt #Blazegraph data.jnl from dump; now backing up my effort. (I may make such dumps available for download, since building #Wikidata from dump takes multiple days even on good hardware.)
[tool] couchdb-blazegraph-sync: keep a #CouchDB database and a #BlazeGraph namespace in sync https://github.com/maxlath/couchdb-blazegraph-sync