#qlever

2025-06-23

Service-Link zur heutigen @digiSberlin -Veranstaltung anlässlich des #Digitaltag2025 (vgl. openbiblio.social/@digiSberlin)

Folgender Query listet die Online-Kataloge/Online-Sammlungen der Berliner #GLAM-Einrichtungen: qlever.cs.uni-freiburg.de/wiki (Query powered by #qlever . Die zwei unbegrenzten property paths zwingen blazegraph in die Knie ...)

Folien zur Veranstaltung folgen ...

2025-05-09

Spätestens jetzt, wo auch die #loc ins Fadenkreuz des Trumpismus gekommen ist, lohnt es sich vermutlich, sich zu überlegen, welche Services/Projekte der LOC man gerne auch zukünftig zur Verfügung hätte. Hier die Dinge, von denen #wikidata weiß:

w.wiki/E3yD

(Alternative über #qlever, falls der #wdqs ins Timeout läuft: qlever.cs.uni-freiburg.de/wiki)

Katharina Brunnercutterkom
2025-03-24
OpenStreetMap as central tool
Coordinates plus OSM = Context
So Data becomes Information
SPARQL query in QLever
OpenHistoricalMapohm@mapstodon.space
2025-03-06

Want the raw data? Here it is using #QLever and #OverpassUltra, exportable as #GeoJSON and in the #PublicDomain:

overpass-ultra.us/#m=7.52/40.0

OpenHistoricalMapohm@mapstodon.space
2025-03-06

Want to create a time series animation like the one @bmacs001 posted, but for your favorite region? The #OSMWiki has the rudiments of a guide to creating one with {#OverpassTurbo or #OverpassUltra or #QLever} + #QGIS + #FFmpeg:

wiki.openstreetmap.org/wiki/Op

2025-03-02

Dear @dblp, we are very happy to see that you are deploying a QLever-based SPARQL endpoint. Would you probably mind to increase your cache size as this simple 2-hop query doesn't work:
SELECT ?p ?o ?pp ?oo WHERE {
<dblp.org/rec/books/acm/19/X19> ?p ?o .
OPTIONAL { ?o ?pp ?oo . }
}
sparql.dblp.org/n9kCi4
Thank you very much!! :)

#semanticweb #sparql #knowledgegraph #bibliography #qlever

Screenshot of the DBLP SPARQL endpoint running the query:
SELECT ?p ?o ?pp ?oo WHERE { 
  <https://dblp.org/rec/books/acm/19/X19> ?p ?o .
  OPTIONAL { ?o ?pp ?oo . }
}
with the following result:

Error processing query
Tried to allocate 10.6 GB, but only 8.7 GB were available. Clear the cache or allow more memory for QLever during startup
Your query was:
SELECT ?p ?o ?pp ?oo WHERE { 
  <https://dblp.org/rec/books/acm/19/X19> ?p ?o .
  OPTIONAL { ?o ?pp ?oo . }
}
OpenHistoricalMapohm@mapstodon.space
2025-02-13

If you’ve been querying our data using #QLever, updates have resumed with some breaking changes and new features:

forum.openhistoricalmap.org/t/

Lozana Rossenovalozross@post.lurk.org
2025-02-10

And just last week, we presented at the Workshop on New Media Art Archiving, which took place at ZKM | Center for Art and Media Karlsruhe. #WNMAA2025

A blog post up on the @tibhannover blog, outlines the key topics discussed at the workshop and how current developments in #RDM on national and international level can deliver the goal of the workshop to interconnect media art archives globally. Specifically services such as #KGI4NFDI and #TS4NFDI from #Base4NFDI, in addition to tools such as #Qlever and #ORKGask provide relevant points of reference for the further development of a decentralised media art archive infrastructure.

🔗🔖 blog.tib.eu/2025/02/10/connect

2025-01-23

#qlever #wikidata backend has been struggling recently, right?

2025-01-22

@JensB @VdAKluttig Die häufigsten Properties, die in Referenzen auf Bestände verwendet werden: qlever.cs.uni-freiburg.de/wiki (momentan scheint das #qlever Wikidata-Backend zu zicken ...)

2025-01-17

Linking metadata in electronic lab notebook #OpenBIS with metadata in the
#bioimage resource #Omero: How many images are present in omero for a given sample in OpenBIS?
At #mpievolbio, to answer this and similar questions,
we map both resources to #RDF #KnowledgeGraphs using the #ontop framework,
serve the materialized #KGs via #qLever (qlever.cs.uni-freiburg.de) and run a #SPARQL query. Answered in no time! @nfdi4bioimage #LinkedOpenData #FAIR #MicrobialPopulationBiology @ontop

sparql query and response against a knowledge graph for microbial population biology
2025-01-02

#federation with #qlever is really fantastic! You can easily supply missing information from other sources. This query get street addresses of cultural heritage institutions from #openstreetmap #osm where there is no corresponding statement on #wikidata : qlever.cs.uni-freiburg.de/wiki

2024-12-31

Und hier noch Werke aus der @DNB_Aktuelles ... falls zwischen den Jahren die Scanner nicht ausgelastet sind, gäbe es hier ein paar Titel, die für eine Digitalisierung in Frage kämen: qlever.cs.uni-freiburg.de/dnb/

(der Query war für mich technisch lehrreich wegen der Föderation mit #qlever

2024-12-17

it took me quite a while to understand that the ominous `wikibase:directClaim` in #Wikidata is basically just string replacement, so "?dc wikibase:directClaim ?prop ." stands for "FILTER ( STRSTARTS ( STR(?prop), "wikidata.org/prop/direct/" ) )
BIND( IRI(REPLACE( STR(?prop),"prop/direct/", "entity/" ) ) AS ?dc)" in plain #SPARQL .(w.wiki/CSc4).

Is there any documentation on these wikidata-specific functions? They are important if queries must be adapted to #qlever.

2024-12-15
Lozana Rossenovalozross@post.lurk.org
2024-12-05

Also Qlever is pure magic and performance is ace.

Example federating across 3 sources: one of the #Wikibase instances from @tibosl + #Wikidata + #FactGrid - works like a charm in #Qlever, too:
qlever.cs.uni-freiburg.de/wiki

*We do need to sort out the literals business, to minimize the bindings, but that's homework!

2024-12-05

#qlever 420ms, #wikidata #query service Timeout nach 60 Sek. , also mindestens Faktor 143. Keine Ahnung, was da die technischen Hintergründe sind; für die produktive Arbeit mit Wikidata ist #Blazegraph aber ein enorm limitierender Faktor.

Egon Willigh☮gen 🟥egonw
2024-11-03

okay, the final thing I had to learn, is that the new Apache uses transactions. That makes sense, but TDB did not have that, so that required more updates. That the number of triples are reported shows that I seems to have solve most of that.

Oh, a minor thing, the Human Protein Atlas nanopublications has a small error, that fails on:

:HumanProteinAtlas pav:versionNumber "23.0"^^xsd:integer .

But that I solve in the Qleverfile :) 7/7

Qlever in action. Screenshot of the query results for a SPARQL query asking for all unique predicates. Importantly, the BFO_000066 predicate shows up, which is 'occurs in', used by HPA to link genes/proteins to places where they are expressed.

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst