I have been thinking about how to combine RDF models when one of them is too large to load in to memory.
The answer will involve copying a connected subgraph into a temporary model, a "view" if you will. This will be centred on a start node and then use some specified number of breadth first search steps.
The problem is that if you hit a node with a large number of links, such as a class type, the process will crash just getting a count of the links (if that were possible).
I think the answer involves setting a boundary fence. A set of nodes, or tests that if reached will prevent the search from including that node in subsequent steps.
Some thoughts:
https://ocratato-sassy.sourceforge.io/pdf/views.pdf
#SASSY #RDF