whoop whoop full provenance-preserving roundtrip serialization to JSON in #numpydantic 1.6.0.
sure you could serialize a complex, nested data model with a ton of arrays in a precious high-performance data backend as a bunch of random JSON numbers and test the parsers of fate, or you could serialize them as relative paths with an indication of an interface loading class to a json/yaml file and distribute them without losing all the nice chunking and whatnot you gone and done. Whenever they review my PR, if you use numpydantic with @linkml , then you get all the rich metadata and modeling power of linked data with arrays in a way that makes sense, with arbitrary array framework backends rather than some godforsaken rest/first
tree or treating arrays as if they're the same as scalar values -- and now complete with a 1:1 JSON/YAML serializable/deserializable storage format.
Another day closer to linked data for real data with tools that feel good to use. Another day closer to p2p linked data.
next up is including hashes, multiple sources, and support for more types of constraints (now that i'm actually getting feature requests for this thing, which is weird to me).
https://numpydantic.readthedocs.io/en/latest/serialization.html