OWLIM Inference Storage Engine


OWLIM is a high-performance semantic repository, packaged as a Storage and Inference Layer (SAIL) for the Sesame RDF database. OWLIM uses the IRRE engine to perform OWL DLP reasoning. The reasoning and query evaluation are performed in-memory, while a reliable persistence strategy assures data preservation, consistency and integrity.


– From the OWLIM Page.

Post a Comment

Popular posts from this blog

Converting Array to List in Scala

I ported a JavaScript app to Dart. Here's what I learned.

Minification is not enough, you need tree shaking