Import > Import RDF File

This feature from the Import Tab can be used to add statements from an uploaded RDF file to the current asset collection.

Choose the RDF file and select its Format, noting that the file may be compressed.

Note

The compression formats ZIP (.zip), gzip (.ttl.gz etc) and bzip2 (.ttl.bz2 etc) are supported, but only the first file in an archive will be imported.

Then, if applicable select the following options:

  • Record each new triple in change history (use with care for large RDF files!). If importing into a workflow, history will always be recorded and this option is greyed out.

  • Direct streaming import into production copy, available only for users with at least Managers permission. Direct streaming is not available for import into workflows.

  • Perform constraint validation only Validate the RDF file content combined with the existing collection data This is necessary because some violations only become apparent for the combined data.

Click Finish to complete the the operation. A message will indicate whether the import was successful. For large imports, this process may take minutes.

Please check the status on the Reports > File Import Reports.

Note

  • If an RDF file contains any “schema” definitions such as classes, properties, or shapes, then it can only be imported it into an Ontologies collection.

  • If an RDF file contains both “instances” and “schema”, either split the file before import or follow instructions in Transform > Copy or Move Instances from Other Asset Collection.

When importing RDF into a workflow, the addition of each triple will be recorded as an entry in the change history, where it will be available to all the relevant reports. When importing into a Production Copy, the Record each new triple in change history checkbox gives you the option of adding these to the change history;

Caution

This is not recommended when importing large amounts of data.

The option of Direct streaming import into production copy imports the content much more quickly and uses less memory. This should only be used for large imports if the user is confident they do not need to do validation or clean up on the data. It’s best to perform a backup (e.g. Export RDF File) of the collection prior to importing with direct streaming or use a workflow so that reverting is possible should anything go wrong.

When importing RDF files into an Ontologies or a Taxonomies collection, TopBraid performs some transformations (unless the streaming import is chosen):

  • For Ontologies, “subclass of Thing” statements will be added for classes that have no parents. This is done to ensure that these classes are visible in the Class Hierarchy.

  • For Taxonomies, “narrower concept” relationships will be used to generate inverse “broader concept” relationships. This is done to ensure that such concepts are visible in the Concept Hierarchy.

See Also

Further Reading on TopBraid