Fork of https://github.com/oxigraph/oxigraph.git for the purpose of NextGraph project
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
 
dependabot[bot] b87d715424
Update lasso requirement from 0.3 to 0.4 (#71)
4 years ago
.github Attempt to fix GitHub CI release script 4 years ago
bench Fixes a sentence in the benchmark README 4 years ago
js Update rand requirement from 0.7 to 0.8 and getrandom from 0.1 to 0.2 (#70) 4 years ago
lib Update lasso requirement from 0.3 to 0.4 (#71) 4 years ago
python Simplifies some Python code 4 years ago
server Makes server SPARQL protocol pass the test suite 4 years ago
testsuite Drops usages of async_std::spawn_blocking 4 years ago
wikibase Drops usages of async_std::spawn_blocking 4 years ago
.dockerignore Fixes a style problem in pyoxigraph Cargo.toml 4 years ago
.gitignore Releases version 0.1.0-rc.1 4 years ago
.gitmodules Implements a crate for the testsuite, adds parser tests and a better isomorphism algorithm 4 years ago
CHANGELOG.md Adds Store::clear 4 years ago
Cargo.toml Adds basic Python bindings to Oxigraph 4 years ago
LICENSE-APACHE Initial commit 7 years ago
LICENSE-MIT Update LICENSE-MIT 5 years ago
README.md Improves Oxigraph server launch instructions in the README 4 years ago
clippy.toml Upgrades some dependencies 5 years ago
logo.svg Adds Oxigraph logo 4 years ago

README.md

Oxigraph

Latest Version Released API docs PyPI npm actions status Gitter

Oxigraph is a graph database implementing the SPARQL standard.

Its goal is to provide a compliant, safe, and fast graph database based on the RocksDB and Sled key-value stores. It is written in Rust. It also provides a set of utility functions for reading, writing, and processing RDF files.

Oxigraph is in heavy development and SPARQL query evaluation has not been optimized yet.

It is split into multiple parts:

  • The lib directory contains the database written as a Rust library. Latest Version Released API docs
  • The python directory contains Pyoxigraph. Pyoxigraph allows using Oxigraph in Python. See the Pyoxigraph website for Pyoxigraph documentation. PyPI
  • The js directory contains bindings to use Oxigraph in JavaScript with the help of WebAssembly. See its README for the JS bindings documentation. npm
  • The server directory contains a stand-alone binary of a web server implementing the SPARQL 1.1 Protocol. It uses the RocksDB key-value store. Latest Version Docker Image Version (latest semver)
  • The wikibase directory contains a stand-alone binary of a web server able to synchronize with a Wikibase instance. Latest Version
    Docker Image Version (latest semver)

Oxigraph implements the following specifications:

A preliminary benchmark is provided.

Run the Web server

Installation

You need to have a recent stable version of Rust and Cargo installed. You also need clang to build RocksDB.

To download, build and install the latest released version run cargo install oxigraph_server. There is no need to clone this git repository.

To compile the server from source, clone this git repository, and execute cargo build --release in the server directory to compile the full server after having downloaded its dependencies. It will create a fat binary in target/release/oxigraph_server.

Usage

Run oxigraph_server -f my_data_storage_directory to start the server where my_data_storage_directory is the directory where you want Oxigraph data to be stored in. It listens by default on localhost:7878.

The server provides an HTML UI with a form to execute SPARQL requests.

It provides the following REST actions:

  • / allows to POST data to the server. For example curl -f -X POST -H 'Content-Type:application/n-triples' --data-binary "@MY_FILE.nt" http://localhost:7878/ will add the N-Triples file MY_FILE.nt to the server repository. Turtle, TriG, N-Triples, N-Quads and RDF XML are supported.
  • /query allows to evaluate SPARQL queries against the server repository following the SPARQL 1.1 Protocol. For example curl -X POST -H 'Content-Type:application/sparql-query' --data 'SELECT * WHERE { ?s ?p ?o } LIMIT 10' http://localhost:7878/query. This action supports content negotiation and could return Turtle, N-Triples, RDF XML, SPARQL Query Results XML Format and SPARQL Query Results JSON Format.
  • /update allows to execute SPARQL updates against the server repository following the SPARQL 1.1 Protocol. For example curl -X POST -H 'Content-Type: application/sparql-update' --data 'DELETE WHERE { <http://example.com/s> ?p ?o }' http://localhost:7878/update.

Use oxigraph_server --help to see the possible options when starting the server.

Using a Docker image

Display the help menu

docker run --rm oxigraph/oxigraph --help

Run the Web server

Expose the server on port 7878 of the host machine, and save data on the local ./data folder

docker run --init --rm -v $PWD/data:/data -p 7878:7878 oxigraph/oxigraph -b 0.0.0.0:7878 -f /data

You can then access it from your machine on port 7878:

# Open the GUI in a browser
firefox http://localhost:7878

# Post some data
curl http://localhost:7878 -H 'Content-Type: application/x-turtle' -d@./data.ttl

# Make a query
curl -X POST -H 'Accept: application/sparql-results+json' -H 'Content-Type: application/sparql-query' --data 'SELECT * WHERE { ?s ?p ?o } LIMIT 10' http://localhost:7878/query

# Make an UPDATE
curl -X POST -H 'Content-Type: application/sparql-update' --data 'DELETE WHERE { <http://example.com/s> ?p ?o }' http://localhost:7878/update

You could easily build your own Docker image by running docker build -t oxigraph server -f server/Dockerfile . from the root directory.

Run the Wikibase server

Installation

You need to have a recent stable version of Rust and Cargo installed. You also need clang to build RocksDB.

To download, build and install the latest released version run cargo install oxigraph_wikibase. There is no need to clone this git repository.

To compile the server from source, clone this git repository, and execute cargo build --release in the wikibase directory to compile the full server after having downloaded its dependencies. It will create a fat binary in target/release/oxigraph_wikibase.

Usage

To start a server that is synchronized with test.wikidata.org you should run:

./oxigraph_wikibase --mediawiki-api https://test.wikidata.org/w/api.php --mediawiki-base-url https://test.wikidata.org/wiki/ --namespaces 0,120 --file test.wikidata

It creates a SPARQL endpoint listening to localhost:7878/query that could be queried just like Blazegraph.

The configuration parameters are:

  • mediawiki_api URL of the MediaWiki API to use
  • mediawiki_base_url Base URL of MediaWiki pages like https://test.wikidata.org/wiki/ for test.wikidata.org or http://localhost/w/index.php?title= for "vanilla" installations.
  • namespaces The ids of the Wikibase namespaces to synchronize with, separated by ,.
  • file Path of where Oxigraph should store its data.

You can then access it from your machine on port 7878. No GUI is provided.

# Make a query
curl -X POST -H 'Accept: application/sparql-results+json' -H 'Content-Type: application/sparql-query' --data 'SELECT * WHERE { ?s ?p ?o } LIMIT 10' http://localhost:7878/query

Using a Docker image

Display the help menu

docker run --rm oxigraph/oxigraph-wikibase --help

Run the Web server

Expose the server on port 7878 of the host machine, and save data on the local ./data folder

docker run --init --rm -v $PWD/wikibase_data:/wikibase_data -p 7878:7878 oxigraph/oxigraph-wikibase -b 0.0.0.0:7878 -f /wikibase_data --mediawiki-api http://some.wikibase.instance/w/api.php --mediawiki-base-url http://some.wikibase.instance/wiki/

Warning: the Wikibase instance needs to be accessible from within the container. The clean way to do that could be to have both your wikibase and oxigraph_wikibase in the same docker-compose.yml.

You could easily build your own Docker image by running docker build -t oxigraph-wikibase -f wikibase/Dockerfile . from the root directory.

License

This project is licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in Futures by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.