Tpt
961f465703
|
4 years ago | |
---|---|---|
.github | 4 years ago | |
bench | 5 years ago | |
js | 4 years ago | |
lib | 4 years ago | |
python | 4 years ago | |
server | 4 years ago | |
testsuite | 4 years ago | |
wikibase | 4 years ago | |
.gitignore | 5 years ago | |
.gitmodules | 5 years ago | |
Cargo.toml | 4 years ago | |
LICENSE-APACHE | 7 years ago | |
LICENSE-MIT | 5 years ago | |
README.md | 4 years ago | |
clippy.toml | 6 years ago | |
logo.svg | 4 years ago |
README.md
Oxigraph
Oxigraph is a work in progress graph database implementing the SPARQL standard.
There is no released version yet.
Its goal is to provide a compliant, safe and fast graph database based on the RocksDB key-value store. It is written in Rust.
It is split into multiple parts:
- The
lib
directory contains the database written as a Rust library. - The
python
directory contains bindings to use Oxigraph in Python. See its README for the Python bindings documentation. - The
js
directory contains bindings to use Oxigraph in JavaScript with the help of WebAssembly. See its README for the JS bindings documentation. - The
server
directory contains a stand-alone binary of a web server implementing the SPARQL 1.1 Protocol. - The
wikibase
directory contains a stand-alone binary of a web server able to synchronize with a Wikibase instance.
Are currently implemented:
- SPARQL 1.1 Query except
FROM
andFROM NAMED
. - Turtle, TriG, N-Triples, N-Quads and RDF XML RDF serialization formats for both data ingestion and retrieval using the Rio library.
- SPARQL Query Results XML Format and SPARQL Query Results JSON Format.
A preliminary benchmark is provided.
Run the web server
Build
You need to have a recent stable version of Rust and Cargo installed. You also need clang to build RocksDB.
If it's done, executing cargo build --release
in the root directory of this repository should compile the full server after having downloaded its dependencies.
It will create a fat binary in target/release/oxigraph_server
.
Usage
Run ./oxigraph_server
to start the server. It listen by default on localhost:7878
.
The server provides an HTML UI with a form to execute SPARQL requests.
It provides the following REST actions:
/
allows toPOST
data to the server. For examplecurl -f -X POST -H 'Content-Type:application/n-triples' --data-binary "@MY_FILE.nt" http://localhost:7878/
will add the N-Triples file MY_FILE.nt to the server repository. Turtle, TriG, N-Triples, N-Quads and RDF XML are supported./query
allows to evaluate SPARQL queries against the server repository following the SPARQL 1.1 Protocol. For examplecurl -X POST -H 'Content-Type:application/sparql-query' --data 'SELECT * WHERE { ?s ?p ?o } LIMIT 10' http://localhost:7878/query
. This action supports content negotiation and could return Turtle, N-Triples, RDF XML, SPARQL Query Results XML Format and SPARQL Query Results JSON Format.
Use oxigraph_server --help
to see the possible options when starting the server.
Using a Docker image
Display the help menu
docker run --rm oxigraph/oxigraph --help
Run the web server
Expose the server on port 7878
of the host machine, and save data on the local ./data
folder
docker run --init --rm -v $PWD/data:/data -p 7878:7878 oxigraph/oxigraph -b 0.0.0.0:7878 -f /data
You can then access it from your machine on port 7878
:
# Open the GUI in a browser
firefox http://localhost:7878
# Post some data
curl http://localhost:7878 -H 'Content-Type: application/x-turtle' -d@./data.ttl
# Make a query
curl -H 'Accept: application/sparql-results+json' 'http://localhost:7878/query?query=SELECT%20*%20%7B%20%3Fs%20%3Fp%20%3Fo%20%7D%20LIMIT%2010'
You could easily build your own Docker image by running docker build -t oxigraph server -f server/Dockerfile .
from the root directory.
Build
You need to have a recent stable version of Rust and Cargo installed.
If it's done, executing cargo build --release
in the root directory of this repository should compile the full server after having downloaded its dependencies.
It will create a fat binary in target/release/oxigraph_wikibase
.
Usage
To start a server that is synchronized with test.wikidata.org you should run:
./oxigraph_wikibase --mediawiki-api=https://test.wikidata.org/w/api.php --mediawiki-base-url=https://test.wikidata.org/wiki/ --namespaces=0,120 --file=test.wikidata
It creates a SPARQL endpoint listening to localhost:7878/query
that could be queried just like Blazegraph.
The configuration parameters are:
mediawiki_api
URL of the MediaWiki API to usemediawiki_base_url
Base URL of MediaWiki pages likehttps://test.wikidata.org/wiki/
for test.wikidata.org orhttp://localhost/w/index.php?title=
for "vanilla" installations.namespaces
The ids of the Wikibase namespaces to synchronize with, separated by,
.file
Path of where Oxigraph should store its data.
Using a Docker image
Display the help menu
docker run --rm oxigraph/oxigraph-wikibase --help
Run the web server
Expose the server on port 7878
of the host machine, and save data on the local ./data
folder
docker run --init --rm -v $PWD/wikibase_data:/wikibase_data -p 7878:7878 oxigraph/oxigraph-wikibase -b 0.0.0.0:7878 -f /wikibase_data --mediawiki-api http://some.wikibase.instance/w/api.php --mediawiki-base-url http://some.wikibase.instance/wiki/
Warning: the Wikibase instance needs to be accessible from within the container.
The clean way to do that could be to have both your wikibase and oxigraph_wikibase in the same docker-compose.yml
.
You could easily build your own Docker image by running docker build -t oxigraph-wikibase -f wikibase/Dockerfile .
from the root directory.
License
This project is licensed under either of
- Apache License, Version 2.0, (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.
Contribution
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in Futures by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.