What is known about HIV using Bio2RDF's SPARQL endpoints ?
Created: 2009-02-05 06:13:53
Last updated: 2009-02-05 06:15:22
To answer this question Bio2RDF Atlas about mouse and human genome sparql endpoint available at http://atlas.bio2rdf.org/sparql is searched. The selected URIs are then loaded into your local Virtuoso triplestore at http://localhost:8890/sparql. You must enable insert mode into the graph.
Once the mashup is built, two SPARQL queries analyze the obtained graph. Finally you can submit queries to the RDF mashup about HIV as you like. Enjoy.
This is the queries present in this workflow :
CONSTRUCT {
?s ?p ?o .
}
FROM <geneid,uniprot,omim,pubmed,go>
WHERE {
?s ?p ?o .
?o bif:contains "hiv" .
}
then
SELECT *
WHERE {
?s ?p ?o
}
then
SELECT ?type, count(*)
WHERE {
?s <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
}
Preview
Run
Run this Workflow in the Taverna Workbench...
Workflow Components
Authors (1)
Titles (2)
What is known about HIV |
What is known about HIV ? |
Descriptions (6)
To answer this question Bio2RDF Atlas about mouse and human genome sparql endpoint available at http://atlas.bio2rdf.org/sparql is searched. The selected URIs are then loaded into your local Virtuoso triplestore at http://localhost:8890/sparql. You must enable insert mode into the graph.
Once the mashup is built, 2 SPARQL queries analyze the obtained graph. Finally you can submit queries to the RDF mashup about HIV as you like.
This is the queries present in this workflow :
CONSTRUCT {
?s ?p ?o .
}
FROM < geneid,uniprot,omim,pubmed,go>
WHERE {
?s ?p ?o .
?o bif:contains "hiv" .
}
then
SELECT * WHERE {?s ?p ?o}
then
SELECT ?type, count(*)
WHERE {
?s <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
} |
To answer this question Bio2RDF Atlas about mouse and human genome is searched. The selected URIs are then loaded into the local Virtuoso triplestore at http://localhost:8890/sparql. You must enable insert into the graph.
Once the mashup is built, 2 sparql queries analyse the obtained graph.
This is the correspondint queries :
CONSTRUCT {
?s ?p ?o .
}
FROM < geneid,uniprot,omim,pubmed,go>
WHERE {
?s ?p ?o .
?o bif:contains "hiv" .
}
then
SELECT * WHERE {?s ?p ?o}
then
SELECT ?type, count(*)
WHERE {
?s <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
} |
SELECT *
FROM <
geneid,uniprot,omim,pubmed,go>
WHERE {
?s ?p ?o .
?o bif:contains "hiv" .
}
then
SELECT ?type, count(*)
WHERE {
?s <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
}
then
|
SELECT ?type, count(*)
WHERE {
?s <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
} |
To answer this question Bio2RDF Atlas about mouse and human genome sparql endpoint available at http://atlas.bio2rdf.org/sparql is searched. The selected URIs are then loaded into your local Virtuoso triplestore at http://localhost:8890/sparql. You must enable insert into the graph.
Once the mashup is built, 2 sparql queries analyse the obtained graph. Finaly you can submit queries to the RDF mashup about HIV as you like.
This is the corresponding queries :
CONSTRUCT {
?s ?p ?o .
}
FROM < geneid,uniprot,omim,pubmed,go>
WHERE {
?s ?p ?o .
?o bif:contains "hiv" .
}
then
SELECT * WHERE {?s ?p ?o}
then
SELECT ?type, count(*)
WHERE {
?s <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
} |
SELECT *
FROM <
geneid,uniprot,omim,pubmed,go>
WHERE {
?s ?p ?o .
?o bif:contains "hiv" .
}
then
SELECT * WHERE {?s ?p ?o}
then
SELECT ?type, count(*)
WHERE {
?s <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
}
then
|
Dependencies (0)
Processors (12)
Name |
Type |
Description |
Search_Bio2RDF_atlas |
workflow |
|
Namespace_list_to_search |
stringconstant |
Valuegeneid,uniprot,omim,pubmed,go |
Searched_term |
stringconstant |
Valuehiv |
Split_string_into_list |
localworker |
ScriptList split = new ArrayList();
if (!string.equals("")) {
String regexString = ",";
if (regex != void) {
regexString = regex;
}
String[] result = string.split(regexString);
for (int i = 0; i < result.length; i++) {
split.add(result[i]);
}
}
|
Result_limit |
stringconstant |
Value5 |
Insert_triples_in_local_graph |
workflow |
|
Virtuoso_server |
stringconstant |
Valuehttp://localhost:8890/ |
Concatenate_two_strings |
localworker |
Scriptoutput = string1 + string2; |
Count_triples |
stringconstant |
Valuesparql?default-graph-uri=http%3A%2F%2Flocalhost%3A8890%2Fhiv&should-sponge=&query=SELECT+count(*)+WHERE+{%3Fs+%3Fp+%3Fo}&format=text%2Fhtml&debug=on |
SPARQL_query |
localworker |
ScriptURL inputURL = null;
if (base != void) {
inputURL = new URL(new URL(base), url);
}
else {
inputURL = new URL(url);
}
URLConnection con = inputURL.openConnection();
InputStream in = con.getInputStream();
InputStreamReader isr = new InputStreamReader(in);
Reader inReader = new BufferedReader(isr);
StringBuffer buf = new StringBuffer();
int ch;
while ((ch = inReader.read()) > -1) {
buf.append((char)ch);
}
inReader.close();
contents = buf.toString();
//String NEWLINE = System.getProperty("line.separator");
//
//URL inputURL = null;
//if (base != void) {
// inputURL = new URL(new URL(base), url);
//} else {
// inputURL = new URL(url);
//}
//StringBuffer result = new StringBuffer();
//BufferedReader reader = new BufferedReader(new InputStreamReader(inputURL.openStream()));
//String line = null;
//while ((line = reader.readLine()) != null) {
// result.append(line);
// result.append(NEWLINE);
//}
//
//contents = result.toString();
|
SPARQL_query_2 |
localworker |
ScriptURL inputURL = null;
if (base != void) {
inputURL = new URL(new URL(base), url);
}
else {
inputURL = new URL(url);
}
URLConnection con = inputURL.openConnection();
InputStream in = con.getInputStream();
InputStreamReader isr = new InputStreamReader(in);
Reader inReader = new BufferedReader(isr);
StringBuffer buf = new StringBuffer();
int ch;
while ((ch = inReader.read()) > -1) {
buf.append((char)ch);
}
inReader.close();
contents = buf.toString();
//String NEWLINE = System.getProperty("line.separator");
//
//URL inputURL = null;
//if (base != void) {
// inputURL = new URL(new URL(base), url);
//} else {
// inputURL = new URL(url);
//}
//StringBuffer result = new StringBuffer();
//BufferedReader reader = new BufferedReader(new InputStreamReader(inputURL.openStream()));
//String line = null;
//while ((line = reader.readLine()) != null) {
// result.append(line);
// result.append(NEWLINE);
//}
//
//contents = result.toString();
|
Count_type |
stringconstant |
Valuesparql?default-graph-uri=http%3A%2F%2Flocalhost%3A8890%2Fhiv&should-sponge=&query=SELECT+%3Ftype%2C+count(*)+%0D%0AWHERE+{%0D%0A%3Fs++%3Ftype+.%0D%0A}&format=text%2Fhtml&debug=on |
Beanshells (4)
Name |
Description |
Inputs |
Outputs |
Search_Atlas |
|
sparql_endpoint
term
namespace
limit
|
sparql_xml
|
insert_n3_into_graph |
|
host
port
n3
graph
|
html
|
describe_uri |
|
sparql_endpoint
bmuri
|
n3
|
triples_inserted |
|
string
bmuri
|
result
|
Outputs (4)
Name |
Description |
bmuris |
|
inserted |
|
count |
|
type_count |
|
Datalinks (16)
Source |
Sink |
Split_string_into_list:split |
Search_Bio2RDF_atlas:namespace |
Searched_term:value |
Search_Bio2RDF_atlas:term |
Result_limit:value |
Search_Bio2RDF_atlas:limit |
Namespace_list_to_search:value |
Split_string_into_list:string |
Search_Bio2RDF_atlas:bmuris |
Insert_triples_in_local_graph:bmuri |
Concatenate_two_strings:output |
Insert_triples_in_local_graph:graph |
Searched_term:value |
Concatenate_two_strings:string2 |
Virtuoso_server:value |
Concatenate_two_strings:string1 |
Virtuoso_server:value |
SPARQL_query:base |
Count_triples:value |
SPARQL_query:url |
Count_type:value |
SPARQL_query_2:url |
Virtuoso_server:value |
SPARQL_query_2:base |
Search_Bio2RDF_atlas:bmuris |
bmuris |
Insert_triples_in_local_graph:triples_inserted |
inserted |
SPARQL_query:contents |
count |
SPARQL_query_2:contents |
type_count |
Coordinations (2)
Controller |
Target |
Insert_triples_in_local_graph |
SPARQL_query |
SPARQL_query |
SPARQL_query_2 |
Uploader
License
All versions of this Workflow are
licensed under:
Version 1 (earliest)
(of 2)
Credits (1)
(People/Groups)
Attributions (0)
(Workflows/Files)
None
Shared with Groups (0)
None
Featured In Packs (0)
None
Log in to add to one of your Packs
Attributed By (0)
(Workflows/Files)
None
Favourited By (1)
Statistics
Other workflows that use similar services
(0)
There are no workflows in myExperiment that use similar services to this Workflow.
Comments (2)
Log in to make a comment
Dear Francois,
Firstly, congratulation on the great job! I have been looking forward to this for a long time, a binding between Bio2RDF and Taverna.
I have download your workflow and tried to run it using Taverna 2.0. However, no results were ever returned to me. I digged into the workflow and found that the value given to the string processor "virtuoso_server" is http://localhost:8890. Is this the reason why I got no results back? What should I fix to make the workflow run on my local desktop?
Thanks,
Jun
Jun,
I think the workflow assumes that you have a triple-store running locally. Basically it goes and gets the data for you, sticks it into your server, and then lets you query it.
Anyone have this working?