Workflows

Search filter terms
Filter by type
Filter by tag
Filter by user
Filter by licence
Filter by group
Filter by wsdl
Filter by curation
Results per page:
Sort by:
Showing 2916 results. Use the filters on the left and the search box below to refine the results.

Workflow Keyword Extraction (1)

Thumb
Perform keyword extraction using AlchemyAPI. See here for more information: http://www.alchemyapi.com/api/keyword/

Created: 2012-10-05 | Last updated: 2014-04-09

Credits: User cneudecker

Workflow OpenCalais (1)

Thumb
Perform various enrichments using the OpenCalais API. See here for more information: http://www.opencalais.com/about

Created: 2012-10-04

Credits: User cneudecker

Workflow Concept Tagging (1)

Thumb
Perform concept tagging using AlchemyAPI. See here for more information: http://www.alchemyapi.com/api/relation/

Created: 2012-10-04

Credits: User cneudecker

Workflow Entity Extraction (1)

Thumb
Perform entity extraction using AlchemyAPI. See here for more information: http://www.alchemyapi.com/api/entity/

Created: 2012-10-04

Credits: User cneudecker

Workflow Sentiment Analysis (1)

Thumb
Perform sentiment analysis using AlchemyAPI. See here for more information: http://www.alchemyapi.com/api/sentiment/

Created: 2012-10-04

Credits: User cneudecker

Workflow Relation Extraction (1)

Thumb
Perform relation extraction using AlchemyAPI. See here for more information: http://www.alchemyapi.com/api/relation/

Created: 2012-10-04 | Last updated: 2014-04-09

Credits: User cneudecker

Uploader

Workflow Matchbox Evaluation (1)

Thumb
Matchbox evaluation against ground truth. The evaluation process first creates the matchbox output and ground truth lists. It then counts each page tuple from the matchbox output that is in the ground truth as correctly identified tuple (true positive). Those that are not in the ground truth are counted as incorrectly identified tuples (false positives), and finally, those that are in the ground truth but not in the matchbox output are counted as missed tuples (false negatives). The precision...

Created: 2012-10-02 | Last updated: 2012-10-02

Credits: User Sven

Workflow Choosing the best k value for the k-NN cla... (1)

Thumb
The process determines the best value for the parameter k for the k-NN classification of the Breast Cancer Wisconsin (Diagnostic) data set available in the UCI Machine Learning Repository. The optimal k is computed by using 10-fold cross-validation. (To get better results each cross-validation is repeated 10 times and the averages of the runs are considered.) Finally, a k-NN classifier is built and evaluated on the entire data set using the optimal k. During the process the resulting average ...

Created: 2012-09-28 | Last updated: 2012-09-28

Workflow A simple process that demonstrates how to ... (1)

Thumb
This simple RapidMiner process demonstrates how to use the Open File operator introduced in RapidMiner 5.2. In this example we use the operator to consume a data feed from the web.

Created: 2012-09-27

Uploader

Workflow Querying SDSS DR8 to get magnitude properties (2)

Thumb
This workflow gets a VOTable with the RA and DEC among others values of a list of galaxies. The workflow queries the SDSS DR8 VO conesearch service, to extract the objID, specObjID, ra, dec, u, g, r, i, z.

Created: 2012-09-26 | Last updated: 2013-03-08

Credits: User Susana

Uploader

Workflow Searching for near galaxies in NED service (2)

Thumb
This workflow needs as input an ascii table with a list of galaxies. This table contains the PGC name of the galaxy, the CIG number, the radius of searching, Z1 and Z2. The workflow uses the radius, Z1, Z2 and the PGC name of the galaxy to query NED service and get the html with the list of companions. This html is parsed and it is built a VOTable with the RA, DEC, Magnitud, Redshift, Separation, Velocity of each companion.

Created: 2012-09-26 | Last updated: 2012-10-09

Credits: User Susana

Workflow The title (1)

Thumb
The description. This could be quite long.

Created: 2012-09-26

Credits: User Stian Soiland-Reyes

Uploader

Workflow Merkel Government Statement Analysis (1)

Thumb
Calculates the 100 most important words of each governmental statement from chancellor Merkel based their log likelihood values (compared to all govermental statements from her). Check out the pack - http://www.myexperiment.org/packs/333.html to get the workflow with required text data.

Created: 2012-09-19 | Last updated: 2012-09-19

Credits: User jhermes

Workflow STUDY OF QUANTIFICATION OF IMPURITIES AND ... (1)

  Bulk drug during its production process, after its scale up, it is necessary to analyse for the presence of any impurities or related substances in it. This is to ensure the impurities and related substances are within their limits as per ICH Guidelines. Required brief study        The primary objective of the study is to develop HPLC method and validate it for the detection and quantification of impurities and related substances in the manufactu...

Created: 2012-09-16

Credits: User Drkrishnasarmapathy

Workflow Filter concepts with profiles (4)

Thumb
Purpose: Filter a list of concept id(s) by returning only those with a concept profile in the database.

Created: 2012-09-14 | Last updated: 2014-07-14

Credits: User Kristina Hettne User Reinout van Schouwen User Martijn Schuemie Network-member BioSemantics

Uploader

Workflow Cloud Parallel Processing of Tandem Mass S... (1)

Thumb
An advanced scientific workflow for searching LC−MS data using SpectraST on the cloud. Uploading the libraries is optimized to achieve better performance, which makes this workflow more suitable for processing mzXML spectra files from human samples, as the corresponding NIST library needed by SpectraST is larger than 2 GB. Here we connect 3 nested workflows, in which the first 2, i.e., decomposeMzxml and uploadToCloud, run in parallel, while the third nested workflow, i.e. runSpectrastO...

Created: 2012-09-10 | Last updated: 2015-08-19

Credits: User Yassene

Uploader

Workflow Cloud Parallel Processing of Tandem Mass S... (1)

Thumb
A workflow for searching LC−MS/MS data using SpectraST on the cloud. The processor mzxmlDecomposer, pepxmlUnzip, and pepxmlComposer are identical to the one in the X!Tandem workflow (Figure 2). The only difference is that the Xtandem processor is exchanged with the Spectrast processor and the constant inputs are adjusted to SpectraST. This approach is also possible for other search engines as described in Data Decomposition and Recomposition Algorithms. More details can be found here:...

Created: 2012-09-10

Credits: User Yassene

Uploader

Workflow Cloud Parallel Processing of Tandem Mass S... (1)

Thumb
A workflow for searching LC−MS/MS mass spectrometry data using X!Tandem on the cloud. The workflow consists of 5 processors. The objectLogic processor prepares all inputs in the right format, i.e. keeping or converting strings into file object according to the following processor. The mzxmlDecomposer and pepxmlComposer run the decomposing/recomposing algorithms. objectLogic, mzxmlDecomposer and pepxmlComposer are Beanshell processors and they run locally. Xtandem runs X!Tandem on a remo...

Created: 2012-09-10

Credits: User Yassene

Workflow Detect ellipse failures and get votable wi... (1)

Thumb
It detects if ellipse has failed by checking if the resulting data file exists. It adds a column (validellipse) that contains 1 if ellipse didn't fail and 0 if ellipse failed. It returns a votable that contains all the data with the new column and a votable that contains only the rows where ellipse didn't failed. It uses astrotaverna plugin (http://wf4ever.github.com/astrotaverna/).

Created: 2012-09-07

Credits: User Julian Garrido

Uploader

Workflow xlmPath (2)

Thumb
xml path

Created: 2012-09-06 | Last updated: 2012-09-06

Results per page:
Sort by: