Environment
Created: 2013-01-04 20:41:31
Last updated: 2013-01-14 09:39:40
This workflow takes as input the path of the tabular *.pckl Python pickle dataset created in the previous workflow, as well as the database connection settings and several criteria on how to filter the
potential companions of the target galaxies. It provides a file with the SDSS identifiers of each target galaxy of the sample, environmental estimators and radius where the 10th companion has been found. The workflow looks for potential companions in radius ranging from 3Mpc to 11Mpc, with a step of 1Mpc. The user may modify these numbers at the input stage, as well as several limits and ranges needed in the filtering process. As in the previous workflow other provided input values are the Working Path of the digital experiment and the database connection settings: hostname, login and password.
Execution environment
The first requirement to run the workflows provided by both ROs is Taverna Workbench2 2.4 or higher. AstroTaverna (Taverna plugin) is also needed in order to get functionalities related with Virtual
Observatory web services queries and management of standard VOTable data formats. In general, the execution environment is a Linux distribution including Python4 2.x and a bash shell, with psycopg and numpy Python packages. Access to a PostgreSQL database storing the physical parameters provided by SDSS is also needed; a dump file of database may be downloaded from the AMIGA web server and in order to be deployed and accessible from a local execution environment.
Preview
Run
Run this Workflow in the Taverna Workbench...
Workflow Components
Authors (1)
Juan de Dios Santander Vela
José Sabater Montes |
Titles (1)
Descriptions (1)
This workflow takes as input the path of the tabular *.pckl Python pickle dataset created in the previous workflow, as well as the database connection settings and several criteria on how to filter the
potential companions of the target galaxies. It provides a file with the SDSS identifiers of each target galaxy of the sample, environmental estimators and radius where the 10th companion has been found. The workflow looks for potential companions in radius ranging from 3Mpc to 11Mpc, with a step of 1Mpc. The user may modify these numbers at the input stage, as well as several limits and ranges needed in the filtering process. As in the previous workflow other provided input values are the Working Path of the digital experiment and the database connection settings: hostname, login and password.
Execution environment
The first requirement to run the workflows provided by both ROs is Taverna Workbench2 2.4 or higher. AstroTaverna (Taverna plugin) is also needed in order to get functionalities related with Virtual
Observatory web services queries and management of standard VOTable data formats. In general, the execution environment is a Linux distribution including Python4 2.x and a bash shell, with psycopg and numpy Python packages. Access to a PostgreSQL database storing the physical parameters provided by SDSS is also needed; a dump file of database may be downloaded from the AMIGA web server and in order to be deployed and accessible from a local execution environment. |
Dependencies (0)
Inputs (8)
Name |
Description |
dbuser |
|
dbpasswd |
|
dbhost |
|
dbport |
|
candidatescube_pickle |
|
csv_out_filename |
|
csv_refined_filename |
|
working_directory |
|
Processors (6)
Name |
Type |
Description |
aux_dens.py |
externaltool |
aux_dens.py is imported by environment.py; because it uses candidatescube_pickle, indirectly, it has been also linked to it. |
sample_total.py |
externaltool |
|
environment.py |
externaltool |
|
8run1.py |
externaltool |
|
9run2.py |
externaltool |
|
get_last_line_of_STDIN |
externaltool |
|
Outputs (1)
Name |
Description |
csv_filepath |
|
Datalinks (20)
Source |
Sink |
sample_total.py:STDOUT |
aux_dens.py:STDIN |
working_directory |
sample_total.py:STDIN |
aux_dens.py:STDOUT |
environment.py:STDIN |
dbuser |
8run1.py:dbuser |
dbpasswd |
8run1.py:dbpasswd |
dbhost |
8run1.py:host |
candidatescube_pickle |
8run1.py:pickle |
csv_out_filename |
8run1.py:first_run_csv |
working_directory |
8run1.py:working_directory |
environment.py:STDOUT |
8run1.py:STDIN |
dbuser |
9run2.py:dbuser |
dbpasswd |
9run2.py:dbpasswd |
dbhost |
9run2.py:host |
dbport |
9run2.py:tcpport |
csv_out_filename |
9run2.py:first_run_csv |
csv_refined_filename |
9run2.py:second_run_csv |
working_directory |
9run2.py:working_directory |
8run1.py:STDOUT |
9run2.py:STDIN |
9run2.py:STDOUT |
get_last_line_of_STDIN:STDIN |
get_last_line_of_STDIN:STDOUT |
csv_filepath |
Uploader
License
All versions of this Workflow are
licensed under:
BSD License
Version 1 (earliest)
(of 2)
Credits (2)
(People/Groups)
Attributions (0)
(Workflows/Files)
None
Shared with Groups (2)
Featured In Packs (1)
Log in to add to one of your Packs
Attributed By (0)
(Workflows/Files)
None
Favourited By (0)
No one
Statistics
Other workflows that use similar services
(0)
There are no workflows in myExperiment that use similar services to this Workflow.
Comments (0)
No comments yet
Log in to make a comment