You have to be logged in to leave a comment.
Sign In
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
= Network Inference Pipeline
This project uses the https://github.com/drostlab/network-inference-toolbox[Network Inference Toolbox], a collection of Singularity containers packaging various pre-existing tools for Gene Regulatory Network Inference, to infer and evaluate some networks for specific datasets.
== Dependencies
* https://www.python.org/[Python] v3.9 with https://python-poetry.org/[Poetry] set up
* https://sylabs.io/[Singularity] (tested with v3.7)
* `git` (optional)
== Overview
This project is a https://dvc.org/[DVC] pipeline that glues data preprocessing, tools from `toolbox/bin` and network evaluation together. After setup as described below, you should be able to `dvc repro` the results.
NOTE: To package up materials used for the https://doi.org/10.1101/2021.01.17.427026[`noisyR` paper] (Moutsopoulos et al.), run `dvc repro artifacts/noisyR-paper/dvc.yaml`, which will also implicitly generate its dependencies. The resulting tarball will be placed in `artifacts/noisyR-paper`.
== Setup
Clone (or download and extract) this repository and enter its directory. You then need to setup both Python dependencies and the previously mentioned Network Inference Toolbox.
=== Python Dependencies
The recommended (but optional) way to setup DVC is to install it into a virtual environment. (E.g., call `poetry shell` and then always run `dvc` from there.) Then just run `poetry install`.
=== Network Inference Toolbox
The recommended way to setup the Toolbox is to pull the Git submodule. In the repository root, execute:
[source,sh]
----
git submodule init
git submodule update
----
After setting up the submodule, you can build the containers by entering the `toolbox` directory and then running `make` (or, if the containers were already built somewhere else, just copy them to `toolbox/bin`).
The only parts of the Network Inference Toolbox that are strictly necessary to run the pipeline are the assembled Singularity containers (i.e. `.sif` files), which the pipeline expects to find in `toolbox/bin`. If you are sure you just want to _run_ the pipeline (and not change it), you alternatively could copy the containers there, or symlink `toolbox/bin` to another location where the containers are available, without setting up the submodule as described above.
Press p or to see the previous file or,
n or to see the next file
Comments
Integrate Google Cloud Storage
Use Google Storage
Select bucket
Upload key
Finish
Use Google Cloud Storage!
Browsing data directories saved to Google Cloud Storage is possible with DAGsHub. Let's configure
your repository to easily display your data in the context of any commit!
Specify your Google Storage bucket
Congratulations!
network-inference-pipeline is now integrated with Google Cloud Storage!
Delete Storage Key
Are you sure you want to delete this access key?
No
Yes
Integrate AWS S3
Use S3 remote
Select bucket
Access key
Finish
Use AWS S3 as storage!
Browsing data directories saved to S3 is possible with DAGsHub. Let's configure
your repository to easily display your data in the context of any commit!
Specify your S3 bucket
Select Region
af-south-1 - Africa (Cape Town)
ap-northeast-1 - Asia Pacific (Tokyo)
ap-northeast-2 - Asia Pacific (Seoul)
ap-south-1 - Asia Pacific (Mumbai)
ap-southeast-1 - Asia Pacific (Singapore)
ap-southeast-2 - Asia Pacific (Sydney)
ca-central-1 - Canada (Central)
eu-central-1 - EU (Frankfurt)
eu-north-1 - EU (Stockholm)
eu-west-1 - EU (Ireland)
eu-west-2 - EU (London)
eu-west-3 - EU (Paris)
sa-east-1 - South America (São Paulo)
us-east-1 - US East (N. Virginia)
us-east-2 - US East (Ohio)
us-gov-east-1 - US Gov East 1
us-gov-west-1 - US Gov West 1
us-west-1 - US West (N. California)
us-west-2 - US West (Oregon)
Congratulations!
network-inference-pipeline is now integrated with AWS S3!
Delete Storage Key
Are you sure you want to delete this access key?
No
Yes
Integrate S3 compatible storage
Use S3 like remote
Select bucket
Access key
Finish
Use any S3 compatible storage!
Browsing data directories saved to S3 compatible storage is possible with DAGsHub. Let's configure
your repository to easily display your data in the context of any commit!
Specify your S3 bucket
Congratulations!
network-inference-pipeline is now integrated with your S3 compatible storage!
Delete Storage Key
Are you sure you want to delete this access key?
No
Yes
Integrate Azure Cloud Storage
Use Azure Storage
Select bucket
Set key
Finish
Use Azure Cloud Storage!
Browsing data directories saved to Azure Cloud Storage is possible with DAGsHub. Let's configure
your repository to easily display your data in the context of any commit!
Specify your Azure Storage bucket
Congratulations!
network-inference-pipeline is now integrated with Azure Cloud Storage!