Nextflow: Pipeline Lift
Last updated
Was this helpful?
Last updated
Was this helpful?
This is an to help develop Nextflow pipelines that will run successfully on ICA. There are some syntax bugs that may get introduced in your Nextflow code. One suggestion is to run the steps as described below and then open these files in VisualStudio Code with the Nextflow plugin installed. You may also need to run smoke tests on your code to identify syntax errors you might not catch upon first glance.
This is not an official Illumina product, but is intended to make your Nextflow experience in ICA more fruitful.
Some examples of Nextflow pipelines that have been lifted over with this repo can be found .
Some additional examples of ICA-ported Nextflow pipelines are .
Some additional repos that can help with your ICA experience can be found below:
Relaunch pipeline analysis and
Monitor your analysis run in ICA and troubleshoot
Wrap a WDL-based workflow in a
Wrap a Nextflow-based workflow in a
This will allow you to test your main.nf script. If you have a Nextflow pipeline that is more nf-core like (i.e. where you may have several subworkflow and module files), this may be more appropriate. Any and all comments are welcome.
What these scripts do:
Parse configuration files and the Nextflow scripts (main.nf, workflows, subworkflows, modules) of a pipeline and update the configuration of the pipeline with pod directives to tell ICA what compute instance to run
Strips out parameters that ICA utilizes for workflow orchestration
Migrates manifest closure to conf/base.ica.config
file
Ensures that docker is enabled
Adds workflow.onError
(main.nf, workflows, subworkflows, modules) to aid troubleshooting
Modifies the processes that reference scripts and tools in the bin/
directory of a pipeline's projectDir
, so that when ICA orchestrates your Nextflow pipeline, it can find and properly execute your pipeline process
Additional edits to ensure your pipeline runs more smoothly on ICA
Nextflow workflows on ICA are orchestrated by kubernetes and require a parameters XML file containing data inputs (i.e. files + folders) and other string-based options for all configurable parameters to properly be passed from ICA to your Nextflow workflows
Nextflow processes will need to contain a reference to a container --- a Docker image that will run that specific process
Nextflow processes will need a pod annotation
specified for ICA to know what instance type to run the process.
The scripts mentioned below can be run in a docker image keng404/nextflow-to-icav2-config:0.0.3
This has:
nf-core installed
All Rscripts in this repo with relevant R libraries installed
The ICA CLI installed, to allow for pipeline creation and CLI templates to request pipeline runs after the pipeline is created in ICA
You'll likely need to run the image with a docker command like this for you to be able to run git commands within the container:
where pwd
is your $HOME
directory
If you have a specific pipeline from Github, you can skip this statement below.
You'll first need to download the python module from nf-core via a pip install nf-core
command. Then you can use nf-core list --json to return a JSON metadata file containing current pipelines in the nf-core repository.
You can choose which pipelines to git clone
, but as a convenience, the wrapper nf-core.conversion_wrapper.R
will perform a git pull, parse nextflow_schema.json files and generate parameter XML files, and then read configuration and Nextflow scripts and make some initial modifications for ICA development. Lastly, these pipelines are created in an ICA project of your choosing, so you will need to generate and download an API key from the ICA domain of your choosing.
The Project view should be the default view after logging into your private domain (https://my_domain.login.illumina.com) and clicking on your ICA 'card' ( This will redirect you to https://illumina.ica.com/ica).
GIT_HUB_URL
can be specified to grab pipeline code from github. If you intend to liftover anything in the master branch, your GIT_HUB_URL
might look like https://github.com/keng404/my_pipeline
. If there is a specific release tag you intend to use, you can use the convention https://github.com/keng404/my_pipeline:my_tag
.
Alternatively, if you have a local copy/version of a Nextflow pipeline you'd like to convert and use in ICA, you can use the --pipeline-dirs
argument to specify this.
In summary, you will need the following prerequisites, either to run the wrapper referenced above or to carry out individual steps below.
git clone
nf-core pipelines of interest
Install the python module nf-core
and create a JSON file using the command line nf-core list --json > {PIPELINE_JSON_FILE}
nf-core.conversion_wrapper.R
does for each Nextflow pipelineA Nextflow schema JSON is generated by nf-core's python library nf-core
nf-core can be installed via a pip install nf-core
command
nextflow.config
and a base config
file so that it is compatible with ICA.This script will update your configuration files so that it integrates better with ICA. The flag --is-simple-config
will create a base config file from a template. This flag will also be active if no arguments are supplied to --base-config-files
.
This step adds some updates to your module scripts to allow for easier troubleshooting (i.e. copy work directory back to ICA if an analysis fails). It also allows for ICA's orchestration of your Nextflow pipeline to properly handle any script/binary in your bin/
directory of your pipeline $projectDir
.
You may have to edit your {PARAMETERS_XML}
file if these edits are unnecessary.
Currently ICA supports Nextflow versions nextflow/nextflow:22.04.3
and nextflow/nextflow:20.10.0
(with 20.10.0 to be deprecated soon)
nf-core.create_ica_pipeline.R
Add the flag --developer-mode
to the command line above if you have custom groovy libraries or modules files referenced in your workflow. When this flag is specified, the script will upload these files and directories to ICA and update the parameters XML file to allow you to specify directories under the parameters project_dir
and files under input_files
. This will ensure that these files and directories will be placed in the $workflow.launchDir
when the pipeline is invoked.
As a convenience, you can also get a templated CLI command to help run a pipeline (i.e. submit a pipeline request) in ICA via the following:
There will be a corrsponding JSON file (i.e. a file with a file extension *ICAv2_CLI_template.json
) that saves these values that one could modify and configure to build out templates or launch the specific pipeline run you desire. You can specify the name of this JSON file with the parameter --output-json
.
Once you modify this file, you can use --template-json
and specify this file to create the CLI you can use to launch your pipeline.
If you have a previously successful analysis with your pipeline, you may find this approach more useful.
Where possible, these scripts search for config files that refer to a test (i.e. test.config,test_full.config,test*config) and creates a boolean parameter params.ica_smoke_test
that can be toggled on/off as a sanity check that the pipeline works as intended. By default, this parameter is set to false
.
When set to true
, these test config files are loaded in your main nextflow.config
.
Generates parameter XML file based on nextflow_schema.json, nextflow.config, conf/
`- Take a look at to understand a bit more of what's done with the XML, as you may want to make further edits to this file for better usability
A table of instance types and the associated CPU + Memory specs can be found under a table named Compute Types
These scripts have been made to be compatible with workflows, so you may find the concepts from the documentation here a better starting point.
Next, you'll need an API key file for ICA that can be generated using the instructions .
Finally, you'll need to create a project in ICA. You can do this via the CLI and API, but you should be able to follow these to create a project via the ICA GUI.
Install ICA CLI by following these .
A table of all CLI releases for mac, linux, and windows can be found .
Relaunch pipeline analysis and .