Data Transfer Options
ICA Connector
The platform provides Connectors to facilitate automation for operations on data (ie, upload, download, linking). The connectors are helpful when you want to sync data between ICA and your local computer or link data between projects in ICA.
ICA CLI
The ICA CLI upload/download proves beneficial when handling large files/folders, especially in situations where you're operating on a remote server by connecting from your local computer. You can use icav2 projects enter <project-name/id>
to set the project context for the CLI to use for the commands when relevant. If the project context is not set, you can supply the additional parameter --project-id <project-id>
to specify the project for the command.
Upload Data
Download Data
Note: Because of how S3 manages storage, it doesn't have a concept of folders in the traditional sense. So, if you provide the "folder" ID of an empty "folder", you will not see anything downloaded.
ICA API
Another option to upload data to ICA is via ICA API. This option is helpful where data needs to be transferred via automated scripts. You can use the following two endpoints to upload a file to ICA.
Post - /api/projects/{projectId}/data with the following body which will create a partial file at the desired location and return a dataId for the file to be uploaded. {projectId} is the the project id for the destination project. You can find the projectId in yout projects details page (Project > Details > URN > urn:ilmn:ica:project:projectId#MyProject).
Post - /api/projects/{projectId}/data/{dataId}:createUploadUrl where dataId is the dataId from the response of the previous call. This call will generate the URL that you can use to upload the file.
Example
Create data in the project by making the API call below. If you don't already have the API-Key, refer to the instructions on the support page for guidance on generating one.
In the example above, we're generating a partial file named 'tempFile.txt' within a project identified by the project ID '41d3643a-5fd2-4ae3-b7cf-b89b892228be', situated inside a folder with the folder ID 'fol.579eda846f1b4f6e2d1e08db91408069'. You can access project, file, or folder IDs either by logging into the ICA web interface or through the use of the ICA CLI.
The response will look like this:
Retrieve the data/file ID from the response (for instance: fil.b13c782a67e24d364e0f08db9f537987) and employ the following format for the Post request - /api/projects/{projectId}/data/{dataId}:createUploadUrl:
The response will look like this:
Use the URL from the response to upload a file (tempFile.txt) as follows:
AWS CLI
ICA allows you to directly upload/download data from ICA using AWS CLI. It is especially helpful when dealing with an unstable internet connection to upload or download a large amount of data. If the transfer gets interrupted midway, you can employ the sync command to resume the transfer from the point it was stopped.
To connect to ICA storage, you must first download and install AWS CLI on your local system. You will need temporary credentials to configure AWS CLI to access ICA storage. You can generate temporary credentials through the ICA CLI, which can be used to authenticate AWS CLI against ICA. The temporary credentials can be obtained using this ICA API endpoint
Generate temporary credentials
Example cli to generate temporary credentials:
If you are trying to upload data to /cli-upload/
folder, you can get the temporary credentials to access the folder using icav2 projectdata temporarycredentials /cli-upload/
. It will produce following output with accessKey, secretKey and sessionToken that you will need to configure AWS CLI to access this folder.
Copy the awsTempCredentials.accessKey, awsTempCredentials.secretKey and awsTempCredentials.sessionToken to build the credentials file: ~/.aws/credentials
. It should look something like
Example format for credentials file:
The temporary credentials expire in 36 hours. If the temporary credentials expire before the copy is complete, you can use AWS sync command to resume from where it left off.
Following are a few AWS commands to demonstrate the use. The remote path in the commands below are constructed off of the output of temporarycredentials command in this format: s3://<awsTempCredentials.bucket>/<awsTempCredentials.objectPrefix>
Example AWS commands
You can also write scripts to monitor the progress of your copy operation and regenerate and refresh the temporary credentials before they expire.
rclone
You may also use Rclone for data transfer if you prefer. The steps to generate temporary credentials is the same as above. You can run rclone config
to set keys and tokens to configure rclone with the temporary credentials. You will need to select the advanced edit option when asked to enter the session key. After completing the config, your configure file (~/.config/rclone/rclone.conf) should look like this:
Example Rclone commands
Last updated