arrow-left

All pages
gitbookPowered by GitBook
1 of 21

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Scripts That Help Automate Steps

  • Route Artifacts Based Off a Template File

  • Invoking bcl2fastq from BCL Conversion and Demultiplexing Step

  • Finishing the Current Step and Starting the Next

Advancing/Completing a Protocol Step via the API

This topic forms a natural partner to the application example. When protocol steps are being initiated programmatically, we must know how to advance the step through the various states to completion.

hashtag
Solution

Advancing a step is actually quite a simple task. It requires the use of the steps/advance API endpoint - in fact, little else is needed.

Let us consider a partially completed step with ID 24-1234. To advance the step to the next state, the following is required:

Adding Downstream Samples to Additional Workflows
Advancing/Completing a Protocol Step via the API
Setting a Default Next Action
Automatic Placement of Samples Based on Input Plate Map (Multiple Plates)
Automatic Placement of Samples Based on Input Plate Map
Publishing Files to LabLink
Automatic Pooling Based on a Sample UDF/Custom Field
Completing a Step Programmatically
Automatic Sample Placement into Existing Containers
Routing Output Artifacts to Specific Workflows/Stages
Creating Multiple Containers / Types for Placement
Setting Quality Control Flags
Applying Indexing Patterns to Containers Automatically
Assignment of Sample Next Steps Based On a UDF
Parsing Metadata into UDFs (BCL Conversion and Demultiplexing)
  • Perform a GET to the resource .../api/v2/steps/24-1234, saving the XML response.

  • POST the XML from step 1 to .../api/v2/steps/24-1234/advance, and monitor the returned XML for success.

  • If successful, the protocol step advances to its next state, just as if the lab scientist had advanced it via the Clarity LIMS interface.

  • Advancing a protocol step that is in its final state completes the step.

  • The Python advanceStep (STEP_URI) method shown below advances a step through its various states. To achieve this, the URI of the step is passed to the method to be advanced/completed.

    hashtag
    Assumptions and Notes

    • The glsapiutil.py file is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You will find the latest glsapiutil (and glsapiutil3) Python libraries on our GitHub pagearrow-up-right.

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    Starting a Protocol Step via the API
    def advanceStep( STEP_URI ):
    
        response = False
    
        stXML = API.getResourceByURI( STEP_URI )
        rXML = API.createObject( stXML, STEP_URI + "/advance" )
        rDOM = parseString( rXML )
        nodes = rDOM.getElementsByTagName( "configuration" )
        if len( nodes ) > 0:
            response = True
    
        return response

    Email Notifications

    Stakeholders are interested in the progress of samples as they move through a workflow. E-mail alerts of events can provide them with real-time notifications.

    Some possible uses of notifications include the following:

    • Completion of a workflow for billing department

    • Manager review requests

    Notice of new files added via the Collaborations Lablink Interface

  • Updates on samples that are not following a standard path through a workflow

  • Clarity LIMS provides a simple way of accomplishing this using a combination of the Clarity LIMS API, EPP / automation triggers, and Simple Mail Transfer Protocol (SMTP).

    hashtag
    Solution

    The send_email() method uses the Python smptlib module to create a Simple Mail Transfer Protocol (SMPT) object to build and send the email. The attached script does the following:

    1. Gathers relevant data from the Clarity LIMS API endpoints.

    2. Generates an email body according to a template.

    3. Calls the send_email() function.

    hashtag
    Mail Server Configuration

    Connect to Clarity SMTP with:

    • host='localhost', port=25

    Because of server restrictions, the script can send emails from only:

    • [email protected]

    hashtag
    Automation Parameters

    The automation / EPP command is configured to pass the following parameters:

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the step that launches the script (Required)

    Example command line:

    hashtag
    User Interaction

    • The script can be executed using a Clarity LIMS automation / EPP command, and trigged by one of the following methods:

      • Manually, via a button on the Record Details screen.

      • Automatically, at a step milestone (on entry to or exit from a screen).

    • The script can also be triggered outside of a Clarity LIMS workflow, using a time-based job scheduler such as cron.

    hashtag
    Assumptions and Notes

    • You are running a version of Python that is supported by Clarity LIMS, as documented in the Clarity LIMS Technical Requirements.

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    hashtag
    Attachments

    emails_from_clarity.py:

    emails_attachmentPOC.py:

    file-download
    4KB
    emails_from_clarity.py
    arrow-up-right-from-squareOpen
    file-download
    18KB
    emails_attachmentPOC.py
    arrow-up-right-from-squareOpen
    python /opt/gls/clarity/customextensions/emails_from_Clarity.py -u {username} -p {password} -s {stepURI}

    Automatic Sample Placement into Existing Containers

    There are often cases where empty containers are received and added into Clarity LIMS before being used in a protocol. This application example describes how to use the API to place samples into existing containers automatically. The application uses a CSV file that describes the mapping between the sample and its destination container.

    Furthermore, the API allows accessioning into multiple container categories, something that is not possible through the web interface.

    hashtag
    Prerequisites

    • If you use Python version 2.6x, you must install the argparse package. Python 2.7 and later include this package by default.

    • Also make sure that you have the latest glsapiutil.py Python API library on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our .

    • Check the list of allowed containers for the step and make sure that all expected container categories are present. The API cannot place samples into containers that are not allowed for the step!

    hashtag
    Structure of the Input File

    The suggested input format is a four-column CSV with the following columns:

    Sample Name, Container Category, Container Name, Well Position

    The sample name should match the name as shown in the Ice Bucket/Queue screen.

    hashtag
    Step Setup Screen Configuration

    First, make sure that the Step Setup screen has been activated and is able to accept a file for upload:

    hashtag
    EPP / Automation command line

    Assuming the file is `compoundOutputFileLuid0`, the EPP / automation command line would be structured as follows:

    The automation should be configured to trigger automatically when the Placement screen is entered.

    Clarity LIMS v6.x automation trigger configuration

    hashtag
    The Script

    NOTE: The attached Python script uses the prerelease API endpoint (instead of v2), which allows placement of samples into existing containers.

    The script performs the following operations:

    1. Parses the file and create an internal map (Python dict) between sample name and container details:

      • Key: sample name

      • Value: (container name, well position) tuple

    After the script runs, the Placement screen should show the placements, assuming there were no problems executing the script.

    The attached script also contains some minimal bulletproofing for the following cases:

    • Container was not found.

    • Container is not empty.

    • Well position is invalid.

    • Sample in the ice bucket does not have a corresponding entry in the uploaded file.

    In all cases, the script reports an error and does not allow the user to proceed.

    hashtag
    Attachments

    placeSamplesIntoExistingContainers.py:

    Publishing Files to LabLink

    Some steps produce data that you would like your collaborators to have access to.

    This example provides an alternative method and uses a script to publish the files programmatically via the API.

    hashtag
    Solution

    In this example, suppose we have a protocol step, based upon a Sanger/capillary sequencing workflow, that produces up to two files per sample (a .seq and a .ab1 file).

    Retrieves the URI of each container.
  • Accesses the step's 'placements' XML using a GET request.

  • Performs the following modifications to the XML:

    • Populates the <selected-containers> node with child nodes for each retrieved container.

    • Populates each <output> artifact with a <location> node with the container details and well position.

  • PUTs the placement XML back to Clarity LIMS.

  • Sample in the uploaded file is not in the ice bucket.

  • GitHub pagearrow-up-right
    file-download
    8KB
    placeSamplesIntoExistingContainers.py
    arrow-up-right-from-squareOpen
    Our example script runs at the end of the protocol step. The script publishes the output files so that they are available to collaborators in the LabLink Collaborations Interface.

    hashtag
    Parameters

    The EPP / automation command is configured to pass the following parameters:

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the protocol step that launches the script - the {stepURI:v2:http} token (Required)

    An example of the full syntax used to invoke the script is as follows:

    hashtag
    User Interaction

    After the script has completed its execution, collaborators are able to view and download the files from the LabLink Collaborations Interface.

    Publishing_files_to_Lablink1.png

    hashtag
    About the code

    The main method used in the script is publishFiles(). The method in turn carries out several operations:

    1. The limsids of the steps' artifacts are gathered, and the artifacts are retrieved, in a single transaction using the 'batch' method.

    2. Each artifact is investigated. If there is an associated file resource, its limsid is stored.

    3. The files resources are retrieved in a single transaction using the 'batch' method.

    4. For each file resource, the value of the <is-published> node is set to 'true'.

    5. The files resources are saved back to Clarity LIMS in a single transaction using the 'batch' method.

    hashtag
    Assumptions & notes

    • The attached file is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our GitHub pagearrow-up-right.

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    hashtag
    Attachments

    publishFilesToLabLink.py:

    publishFilesToLabLink_v2.py:

    file-download
    3KB
    publishFilesToLabLink.py
    arrow-up-right-from-squareOpen
    file-download
    3KB
    publishFilesToLabLink_v2.py
    arrow-up-right-from-squareOpen

    Automatic Placement of Samples Based on Input Plate Map

    The Clarity LIMS interface offers tremendous flexibility when placing the outputs of a protocol step into new containers. Consider if your protocol step always involves the placement of samples using the plate map of the input plate. If yes, it makes sense to automate sample placement.

    This example provides a script that allows sample 'autoplacement' to occur. It also describes how the script can be triggered, and what the lab scientist sees when the script is running.

    For an example script that automates sample placement using multiple plates, see the Automatic Placement of Samples Based on Input Plate Map (Multiple Plates) example.

    hashtag
    Solution

    In this example, samples are placed according to the following logic:

    • The step produces one output sample for every input sample.

    • The output samples are placed on a 96 well plate.

    • Each occupied well on the source 96 well place populates the corresponding well on the destination 96 well plate.

    hashtag
    Step Configuration

    In this example, the step is configured to invoke the script on entry to the Sample Placement screen.

    hashtag
    Parameters

    The EPP / automation command is configured to pass the following parameters:

    An example of the full syntax to invoke the script is as follows:

    hashtag
    User Interaction

    When the lab scientist enters the Sample Placement screen, a message similar to the following appears:

    When the script has completed, the rightmost Placed Samples area of the Placement screen will display the container of auto-placed samples:

    At this point the lab scientist can review the constituents of the container, and complete the step as normal.

    hashtag
    About the Code

    The main method in the script is autoPlace(). This method in turn carries out several operations:

    1. A call to createContainer() prompts the creation of the destination 96 well plate.

    2. The method harvests enough information so that the objects required by the subsequent code can retrieve the required objects using the 'batch' API operations. You add additional code to build and manage the cache of objects retrieved in the batch operations:

      • cacheArtifact()

    hashtag
    Assumptions and Notes

    • The attached file is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our .

    • The HOSTNAME global variable must be updated so that it points to your Clarity LIMS server.

    hashtag
    Attachments

    autoplaceSamplesDefault.py:

    Routing Output Artifacts to Specific Workflows/Stages

    Samples progressing through workflows can branch off and must be directed to different workflows or stages within a workflow.

    Example: If it is not known at the initiation of a workflow if a sample is to be sequenced on a HiSeq or MiSeq. Rerouting the derived samples could be necessary.

    hashtag
    Solution

    This example provides the user with the opportunity to route samples individually to the HiSeq, MiSeq, or both stages from the Record Details screen.

    hashtag
    Step Configuration

    The step is configured to display two checkbox analyte UDFs / derived sample custom fields. The fields are used to select the destination workflow/stages for each derived sample. You can choose to queue the sample for HiSeq, MiSeq, or both.

    In this example, you select the following:

    • Two samples to be queued for HiSeq

    • Two samples for MiSeq

    • Two that are not routed

    • Two samples for both HiSeq and MiSeq

    hashtag
    Parameters

    The script accepts the following parameters:

    An example of the full syntax to invoke the script is as follows:

    hashtag
    User Interaction

    On the Record Details screen, you use the analyte UDF / derived sample custom field checkboxes to decide which workflow/stage combination to send each derived sample.

    hashtag
    About the Code

    The first important piece of information required is the URI of the destination stage.

    A stage URI can change across LIMS instances (such as switching from Dev to Prod). Therefore, the script gathers the stage URI from the workflow and stage name. This process occurs even when the workflows are identical.

    The main method in the script is routeAnalytes. This method in turn carries out several operations:

    • Gathers the information for the process / master step that triggered the script, including output analytes.

    • For each analyte, evaluates which UDFs / custom fields have been set, and adds the analyte to a list of analytes to route.

    • Creates the XML message for each stage.

    Modifications

    This script can be modified to look for a process level UDF (master step custom field), in which case all outputs from the step would be routed to the same step.

    This example also serves to demonstrate a more general concept. Routing artifacts is valuable in any situation where a sample needs to be queued for a stage outside of the usual order of a workflow - or even routing newly submitted samples to the first stage in a workflow.

    For more information about how to use the artifact/route endpoint, see .

    hashtag
    Assumptions and Notes

    • You are running a version of Python that is supported by Clarity LIMS, as documented in the Clarity LIMS Technical Requirements.

    • The attached file is placed on the LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our .

    hashtag
    Attachments

    Route_to_HiSeq_MiSeq.py:

    Completing a Step Programmatically

    circle-info

    This method supersedes the use of the processes API endpoint.

    The capacity for completing a step programmatically, without having to open the BaseSpace Clarity LIMS web interface, allows for rapid validation of protocols. This method results in streamlined workflows for highly structured lab environments dealing with high throughput.

    This example uses the /api/v2/steps endpoint, which allows for more controlled execution of steps. In contrast, a process can be executed using the api/v2/processes endpoint with only one POST. This ability is demonstrated in the Process Execution with EPP/Automation Support example.

    hashtag
    Solution

    The Clarity LIMS API allows for each aspect of a step to be completed programatically. Combining the capabilities of the API into one script allows for the completion of a step with one click.

    hashtag
    Step Configuration

    This example was created for non-pooling, non-indexing process types.

    hashtag
    Parameters

    The script accepts the following parameters:

    An example of the full syntax to invoke the script is as follows:

    hashtag
    User Interaction

    The script contains several hard coded variables, as shown in the following example.

    step_config_uri Is the stage that is automatically completed. Because this script is starting the step, there is no step limsid needed as input parameter to the script. After the script begins the step, it gathers the step limsid from the APIs response to the step-creation post.

    hashtag
    About the Code

    The main() method in the script carries out the following operations:

    • startStep()

    • addLot()

    • addAction()

    • addPlacement()

    Each of these functions creates an XML payload and interacts with the Clarity LIMS API to complete an activity that a lab user would be doing in the Clarity LIMS interface.

    hashtag
    startStep()

    This function creates a 'stp:step-creation' payload.

    As written, the script includes all the analytes in the queue for the specified stage.

    hashtag
    addLot()

    This function creates a 'stp:lots' payload. This may be skipped if the process does not require reagent lot selection.

    hashtag
    addAction()

    This function creates a 'stp:actions' payload. As written, all output analytes are assigned to the same 'next-action'. To see the options available as next actions, see the REST API documentation: Type action-type:

    NOTE: This example only supports the following next-actions: 'nextstep', 'remove', 'repeat'.

    hashtag
    addPlacement()

    This function creates a 'stp:placements' payload.

    In this example, it is not important where the artifacts are placed, so the analytes are assigned randomly to a well location.

    This function relies on the createContainer function, since a step producing replicate analytes may not create enough on-the-fly containers to place all out the output artifacts.

    hashtag
    advanceStep()

    This function advances the current-state for a step. The current-state is an attribute that is found at the /api/v2/steps/{limsid} endpoint. It is a representation of the page that you see in the user interface. For more information, see and search for the REST API documentation relating to the /{version}/steps/{limsid}/advance endpoint.

    This function creates a 'stp:placements' payload. POSTing this payload to steps/{limsid}/advance is the API equivalent of moving to the next page of the GUI, with the final advance post completing the step.

    hashtag
    Known Issues

    • There is a known bug with advance endpoint that prevents a complete end-to-end programatic progression through a pooling step.

    hashtag
    Assumptions and Notes

    • You are running a version of Python that is supported by Clarity LIMS, as documented in the Clarity LIMS Technical Requirements.

    • The attached file is placed on the LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our .

    hashtag
    Attachments

    autocomplete-wholestep.py:

    Invoking bcl2fastq from BCL Conversion and Demultiplexing Step

    Illumina sequencing protocols include a BCL Conversion and Demultiplexing step. This stage allows you to select the command options for running bcl2fastq2. The bcl2fastq must be initiated through a command-line call on the BCL server.

    This example allows you to initiate the bcl2fastq2 conversion software by the selection of a button in BaseSpace Clarity LIMS.

    Step Configuration

    The "out of the box" step is configured to include the following UDFs / custom fields. You can select these options on the Record Details screen. You can also configure additional custom options.

    hashtag
    About the Code

    The main method in the script is convertData(). This method performs several operations:

    1. The script determines the run folder. The link to the run folder is attached as a result file to the sequencing step.

      • The script searches for the appropriate sequencing step and downloads the result file containing the link.

      • The script changes directories into the run folder.

    hashtag
    Script Configuration

    This script must be copied to the BCL server because the script is executed on the BCL server remote Automated Informatics (AI) / Automation Worker (AW) node.

    By default, the remote AI / AW node does not come with a custom extensions folder. Therefore, if this script is the first script on the server you can create a customextensions folder in /opt/gls/.

    circle-info

    It is not recommended to have the customextensions folder in the remoteai folder as the remoteaifolder can get overwritten.

    When uploading the script, ensure the following:

    • The path to the bcl2fastq application is correct (line 17)

    • The sequencing process type matches exactly the name of the process type / master step the artifact went through (the -d parameter)

    • The customextensions folder contains both glsapiutil.py and glsfileutil.py modules. See .

    Parameters

    The script accepts the following parameters:

    An example of the full syntax to invoke the script is as follows:

    hashtag
    Assumptions and Notes

    • You are running a version of Python supported by Clarity LIMS, as documented in the Clarity LIMS Technical Requirements.

    • The attached files are placed on the LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from the page.

    hashtag
    Attachments

    kickoff_bcl2fastq2.py:

    glsfileutil.py:

    Setting a Default Next Action

    Often, a protocol step will have just a single 'next action' that is required to continue the workflow. In such cases, it can be desirable to automatically set the default action of the next step.

    This example script shows how this task can be achieved programmatically.

    hashtag
    Solution

    hashtag
    Step configuration

    The step is configured to invoke a script that sets the default next action when you exit the Record Details screen of the step.

    hashtag
    Parameters

    The automation command is configured to pass the following parameters to the script:

    An example of the full syntax to invoke the script is as follows:

    hashtag
    User Interaction

    When the lab scientist exits the Record Details screen, the script is invoked and the following message displays:

    If the script completes successfully, the LIMS displays the default next action. If the current step is the final step of the protocol, it is instead marked as complete:

    hashtag
    Manual Intervention

    At this point the lab scientist is able to manually intervene and reroute failed samples accordingly.

    NOTE: it is not possible to prevent the user from selecting a particular next-step action. However, it is possible to add a validation script that checks the next actions that have been selected by the user against a list of valid choices.

    If the selected next step is not a valid choice, you can configure the script such that it takes one of the following actions:

    • Replaces the next steps with steps that suit your business rules.

    • Issues a warning, and prevents the step from completing until the user has changed the next step according to your business rules.

    hashtag
    About the Code

    The main method in the script is routeAnalytes(). The method in turn carries out several operations:

    1. The actions resource of the current protocol step is investigated.

    2. The possible next step(s) are identified.

    3. If there is no next step, the current step is set to 'Mark protocol as complete.'

    hashtag
    Assumptions and Notes

    • The attached file is placed on the LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our .

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    hashtag
    Attachments

    setDefaultNextAction.py:

    Automatic Pooling Based on a Sample UDF/Custom Field

    In some facilities, when samples are initially submitted into BaseSpace Clarity LIMS, it has already been determined which samples are combined to give pooled libraries. In such cases, it is desirable to automate the pooling of samples within the protocol step. Doing so means that the lab scientist does not have to manually pool the samples in Clarity LIMS interface. This automatic pooling saves time and effort and reduces errors.

    hashtag
    Solution

    This example provides a script that allows 'autopooling' to occur. It also describes how the script can be triggered, and what the lab scientist sees when the script is running.

    The attached script relies upon a user-defined field (UDF) / custom field, named Pooling Group, at the analyte (sample) level.

    This UDF is used to determine the constitution of each pool. This determination makes sure that samples combined to create a pool having the name of the Pooling Group value all have a common Pooling Group value.

    For example, consider the Operations Interface (LIMS v4.x & earlier) Samples list shown below. The highlighted samples have a common Pooling Group value. Therefore, we can expect that they will be combined to create a pool named 210131122-pg1.

    hashtag
    Step Configuration

    In this example, the Pooling protocol step is configured to invoke the script as soon as the user enters the step's Pooling screen.

    hashtag
    Parameters

    The EPP / automation command is configured to pass the following parameters:

    An example of the full syntax to invoke the script is as follows:

    hashtag
    User Interaction

    When the lab scientist enters the Pooling screen, a message similar to the following displays:

    When the script has completed, the rightmost Placed Samples area of the Placement screen displays the auto-created pools:

    At this point, the lab scientist can review the constituents of each pool, and then complete the protocol step as normal.

    hashtag
    About the Code

    The main methods of interest are autoPool() and getPoolingGroup().

    1. The autoPool() method harvests just enough information so that the objects required by the subsequent code in the method can retrieve the required objects using the 'batch' API operations. This involves using additional code to build and manage the cache of objects retrieved in the batch operations, namely:

      • cacheArtifact()

      • prepareCache()

    hashtag
    Assumptions and Notes

    • The attached file is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our .

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    hashtag
    Attachments

    autopoolSamples.py:

    Adding Downstream Samples to Additional Workflows

    When processing samples, there can be circumstances in which you must add downstream samples to additional workflows. This sample addition is not easy to achieve using the Server Name interfaces, but is easy to do via the API.

    hashtag
    Solution

    This example provides a Python script that can be used to add samples to an additional workflow step. The example also includes information on the key API interactions involved.

    • It is the outputs, not the inputs, of the process that are added to the workflow step. (If you would like to add the inputs, changing this step is simple.)

    • This example is an add function, not a move. If you would like to remove the samples from the current workflow, you can do arrange to do so by building an <assign> element.

    • The process is configured to produce analyte outputs, and not result files.

    hashtag
    Parameters

    The script accepts the following parameters:

    hashtag
    About the Code

    The step URI (-s) parameter is used to report a meaningful message and status back to the user. These reports depend upon the outcome of the script.

    Initially, the parameters are gathered and the helper object defined in the glsapiutil.py has been instantiated and initialized. Its methods can be called to take care of the RESTful GET/PUT/POST functionality. The script calls the following functions:

    Once the parameters have been gathered, and the helper object defined in the glsapiutil.py has been instantiated and initialized, its methods can be called to take care of the RESTful GET/PUT/POST functionality, leaving the script to call the following functions:

    • getStageURI()

    • routeAnalytes().

    hashtag
    getStageURI() function

    The getStageURI() function converts the workflowname and stagename parameters into a URI that is used as the assign element, for example:

    hashtag
    routeAnalytes() function

    The routeAnalytes() function gathers the outputs of the process, and harvests their URIs to populate in the <artifact> elements. This function also uses the reporting mechanism based upon the step URI parameter.

    hashtag
    API resource

    The crucial resource in this example is the route/artifacts resource. This API endpoint can only be POSTed to, and accepts XML of the following form:

    For more information, refer to the route/artifacts REST API documentation. Also useful are the configuration/workflow resources, both single and list.

    hashtag
    Assumptions and Notes

    • The attached file is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our .

    • You must update the HOSTNAME global variable such that it points to your Clarity LIMS server.

    hashtag
    Attachments

    addToStep.py:

    Finishing the Current Step and Starting the Next

    Many facilities have linear, high-throughput workflows. Finishing the current step and finding the samples that have been queued for the next step. Then putting the samples into the virtual ice bucket. Then actually starting the next step. All of these steps can be seen as unnecessary overhead.

    This solution provides a methodology that allows a script to finish the current step, and start the next.

    We have already illustrated how a step can be run in its entirety via the API. Finishing one step and starting the next would seem a straightforward exercise, but, as most groups have discovered, there is a catch.

    Consider the step to be autocompleted as Step A, with Step B being the next step to be autostarted.

    • The script to complete Step A and start Step B will be triggered from Step A (in the solution defined below, it will be triggered manually by clicking a button on the Record Details screen).

    • As Step A invokes a script:

      • The step itself cannot be completed - because the script has not completed successfully. - and -

      • The script cannot successfully complete unless the step has been completed.

    To break this circle, the script initiated from Step A must invoke a second script. The first script will then complete successfully, and the second script will be responsible for finishing Step A and started Step B.

    hashtag
    Solution

    hashtag
    Invoking script: runLater.py

    The runLater.py script launches the subsequent script (finishStep.py) by handing it off to the Linux 'at' command (see ). This process effectively 'disconnects' the execution of thefinishStep.py script from runLater.py, allowing runLater.py to complete successfully, and then Step A to be completed.

    Parameters

    The script accepts the following parameter:

    Although this script accepts just a single parameter, the value of this parameter is quite complex, since it is the full command line and parameters that need to be passed to the finishStep.py script.

    Following is an example of a complete EPP / automation command-line string that will invoke runLater.py, and in turn finishStep.py:

    hashtag
    Invoking script: finishStep.py

    The finishStep.py script completes and starts Steps A and B respectively.

    Parameters

    The script accepts the following parameters:

    hashtag
    About the code

    The first thing that this script does is go to sleep for five seconds before the main method is called. This delay allows the application to detect that the invoking script has completed successfully, and allow this script to work. The duration of this delay can be adjusted to allow for your server load, etc.

    The central method (finishStep()) assumes that the current Step (Step A) is on the Record Details screen. Next, the method callsadvanceStep() to move the step onto the screen to allow the next step to be selected. By default, the routeAnalytes() method selects the first available next step in the protocol / workflow for each analyte. Then theadvanceStep() method is called, which completes Step A.

    If the script has been passed a value of 'START' for the -a parameter, the next step (Step B) will now be started. ThestartNextStep() method handles this process, carrying out the following high-level actions:

    • It determines the default output container type for Step B.

    • It invokes an instance of Step B on the analytes routed to it via the routeAnalytes() method.

    • If the Step B instance was invoked successfully, it gathers the LUID of the step.

    At this point, it is possible that the apiuser user has started the step. This situation is not ideal. We would like to update it so that it shows up in the Work in Progress section of the GUI for the user that initiated Step A.

    This script makes a note of the current user and then updates the new instance of Step B with the current user. This update can fail if Step B has mandatory fields that must be populated on the Record Details screen (see ). It can also fail if the step has mandatory reagent kits that must be populated on the Record Details screen.

    If the update does fail, the step has been started, but the apiuser user still owns it. It can be continued as normal.

    hashtag
    Assumptions

    • The attached files are placed on the Clarity LIMS server, in the following location: opt/gls/clarity/customextensions

    • The attached files are readable by the glsai user.

    • Updating the HOSTNAME global variable such that it points to your Clarity LIMS server is required.

    hashtag
    Notes

    1. The Linux 'at' command might not be installed on your Missing variable reference server. For installation instructions, refer to the Linux documentation.

    2. For a solution to this issue, see .

    hashtag
    Attachments

    runLater.py:

    clarityHelpers.py:

    finishStep.py:

    Automatic Placement of Samples Based on Input Plate Map (Multiple Plates)

    The BaseSpace Clarity LIMS interface offers tremendous flexibility when placing the outputs of a step into new containers. Consider if your protocol step always involves the placement of samples using the plate map of the input plate. If yes, it makes sense to automate sample placement.

    This example provides a script that allows sample 'autoplacement' to occur, and describes how the script can be triggered.

    This script is similar in concept to the example. The difference being that this updated example is able to work with multiple input plates, which can be of different types.

    hashtag
    Solution

    Creating Multiple Containers / Types for Placement

    The Clarity LIMS interface offers tremendous flexibility in placing the outputs of a protocol step into new containers.

    Sometimes it is necessary that a step produces multiple containers of differing types (for example a 96-well plate and a 384-well plate). Such an interaction is not possible without using the Clarity LIMS API.

    This example provides a script that creates a 96-well plate and a 384-well plate in preparation for subsequent manual placement of samples.

    hashtag
    Solution

    Parsing Metadata into UDFs (BCL Conversion and Demultiplexing)

    This example provides a script that can be used to parse lanebarcode.html files from demultiplexing. This script is written to be easily used with the out of the box Bcl Conversion & Demultiplexing (HiSeq 3000/4000) protocol.

    • Result values are associated with a barcode sequence as well as lane.

    • Values are attached to the result file output in Clarity LIMS, with matching barcode sequence (index on derived sample input) and lane (container placement of derived sample input).

    bash -l -c "/usr/bin/env python /opt/gls/clarity/customextensions/placeSamplesIntoExistingContainers.py -u {username} -p {password} -s {stepURI:https} -f {compoundOutputFileLuid0}"
    bash -l -c "/usr/bin/python /opt/gls/clarity/customextensions/publishFilesToLabLink.py -u admin -p securepassword -s https://demo-5-1.claritylims.com/api/v2/steps/24-7953"
    Does a POST to the REST API in order to add the analytes to the queue in Clarity LIMS.

    The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

  • Samples can be inadvertently duplicated in the next step. This duplication occurs if:

    • The sample is being routed at the last step of a protocol and;

    • If the action of next steps is Mark Protocol as Complete.

    This duplication is due to:

    • The next step is routing the artifact to its default destination and;

    • The script is routing the same artifact.

    The solution here is to set the default next steps action to Remove from Workflow instead. This solution can be automated using Lab Logic Toolkit or the API.

  • -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the step that launches the script - the {stepURI:v2} token (Required)

    REST
    GitHub pagearrow-up-right
    file-download
    5KB
    Route_to_HiSeq_MiSeq.py
    arrow-up-right-from-squareOpen

    advanceStep()

    The HOSTNAME global variable must be updated so that it points to your Clarity LIMS server.

  • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.Attachments

  • -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the step that launches the script - the {stepURI:v2:http} token (Required)

    API Portal
    GitHub pagearrow-up-right
    file-download
    8KB
    autocomplete-wholestep.py
    arrow-up-right-from-squareOpen
    The script gathers all the step level UDFs / custom fields from the BCL Conversion and Demultiplexing step.
  • Using the information gathered, the script builds the command that is executed on the BCL server. The command consists of two parts: cd (changing directory) into the run folder.

    • Executing the bcl2fastq command with the selected options.

  • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the step that launches the script - the {stepURI:v2} token (Required)

    -d

    The display name of the sequencing step (Required)

    GitHubarrow-up-right
    file-download
    5KB
    kickoff_bcl2fastq2.py
    arrow-up-right-from-squareOpen
    file-download
    2KB
    glsfileutil.py
    arrow-up-right-from-squareOpen
    Assumptions and Notes

    The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -l

    The limsid of the process invoking the script (Required)

    -s

    The URI of the step that launches the script (Required)

    -w

    The name of the destination workflow (Required)

    -g

    The name of the desired stage within the workflow (Required)

    GitHub pagearrow-up-right
    file-download
    3KB
    addToStep.py
    arrow-up-right-from-squareOpen
    The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.
  • The scripts use rudimentary logging. After the scripts are installed and validated, these logs are of limited value, and their creation can likely be removed from the scripts.

  • -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the step that launches the script (Required)

    -a

    The action, or mode that the script runs under (Required). Accepted values are START (which completes the current step and starts the next) or COMPLETE (completes the current step)

    Validating Process/Step Level UDFs
    file-download
    692B
    runLater.py
    arrow-up-right-from-squareOpen
    file-download
    5KB
    clarityHelpers.py
    arrow-up-right-from-squareOpen
    file-download
    11KB
    finishStep.py
    arrow-up-right-from-squareOpen
    Notes
    Notes

    In this example, samples are placed according to the following logic:

    • The process type / master step is configured to produce just one output analyte (derived sample) for every input analyte.

    • The output analytes are placed on the same type of container as the corresponding source plate.

    • Each occupied well on the source plate populates the corresponding well on the destination plate.

    • The destination plate is named such that it has the text 'DEST-' prepended to the name of its corresponding source plate.

    hashtag
    Step Configuration

    In this example, the step is configured to invoke the script on entry to the Sample Placement screen.

    hashtag
    Parameters

    The EPP / automation command is configured to pass the following parameters:

    -l

    The limsid of the process invoking the script (Required)

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the protocol step that launches the script - the {stepURI:v2:http} token (Required)

    An example of the full syntax to invoke the script is as follows:

    hashtag
    About the Code

    The main method in the script is autoPlace(). This method executes several operations:

    1. The creation of the destination plates is effected by calls to createContainer().

    2. It harvests just enough information so that the objects required by the subsequent code can retrieve the required objects using the batch API operations. This involves using some additional code to build and manage the cache of objects retrieved in the batch operations, namely:

      • cacheArtifact()

      • prepareCache()

      • getArtifact()

    3. The cached analytes are then accessed. After the source well to which the analyte maps has been determined, the output placement can be set. This information is presented in XML that can be POSTed back to the server in the format required for the placements resource.

    4. After all the analytes have been processed, the placements XML is further supplemented with required information, and POSTed to the ../steps/<stepID>/placements API resource.

    5. Finally, a meaningful message is reported back to the user via the ../steps/<stepID>/programstatus API resource.

    hashtag
    Assumptions and Notes

    • The attached file is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our GitHub pagearrow-up-right.

    • The HOSTNAME global variable must be updated so that it points to your Clarity LIMS server.

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    hashtag
    Attachments

    autoplaceSamplesDefaultMulti.py:

    Automatic Placement of Samples Based on Input Plate Map
    file-download
    7KB
    autoplaceSamplesDefaultMulti.py
    arrow-up-right-from-squareOpen

    Script modifications may be needed to match the format of index in Clarity LIMS to the index in the HTML result file.

    hashtag
    Parameters

    The script accepts the following parameters:

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -o

    The limsid of the result file artifact with attached lanebarecode.html file (Required)

    -s

    The LIMS IDs of the individual result files. (Required)

    An example of the full syntax to invoke the script is as follows:

    hashtag
    Configuration

    hashtag
    Defining the UDFs / Custom Fields

    All user defined fields (UDFs) / custom fields must first be defined in the script. Within the UDF / custom field dictionary, the name of the field as it appears in Clarity LIMS (the key) must be associated with the field from the result file (the value).

    The fields should be preconfigured in Clarity LIMS for result file outputs.

    hashtag
    Modifying individual UDFs / Custom Fields

    The UDF / custom field values can be modified before being brought into Clarity LIMS. In the following example, the value in megabases is modified to gigabases.

    hashtag
    Checking for matching flow cell ID

    The script currently checks the flow cell ID for the projects in Clarity LIMS against the flow cell IS in the result file.

    NOTE: The script will still complete and attach UDF / custom field values. You may wish to modify the script to not attach the field values if the flow cell ID does not match.

    hashtag
    Assumptions and Notes

    • Your configuration conforms with the script's requirements, as documented in the Configuration section of this document.

    • You are running a version of Python that is supported by Clarity LIMS, as documented in the Clarity LIMS Technical Requirements.

    • The attached Python file is placed on the LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The glsapiutil file is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    hashtag
    Attachments

    demux_stats_parser.py:

    demux_stats_parser_4.py:

    file-download
    8KB
    demux_stats_parser.py
    arrow-up-right-from-squareOpen
    file-download
    10KB
    demux_stats_parser_4.py
    arrow-up-right-from-squareOpen
    bash -c "/usr/bin/python /opt/gls/clarity/customextensions/Route_to_HiSeq_MiSeq.py -u {username} -p {password} -s {stepURI:v2}"
    /usr/bin/python /opt/gls/clarity/customextensions/autocomplete-wholestep.py -p apipassword -u apiuser -s https://demo-4-1.claritylims.com/api/v2/
    step_config_uri = "https://demo-4-1.claritylims.com/api/v2/configuration/protocols/551/steps/1003"
    step_config = "simple step"
    queue = '1003'
    
    containerType = "96 well plate"
    reagentCat = ""
    replicates = 40
    
    nextAction = 'nextstep'
    nextStepURI = 'https://demo-4-1.claritylims.com/api/v2/configuration/protocols/551/steps/1004'
    
    reagent_lot_limsid = ""
    queue = '1003'
    /usr/bin/python /opt/gls/customextensions/ kickoff_bcl2fastq2.py -u {username} -p {password} -s {stepURI:v2} -d 'Illumina Sequencing (Illumina SBS) 5.0'
    <assign stage-uri="http://localhost:8090/api/v2/configuration/workflows/7/stages/4">  
    <rt:routing xmlns:rt="http://genologics.com/ri/routing">
      <assign stage-uri="http://localhost:8090/api/v2/configuration/workflows/7/stages/4">
        <artifact uri="http://localhost:8090/api/v2/artifacts/5"/>
        <artifact uri="http://localhost:8090/api/v2/artifacts/6"/>
      </assign>
      <assign workflow-uri="http://localhost:8090/api/v2/configuration/workflows/7">
        <artifact uri="http://localhost:8090/api/v2/artifacts/8"/>
        <artifact uri="http://localhost:8090/api/v2/artifacts/9"/>
      </assign>
      <unassign workflow-uri="http://localhost:8090/api/v2/configuration/workflows/10">
        <artifact uri="http://localhost:8090/api/v2/artifacts/11"/>
        <artifact uri="http://localhost:8090/api/v2/artifacts/12"/>
      </unassign>
    </rt:routing>
    -c   The command to be passed to the at command (Required)
    /usr/bin/python /opt/gls/clarity/customextensions/runLater.py -c "/usr/bin/python /opt/gls/clarity/customextensions/finishStep.py -u {username} -p {password} -s {stepURI:v2:http} -a START"
    /usr/bin/python /opt/gls/clarity/customextensions/autoplaceSamplesDefaultMulti.py -l 122-7953 -u admin -p securepassword -s http://192.168.9.123:8080/api/v2/steps/122-5601
    bash -l -c "/usr/bin/python /opt/gls/clarity/customextensions/demux_stats_parser.py -s {stepURI:v2} -o {compoundOutputFileLuid0} -u {username} -p {password}" 
    udfs_in_clarity = {"Yield PF (Gb)":"Yield (Mbases)"
        "%PF":"% PF Clusters",
        "% One Mismatch Reads (Index)":"% One mismatch barcode", 
        "% Bases >=Q30":"% &gt;= Q30 bases",
        "Ave Q Score":"Mean Quality Score",
        "% Perfect Index Read":"% Perfect barcode",
        "# Reads":"PF Clusters",
        "% of Raw Clusters Per Lane":"% of the lane"}
    if clarity_udf == 'Yield PF (Gb)':
        yieldmb = udf_value
        yieldmb = yieldmb.replace(",","")
        yieldgb = float(yieldmb)*.001
        udf_value = yieldgb

    prepareCache()

  • getArtifact()

  • The cached analytes are accessed and the source well to which each analyte maps is determined.

  • Output placement can then be set. This information is presented in XML that can be POSTed back to the server in the correct format required for the placements resource.

  • After the analytes have been processed, the placements XML is further supplemented with required information, and POSTed to the ../steps/<stepID>/placements API resource.

  • Finally, a meaningful message is reported back to the user via the ../steps/<stepID>/programstatus API resource.

  • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    -l

    The luid of the process invoking the script (Required)

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the step that launches the script - the {stepURI:v2:http} token (Required)

    GitHub pagearrow-up-right
    file-download
    5KB
    autoplaceSamplesDefault.py
    arrow-up-right-from-squareOpen
    AutoPlace_based_on_inputs_samples_placed.png
    If there are multiple next steps, the first is used to set the next action.

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the step that launches the script (Required)

    The {stepURI} token

    -a

    Sets the next action to a fixed value. (Optional)

    This is for advanced use, for example, when you would like to set the next action to a fixed value — 'repeat step', 'remove from workflow', and so on.

    GitHub pagearrow-up-right
    file-download
    3KB
    setDefaultNextAction.py
    arrow-up-right-from-squareOpen
    Setting_default_next_action_message.png
    Setting_default_next_action_Complete.png

    getArtifact()

  • After the cache of objects has been built, each artifact is linked to its submitted sample. The getPoolingGroup function harvests the Pooling Group UDF value of the corresponding submitted sample.

  • The script now understands which artifacts are to be grouped to produce the requested pools. An appropriate XML payload is constructed and then POSTed to the ../steps/<stepID>/placements API resource.

  • -l

    The limsid of the process invoking the script (Required)

    -u

    The username of the current user (Required)

    -p

    The password of the current user (Required)

    -s

    The URI of the step that launches the script - the {stepURI:v2:http} token (Required)

    GitHub pagearrow-up-right
    file-download
    4KB
    autopoolSamples.py
    arrow-up-right-from-squareOpen
    Auto_pooling_Op_Interface_samples_list.png
    Auto_pooling_Clarity_Interface_executing_custom_program.png
    Auto_pooling_Clarity_Interface_pools_created.png
    In this example, containers are created according to the following logic:
    • A 96 well plate is produced along with a 384 well plate.

    • Neither plate has a user-specified name. The LIMS names them using the LIMS ID of the plates.

    hashtag
    Step Configuration

    The step is configured:

    • To allow samples to be placed in 96 or 384 well plates.

    • To invoke the script as soon as the user enters the step's Sample Placement screen.

    hashtag
    Parameters

    The EPP / automation command is configured to pass the following parameters:

    -l

    The limsid of the process invoking the code (Required)

    The {processLuid} token

    -u

    The username of the current user (Required)

    The {username} token

    -p

    The password of the current user (Required)

    The {password} token

    -s

    The URI of the step that launches the script (Required)

    The {stepURI:v2:http} token

    An example of the full syntax to invoke the script is as follows:

    hashtag
    User Interaction

    When the lab scientist enters the Sample Placement screen, the rightmost Placed Samples area displays the first container created (1 of 2). By selecting the second container, the display shows the second container (2 of 2).

    Creating_multiple_container_types_for_placement_2.png

    The lab scientist manuallys place the samples into the containers and completes the protocol step as normal.

    hashtag
    About the Code

    1. Two calls to createContainer() create the destination 96-well and 384-well plates.

      • To create custom containers, supplement the createContainer() method with the configuration details that apply to your instance of Clarity LIMS.

      • To name the container with the value of the second argument, pass a non-empty string as the second argument to createContainer() method.

    2. An XML payload is created containing only the details of the created containers, ready for the user to record the actual placements in the Clarity LIMS user interface.

    3. This information is POSTed back to the server, in the format required for the placements resource.

    4. A meaningful message is reported back to the user via the ../steps/<stepID>/programstatus API resource.

    hashtag
    Assumptions and Notes

    • The attached file is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from our GitHub pagearrow-up-right.

    • Update the HOSTNAME global variable so that it points to your Clarity LIMS server.

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    hashtag
    Attachments

    createMultipleContainerTypes.py:

    file-download
    3KB
    createMultipleContainerTypes.py
    arrow-up-right-from-squareOpen

    Route Artifacts Based Off a Template File

    Workflows do not always have a linear configuration. There are situations when samples progressing through a workflow must branch off and be directed to different stages, or even different workflows. You can add samples to a queue using the /api/{version}/route/artifacts endpoint.

    In the lab, decisions are made dynamically. At the initiation of this example workflow, it is not known whether the sample is destined to be sequenced on a HiSeq or MiSeq instrument. As a result, the derived samples must be routed to a different workflow stage midworkflow.

    This programmatic approach to queuing samples can be used many times throughout a workflow. This approach eliminates the need to write multiple complex scripts - each of which must be maintained over time.

    This example describes a Python script that takes instruction from a .csv template file. Removing any hard-coded reference to specific UDFs / custom fields or workflow stages from the script allows for easy configuration and support. All business logic can be implemented solely through generation of a template and EPP / automation command.

    The template contains UDF / custom field names and values. If samples in the active step have fields with matching values, they are queued for the specified workflow stage.

    hashtag
    Step Configuration

    The step is configured to display the two checkbox analyte UDFs / derived sample custom fields. These fields are used to select the destination workflow stages for each derived sample / analyte. You can queue the sample for HiSeq, MiSeq, or both.

    In the preceding example of the Sample Details screen, the user selected:

    • Two samples to be queued for HiSeq

    • Two samples for MiSeq

    • Two samples where routing is not selected

    In the preceding example of the Step Details screen, the user selected:

    • A step level UDF / custom field to assign all samples to a given step. All output samples are routed to the associated step. Samples can be duplicated in the first step of the following protocol if:

    • Samples are routed as the last step of a protocol and;

    • The action of the next step is Mark Protocol as Complete.

    The duplication is due to the Next Steps action queuing the artifact for its default destination in addition to the script routing the same artifact.

    The solution here is to set the default Next Steps action to Remove from Workflow instead. This action can be automated using the Lab Logic Toolkit or the API.

    hashtag
    LIMS v6.x

    On the protocol configuration screen, make sure that Start Next Step is set to Automatic.

    hashtag
    Template Configuration

    Each UDF name and value pair correspond to a workflow and stage combination as the destination to route the artifact.

    • The UDF name in the template could be configured as a step or analyte level UDF. If a step level UDF has a value that is specified in the template, all analytes in the step are routed.

    • The UDFs can be of any type (Numeric, Text, Checkbox ). If a checkbox UDF is used, the available values are true or false.

    • UDF values in the template are not case-sensitive.

    The template requires four columns:

    • UDF_NAME

    • UDF_VALUE

    • WORKFLOW_NAME

    • STAGE_NAME

    For this example, the template values would be:

    • UDF_NAME, UDF_VALUE, WF_NAME, STAGE_NAME

    • Route all samples to: Step A, Workflow A, Stage A

    • Route all samples to: Step B, Workflow B, Stage B

    This script might be used numerous times in different EPP / automation scripts, with each referencing a different template.

    hashtag
    Using a Template String Instead of a Template File

    Due to restrictions on file server access, this script accepts routing template instructions using a string EPP parameter with lines separated by a newline character ('\n'). The following example shows how this parameter string would be added to represent the previous template:

    hashtag
    Required Parameters

    The EPP/automation that calls the script must contain the following parameters:

    *Either --template or --template_string is required. If both are provided, --template_string is used.

    hashtag
    Optional Parameter

    When the --input parameter is used, the script routes input artifacts instead of the default output artifacts. UDF values of the input artifacts (instead of outputs) are against the template file.

    An example of the full syntax to invoke the script is as follows:

    Or, if you wish to route the inputs instead of outputs:

    hashtag
    User Interaction

    When the Record Details screen is entered, the UDF / custom field checkboxes or drop-down options specify to which workflow/stage combination each derived sample is sent.

    hashtag
    About the Code

    The first important piece of information required is the URI of the destination stage. There is a unique stage URI assigned to each workflow and stage combination.

    A stage URI can change across LIMS instances (such as switching from Dev to Prod). Therefore, the script gathers the stage URI from the workflow and stage name. This process occurs even when the workflows are identical.

    The main method in the script is routeAnalytes() and it carries out several operations:

    1. Gathers the information for the process that triggered the script, including output (or input) analytes.

    2. For each analyte, evaluates which UDFs have been set, and adds the analyte to a list of analytes to route.

    3. Creates the XML message for each stage.

    This example, while in itself useful, also serves to demonstrate a more general concept. Routing artifacts is valuable in any situation where a sample must be queued for a stage outside of the usual order of a workflow. This routing is applicable even when routing newly submitted samples to the first stage in a workflow.

    For more information, see the artifact/route REST API documentation.

    hashtag
    Assumptions and Notes

    • You are running a version of Python supported by Clarity LIMS, as documented in the Clarity LIMS Technical Requirements.

    • The attached files are placed on the LIMS server, in the /opt/gls/clarity/customextensions folder.

    • The Python API Library (glsapiutil.py) is placed on the Clarity LIMS server, in the /opt/gls/clarity/customextensions folder. You can download the latest glsapiutil library from the page.

    hashtag
    Attachments

    routing_template.csv:

    route_by_template.py:

    /usr/bin/python /opt/gls/clarity/customextensions/autoplaceSamplesDefault.py -l 122-7953 -u admin -p securepassword -s http://192.168.9.123:8080/api/v2/steps/122-5601 
    /usr/bin/python /opt/gls/clarity/customextensions/setDefaultNextAction.py 
    -u admin -p securepassword -s 
    https://demo-5-1.claritylims.com/api/v2/steps/122-7953 
    /usr/bin/python /opt/gls/clarity/customextensions/autopoolSamples.py -l 122-7953 -u admin -p securepassword -s http://192.168.9.123:8080/api/v2/steps/122-5601 
    /usr/bin/python /opt/gls/clarity/customextensions/createMultipleContainerTypes.py -l 122-7953 -u admin -p securepassword -s http://192.168.9.123:8080/api/v2/steps/122-5601
    Two samples for both HiSeq and MiSeq
    Route A, True, Workflow A, Stage A
  • Go to HiSeq, True, TruSeq Nano DNA for HiSeq 5.0, Library Normalization (Illumina SBS) 5.0

  • Go to MiSeq, True, TruSeq DNA PCR-Free for MiSeq 5.0, Sort MiSeq Samples (MiSeq) 5.0

  • Does a POST to the REST API to add the analytes to the queue in Clarity LIMS

    The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    -p

    --password

    The password of the current user (Required)

    -s

    --stepURI

    The URI of the step that launches the script, the {stepURI:v2} token (Required)

    -l

    --log

    The path, or limsid of the log file, the {compoundOutputFileLuidN} token (Required)

    -t

    --template

    The path to the template file (Required)*

    -r

    --template_string

    A string containing the template information (Required)*

    --input

    Uses the input artifacts instead of output artifacts. Default = False

    GitHubarrow-up-right
    file-download
    303B
    routing_template.csv
    arrow-up-right-from-squareOpen
    file-download
    7KB
    route_by_template.py
    arrow-up-right-from-squareOpen
    "Route all samples to:, Step A, Workflow A, Stage A\nRoute all samples to:, Step B, Workflow B, Stage B\nRoute A, True, Workflow A, Stage A\nGo to HiSeq, True, TruSeq Nano DNA for HiSeq 5.0, Library Normalization (Illumina SBS) 5.0\nGo to MiSeq, True, TruSeq DNA PCR-Free for MiSeq 5.0, Sort MiSeq Samples (MiSeq) 5.0"
    bash -c "/usr/bin/python /opt/gls/clarity/customextensions/Route_by_template.py -u {username} -p {password} -s {stepURI:v2} -t /opt/gls/clarity/customextensions/routing_template.csv -l {compoundOutputFileLuid0}"
    bash -c "/usr/bin/python /opt/gls/clarity/customextensions/Route_by_template.py --input -u {username} -p {password} -s {stepURI:v2} -t /opt/gls/clarity/customextensions/routing_template.csv -l {compoundOutputFileLuid0}"

    Assignment of Sample Next Steps Based On a UDF

    In the default configuration of Clarity LIMS, at the end of every step, the user is required to choose where the samples will go next - i.e. the 'next step'.

    If samples in the lab follow a logical flow based on business logic, this is an unnecessary manual task. This example shows how to automate this next step selection, to reduce error and user interaction.

    This example uses the Automatically Assign Next Protocol Step (Example) step, in the Automation Examples (API Cookbook) protocol. The examples shoes how to:

    • Automate the selection of a sample's Next Steps, as displayed on the Assign Next Steps screen of this step.

    • Use the Pooling sample UDF / custom field to determine which next step each sample is assigned.

    hashtag
    Solution

    hashtag
    Step Configuration

    The Automatically Assign Next Protocol Step (Example) step has two permitted Next Steps:

    • Confirmation of Low-plexity Pooling (Example)

    • Automated Workflow Assignment (Example)

    Depending on the value of a sample's Pooling UDF / custom field, the sample's Next Step will default to one of the permitted next steps:

    • If the value of the Pooling UDF / custom field is any case combination of No or None, the sample's next step will default to Automated Workflow Assignment (Example).

    • Otherwise, the sample's next step will default to Confirmation of Low-plexity Pooling (Example). Next step configuration (LIMS v4.x shown)

    Automation is configured as follows:

    • Behavior: Automatically initiated

    • Stage of Step: On Record Details screen

    • Timing: When screen is exited

    hashtag
    Parameters

    The script takes three basic parameters:

    An example command line is shown below.

    (Note: The location of groovy on your server may be different from the one shown in this example. If this is the case, modify the script accordingly.)

    hashtag
    User Interaction

    Assuming samples have been placed in the protocol and are ready to be processed, the user proceeds as normal:

    1. Upon reaching the transition from the Record Details screen to the Assign Next Steps screen, the script is run. A message box alerts the user that a custom script is in progress.

    2. Upon completion of the script, a custom success message is displayed.

    3. Once the success message is closed and the screen has transitioned, the default next steps display for the samples.

    hashtag
    About the Code

    Once the script has processed the input and ensured that all the required information is available, we can start to process the samples to determine their next steps.

    1. First, we retrieve the next actions list:\

    2. This endpoint contains a list of the step's output analytes, and a link to its parent step configuration. In this case, we want to retrieve the step configuration so that we can collect the URIs of the expected next steps.

    3. Once we have retrieved the step configuration, we iterate over its possible next steps, gathering their URIs and storing them by name in a Map.

    hashtag
    Assumptions and Notes

    • You are running a version of Groovy that is supported by Clarity LIMS, as documented in the Clarity LIMS Technical Requirements.

    • The attached Groovy file is placed on the LIMS server, in the folder /opt/gls/clarity/customextensions

    • GLSRestApiUtils.groovy is placed in your Groovy lib folder.

    hashtag
    Attachments

    NextStepAutomation.groovy:

    Applying Indexing Patterns to Containers Automatically

    The indexing of samples is often performed in patterns, based upon the location of the samples in the container.

    This example shows how to automate the default placing of reagents on samples, based on their container position. This greatly reduces the amount of time spent on the Add Labels screen (LIMS v6.x) and also reduces user error.

    In this example, reagent labels are assigned to samples in a predetermined pattern as the user enters the Add Reagents screen. This pattern is applied to all containers entering this stage.

    hashtag
    Solution

    Starting a Protocol Step via the API

    In some circumstances, it can be desirable to automate the initiation of a step in Clarity LIMS. In this scenario, the step is thus executed without any user interaction, ie, a liquid-handling robot drives the step to completion. This example provides a solution that allows for automatic invocation of steps via the API.

    hashtag
    Solution

    hashtag

    Once we have collected the URIs of our destination steps, we can start analyzing each sample to determine what its default should be.

    • For each possible 'next-action', we retrieve the target artifact, which then enables us to retrieve that artifact's parent sample.

    • We then retrieve the value of the sample's Pooling UDF / custom field , if it exists. If it doesn't exist, a default value is given.\

    • To set the next step, we set the step-uri attribute of the node to the URI of the expected destination step.

      • We also increment counters, so that we can report to the user what actions were taken on the given samples.

      • Once this is done, we perform an httpPUT on the action list, adding the changes to the API and allowing our defaults to be set.

    • Finally, we define the successful output message to the user. This allows the user to check the results.

    A single-line text sample UDF named Pooling has been created.
  • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

  • -u

    The username of the API user (Required)

    The {username} token

    -p

    The password of the API user (Required)

    The {password} token

    -i

    The URI of the step that launches the script (Required)

    The {stepURI:v2:http} token, in the form: http://<Hostname>/api/v2/steps/<ProtocolStepLimsid>

    file-download
    6KB
    NextStepAutomation.groovy
    arrow-up-right-from-squareOpen
    The example AssignIndexPattern.groovy script is configured to run on the Adenylate Ends & Ligate Adapters (TruSeq DNA) 4.0 step.

    hashtag
    Parameters

    The script accepts the following parameters:

    -i

    The URI of the step that launches the script (Required)

    The {stepURI:v2:http} token, in the form: http://<Hostname>/api/v2/steps/<ProtocolStepLimsid>

    -u

    The username of the API user (Required)

    The {username} token

    -p

    The password of the API user (Required)

    The {password} token

    An example command line is shown below:

    NOTE: The location of Groovy on your server may be different from the one shown in this example. If this is the case, modify the script accordingly.

    hashtag
    Step Configuration

    In the Clarity LIMS web interface, for the Adenylate Ends & Ligate Adapters (TruSeq DNA) 4.0 step (in the TruSeq DNA Sample Prep protocol), configure Automation as follows:

    Clarity LIMS v6.x

    • Trigger Location: Add Labels

    • Trigger Style: Automatic upon entry

    AppExample_ApplyingIndexPatternstoContainersAutomatically_config_v5.png

    hashtag
    User Interaction

    Assuming the user has added 96 samples and has reached the Adenylate Ends & Ligate Adapters (TruSeq DNA) 4.0 step:

    1. The user transfers all 96 samples to a new 96-well plate and proceeds with step.

    2. When the user enters the Add Labels screen, the script is initiated. A message box alerts the user that a custom script is in progress.

    3. Upon completion, the previously defined success message displays.

    4. When the success message is closed, the Add Labels screen loads, and the pattern shown below is applied to samples.

    Applying_Index_Patterns_to_Containers_Automatically_ReagentPattern.png

    hashtag
    About the code

    Once the script has processed the input and ensured that all the required information is available, we can start applying the reagents to our samples.

    1. To begin, we need to define the reagents and pattern to apply.

      • The storing of reagents can be accomplished by placing the reagents in a Map, comprised of the reagent names indexed by their respective number, i.e. 'AD030' indexed at 30.

      • The pattern can be stored as a List of Lists. This can be arranged as a visual representation of the pattern to be applied.\

      • Once we have our reagents and pattern defined, we can start processing the samples:

      • We start by retrieving the node from the reagent setup endpoint. We use this node as a base for subsequent commands.

      • We then gather the unique output artifact URIs and retrieve the output artifacts using batchGET:\

      • Next, we iterate through our list of output artifacts.

      • For each artifact, we determine its position and use its components to index our pattern. This allows us to determine which reagent should be placed on which sample.

      • Once we determine the reagent's name, we create a reagent-label node with a name attribute equal to the desired reagent name.

      • In the list of output-reagents in the reagent setup node, we find the output that corresponds to the output artifact that we are processing and add our reagent-label node to it. NOTE: We must strip off the state from our artifact's URI. The URIs stored in the step setup node are stateless and will not match the URI returned from our output artifact.

        {% code overflow="wrap" %}

        {% endcode %}

      • Once we have processed all of our output artifacts, we POST our modified setup node to the reagentSetup endpoint. This updates the default placement in the API.

      • We then define our success message to display to the user upon the script's completion.\

    hashtag
    Assumptions and Notes

    • Your configuration conforms with the script requirements documented in Solution.

    • You are running a version of Groovy that is supported by Clarity LIMS, as documented in the Clarity LIMS Technical Requirements.

    • The attached Groovy file is placed on the LIMS server, in the following location: /opt/gls/clarity/customextensions

    • GLSRestApiUtils.groovy is placed in your Groovy lib folder.

    • You have imported the attached Reagent XML file into your system using the Config Slicer tool.

    • The example code is provided for illustrative purposes only. It does not contain sufficient exception handling for use 'as is' in a production environment.

    hashtag
    Attachments

    Single Indexing ReagentTypes.xml:

    AssignIndexPattern.groovy:

    file-download
    6KB
    Single Indexing ReagentTypes.xml
    arrow-up-right-from-squareOpen
    file-download
    7KB
    AssignIndexPattern.groovy
    arrow-up-right-from-squareOpen
    Querying the Queues Endpoint

    Before we can invoke a step, we must first employ the queues endpoint.

    Every step displayed in the Clarity LIMS web interface has as associated queue, the contents of which can be queried. This image of the Nextera XT Library Prep shows the samples queued for each step in the Nextera XT Library Prep protocol.

    Starting_a_protocol_step_via_API_1.png

    For this example, we investigate the queue for the Step 1 - Tagment DNA (Nextera XT DNA) step.

    Step 1: Find Step ID

    First, we must find the LIMS ID of the step. We query the configuration/workflows resource and hone in on the Nextera XT for MiSeq protocol:

    From the XML returned, we can see that the Tagment DNA (Nextera XT DNA) step has an associated stage, with an ID of 691:

    Step 2: Find Stage ID

    If we now query this stage ID, we see something similar to the following:

    We now have the piece of information we need: namely the ID 567 that is associated with this step.

    Step 3: Query the Queues Resource

    We can use this ID to query the queues resource, which provide us with something similar to the following:

    This result matches the information displayed in the Clarity LIMS web interface. In the next image, we can see the derived samples awaiting the step.

    Starting_a_protocol_step_via_API_2.png

    hashtag
    Initiating the Step

    Now that we have the contents of the queue, starting the step programmatically is quite simple.

    All that is required is a POST to the steps API endpoint. The XML input payload to the POST request will take the following form:

    If the POST operation was successful, the API will return XML of the following form (for details, see About the Code section):

    hashtag
    User Interaction

    In the Clarity LIMS web interface, two pieces of evidence indicate that the step has been initiated:

    • The partially completed step is displayed in the Work in Progress area.

    • The Recent Activities area shows that the protocol step was started.

    Starting_a_protocol_step_via_API_3.png

    hashtag
    About the Code

    The XML payload POSTed to the steps resource is quite simple in nature. In fact there are only three variables within the payload:

    1. The step to be initiated:

    2. The type of output container to be used (if appropriate):

    3. The URI(s) of the artifact(s) on which the step should be run, along with the number of replicates the step needs to create (if appropriate):

    // For each output analyte, set its corresponding next step according to the value of the UDF 'Pooling'
    nextActionsList.'next-actions'.'next-action'.each {
        Node artifact = GLSRestApiUtils.httpGET(it.@'artifact-uri', username, password)
        Node sample = GLSRestApiUtils.httpGET(artifact.'sample'[0].@uri, username, password)
        String poolingValue = sample.'udf:field'.find { UDF_NAME == it.@name } ? sample.'udf:field'.find { UDF_NAME == it.@name }.value()[0] : 'Default'
     
        // If Pooling is a variation of No or None, set to workflowAssignment, otherwise set to pooling step
        if(!NO_VALUES.contains(poolingValue.toLowerCase())) {
                it.@action = NEXT_STEP_ACTION
                it.@'step-uri' = NEXT_STEPS[POOLING_STEP]
                poolingSamples++
        } else {
                it.@action = NEXT_STEP_ACTION
                it.@'step-uri' = NEXT_STEPS[WORKFLOW_STEP]
                workflowAssignmentSamples++
        }
    }
    bash -c "/opt/gls/groovy/current/bin/groovy -cp /opt/groovy/lib /opt/gls/clarity/customextensions/NextStepAutomation.groovy -u {username} -p {password} -i {stepURI:v2:http}" 
    // Retrieve the current protocol step
    String nextActionsURI = stepURI + '/actions'
    Node nextActionsList = GLSRestApiUtils.httpGET(nextActionsURI, username, password)
    String currentProtocolStepURI = nextActionsList.'configuration'[0].@uri
    Node currentProtocolStep = GLSRestApiUtils.httpGET(currentProtocolStepURI, username, password)
    // Determine the uris of the possible next steps
    currentProtocolStep.'transitions'.'transition'.each {
        if(NEXT_STEPS.containsKey(it.@name)) {
            NEXT_STEPS[it.@name] = it.@'next-step-uri'
        }
    }
    // Reagent Labels
    public static final def REAGENT_MAP = [
     (27):'AD027 (ATTCCT)', (23):'AD023 (GAGTGG)', (20):'AD020 (GTGGCC)',
     (15):'AD015 (ATGTCA)'
    ]
    // Reagent Pattern
    public static final def REAGENT_PATTERN = [
     [27,27,27,27,27,15,27,27,27,27,27,27],
     [27,27,27,27,15,15,15,27,27,27,27,27],
     [27,27,27,15,15,15,15,15,20,20,20,20],
     [27,27,23,23,15,15,15,20,20,20,20,27],
     [27,23,23,23,23,15,20,20,20,20,27,27],
     [23,23,23,23,23,23,23,23,23,27,27,27],
     [27,27,27,27,27,23,23,23,27,27,27,27],
     [27,27,27,27,27,27,23,27,27,27,27,27]
    ]
    bash -c "/opt/gls/groovy/current/bin/groovy -cp /opt/groovy/lib /opt/gls/clarity/customextensions/AssignIndexPattern.groovy -i {stepURI:v2:http} -u {username} -p {password}"
    <wkfcnf:workflow status="ACTIVE" uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309" name="Nextera XT for MiSeq">
    <protocols>
        <protocol uri="http://192.168.8.10:8080/api/v2/configuration/protocols/3" name="DNA Initial QC"/>
        <protocol uri="http://192.168.8.10:8080/api/v2/configuration/protocols/302" name="Nextera XT Library Prep"/>
        <protocol uri="http://192.168.8.10:8080/api/v2/configuration/protocols/10" name="Illumina SBS (MiSeq)"/>
    </protocols>
    <stages>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/690" name="DNA Initial QC"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/691" name="Tagment DNA (Nextera XT DNA)"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/692" name="PCR Amplification (Nextera DNA) 4.0"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/693" name="PCR Clean-up (Nextera DNA) 4.0"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/694" name="Bead Based Library Normalization"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/695" name="Library Pooling (Nextera XT)"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/696" name="Sort MiSeq Samples (MiSeq) 4.0"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/697" name="Library Normalization (MiSeq) 4.0"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/698" name="Library Pooling (MiSeq) 4.0"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/699" name="Denature, Dilute and Load Sample (MiSeq) 4.0"/>
        <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/700" name="MiSeq Run (MiSeq) 4.0"/>
    </stages>
    </wkfcnf:workflow>
    <stage uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/691" name="Tagment DNA (Nextera XT DNA)"/>
    <stg:stage index="0" name="Tagment DNA (Nextera XT DNA)" uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309/stages/691">
        <workflow uri="http://192.168.8.10:8080/api/v2/configuration/workflows/309"/>
        <protocol uri="http://192.168.8.10:8080/api/v2/configuration/protocols/302"/>
        <step uri="http://192.168.8.10:8080/api/v2/configuration/protocols/302/steps/567"/>
    </stg:stage>
    <que:queue name="Tagment DNA (Nextera XT DNA)" protocol-step-uri="http://192.168.8.10:8080/api/v2/configuration/protocols/302/steps/567" uri="http://192.168.8.10:8080/api/v2/queues/567">
        <artifacts>
            <artifact limsid="HES208A1PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A1PA1">
                <queue-time>2013-04-07T17:01:00.636-07:00</queue-time>
                <location>
                    <container uri="http://192.168.8.10:8080/api/v2/containers/27-2654" limsid="27-2654"/>
                    <value>1:1</value>
                </location>
            </artifact>
            <artifact limsid="HES208A2PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A2PA1"></artifact>
            <artifact limsid="HES208A3PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A3PA1"></artifact>
            <artifact limsid="HES208A4PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A4PA1"></artifact>
            <artifact limsid="HES208A5PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A5PA1"></artifact>
            <artifact limsid="HES208A6PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A6PA1"></artifact>
            <artifact limsid="HES208A7PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A7PA1"></artifact>
            <artifact limsid="HES208A8PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A8PA1"></artifact>
            <artifact limsid="HES208A9PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A9PA1"></artifact>
            <artifact limsid="HES208A10PA1" uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A10PA1"></artifact>
        </artifacts>
    </que:queue>
    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <tmp:step-creation xmlns:tmp="http://genologics.com/ri/step">
        <configuration uri="http://192.168.8.10:8080/api/v2/configuration/protocols/302/steps/567"/>
        <container-type>96 well plate</container-type>
        <inputs>
            <input uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A1PA1" replicates="1"/>
        </inputs>
    </tmp:step-creation>
    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <stp:step xmlns:stp="http://genologics.com/ri/step" current-state="Placement" limsid="24-19301" uri="http://192.168.8.10:8080/api/v2/steps/24-19301">
        <configuration uri="http://192.168.8.10:8080/api/v2/configuration/protocols/302/steps/567">Tagment DNA (Nextera XT DNA)</configuration>
        <actions uri="http://192.168.8.10:8080/api/v2/steps/24-19301/actions"/>
        <placements uri="http://192.168.8.10:8080/api/v2/steps/24-19301/placements"/>
        <details uri="http://192.168.8.10:8080/api/v2/steps/24-19301/details"/>
        <available-programs/>
    </stp:step>
    <configuration uri="http://192.168.8.10:8080/api/v2/configuration/protocols/302/steps/567"/>
    <container-type>96 well plate</container-type>
    <input uri="http://192.168.8.10:8080/api/v2/artifacts/HES208A1PA1" replicates="
    // Update the next steps in the API
    GLSRestApiUtils.httpPUT(nextActionsList, nextActionsURI, username, password)
     
    // Define the success message to the user
    outputMessage = "Script has completed successfully.${LINE_TERMINATOR}" +
                "Next steps for ${poolingSamples + workflowAssignmentSamples} samples have been set:${LINE_TERMINATOR}" +
                "${poolingSamples} samples set to '${POOLING_STEP}'.${LINE_TERMINATOR}" +
                "${workflowAssignmentSamples} samples set to '${WORKFLOW_STEP}'."
    // Retrieve the reagent setup
    Node reagentSetup = GLSRestApiUtils.httpGET(stepURI + '/reagents', username, password)
             
    // Collect the artifact URIs and retrieve the artifacts
    def artifactURIs = reagentSetup.'output-reagents'.'output'.collect { it.@uri }.unique()
    def artifacts = GLSRestApiUtils.batchGET(artifactURIs, username, password)
    // For each artifact, determine its position and set its reagent label accordingly
    artifacts.each { artifact ->
        // Split the position into its two components
        def positionIndices = parsePlacement(artifact.'location'[0].'value'[0].text())
     
        // Using our relationship maps, determine which reagent should be placed at that position
        String reagentName = REAGENT_MAP[((REAGENT_PATTERN[positionIndices[0]])[positionIndices[1]])]
     <
        // Create and attach the reagent-label node to our setup
        Node reagentNode = NodeBuilder.newInstance().'reagent-label'(name:reagentName)
        reagentSetup.'output-reagents'.'output'.find { it.@uri == GLSRestApiUtils.stripQuery(artifact.@uri) }.append(reagentNode)
    }
    // Set the reagent setup in the API
    GLSRestApiUtils.httpPOST(reagentSetup, reagentSetup.@uri, username, password)
     
    // Define the success message to the user
    outputMessage = "Script has completed successfully.${LINE_TERMINATOR}" +
            "Clarity LIMS reagent pattern has been applied to all containers."

    Setting Quality Control Flags

    A key reason to track samples is to monitor their quality. In Clarity LIMS, samples are flagged with a check mark to indicate good quality (QC Pass) or an X to indicate poor quality (QC Fail).

    There are many ways to determine quality, including concentration and DNA 260/280 light absorbance measurements. This example uses a tab-separated value (TSV) results file from a Thermo NanoDrop Spectrophotometer to:

    • Record concentration and 260/280 measurements, and;

    • Set quality flags in the LIMS.

    In this example, once the script is installed, the user simply runs and records a step and imports the results file. The EPP / automation script does the rest of the work, by reading the file, capturing the measurements in Clarity LIMS, and setting the QC flags.

    QC file formats

    As spectrophotometers are used for many measurements, within many lab protocols, file formats can vary depending on the instrument software settings. Check your instruments for specific file formats and use the example below to get familiar with the QC example scripts.

    hashtag
    User Interaction

    1. The user selects samples and runs the QC Example step.

    2. In the Operations Interface (LIMS v4 & earlier), the user sets the required minimum concentration and/or 260/280 lower and upper bounds.

    3. The QC Example process creates an output artifact (shared ResultsFile) called QC Data File. This file is shown in the Sample Genealogy and Outputs panes. The "!" icon indicates that this entry is a placeholder for a file.

    hashtag
    Example

    1. A TSV file is created by the NanoDrop spectrophotometer. The specific file used in this example is shown below.

    2. After the TSV file is imported and attached to the QC Example process in Clarity LIMS, a second script is automatically called with EPP. This second EPP script is part of a second process, called QC Example(file handling). The file attachment event triggers the second EPP script. You can see the process in the Sample Genealogy pane:

    3. The example uses the TSV file Sample ID value to locate the plate and well location to which the QC flag is to be applied. In the example file shown in Step 1: The container is QCExamplePlate.

    Automation/EPP can be used to process files when they are attached to the LIMS. Using file attachment triggers is sometimes called data analysis pipelining. Basically, a series of analysis steps across a chain of processes is triggered from one file attachment.

    hashtag
    Installation

    1. Download the zip file to the server; on a non-production server use the gls user account.

    2. Unzip the file to the following directory: /opt/gls/clarity/Applications. The contents of the zip file will be installed within that directory, to CookBook/NanoDropQC/.

    3. Next, unzip the config-slicer-<version>-deployment-bundle.zip in /opt/gls/clarity/Applications/CookBook/NanoDropQC/. Replace <version> with the version number of the included config-slicer.

    hashtag
    Installation validation

    To confirm that the example is correctly installed, follow the steps below to simulate the recording of QC information by a user in the lab:

    1. Start the Clarity LIMS Operations Interface client.

    2. Create a project, and submit a 96 well plate named QCExamplePlate full of samples

    3. Select the samples and run the QC Example process.

    hashtag
    Example modifications

    You can modify the example script to suit your lab's QC experimental methods and calculations. For example, you may want to consider phenotypic information or extra sample data recorded in the LIMS. Two modifications to the example are described below.

    hashtag
    Recording QC measurements on sample inputs instead of file outputs

    The example script writes the measurements into user-defined fields (UDFs) associated with outputs of the process. This allows multiple measurements to be recorded for one sample, by running the process multiple times. Each time the process is run on an input sample, a new process with new output results is recorded in the LIMS.

    You may instead want to write the measurements into UDFs associated with the input samples. For example, you may want to keep the data records simple: the greater the number of outputs recorded in the LIMS, the more confusing it becomes for the user to upload files and navigate results. Setting the fields on the inputs provides a single 'golden' value.

    To change the configuration and script to set QC flags and field values on inputs:

    1. Change the code in NanoDrop.groovy so that UDFs are set on inputs instead of outputs. That is, replace this line:

      with the following:

      Since you are no longer changing the outputs, you can comment or delete the line where the output is saved:

    2. Run the QC Example (preparation) process that was included in the configuration package you imported into your system.

    hashtag
    Using additional information stored in sample fields to set the QC flags

    Most labs use multiple factors to determine sample QC flags. These factors might be associated with the submitted sample, multiple instrument measurements, or even the type of project or sample.

    To demonstrate how easy it is to aggregate multiple factors into the QC flag logic, a boolean field called Human is added to the sample configuration. The script logic is modified to only set flags for human samples.

    To change the configuration and script to check for human samples:

    1. Change the code in NanoDrop.groovy (where we loop through the input/output pairs adjusting QC flags/updating UDFs), so that we first ensure we are dealing with a Human sample.

      To do this, change the loop at the end of the script from this:

      to this:

    2. Configure a checkbox UDF on Sample (this was done for you when you imported the configuration package provided in this application example).

    hashtag
    Assumptions & notes

    • Clarity LIMS v1 or later (API v2 r14)

    • Groovy 1.7.4 or later (expected location: /opt/gls/groovy/current/)

    All prerequisites are preloaded if you install on a non-production server.

    hashtag
    Attachments

    file-qc-2.0-bundle.zip:

    The users loads samples onto the spectrophotometer and follows the instrument's protocol for QC measurement.

  • After the measurements are complete, the user exports the TSV results file created by the spectrophotometer, using the NanoDrop software.

  • The user imports the TSV file into the LIMS: As Clarity LIMS parses the file, the measurements are captured and stored as output user-defined fields (UDFs). The QC Pass/Fail flags are then set on the process inputs. The flags are set according to whether they meet the concentration and/or 260/280 bounds specified in Step 2.

    • The location A01 maps to the sample on the first well of the container named QCExamplePlate.

    • The data contained in the TSV file is captured in Clarity LIMS and can be viewed on the Details tab.

    Notice that the user ran only one process (QC Example), but two processes were recorded. The first EPP script created the second process, QC Example (file handling), using the REST API. Using REST to create a process is described in the Running a Process Cookbook example.

  • The NanoDrop QC algorithm in the script compares the concentration and the 260/280 ratio for each sample in the imported TSV file. The values are entered into the process UDFs.

    • A QC Fail flag is applied:

      • If the concentration of the sample is less than specified, or;

      • If its 260/280 ratio is outside the bounds given when running the process.

    • A QC Pass flag is applied when the sample has values inside the parameters provided.

    • Samples with no associated values in the TSV file are unaffected. In this example, the minimum concentration is set to 60. A QC Fail flag is applied to sample-2 because its concentration level does not meet the minimum value specified.

  • With Clarity LIMS running, run the following server command-line call to import the required configuration into the server (i.e., the process, sample, and fields used by the scripts):

    Click the Next and Done buttons to complete the wizard.
  • When the process completes, in the process summary tab's Input/Output Explorer, you'll see the shared output file placeholder (QC Data File) in the Outputs pane and in the Sample Genealogy. Right-click this placeholder and click Import.

  • Import the example TSV nanodrop-qc-example.tsv file provided in the zip file.

  • Wait for the QC flags to become visible on a subset of the samples used as inputs to the process (those located from A:1 to A:9).

  • The process wizard provides the option for you to either generate a new plate (container) or select a preexisting plate to hold the process outputs.
    • Note: The results of this process will be placed into a plate that is different from the one in which you originally placed the samples.

  • Run the QC Process on the plate created in the previous step.

  • Edit the nanodrop-qc-example.tsv file to reflect the name of this plate (remember that the Sample ID column in the NanoDrop QC data file depends on the plate name):

    • To do this, for each row, replace QCExamplePlate with the name of the plate that now holds the outputs of the process-generated plate names, for example, would be in a format similar to "27-124")

  • Import the modified nanodrop-qc-example.tsv into the result file placeholder generated by the QC Example process.

  • Wait for the QC flags to update on the inputs for which the nanodrop-qc-example.tsv file has measurements.

  • This time, instead of the measurements appearing in the outputs of the QC Example process, they will instead be collected as UDF values on the inputs of that process. To see these measurements, select the outputs of the parent process, QC Example (preparation), and view the output Details tab.

  • Submit a new batch of samples on a 96 well plate.

  • Edit samples on wells from A:1 to A:3 so that the Human check box field is selected.

  • Run the QC Example process on all 96 samples in the plate.

  • In the nanodrop-qc-example.tsv file, update the Sample ID column with the correct plate name.

  • Import the modified nanodrop-qc-example.tsv into the result file placeholder generated by the QC Example process.

  • Wait for the QC flags to update on the inputs for which the NanoDrop QC data file has measurements.

    Note that only the inputs A:1, A:2 and A:3 will have QC flags assigned. The other inputs, including those for which the NanoDrop QC data file has data, will be left untouched because of the selection criteria implemented.

  • State on input and output URIs

    To ensure you are working with the latest version of an artifact, do not specify a state when retrieving artifacts via the REST API. In the example NanoDrop.groovy script provided, the stripStateQuery closure strips the state from the input and output URIs as it iterates through the input-output-map.

    file-archive
    7MB
    file-qc-2.0-bundle.zip
    archive
    arrow-up-right-from-squareOpen
    java -jar config-slicer-.jar \
        -u  -p  -host  \
        -o import -k file-qc-package.xml
    Default
    1/6/2011  5:27 PM
    Sample ID	ng/uL	A260	260/280	260/230	Constant
    QCExamplePlate_A01_DNAIsolate-1	62.49	1.450	1.83	1.86	50
    QCExamplePlate_A02_DNAIsolate-2	49.50	1.413	1.99	1.88	50
    QCExamplePlate_A03_DNAIsolate-3	70.00	1.198	1.84	2.03	50
    QCExamplePlate_A04_DNAIsolate-4	62.49	1.450	1.83	1.86	50
    QCExamplePlate_A05_DNAIsolate-5	49.50	1.413	1.99	1.88	50
    QCExamplePlate_A06_DNAIsolate-6	70.00	1.198	1.84	2.03	50
    QCExamplePlate_A07_DNAIsolate-7	62.49	1.450	1.83	1.86	50
    QCExamplePlate_A08_DNAIsolate-8	49.50	1.413	1.99	1.88	50
    QCExamplePlate_A09_DNAIsolate-9	70.00	1.198	1.84	2.03	50
    unzip -d /opt/gls/clarity/Applications file-qc-2.0-bundle.zip
    cd /opt/gls/clarity/Applications/CookBook/NanoDropQC/
     
    unzip config-slicer--deployment-bundle.zip
    setUDF(output, <udf-name>, <value>)
    setUDF(input, <udf-name>, <value>)
    // Save changes
    client.httpPUT(input.@uri, input)
    // client.httpPUT(output.@uri, output)
    // all inputs we have NanoDrop results for exist, we can now assign QC status 
    nanodropResults.each {
        def (input, output) = identifiedIOPairs[it.key]
     
        // Collect measurements as UDF values
        ...
        // Determine QC flags
        ...
        // Save changes
        ...
    }
    // All inputs we have NanoDrop results for exist, we can now assign QC status 
    nanodropResults.each {
        def (input, output) = identifiedIOPairs[it.key]
     
        // Skip non-human samples
        def sample = client.httpGET(input.'sample'[0].@uri)
        def isHuman = getUDF(sample, 'Human', 'false').toBoolean()
        if (!isHuman) {
            return
        }
     
        // Collect measurements as UDF values
        ...
        // Determine QC flags
        ...
        // Save changes
        ...
    }
    AutoPlace_based_on_inputs_message.png
    forbidden.png
    Assignment_of_samples_based_on_UDF.png
    AppExample_AutomaticSamplePlacementintoExistingContainers_v5.png