Tools, Data, and Workflows for tutorials

Overview
Questions:
  • How can we define the technical infrastructure for a tutorial?

  • How to define the tools needed for a tutorial?

  • How to add the needed data directly in an instance?

  • How to add the workflows related to a tutorial?

  • How can we check the technical infrastructure is working?

  • How can we make an existing Galaxy instance able to run a tutorial?

Objectives:
  • Extracting the technical description for a tutorial

  • Populating an existing instance with the needed tools, data and workflows for a tutorial

  • Creating a Galaxy Docker flavor with the needed tools, data and workflows for a tutorial

  • Testing the Galaxy Docker flavor of a tutorial

Time estimation: 30 minutes
Supporting Materials:
Last modification: Oct 18, 2022
License: Tutorial Content is licensed under Creative Commons Attribution 4.0 International License The GTN Framework is licensed under MIT

Building a Galaxy instance specifically for your training

To be able to run the tutorial, we need a Galaxy instance where all of the needed tools and data are available. Thus we need to describe the needed technical infrastructure.

The files we define in this tutorial will be used to automatically build a Docker Galaxy flavour, and also to test if a public Galaxy instance is able to run the tool.

In this tutorial, you will learn how to create a virtualized Galaxy instance, based on Docker, to run your training - either on normal computers or cloud environments.

Agenda

In this tutorial, we will deal with:

  1. Building a Galaxy instance specifically for your training
  2. Extracting workflows
    1. Testing the workflow (recommended)
  3. Creating the data-library.yaml (recommended)
  4. Creating the data-manager.yaml (optional)
  5. Creating the Galaxy Interactive Tour (optional)
  6. Testing the technical infrastructure
  7. Conclusion

Extracting workflows

Once the tutorial is ready, we need to develop a workflow that represents the steps taken in the tutorial, and then extract these workflow(s) and add them to the workflows directory in the tutorial. Additionally we will need to add some explanation about the workflow(s) in a README.md file

Hands-on: Extract the workflow
  1. Add the topic name as a Tag and the tutorial title as Annotation/Notes to the workflow using the workflow editor.
  2. Download the workflow for the tutorial
  3. Save it in the workflow directory of the tutorial
  4. Check that your workflow directory has an index.md with the contents:

    ---
    layout: workflow-list
    ---
    

Workflow testing is a great way to get feedback that your tutorial can be run successfully on a given server. When you’re giving a training this can provide peace of mind, not only are the tools installed (as is indicated by the badges we provide) but they also work.

Given the workflow you created above and have included in the tutorial folder, you’ll need to create a corresponding -test.yml file.

Hands-on: Creating the `-test.yml` file for your workflow
  1. Find the correct name for the file; if your workflow was unicycler.ga, then your test file should be unicycler-test.yml, they need to share the same prefix.

  2. Create the following structure:

    ---
    - doc: Test sample data for the workflow
      job:
        an_input_file:
          class: File
          location: https://....
          filetype: fasta
      outputs:
        ffn:
          asserts:
            has_text:
              text: ">A"
            has_text:
              text: ">B"
    

You’ll need to edit the job and outputs sections according to your workflow’s inputs and outputs. Additionally you will need to edit the steps of your workflow .ga file appropriately.

Inputs

Your workflow must use “Data Inputs” for each input dataset. For each of these input steps in the .ga file, you’ll need to do the following:

  1. Edit the label
  2. Edit the name
  3. Edit the inputs[0].name
  4. Edit the tool_state

In a normal workflow you have exported from Galaxy, you’ll see something like

{
    "id": 0,
    "input_connections": {},
    "inputs": [
        {
            "description": "",
            "name": "patient1_ChIP_ER_good_outcome.bam"
        }
    ],
    "label": null,
    "name": "Input dataset",
    "outputs": [],
    "position": {
        "left": 10,
        "top": 10
    },
    "tool_id": null,
    "tool_state": "{\"name\": \"patient1_ChIP_ER_good_outcome.bam\"}",
    "tool_version": null
}

You should synchronize the aforementioned fields so it looks like this:

{
    "id": 0,
    "input_connections": {},
    "inputs": [
        {
            "description": "",
            "name": "good_outcome"
        }
    ],
    "label": "good_outcome",
    "name": "good_outcome",
    "outputs": [],
    "position": {
        "left": 10,
        "top": 10
    },
    "tool_id": null,
    "tool_state": "{\"name\": \"good_outcome\"}",
    "tool_version": null
}

This will allow you to specify good_outcome in your job to load a file:

- doc: ...
  job:
    good_outcome:
      class: File
      location: ...
      filetype: ...

The filetype should be the Galaxy datatype of your file, for example fastqsanger, tabular, bam.

Outputs

For the outputs the process is somewhat simpler:

  1. Identify a step, the outputs of which you would like to test
  2. Convert the relevant outputs to workflow_outputs

    In a normal workflow you see

    {
        "outputs": [
            {
                "type": "txt",
                "name": "ofile"
            },
            {
                "type": "txt",
                "name": "ofile2"
            }
        ],
        "workflow_outputs": []
    }
    

    If you want to test the contents of ofile, you should change it to

    {
        "outputs": [
            {
                "type": "txt",
                "name": "ofile"
            },
            {
                "type": "txt",
                "name": "ofile2"
            }
        ],
        "workflow_outputs": [
            {"output_name": "ofile", "label": "my_output"}
        ]
    }
    
  3. You can now use the label you chose (here my_output) in your test case:

    - doc:
      job: ...
      outputs:
        my_output:
          asserts:
            has_text:
              text: 'some-string'
    

Running the Tests

You can test the file you’ve written with the following command and a recent version (>=0.56.0) of planemo:

planemo test \
	--galaxy_url "$GALAXY_URL" \
	--galaxy_user_key "$GALAXY_USER_KEY" \
	--no_shed_install \
	--engine external_galaxy \
	workflow.ga

Planemo will autodetect that the workflow-test.yml file and load that for the testing.

Creating the data-library.yaml (recommended)

The datasets needed for a tutorial can also be integrated in the Galaxy instance inside of data libraries. These allow the datasets to be easily shared with all users of a Galaxy instance. Additionally it lets trainees avoid each re-downloading the input data.

These datasets are described in the data-library.yaml files:

---
destination:
  type: library
  name: GTN - Material
  description: Galaxy Training Network Material
  synopsis: Galaxy Training Network Material. See https://training.galaxyproject.org
items:
- name: Title of the topic
  description: Summary of the topic
  items:
  - name: Title of the tutorial
    items:
    - name: 'DOI: 10.5281/zenodo....'
      description: latest
      items:
      - info: https://doi.org/10.5281/zenodo....
        url: https://zenodo.org/api/files/URL/to/the/input/file
        ext: galaxy-datatype
        src: url
Hands-on: Creating the `data-library.yaml`
  1. Copy the Zenodo link
  2. Generate the data-library.yaml file and update the tutorial metadata with the link:

    $ planemo training_fill_data_library \
             --topic_name "my-topic" \
             --tutorial_name "my-new-tutorial" \
             --zenodo_link "URL to the Zenodo record"
    
  3. Check that the data-library.yaml has been generated (or updated)
  4. Check that the Zenodo link is in the metadata at the top of the tutorial.md

Creating the data-manager.yaml (optional)

Some of the tools may require specific databases, specifically prepared for the tool. In this case, some Galaxy tools come with “data managers” to simplify this process.

If you need such data managers for your training, then you should describe how to run them in the data-manager.yaml file:

data_managers:
    - id: url to data manager on ToolShed
      params:
        - 'param1': ''
        - 'param2': 'value'
      # Items refer to a list of variables you want to run this data manager. You can use them inside the param field with 
      # In case of genome for example you can run this DM with multiple genomes, or you could give multiple URLs.
      items:
        - item1
        - item2
      # Name of the data-tables you want to reload after your DM are finished. This can be important for subsequent data managers
      data_table_reload:
        - all_fasta
        - __dbkeys__

Creating the Galaxy Interactive Tour (optional)

A Galaxy Interactive Tour is a way to go through an entire analysis, step by step inside Galaxy in an interactive and explorative way. It is a great way to help users run the tutorial directly inside Galaxy. To learn more about creating a Galaxy tour please have a look at our dedicated tour training.

Testing the technical infrastructure

Once we have defined all the requirements for running the tutorial, we can test these requirements, either in a locally running Galaxy or in a Docker container. Please see our tutorial about Setting up Galaxy for Training about how to test your tutorial requirements.

Conclusion

Key points
  • Tools, data and workflows can be easily integrated in a Docker flavor to have a useful technical support for a tutorial

  • A Galaxy Docker flavor is a great support for training

  • A Galaxy Docker flavor can be deployed ‘anywhere’ and is scalable

Frequently Asked Questions

Have questions about this tutorial? Check out the tutorial FAQ page or the FAQ page for the Contributing to the Galaxy Training Material topic to see if your question is listed there. If not, please ask your question on the GTN Gitter Channel or the Galaxy Help Forum

Feedback

Did you use this material as an instructor? Feel free to give us feedback on how it went.
Did you use this material as a learner or student? Click the form below to leave feedback.

Click here to load Google feedback frame

Citing this Tutorial

  1. Bérénice Batut, Björn Grüning, Saskia Hiltemann, Helena Rasche, 2022 Tools, Data, and Workflows for tutorials (Galaxy Training Materials). https://training.galaxyproject.org/training-material/topics/contributing/tutorials/create-new-tutorial-technical/tutorial.html Online; accessed TODAY
  2. Batut et al., 2018 Community-Driven Data Analysis Training for Biology Cell Systems 10.1016/j.cels.2018.05.012


@misc{contributing-create-new-tutorial-technical,
author = "Bérénice Batut and Björn Grüning and Saskia Hiltemann and Helena Rasche",
title = "Tools, Data, and Workflows for tutorials (Galaxy Training Materials)",
year = "2022",
month = "10",
day = "18"
url = "\url{https://training.galaxyproject.org/training-material/topics/contributing/tutorials/create-new-tutorial-technical/tutorial.html}",
note = "[Online; accessed TODAY]"
}
@article{Batut_2018,
    doi = {10.1016/j.cels.2018.05.012},
    url = {https://doi.org/10.1016%2Fj.cels.2018.05.012},
    year = 2018,
    month = {jun},
    publisher = {Elsevier {BV}},
    volume = {6},
    number = {6},
    pages = {752--758.e1},
    author = {B{\'{e}}r{\'{e}}nice Batut and Saskia Hiltemann and Andrea Bagnacani and Dannon Baker and Vivek Bhardwaj and Clemens Blank and Anthony Bretaudeau and Loraine Brillet-Gu{\'{e}}guen and Martin {\v{C}}ech and John Chilton and Dave Clements and Olivia Doppelt-Azeroual and Anika Erxleben and Mallory Ann Freeberg and Simon Gladman and Youri Hoogstrate and Hans-Rudolf Hotz and Torsten Houwaart and Pratik Jagtap and Delphine Larivi{\`{e}}re and Gildas Le Corguill{\'{e}} and Thomas Manke and Fabien Mareuil and Fidel Ram{\'{\i}}rez and Devon Ryan and Florian Christoph Sigloch and Nicola Soranzo and Joachim Wolff and Pavankumar Videm and Markus Wolfien and Aisanjiang Wubuli and Dilmurat Yusuf and James Taylor and Rolf Backofen and Anton Nekrutenko and Björn Grüning},
    title = {Community-Driven Data Analysis Training for Biology},
    journal = {Cell Systems}
}
                   

Congratulations on successfully completing this tutorial!
Developing GTN training material
This tutorial is part of a series to develop GTN training material, feel free to also look at:
  1. Overview of the Galaxy Training Material
  2. Adding auto-generated video to your slides
  3. Adding Quizzes to your Tutorial
  4. Contributing with GitHub via command-line
  5. Contributing with GitHub via its interface
  6. Creating a new tutorial
  7. Creating content in Markdown
  8. Creating Interactive Galaxy Tours
  9. Creating Slides
  10. Design and plan session, course, materials
  11. Generating PDF artefacts of the website
  12. GTN Metadata
  13. Including a new topic
  14. Principles of learning and how they apply to training and teaching
  15. Running the GTN website locally
  16. Running the GTN website online using GitPod
  17. Teaching Python
  18. Tools, Data, and Workflows for tutorials
  19. Updating diffs in admin training