Demo

Toy Example

Let’s discuss on how to interact with the DARE Platform!

Before trying this toy example, we recommend to read the features and API documentation sections in order to have an idea of the components that constitute the DARE Platform.

DARE platform exposes a RESTful API for workflow execution to research developers, in other words provides workflow execution as a service. Let’s discuss the necessary steps that our users need to implement.

First of all, users should prepare their python script with their dispel4py workflow, that makes also use of provenance! Before the official execution, the DARE platform provides a testing environment for workflow development and testing! Research developers can test their code directly inside the platform and make the necessary adjustments using our playground module.

Once the workflow is ready for execution, the research developers can execute their workflow using the official execution API. In order to interact with the DARE platform, we provide a script named helper_manager.py, which wraps the necessary API calls to the DARE platform. In the below example code, we make use of this script in order to authenticate a user, create workspace & register workflows, execute a workflow and monitor the execution etc.

The necessary steps to execute a workflow are listed below.

  1. Authentication
BASE_URL = "https://platform.dare.scai.fraunhofer.de/"

import json
from os import getcwd
from os.path import join, exists

import requests

# Download the DARE platform client - helper function library
hf_scripts = requests.get("https://gitlab.com/project-dare/exec-api/-/raw/master/client/helper_manager.py")
if hf_scripts.status_code == 200:
    with open("helper_manager.py", "w") as f:
        f.write(hf_scripts.text)
from helper_manager import DareManager

# Running notebook locally

credentials_file = "credentials.yaml"
if not exists(credentials_file):
    credentials_yaml = requests.get("https://gitlab.com/project-dare/exec-api/-/raw/master/client/example_credentials.yaml")

    if credentials_yaml.status_code == 200:
        with open("credentials.yaml", "w") as f:
            f.write(credentials_yaml.text)

# if you are working locally and not in the JupyterHub, pass the parameter:
# config_file="credentials.yaml
dm = DareManager(dare_platform_url=BASE_URL, config_file=credentials_file)
# only if you are working locally
print(dm.token)

# Running notebook in DARE platform's JupyterHub
dm = DareManager(dare_platform_url=BASE_URL)
print(dm.token)

  1. D4p Information Registry: create a workspace and register your workflow
code = requests.get('https://gitlab.com/project-dare/exec-api/-/raw/master/examples/mySplitMerge/scripts/mySplitMerge_prov.py')
code = str(code.text)

# TODO provide a name for your workflow
name = "mySplitMerge"
workspace_id, impl_id = dm.register_d4p_workflow(name=name, code=code)
print("Your workspace ID is: {}".format(workspace_id))
print("Your PE ID is: {}".format(impl_id))
  1. Use execution API

a) Execute a workflow

dm.exec_d4p(nodes=6, no_processes=6, iterations=1, 
            reqs='https://gitlab.com/project-dare/exec-api/-/raw/master/examples/mySplitMerge/scripts/reqs.txt')

b) Monitor the execution

dm.monitor_job()

c) Upload files in the platform

remote_path = "d4p-input"
filename = "input.json"
dm.upload_file(remote_path=remote_path, filename=filename)

d) List your folders & files

dm.list_workspace()
dm.list_exec_folder()

e) Download a file from the platform

dm.download_file(filename="logs.txt")

f) Share files in B2DROP

# Case 1 file
kind = "file"
dare_path_kind = "run" # you can also use upload if you want to use some file from the uploads directory
# use the dare_directory parameter if you want a file from a different run directory than the one stored in the session
filename = ""
remote_dir_name = None
dm.b2drop_share(kind=kind, filename=filename, dare_path_kind=dare_path_kind, remote_dir=remote_dir_name)

# Case 2 directory
kind = "directory"
dare_path_kind = "run"
remote_dir_name = None
# Again use the dare_directory if you want to upload another run_dir and not the one in the session
dm.b2drop_share(kind=kind, dare_path_kind=dare_path_kind, remote_dir=remote_dir_name)

For hands-on practice with the DARE platform, for both Dispel4py and CWL workflow, we provide a tutorial jupyter notebook in order to get familiar with the platform. We include the above examples as well as additional material, which can be found in our GitLab repository. You can download the tutorial folder and use the Jupyter Notebook to interact with the platform.

We provide client-side helper functions in order to make easier the interaction with the platform. You can find the relevant documentation here.

Contact our team to request an account to our JupyterLab to access our demos and tutorials directly into the platform!