Toy Example

Let’s discuss on how to interact with the DARE Platform!

Before trying this toy example, we recommend to read the features and API documentation sections in order to have an idea of the components that constitute the DARE Platform.

DARE platform exposes a RESTful API for workflow execution to research developers, in other words provides workflow execution as a service. Let’s discuss the necessary steps that our users need to implement.

First of all, users should prepare their python script with their dispel4py workflow, that makes also use of provenance! Before the official execution, the DARE platform provides a testing environment for workflow development and testing! Research developers can test their code directly inside the platform and make the necessary adjustments using our playground module.

Once the workflow is ready for execution, the research developers can execute their workflow using the official execution API. In order to interact with the DARE platform, we provide a script named helper_functions, which wraps the necessary API calls to the DARE platform. In the below example code, we make use of this script in order to authenticate a user, create workspace & register workflows, execute a workflow and monitor the execution etc.

The necessary steps to execute a workflow are listed below.

  1. Authentication

# Imports
import json, os
import requests
import helper_functions as hf

credentials = hf.read_credentials()
token = hf.login(LOGIN_HOSTNAME, credentials["username"], credentials["password"], credentials["issuer"])["access_token"]
  1. D4p Information Registry: create a workspace and register your workflow
    hf.delete_workspace('mySplitWorkspace', D4P_REGISTRY_HOSTNAME, token)
except (BaseException, Exception):

# Register a workspace
workspace_url, workspace_id = hf.create_workspace("", "mySplitWorkspace", "", D4P_REGISTRY_HOSTNAME, token)
workspace_id = int(workspace_id)
print('Workspace URL: ' + ''+str(workspace_id))
print('Workspace ID: ' + str(workspace_id))

# Register ProcessingElementSignature 
pe_url = hf.create_pe(desc="", name="mySplitMerge", conn=[], pckg="mysplitmerge_pckg", workspace=workspace_url,
                       clone="", peimpls=[], hostname=D4P_REGISTRY_HOSTNAME, token=token)
print('PESig URL: ' + ''+str(pe_url.split('/')[4]))

# Register ProcessingElementImplementation

# Online code
code = requests.get('')

impl_id = hf.create_peimpl(desc="", code=str(code.text), parent_sig=pe_url, pckg="mysplitmerge_pckg",
                          name="mySplitMerge", workspace=workspace_url, clone="", hostname=D4P_REGISTRY_HOSTNAME,
print('PE Implementation ID: '+ str(impl_id))
  1. Use execution API

a) Execute a workflow

hf.submit_d4p(impl_id=impl_id, pckg="mysplitmerge_pckg", workspace_id=workspace_id, pe_name="mySplitMerge",
           token=token, hostname=EXEC_API_HOSTNAME, n_nodes=6, no_processes=6, iterations=1)

b) Monitor the execution

from IPython.display import display, clear_output
import time
while True:
    resp = F.my_pods(token=token, hostname=EXEC_API_HOSTNAME)
    if not json.loads(resp):

c) Upload files in the platform

os.system('zip -r input.json')
F.upload(token=token, path='d4p-input', local_path='', hostname=EXEC_API_HOSTNAME)

d) List your folders & files

resp = F.myfiles(token=token, hostname=EXEC_API_HOSTNAME)

resp = F._list(token=token, path="/path/to/file", hostname=EXEC_API_HOSTNAME)

e) Download a file from the platform, path="/path/to/file", hostname=EXEC_API_HOSTNAME, local_path='logs.txt')

f) Share files in B2DROP

F.send2drop(token=token, hostname=EXEC_API_HOSTNAME, path="/path/to/file/in/platform")

The above examples are available in our GitLab repository. You can download the mySplitMerge folder and use the Jupyter Notebook to interact with the platform.