Getting Started with Medium One

Welcome to the Medium One IoT Platform! Let's get you started with a tutorial.

In this tutorial, you will:

  1. Set up your account and configure users

  2. Send data

  3. Build a real-time workflow

  4. Inspect results

Here's what you need to get started:

  1. Medium One Prototyping Sandbox Account [Buy]

Step 1. Create an API Key

An API Key determines the privilege and visibility within the Medium One platform.

  1. On the SetupManage API Keys page, click Add New API Key. Enter a brief description and make sure Enabled is checked.

Note: This is your newly created API Key that you will need to send events in a later step.

Step 2. Create an API Basic User

Now you will need to create an API Basic User to send data into the cloud. An API Basic User can create/retrieve/update/delete event data, retrieve/update their own user information, and fetch processed events.

In this tutorial, you will create the API Basic User through our web portal. They can also be created via REST API. For more details on how to do this, see our REST API docs.

  1. On the SetupManage Users page, click Add New User and fill out the username and password info.

  2. After creating a new user, you should now see it appear on the Manage Users page. You may have to refresh to see this change.

  3. Save the credentials for the API Basic User as we will be using them in the next step.

Step 3. Send a Sample Event

Let's use cURL commands in the terminal to send data created on behalf of the API Basic User.

If you are using a Windows computer, you will need to download cURL with SSL here and the cacert.pem here. Rename it 'curl-ca-bundle.crt' and place it in the same folder as curl.exe

First, we will need to log in using cURL before sending the event.

  1. In the cURL command below, replace < API Basic User > with the username you created in the previous step. Be sure to keep the quotes around the username.

  2. Replace < password > with the password for that API Basic User. Be sure to keep the quotes around the password.

  3. Replace < API Key > with the API Key that you created in Step 2. Be sure to keep the quotes around the API Key.

Mac OS / Linux

curl -c cookies.txt -X POST -k -H 'Content-Type: application/json' -H 'Accept: application/json' -i '' -d '{"login_id": "< API Basic User >", "password": "< password >", "api_key": "< API Key >"}'


keep all of the backslashes ( \ ) before the quotes.

curl -c cookies.txt -X POST -k -H "Content-Type: application/json" -H "Accept: application/json" -i "" -d "{\"login_id\": \"< API Basic User >\", \"password\": \"< password >\", \"api_key\": \"< API Key >\"}"

If your login was successful, you will see a 200 message:

HTTP/1.1 200 OK Server: nginx/1.4.6 (Ubuntu) Date: Tue, 01 Mar 2016 18:45:49 GMT Content-Type: application/json; charset=UTF-8 Content-Length: 4 Connection: keep-alive

Note: Using cURL, you will only need to log in once every 24 hours.

After logging in, you can now send an event to the cloud using that API Basic User.

  1. In this tutorial, we will be sending "force_strength":67 and "level":"apprentice" in the same event. You can find the cURL commands for this below.

Mac OS / Linux

curl -b cookies.txt -X POST -k -H 'Content-Type: application/json' -i '<API Basic User>' -d '{"observed_at":"2015-03-04T12:00:00.0-0700", "event_data": {"force_strength":67, "level":"apprentice"}}'


keep all of the backslashes ( \ ) before the quotes.

curl -b cookies.txt -X POST -k -H "Content-Type: application/json" -i "<API Basic User>" -d "{\"observed_at\":\"2015-03-04T12:00:00.0-0700\", \"event_data\": {\"force_strength\":67, \"level\":\"apprentice\"}}"

You should see a 200 success message. 

Now that you've sent in an event, we need to activate the tags to use them in workflows.

Step 4. Activate Tags

Tags are what we use to classify the "keys". They are each sent with a "value"

In the event you just sent:

  • The tags are "force_strength" and "level"

  • The values are 67 and "apprentice"

  1. On the ConfigData Streams page, click Edit for the "raw" Data Stream.

  2. Select the Active checkbox for the two listed tags (force_strength and level), and then Save Data Stream.

Step 5. Create a Workflow

Workflows are used to process data from events to generate a new event. These new events created from workflows are called processed events.

First, we'll set up the workflow triggers, modules, and outputs.

  1. On the Workflow Studio page, click Create to create a new Workflow.

  2. Then click the Tags & Triggers pane on the right toolbar, and under the "raw" dropdown, drag and drop force_strength and level tags onto the main canvas. These are the triggers that signify what causes the Workflow to run.

  3. Next, from the Modules pane, click on the "Foundation" dropdown then drag and drop the Base Python module onto the canvas. Base Python is a programmatic module that you can code in Python to process your data.

  4. From the Outputs panel, drag and drop the Processed Stream - Single module onto the canvas. This output module signifies that your Workflow will generate a processed event every time it is triggered.

  5. Double click on the Base Python module box, and expand the IONodes section and click Add Input. A new row should appear for input "in2". Click Save.

  6. With two inputs, you can connect your "force_strength" and "level" triggers to the Base Python module. Connect the input and output connectors (click and drag) to make the Workflow look something like this:

Now we'll add some Python code to our Workflow.

  1. Double click the Base Python module again, and replace the default script with the following in the script text area.

new_strength = IONode.get_input('in1')['event_data']['value'] + 28 
new_level = "jedi"
output = IONode.set_output('out1', {'new_strength': new_strength, 'new_level': new_level})

Save the code in the Base Python module.

This script outputs processed events with tags "new_strength" and "new_level" every time the Workflow runs:

  • "new_strength" is created by taking input 1 (in this case it is the "force_strength" tag) and adding 28 to that value.

  • "new_level" is set as "jedi"

Lastly, you want to Activate the Workflow to make it go live.

  1. Go to the Revisions panel, and Activate the Workflow by clicking on the check icon under the most recent revision.

  2. A green "activated" icon should appear next to the revision name.

Your workflow is now live and will be generating an output whenever it is triggered.

If this doesn't work, you may need to double-click the Processed Stream output and save first.


  • The star icon means this revision is favorited and it cannot be deleted. Activated revisions must always be starred and cannot be unstarred until they are deactivated.

  • In this example, the “in2” input is actually not used, and we chose to use output tag names that are different from the inputs.

Step 6. Trigger the Workflow to Generate a Processed Event

Now that you've created your Workflow, it's time to trigger it to generate a processed event.

  1. Using cURL, send in the same exact event that you sent in from Step 4 again.

Sending this new event will automatically trigger your newly created Workflow and will generate a new processed event.

Step 7. View Tags of Processed Stream

Next, you will verify that the processed event is created and sent to the processed data stream.

  1. Similarly to Step 4, on the Config → Data Streams page, click Edit for Processed Events.

  2. You will find the two new tags (new_strength and new_level) listed and already selected Active, so no changes are necessary.

Note: Processed event tags are always automatically selected as Active.

Step 8. Visualize Processed Data on Dashboard

Now you can view your newly processed data on the Dashboard.

  1. Click on the Dashboard on the navigation sidebar.

  2. Select the Single User Table widget from the widget selector. 

  3. From the Select User dropdown, select the username you gave to the API Basic User.

  4. Click the configuration gear icon and select the 4 tags (force_strength, level, new_strength, and new_level). If you don't see all 4 tags, you may need to refresh your browser.

  5. You should now see both the raw and processed events in the single user table.

Congratulations, you have now created a Workflow and processed a simple data event. Now you are ready to connect your data sources and get intelligent about your data.

Want to learn more about what Medium One can do?

Check out our documentation and tutorials & get started on your next IoT project!

Recent Posts

See All

The Medium One workflows operate on UTC time, which means that they cannot tell when it is Daylight Savings Time or not. Typically, the pytz library is used to work around this, but this library is no