Pages

Thursday, November 16, 2017

Integrating Flic with Microsoft Flow and Azure

Overview

Recently at an internal conference, I got my hands on this pretty cool tiny button Flic, go read about it here. Apart from all the cool stuff you can do with Flic’s extensive support for integration, my interest was around how Flic can integrate with Microsoft Flow.  However, I wanted to share some clarify on how can you do this and things that Flic could do better to make this integration experience better.

What I have blogged here is to simply capture each click, double click and hold events from the Flic button, trigger Microsoft Flow from the Flic mobile app. From the Microsoft Flow, capture each of these events, log the trigger message as an Azure Queue message.

Once the message got in to the Azure world, you have many opportunities, such as to pick up the incoming message by integrating using many of the Azure event driven models such as ….

Setting Up your Flic

I am assuming you have your one or more Flic buttons.  Go ahead and setup your Flic. This is very simple and straight forward step following guidance from the Flic.io site.

As I was setting up my single Flic I had on my iOS app (Equally applies to Android as well), I named my flic like below.  So make sure to give unique name for each of your Flics.

image

Microsoft Flow App Install and Setup

On your mobile device,  ensure to install the Microsoft Flow app from your app store. Once the Flow app is installed, ensure you login to the Flow app with your  account that is same as your Azure account. The email account for the Flow app and the above Flic app doesn't need to be the same, but just remember which email you are using in each app to ensure the connectivity between each app. 

Setting Up Flic to do Flow tasks

Now lets set up your Flic to call the Flow. So click on your Flic from the mobile app (Not the physical Flic button). For each event Click, Double Click and the Hold, assign Microsoft Flow from the Advanced category and your assignment would look like below. (What can Flic do better is allow this trigger integration to be named as this will get listed as simply Flic in the Microsoft Flow later) .

image

Azure Queue

From your Azure account,  login to the Azure portal and create a generic storage account and you can dedicate a separate Resource Group.

image

From this storage account, under the Queue, create a new Queue and I assigned name such as single-clicks assuming I would only write a single click events to this queue.

image

image

Likewise I have created separate queue for each type of event: double-click and hold.

image

Building Microsoft Flow

Now run the Microsoft Flow from the Mobile or browse to the flow.microsoft.com on PC as you can build the App from either place with same consistent interface.  Below I will show from the mobile for simplicity.

Form your Microsoft Flow app, choose to create new Flow and then search for ‘flic’ and you will see two below triggers. Choose When a Flic is pressed

image

Choose any, this is just to establish the body of the incoming message from the Flic.

image

Next add Parse JSON action  to parse the incoming message as JSON object and add the body as Content. Next add same payload as {“body”:”text”} to auto generate the schema.

image

image

image

Next lets add a Switch so that we can handle each click type and write to respective Azure  queues:

image

Next choose to Click type as On Switch condition.

image

Choose action as Put message to queue

image

image

Choose single-click queue to write the single click event message.

image

image

Likewise add additional switch cases to write for double-click event to double-click queue and hold event to to hold queue.

image

image

image

image

Now lets save the Flow and test the process with following clicks and observer the corresponding queue in the azure.

click

image

double click

image

hold

image

If you do not see expected message, then check to ensure your flow is working properly.

Saturday, September 9, 2017

Understanding and Deploying Azure IoT Hub Device Provision Service-Part-I

Overview

Lets get some context around what is this blog about and who and why should you care. (This is in reference to  Azure IoT Hub Device Provisioning Service)

If you are a manufacturer of devices or machines or appliances or something that needs to be connected that your customer or consumers buy, meaning you are producing 100s,1000s or millions of these products. When the end consumer gets your product, you want these products to connect to your IoT listening end point in the cloud (such as Azure IoT Hub),  in safe and secure way to the nearest geographical IoT end point and you want  all of this process to happen seamlessly. Seamlessly meaning at the manufacturing time, without you encoding a Device ID/Key in to your product that your your IoT Cloud End Point provides, Azure IoT Hub Device Provisioning (DPS) is something you should look at seriously and likewise all my global manufacturing customers are!  What we call semantically as a  “Zero-Touch Provisioning”.

How would DSP work for you?

You will make your products that support TPM/HSM (reference in IoT Hub Device Provisioning Service security concepts).  You ship these products to your customers across the globe. After receiving the product, for the first time your customer turns on the product. The product calls home (To the cloud IoT Hub Device Provisioning Service). This is where the magic happens. The IoT Hub Device Provisioning End Point validates your product by the TPM/HSM ID/Keys. Based on the geo location and based on the Enrollment option you have pre-configured in the DPS, the DPS will figure out which of IoT Hubs  is the best fit. These IoT Hubs are one of those that you had already pre created across the geo locations that you support your customers.

How can I test this DSP quickly without doing anything to product?

At the time of writing this blog, this DPS service was in preview. But I am going to reference these two articles in sequence and share my experience of an actual build out and provide clarity on the steps and some insights on results.  What we are going to be building out is like the  2nd article suggested, is simulate your what product would communicate with the DPS.

  1. Set up the IoT Hub Device Provisioning Service (preview) with the Azure portal 
  2. Create and provision a simulated device using IoT Hub Device Provisioning Service 

I won’t repeat what is already  been document in the above referenced articles, so I suggest you read the articles first and reference this blog for clarity on the steps. So continue with the steps in the article and compare my blog to make sure you get your steps correct. I have outlined each major sections to give you context of which section my guideline convers.

Steps described in the article 1 Set up the IoT Hub Device Provisioning Service (preview) with the Azure portal  are straight forward but here are the names I had used for reference in the future steps:

image

image

Steps described in Article 2 Create and provision a simulated device using IoT Hub Device Provisioning Service 

Prepare the development environment

In this section under the steps 7 where you are going to run the Simulator.exe, open another Git Bach command prompt, here I have actual run corrected path(at the time of preview).

image

The run will prompt for port opening in the Windows Firewall. Click on Allow access.

image

Create a device enrollment entry in the Device Provisioning Service

Step 1 opening the solution:

image

image

image

image

In the build output you will notice 1 failed as one of the project is marked for not to build. So your results are good.

Step 2:Right-click the tpm_device_provision project and select Set as Startup Project. .  .

clip_image001

clip_image002

Run the solution

clip_image003

The output window displays the Endorsement Key and the Registration ID needed for device enrollment.

image

Note down these values

image

Enter to continue the run time command closes.

Simulate first boot sequence for the device

Step 2. In Visual Studio on your machine, select the sample project named dps_client_sample and open the file dps_client_sample.c

clip_image001[4]

clip_image002[4]

Step 3. Assign the ID Scope value to the dps_scope_id variable. Notice that the dps_uri variable has the same value as the Global device endpoint.

clip_image003[4]

Step 4. Right-click the dps_client_sample project and select Set as Startup Project. Run the sample

clip_image004

clip_image005

Notice the messages that simulate the device booting and connecting to the Device Provisioning Service to get your IoT hub information.

Now this is where, what your actual product would do when it calls home for the first time when the customer turns on your product. Lets review this sequence of operations and understand the handshake between your device and the DPS.

In the first communication, your device had established with the global DSP service. This global DPS end point is again geo distributed with globally single unique name making your device provisioning code consistent.

 image

Next your device sends your TPM/HSM attributes. In this case as we choose to simulate the TPM, a registraionID , the protocol name “tpm” and the associated tpm endorsementKey.

image

Next you will notice the DPS will send 401 status requesting for an authorization, along with the new DSP provided unique authorizationKey.

image

Next your device will follow the instructions and send the SAS key

image

At this time the DPS responds acknowledging the authorization with 202 status code and sends the status that the DPS is now “assigning” the device to an IoT Hub

image

Now once the DPS is successful allocating the IoT Hub, your device will receive the status code of 200. The DPS will register/add your device in the corresponding IoT Hub (“assigniendHub”), there by acquiring a new “device ID” (This is an auto generated name, you should be able to provide a specific name), and all other registration signatures.  Next you should see device starts communicating directly with the allocated IoT Hub. 

image

Lets go to that corresponding Iot Hub in the Azure portal.Voila! You should see your device registered.

image

So this completes all the steps in the 2nd article.

Conclusion

We just simulated a single device provisioning and understanding the end-to-end flow of communication between your device and the Cloud.  Your next question is well we didn’t actually have more than one device nor more than one IoT Hub.  Sure,  read the next article Provision devices across load-balanced IoT hubs and I will follow up with a blog on Experimenting and Extending DPS with multiple simulated devices and multiple IoT Hubs-Part-II (coming soon….)

Thursday, August 31, 2017

Dissecting and Experimenting to Extend the IoT Edge Gateway-Part II

Overview

This blog is an extension of Developing Azure IoT Edge-Part I.  Picking up from the Conclusion section from the Part I, the goals for this blog are to explore following objectives.

  • Exploring SensorTag capabilities
  • Providing clarity on the telemetry data by dissecting the IoT Edge code base.
  • Connecting this telemetry output to do Hot Path/Warm Path/Cold Path analytics.
  • Connecting additional SensorTags to the same Raspberry Pi IoT Edge Gateway

Exploring SensorTag Capabilities

Before we dive in to the telemetry data, lets first take look at the Sensor device itself to better understand what we are ready from it. My SensorTag model number is CC2650. This sensor has quite a punch to the list of sensors it has!

10 low-power MEMS sensors. Imagine all the interesting IoT features you could implement with this cost effective tiny sensor!

  1. light
  2. digital microphone
  3. magnetic sensor
  4. humidity
  5. pressure
  6. accelerometer
  7. gyroscope
  8. magnetometer
  9. object temperature (Make note of this)
  10. ambient temperature (Make note of this)

clip_image001

Understanding the Telemetry Data from IoT Edge

From the output of running the IoT Edge in the Pi terminal, you see below output under column Telemetry Data. Lets understand how is this configured and where is this data coming from and what they meant.

Lets explore following C code base which is responsible for the above display output of the telemetry. Refer to https://github.com/Azure/iot-edge/blob/d2c251d76a231eff3af4ffc4b854031dc9cacdde/samples/ble_gateway/ble_printer/src/ble_printer.c

For each configuration GATT UUIDs , in your configuration file, it is configured to read all the first 7 parameters one time and ready the Temperature every 1 second, which is what the telemetry output you see.

Configuration

From gateway_sample.json

GATT Characteristic IDs

From the ble_printer.c

Telemetry Data

From the IoT Edge run

image image

clip_image001[9]

From the temperature telemetry, displayed as raw Ambient temperature read from the IR sensor, and raw Object temperature.

Understanding the Telemetry Data from monitor-events

Below is the iothub-explorer tool running under the monitor-events mode and displaying each messages received from the connected devices (This is referenced as D2C messages).  What you see from each D2C message is BLE device Index, timestamp, the UUID and source, but there are no telemetry data!

From the above telemetry configuration and the fact that

image

From above, the --- properties ---- is the actual raw telemetry sent by the Sensor Tag to the Pi. This is the raw data and not in JSON format and there is no actual temperature data. The UUID is for temperature signature. More on this later if I am able to discover details on the code base as to how we can fix this to send well formed JSON. Again I will update the Conclusion section to provide the solution in a later blog.

Connecting this telemetry output to do Hot Path/Warm Path/Cold Path analytics

Let me quickly define each of these IoT Analytics approaches:

  • Hot Path: As messages coming to IoT Hub, you want to perform your analytics as quickly as possible since these are time sensitive scenarios.  That means resources with higher cost to get the quickest results is acceptable.
  • Warm Path: As messages coming to IoT Hub, you want to perform your analytics soon enough that a business defined delay or lag in determining the results is acceptable. That means resources with moderate cost to get the results in business defined time lag is acceptable.
  • Cold Path: You want to perform your analytics with historical data to derive insights. That means resources with lowest cost to derive the insights is acceptable.

My narrative here is to distinguish each analytics approach in the IoT pipe-line. Your definition may  differ.

Lets review how we can begin to approach Hot Path analytics such that we can visualize the data real time in a visualization tool. Following an existing article guidance Visualize real-time sensor data from Azure IoT Hub using Power BI, lets create necessary resources, like wise here I am sharing the results under each sections from the article.

Add a consumer group to your IoT hub

clip_image001[1]

Create a Stream Analytics job

image

Add an input to the Stream Analytics job

clip_image001[3]

clip_image001[5]

Add an output to the Stream Analytics job

clip_image001[7]

clip_image002

clip_image003

clip_image004

clip_image005

clip_image006

Run the Stream Analytics job

clip_image007

Now as you will notice there is a warning sign from the input that we created, lets further explore that by click on the input.

clip_image008

The warning clearly states the issue. Like mentioned in the Part I blog, the messages that are sent to Iot Hub are not in JSON format.

clip_image009

Lets take this how we can resolve this and follow the Conclusion section. Again I will update the Conclusion section to provide the solution in a later blog.

Connecting additional SensorTags to the same Raspberry Pi IoT Edge Gateway

The fundamental idea of a Gateway concept in IoT Scenarios is that the device or a thing that plays the role of Gateway is that it can connect to many things and play a role of gate keeper to talk further with the Cloud.  In Part I we connected a single SensorTag. Here lets explore how we can connect a second SensorTag device.

I had bunch of old SensorTag CC2541, these are 1st generation and no more sold. But for the purpose of this experiment, this works.

Office Lens 20170830-110933

First lets ensure that you are not running the IoT Edge, otherwise stop the run.

From the Pi SSH session, lets initiate Bluetooth shell and try connect this 2nd sensor tag and find out the MAC address:

image

Lets Power On the 2nd sensor tag

image

image

Note the MAC  78:A5:04:8C:13:98

Enter scan off to stop.

image

Now lets connect to this SensorTag in our bluetooth shell

image

Lets disconnect from the device using the disconnect command and then exit from the bluetooth shell using the quit command:

image

image

Now lets add the 2nd SensorTag device to the IoT Hub. Navigate to Azure Portal and to your IoT Hub, Device Explorer and add new sensor tag and make note of the new Primary Key for the SensorTag_02.

image

Now lets take the original sensor tag sample configuration file, lets rename that to gateway_Sensor1.json and test to ensure the configuration works

 image

Take the gateway_Sensor1.json file and copy to gateway_Sensor2.json. Then lets update the MAC address for the sensor two and save.

image

image

Now run the gateway for Sensor 2 to test and ensure the sensor two works, make sure you have the sensor two powered on before the run:

image

Now lets create a 3rd configuration file that will have both sensors. Copy the Sensor1 file to gateway_Extend.json. Add the 2nd SensorTag device in the configuration file via nano editor from your SSH session:

image

Locate the Mapping section, notice that under the args where we had our first Sensortag entered, I have comma separated and added our second sensor tag with respective MAC and Primary Key for the deviceKey.

image

Next locate the below section

image

Copy and paste an existing section of the “controller_index along with the instructions separated by comma (like curser green highlight below. Update the controller_index  with 1 and replace the respective MAC address.

image

image

Save the configuration Ctl+O and exit Ctl+X. Lets copy the content of the file and test to ensure the JSON is welformed. I used

https://jsonformatter.curiousconcept.com/ to test.

Now run the IoT Edge Gateway from the Pi. From the bellow error message, it looks like even thought the configuration is well formed JSON, the JSON parsing modules are not able to read appropriate configfuration sections where we extended with the second BLE controller.

image

Again I will update the Conclusion section to provide the solution in a later blog.

Conclusion

So in conclusion, I am expecting to find out more clarity on the SDK and being able to perform above tasks of correcting the Telemetry message so that it is a well formed JSON and that I can add more than one SensorTag and run a successful IoT Edge.

As I discover more clarity I will update here for next set of blogs to shed some light on these topics.