Azure Logic Apps Preview Refresh: Moving a v1 app to v2

The Microsoft Integration team (BizTalk/Logic Apps) recently hit an important milestone.  They have launched the Logic Apps Preview Refresh.  You can find the original press release here.

The purpose of this post is to highlight some of the recent investments and then take an existing v1 Logic App and rebuild it in v2.

New Designer

Probably the biggest feature released was the new designer experience.  As opposed to the Left –> Right workflow we saw in v1 of Logic Apps, we now find ourselves moving top-down.  I think for most BizTalk people, moving top-down is more natural.


In addition to top-down, we also have a search/intellisense type experience when looking for new shapes(connectors).  It is a neat way to approach the problem as a canvas that contains pages and pages of connectors isn’t the best user experience.


Intelligent Output

Another benefit is we are not as dependent on the Logic Apps Workflow Language.  It is still there but Microsoft has abstracted much of that away from us.  So if we want to use a value from a previous step in our workflow we can just select it.  A great addition!


This is very much welcomed over the v1 equivalent: A new table has been created @{body(‘wearsy.servicenow.apiapp’).TableName}


Another investment has been in the way you add an action or a condition.  No longer is a condition ‘hidden’ at the card/connector level.  Very explicit, easy to read and natural.


Moving the furniture

With v1 of Logic Apps, once you had a card on the canvas you were committed.  Rearranging the cards was only possible using the json in the code behind.  It at times was painful, so this is a great add. You can now move cards back and forth as long as you are not breaking dependencies.



Native Webhook support

Logic Apps now supports Webhooks which allows developers to build ‘callback’ interfaces over HTTP.  Many services support Webhooks as  a way to extend their offering including Visual Studio Online, GitHub, Stripe and Paypal to name a few. An example the Logic Apps team likes to use is that you can call a Logic App from Visual Studio when code is committed to your master branch as a way to kick off other downstream processes.

Managed APIs

In v1 of Logic Apps, there was always a provisioning exercise when you wanted to use a new connector/API App. This added some delays in your dev experience and also forced you to create multiple instances in the event you had some different connection strings.  Now, Microsoft has provisioned a set of Managed API connections.  This means that there is zero delay when using out of the box API connections.

This does have an impact on existing v1 API Apps that you have created and that is one of the main reasons for this post.  Since a custom v1 API App is not in this Managed API list it will not be immediately discoverable within Logic Apps.  (This is bound to change as we are still in preview and Microsoft is working on this)  But, since v1 API Apps were decorated with Swagger meta-data, we can take advantage of the Http + Swagger connector in order to consume our existing Swagger “contract”.

Note: There are a few actions that you need to perform in order for your v1 API App to be discoverable.  In order to avoid duplications, I will refer you to Jeff Hollan’s (Microsoft) post and Daniel Probert’s post.

As a starting point, here is my v1 Logic App.  I documented the use case here so I won’t get into a lot of details about what it does.  In summary, on a regular basis I want this Logic App to initiate, create a table in Service Now using my custom Service Now connector and send me a text message when it is complete.


With my configuration set as described in those previous blog posts I am going to create a new logic app.

  • The first step that I want to perform is to add the Recurrence card to my Logic App.  I will then set a frequency. In this case it is set to a minute but I will revert it back to 1 week after I am done testing.  Interesting that Microsoft has added a timezone field and a starttime field. 


  • Next, I will add my Http + Swagger connector. When you think about it, this is actually a very powerful feature.  Swagger (recently renamed OpenAPI) is the leading API markup language used by IBM, Amazon, Apigee and many others.  What this means is that any API that has Swagger metadata can now be “plugged” into Logic Apps.  This has already provided many benefits.  An example is that the Microsoft Cortana Analytics team has been including Swagger metadata in their APIs so now Logic Apps can plug them in and developers do not have to worry about connectivity or write their own wrappers.  This is a big win!


  • With my Swagger URL in my clipboard I can paste it in to the Endpoint URL text box and click the Next button.


  • We will now discover all of the different operations that are available for me to use in my Logic App.


  • For the purpose of this post, I will use the Table_PostByValue operation. There are 4 attributes that need to be set.  Since I want my table to be dynamic, I will use the Logic App Workflow Language functions.  I did run into a bit of a bug with this one.  If you need to use the @utcnow() function, set it in the code behind. The Product Team is aware and will be addressing this issue.


  • After configuring my ServiceNow Api App, I want to now send a text message using the v2 Twilio API App. I can easily add it by starting to type the word “Twilio”.


  • I previously had a Twilio account so I needed to log into their website to obtain my credentials.


  • Next, I need to provide my Twilio phone number and the phone number where I would like to receive a text. I also want to provide some text in the message and include the name of the table that was just created.  By using Swagger, Logic Apps knows that the Service Now API App will return the name of the Table that was created and as a result I can simply click on that item and have it added to my message body.


  • After all my configuration is complete, this is what my Logic App looks like.  Pretty clean!!!


  • The end result when testing is that I have a new table created and receive a text message.  Note that Service Now instance is the pacific region which accounts for the 1 hour timestamp delta.




While still in preview, we can see that the Microsoft team is working hard and is focused on providing continuous value to customers.  While there are some features I would like to see included, this was a good release and a step in the right direction.

Protecting your Azure Event Hub using Azure API Managment

We are currently embarking on an Event Hub project where we will be processing device “reads” on a frequent basis. You can consider the use case to be in the Industrial IoT space but the difference is we do not have to manage the devices.  Without geting into a lot of details, we have aggregators responsible for that.  For this reason we decided not to pursue the Azure IoT Suite as we really just need Event Hub capabilities.

We will not have a lot of publishers, but the question did come up how can we protect, or restrict where traffic is coming from and how we manage keys to our service.  Event Hubs currently does not have an ability to white list a set of IP Addresses.  This implies that someone could take your SAS key and potentially ‘pollute’ your Event Hub from a location that has no business publishing to your Event Hub.

Another option is to issue SAS tokens, which do have expiration timestamps attached to them. This will not get you away from the location in which the publisher is pushing events to your Event Hub.  But, it does ensure if you have a key leakage that its TTL (time to live) reduces risks.  I am going to continue to explore this path in a later post or talk.

But for the purpose of this post, I am going to focus on API Managment (APIM) + Event Hubs to see what we can do.  In a previous post, I did speak about using Azure API Management policies to limit where an API can be called from.  We will leverage that post in the interest of keeping this post short(er).

The first thing we need to understand is the mechanics of calling an Event Hub using HTTP.  The Service Bus team, does prefer AMQP and for good reasons (see Publishing an event section) but back to the whitelisting requirement we are left with HTTP.  In order to call an Event Hub via HTTP there is some information that we need to collect including:

  • Create ServiceBus Namespace
  • Create Event Hub
  • Create Shared Access Policy

    In this case we will create a Send Access Policy.  We do not want to provide uncessary privileges to our publisher; such as the ability to consume from our Event Hub.


  • Generate a SAS token. To accomplish this feat, I used a tool by Sandrino Di Mattia which is available here:  When you run the tool, you will be prompted for some of your ServiceBus information including Namespace, Event Hub Name, an arbitrary Publisher Name and details from the SAS Policy you just created.  Once you provide this infromation and click on the Generate button you will have a SAS Token based upon the TTL that you specified. Copy this token as we will need it in future steps.image
  • We now can assemble our URL for our Event Hub based upon the following information:

NAMESPACE: Your Service Bus Namespace
EVENTHUB-NAME Name of your Event Hub
PUBLISHER-NAME This is somewhat arbitrary.  What I used here was the name of my Shared Access Policy.  Make sure to use the same value that you used to generate SAS Token.


We can now test our ability to send a message to our Event Hub using HTTP and PostMan.  In order to call our Event Hub using HTTP we will need our SAS token that we just generated and our URL.

In order to call the Event Hub endpoint we will need to create an Authorization Header and populate it with our SAS token.


Following that we can provide our URL and a message body that we want to send to Event Hubs. After clicking on the Send button we will see we get an HTTP 201 status code.


Azure API Management

We now know we can use HTTP to populate our Event Hub.  With this done, lets configure our API Management instance.  I am not going to go into a lot of detail on how to set up Azure API Management.  I will refer you to a previous presentation video and tutorial on the subject.

Within Azure API Management we want to do the following

  • Create an API In order to do this we will need our Event Hub URL that we used in PostMan.


  • Create an Operation (and meta data).  In this case there is no URL rewriting that needs to be performed.  Our requests will be passed through.


  • Apply our policies to Operation.  In this case we will add two policies:
    • IP Filter where I will include my IP Address and set action = “allow”
    • Set Header where I will provide my SAS Token that will allow my API Management instance to talk to Event Hubs.


  • Create Product


    • Add API to Product


    • Publish Product
    • Assign Access to Product



      Since we are using IP Whitelisting, I will make a call using PostMan, but this time I will send it through the API Management Proxy.

      In order to call our endpoint through API Management, we will need our API Key which is available from the Azure Portal.  We can access this information from the Users menu.  We can then click on the Show link where we can then copy our key.


      In PostMan, we will want to add an Authorization Header called ocp-apim-subscription-key and then provide our API Key that we just copied.


      Next, we can provide our URL, message body and click Send button.


      As you can see we have a successful publish through APIM and receive an HTTP 201 back.


      To ensure our IP Whitelist is working, let’s try to call our service from the API Management Developer’s Console. As expected, we get an HTTP Status code of 403 Forbidden back which is as expected.


      Azure Stream Analytics–Querying JSON Arrays

      I have been learning Stream Analytics recently and ran across an issue that I couldn’t find any good examples of how to solve the problem so I figured I would post my solution.

      I fully expect to expand on this scenario in coming months, but for now will keep the scenario light.  What I am doing is getting device reads off of an Azure Event Hub.  These reads are being aggregated on the publisher side and placed into a single message/event.  Since the publisher is creating a message structure that contains many device reads for that specific interval I wanted to ensure I can process each element in the array within my Stream Analytics query.

      My message payload looks like this:

          “interchangeID”: “94759e00-b7cf-4036-a2a5-827686caace2”,
          “processType”: “RT”,
          “tagDetails”: [

      {“tagName”: “TAG1″,”tagTimestamp”: “15-Jan-2016 12:47:30″,”tagValue”: 2.756858951,”tagQuality”: “524481”},
      {“tagName”: “TAG2″,”tagTimestamp”: “15-Jan-2016 12:47:30″,”tagValue”: 2.756858952,”tagQuality”: “524482”},
      {“tagName”: “TAG3″,”tagTimestamp”: “15-Jan-2016 12:47:30″,”tagValue”: 2.7568589533,”tagQuality”: “524483”}
      {“tagName”: “TAG4″,”tagTimestamp”: “15-Jan-2016 12:47:30″,”tagValue”: 2.7568589534,”tagQuality”: “524484”}



      From a Stream Analytics perspective, here is what my query looks like:

      SELECT tagDetails.ArrayValue AS tag
      FROM inputeventhub AS e
      CROSS APPLY GetArrayElements(e.tagDetails) AS tagDetails

      When I execute my query, my result looks like this:


      The key to making this query works is the “CROSS APPLY” operators.  MSDN describes these operators as: “The APPLY operator allows you to invoke a table-valued function for each row returned by an outer table expression of a query. The table-valued function acts as the right input and the outer table expression acts as the left input. The right input is evaluated for each row from the left input and the rows produced are combined for the final output. The list of columns produced by the APPLY operator is the set of columns in the left input followed by the list of columns returned by the right input.

      There are two forms of APPLY: CROSS APPLY and OUTER APPLY. CROSS APPLY returns only rows from the outer table that produce a result set from the table-valued function. OUTER APPLY returns both rows that produce a result set, and rows that do not, with NULL values in the columns produced by the table-valued function.”

      Then as part of the SELECT statement, we are able to iterate through each value in the array which gives us our columns.

      Stay tuned for more info on Event Hubs and Stream Analytics as I continue to get more familiar with them.  One thing I have learned early is that they are very powerful together.

      Also, as an additional resource, I encourage you to visit the MSDN Complex Data Types post which will give you some examples of other more complex scenarios.

      Azure Logic Apps Automation

      Lately I have been building my own SaaS connector for a well known SaaS vendor.  This vendor (which will remain nameless for this post) provides free dev instances for devs to play with.  There is a limitation though.  If you do not create an object in their system within 10 days you will have your instance revoked.  My interest in the platform has been very much related to calling their APIs and unfortunately calling their APIs does not count as ‘usage’.  To work around this I have been manually logging into the service and creating a dummy table which resets the counter.

      I have been meaning to automate this function and a recent hackathon gave me the opportunity to do so.  I extended my SaaS connector to include a Table entity and the ability to create a new Table based upon some parameters via their API.

      Recently, I have have been seeing more and more people automating simple tasks in Logic Apps so I figured why not.  I can use the Recurrence Trigger in Logic apps, generate some unique parameters, using expression language, and then create my table once a week using a unique name.  This saves me from manually performing this same action.

      To add some ‘bells and whistles’ to my Logic App, I added the Twilio connector so that I can receive a text message whenever this new table is created.  Perhaps in the future I will also automate the clean-up of these tables.

      Here is my Logic App – I call it Vakna which means (awake in Swedish –  I was sitting beside my Swedish buddies when I built it).


      Azure API Management–As An Agility Layer

      I have a POC that I have recently built.  The initial driver was building SaaS Connectivity for ServiceNow.  Currently there is not an Azure ‘out-of-cloud’ connector for ServiceNow. This gave me the opportunity to get my hands dirty and build my own custom API App where I can expose ServiceNow operations.

      Solution Purpose

      The purpose of the solution is that you may have an Plant Operations employee that needs to perform an ‘Operator Round’.  An Operator Round has a few difference purposes:

      • To get equipment meter reads that are not hooked up to the SCADA system
      • To perform an inspection on equipment in the plant

      Some assets may be managed by a Plant Operations team where as some assets (cyber assets) may be managed by an IT team. When a Cyber Asset requires maintenance, a ticket needs to be created in ServiceNow.  But, when we create this incident we want to ensure that we are using the SAP Master data for this asset as SAP is the system of the record for Plant assets.  We need to track this information from a regulatory standpoint, but we also want to make sure that the technician performing this work is working on the correct piece of equipment.

      Below is a high-level architecture diagram of the solution.


      I started to build out a mobile solution that encompassed the Logic Apps and API Apps components as that was the focus of my POC.  I quickly realized there was some real benefits in leveraging Azure API Management.  In this case, provisioning Azure API Management was ridiculously simple.  With my MSDN account in hand I provisioned my own instance.  About 15 minutes later I was ready.

      Azure API Management Benefits

      The drivers for using Azure API Management include:

      • Security – Even though this is a POC, I am using real data from an SAP DEV system and wanted an additional layer of security as I am exposing the solution through a Mobile App.
      • Performance – Based on the nature of some of my data (Master Data), I knew I could improve the user experience by caching frequent requests.  For example if I want to provide a list of Assignment Groups from ServiceNow that a user can select, this list is not going to change very frequently.  Why would I hit ServiceNow each time I needed to select an Assignment Group.  Let’s cache this data and provide a user with a 20 millisecond experience instead of 2 seconds.
      • Analytics – It is always great to see where your app is performing well and not so well. This leads me to my next opportunity.

      ServiceNow has a very modern and rich API.  I give them a lot of credit, it is one of the best I have seen.  Exposing my ServiceNow API operations through API management was straight forward and the caching worked great. 

      However, when it came to SAP it was a very different experience.  SAP uses more of an RPC approach instead of a RESTful approach.  The difference being in ServiceNow I will have a Resource (i.e. Assignment Groups).  I can then perform actions against that Resource such as GET, POST, PATCH and DELETE.  To retrieve Assignment Groups I would perform a GET against this resource.  For SAP it is very different, I need to actually send a message payload to the service in order to retrieve data.  For example getting a list of Equipment from SAP will look more like a POST than a GET.

      You can’t cache POST Requests

      By definition a POST request implies you are creating data which is very counter-intuitive to caching a GET Request.  The idea behind caching a GET request is that you have static data and you do not want to hit the back end system if there is a very high probability that it has not changed.  Since a POST implies you are creating new data, then you shouldn’t be caching it right?  Here is where the problem is.  I didn’t want to be using a POST to get master data from SAP but that is the way SAP has exposed the interface. 

      Since my SAP instance is buried deep within a data center the performance was not great.  There was a noticeable wait on the mobile app when it came to SAP. I could have replicated this master data in the cloud but that seemed like a lot of synchronization and overhead especially for a POC.  What I really wanted to do was to just cache my SAP data on the API Management tier.  However, since I was sending POST requests (against my will) this wasn’t possible, or was it?

      Update (Edit)

      @darrel_miller reached out to me over twitter after posting this article and we chatted about caching rules.  It is worth adding to this post that “POST responses may be cacheable, but will only be served to a subsequent GET request. A POST request will never receive a cached response.”  He has a great blog post that further describes caching rules here.


      Recently, I watched an #IntegrationMonday presentation from the Azure API Management team.  In this presentation, Miao Jiang was talking about an upcoming feature called a Send Request Policy. (Watch the video for more details).  It was after watching this video that gave me the idea to expose a GET request to the mobile application and then within the API in Azure Management, convert it to a POST.  This way SAP will be continue to be happy, I will be exposing a GET so that I can cache it in API Management and my User Experience improves.  A Win-Win-Win on all accounts.

      So how did I do this?  By simply configuring a few policies in the Azure API Management.

      Click on the image for more details but in summary here is what I was able to do:

      Inbound Policies

      • Enabling Caching
      • Set my new Method to POST
      • Set a Message Body since this will now be a POST request
      • Insert a Query Parameter from Mobile App into payload so it is dynamic
      • Set my Auth Header (optional but recommended)
      • Use a Rewrite URL since I don’t want to pass a Query Parameter on a POST

      Outbound Policies

      • Convert my XML response to JSON so that my mobile app can easily digest this response.



      I love that I can modify the behavior of my solution through simple configuration.  You could have performed some of these capabilities in an ESB or other Middleware but it would have been much more cumbersome.  The end result is that I get a much better User Experience with very little tradeoff.  The data that I am returning from SAP does not change that frequently, so why would I fetch it all the time if I don’t have to?  In this case Azure API Management has really been an agility layer for me.

      Also want to take this opportunity to thank Maxim, from Azure API Mgmt team,  for the assistance. I started going down the path of the Send Request policy when there was a simpler way of achieving this.

      Azure Hybrid Integration Day coming to Calgary

      Every year some of the brightest minds in Microsoft Integration descend upon Redmond, Washington for the Microsoft MVP Summit. This year 3 MVPs (Saravana Kumar, Steef-Jan Wiggers and Michael Stephenson) from Europe will be stopping by Calgary on their way to the Summit and will be giving some presentations.   Myself and a local Microsoft employee, Darren King, will also be presenting.

      I have shared the stage with these MVPs before and can vouch that attendees are in for a unique experience as they discuss their experiences with Microsoft Azure and BizTalk Server.

      During this full day of sessions you will learn about how BizTalk and Microsoft Azure can address integration challenges. Session topics include SaaS connectivity, IoT, Hybrid SQL Server, BizTalk administration & operations and Two Speed IT using Microsoft Azure. Also bring your burning questions for our interactive Ask the Experts Q & A.

      The free event takes place on October 30th, 2015  at the Calgary Microsoft office.  You can find more details here.

      Azure Mobile Services–Update Entity results in 400 Bad Request

      While performing some local testing on an Azure Mobile Services (C# backend) app, I kept getting a 400 Bad Request whenever I tried to perform an update to an Entity:

      await MobileService.GetTable<MyEntity>().UpdateAsync(myEntity);

      I ensured that my entity object was populated correctly and that I was providing an Id as part of the request but was still getting the following error:

      “Microsoft.WindowsAzure.MobileServices.MobileServiceInvalidOperationException: The request could not be completed.  (Bad Request)\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<ThrowInvalidResponse>d__18.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<SendRequestAsync>d__1d.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Micr
      osoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<RequestAsync>d__4.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceTable.<>c__DisplayClass21.<<UpdateAsync>b__20>d__23.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceTable.<TransformHttpException>d__51.MoveNext()\r\n— End of stack trace from previous

      Not the most descriptive error hence my decision to create the post.  The end result is that within my C# backend project I declared an entity within my DataObjects class.

      public class MyEntity:EntityData
             public string Id { get; set; }

      As soon as I removed the Id from this class my updates were successful.

      In the end it was a silly error on my part as within my Mobile App Model classes you need to declare this Id as part of your class. This is not the same case with your backend classes.  I have several other Entities in this project and knew that you should not declare this Id but it slipped through the cracks.

      It is also worth noting that during this time I could successfully insert records even though this Id was declared.