Azure Logic Apps Automation

Lately I have been building my own SaaS connector for a well known SaaS vendor.  This vendor (which will remain nameless for this post) provides free dev instances for devs to play with.  There is a limitation though.  If you do not create an object in their system within 10 days you will have your instance revoked.  My interest in the platform has been very much related to calling their APIs and unfortunately calling their APIs does not count as ‘usage’.  To work around this I have been manually logging into the service and creating a dummy table which resets the counter.

I have been meaning to automate this function and a recent hackathon gave me the opportunity to do so.  I extended my SaaS connector to include a Table entity and the ability to create a new Table based upon some parameters via their API.

Recently, I have have been seeing more and more people automating simple tasks in Logic Apps so I figured why not.  I can use the Recurrence Trigger in Logic apps, generate some unique parameters, using expression language, and then create my table once a week using a unique name.  This saves me from manually performing this same action.

To add some ‘bells and whistles’ to my Logic App, I added the Twilio connector so that I can receive a text message whenever this new table is created.  Perhaps in the future I will also automate the clean-up of these tables.

Here is my Logic App – I call it Vakna which means (awake in Swedish –  I was sitting beside my Swedish buddies when I built it).


Azure API Management–As An Agility Layer

I have a POC that I have recently built.  The initial driver was building SaaS Connectivity for ServiceNow.  Currently there is not an Azure ‘out-of-cloud’ connector for ServiceNow. This gave me the opportunity to get my hands dirty and build my own custom API App where I can expose ServiceNow operations.

Solution Purpose

The purpose of the solution is that you may have an Plant Operations employee that needs to perform an ‘Operator Round’.  An Operator Round has a few difference purposes:

  • To get equipment meter reads that are not hooked up to the SCADA system
  • To perform an inspection on equipment in the plant

Some assets may be managed by a Plant Operations team where as some assets (cyber assets) may be managed by an IT team. When a Cyber Asset requires maintenance, a ticket needs to be created in ServiceNow.  But, when we create this incident we want to ensure that we are using the SAP Master data for this asset as SAP is the system of the record for Plant assets.  We need to track this information from a regulatory standpoint, but we also want to make sure that the technician performing this work is working on the correct piece of equipment.

Below is a high-level architecture diagram of the solution.


I started to build out a mobile solution that encompassed the Logic Apps and API Apps components as that was the focus of my POC.  I quickly realized there was some real benefits in leveraging Azure API Management.  In this case, provisioning Azure API Management was ridiculously simple.  With my MSDN account in hand I provisioned my own instance.  About 15 minutes later I was ready.

Azure API Management Benefits

The drivers for using Azure API Management include:

  • Security – Even though this is a POC, I am using real data from an SAP DEV system and wanted an additional layer of security as I am exposing the solution through a Mobile App.
  • Performance – Based on the nature of some of my data (Master Data), I knew I could improve the user experience by caching frequent requests.  For example if I want to provide a list of Assignment Groups from ServiceNow that a user can select, this list is not going to change very frequently.  Why would I hit ServiceNow each time I needed to select an Assignment Group.  Let’s cache this data and provide a user with a 20 millisecond experience instead of 2 seconds.
  • Analytics – It is always great to see where your app is performing well and not so well. This leads me to my next opportunity.

ServiceNow has a very modern and rich API.  I give them a lot of credit, it is one of the best I have seen.  Exposing my ServiceNow API operations through API management was straight forward and the caching worked great. 

However, when it came to SAP it was a very different experience.  SAP uses more of an RPC approach instead of a RESTful approach.  The difference being in ServiceNow I will have a Resource (i.e. Assignment Groups).  I can then perform actions against that Resource such as GET, POST, PATCH and DELETE.  To retrieve Assignment Groups I would perform a GET against this resource.  For SAP it is very different, I need to actually send a message payload to the service in order to retrieve data.  For example getting a list of Equipment from SAP will look more like a POST than a GET.

You can’t cache POST Requests

By definition a POST request implies you are creating data which is very counter-intuitive to caching a GET Request.  The idea behind caching a GET request is that you have static data and you do not want to hit the back end system if there is a very high probability that it has not changed.  Since a POST implies you are creating new data, then you shouldn’t be caching it right?  Here is where the problem is.  I didn’t want to be using a POST to get master data from SAP but that is the way SAP has exposed the interface. 

Since my SAP instance is buried deep within a data center the performance was not great.  There was a noticeable wait on the mobile app when it came to SAP. I could have replicated this master data in the cloud but that seemed like a lot of synchronization and overhead especially for a POC.  What I really wanted to do was to just cache my SAP data on the API Management tier.  However, since I was sending POST requests (against my will) this wasn’t possible, or was it?

Update (Edit)

@darrel_miller reached out to me over twitter after posting this article and we chatted about caching rules.  It is worth adding to this post that “POST responses may be cacheable, but will only be served to a subsequent GET request. A POST request will never receive a cached response.”  He has a great blog post that further describes caching rules here.


Recently, I watched an #IntegrationMonday presentation from the Azure API Management team.  In this presentation, Miao Jiang was talking about an upcoming feature called a Send Request Policy. (Watch the video for more details).  It was after watching this video that gave me the idea to expose a GET request to the mobile application and then within the API in Azure Management, convert it to a POST.  This way SAP will be continue to be happy, I will be exposing a GET so that I can cache it in API Management and my User Experience improves.  A Win-Win-Win on all accounts.

So how did I do this?  By simply configuring a few policies in the Azure API Management.

Click on the image for more details but in summary here is what I was able to do:

Inbound Policies

  • Enabling Caching
  • Set my new Method to POST
  • Set a Message Body since this will now be a POST request
  • Insert a Query Parameter from Mobile App into payload so it is dynamic
  • Set my Auth Header (optional but recommended)
  • Use a Rewrite URL since I don’t want to pass a Query Parameter on a POST

Outbound Policies

  • Convert my XML response to JSON so that my mobile app can easily digest this response.



I love that I can modify the behavior of my solution through simple configuration.  You could have performed some of these capabilities in an ESB or other Middleware but it would have been much more cumbersome.  The end result is that I get a much better User Experience with very little tradeoff.  The data that I am returning from SAP does not change that frequently, so why would I fetch it all the time if I don’t have to?  In this case Azure API Management has really been an agility layer for me.

Also want to take this opportunity to thank Maxim, from Azure API Mgmt team,  for the assistance. I started going down the path of the Send Request policy when there was a simpler way of achieving this.

Azure Hybrid Integration Day coming to Calgary

Every year some of the brightest minds in Microsoft Integration descend upon Redmond, Washington for the Microsoft MVP Summit. This year 3 MVPs (Saravana Kumar, Steef-Jan Wiggers and Michael Stephenson) from Europe will be stopping by Calgary on their way to the Summit and will be giving some presentations.   Myself and a local Microsoft employee, Darren King, will also be presenting.

I have shared the stage with these MVPs before and can vouch that attendees are in for a unique experience as they discuss their experiences with Microsoft Azure and BizTalk Server.

During this full day of sessions you will learn about how BizTalk and Microsoft Azure can address integration challenges. Session topics include SaaS connectivity, IoT, Hybrid SQL Server, BizTalk administration & operations and Two Speed IT using Microsoft Azure. Also bring your burning questions for our interactive Ask the Experts Q & A.

The free event takes place on October 30th, 2015  at the Calgary Microsoft office.  You can find more details here.

Azure Mobile Services–Update Entity results in 400 Bad Request

While performing some local testing on an Azure Mobile Services (C# backend) app, I kept getting a 400 Bad Request whenever I tried to perform an update to an Entity:

await MobileService.GetTable<MyEntity>().UpdateAsync(myEntity);

I ensured that my entity object was populated correctly and that I was providing an Id as part of the request but was still getting the following error:

“Microsoft.WindowsAzure.MobileServices.MobileServiceInvalidOperationException: The request could not be completed.  (Bad Request)\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<ThrowInvalidResponse>d__18.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<SendRequestAsync>d__1d.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Micr
osoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<RequestAsync>d__4.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceTable.<>c__DisplayClass21.<<UpdateAsync>b__20>d__23.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceTable.<TransformHttpException>d__51.MoveNext()\r\n— End of stack trace from previous

Not the most descriptive error hence my decision to create the post.  The end result is that within my C# backend project I declared an entity within my DataObjects class.

public class MyEntity:EntityData
       public string Id { get; set; }

As soon as I removed the Id from this class my updates were successful.

In the end it was a silly error on my part as within my Mobile App Model classes you need to declare this Id as part of your class. This is not the same case with your backend classes.  I have several other Entities in this project and knew that you should not declare this Id but it slipped through the cracks.

It is also worth noting that during this time I could successfully insert records even though this Id was declared. 

Using the Azure API Management–Management API

I recall the first time I saw this demoed at the MVP Summit last fall and wondered how/if I would use this.  My thought process, at the time, was that Microsoft has put together a nice, functional Administration and Developer portals so why would I want to call these APIs myself?

Fast-forward about 8 months where I have a ‘Middleware’ support team and also other internal stakeholders who are interested in some of this data.  While I could provide access to all of these users to the portals I just mentioned but it feels like it would be just another website that someone has to remember credentials for.  It also would be very disconnected from some of the other tooling that the Middleware team uses to support interfaces (BAM, Exception Portal, BizTalk360 etc).  In addition to BizTalk, we also have other Azure Services that participate in some API interactions including  Web APIs, SQL Azure and Service Bus topics.  With all of these disparate sites and services, it was time to build an “Integration Portal” where we could consume these external data sources (API Management, SQL Azure, Service Bus) and then link to the other BizTalk tools).

Perhaps I will get deeper into this Integration Portal solution in the future, but for now let’s focus on the Azure API Management – Management API and how we can consume some of the Analytics that we would ordinarily see in our Admin/Developer portal.

Security and Authorization

By default, the Management API is disabled within your API Management tenant. Within the API Management Portal, click on the Security label and then the API Management REST API tab to enable it.  Within this tab:

  • Click on the Enable API REST API checkbox
  • Set an appropriate Expiry date for your token
  • Click Generate Token button
  • Copy your Authorization header token


Building the application

For the purposes of this Blog Post I am going to build a simple ASP.NET MVC Project where I will display my API Management Analytic data.  I am not going to go through setting up the MVC project in much detail, but here is a link to a good tutorial.

Within this MVC project I am going to create a new Controller called APIAnalytics and will use the a view called Index to display my data.


Below is the code that I have included within my Index() method.  You will notice this is marked as an async method as we are making our API Call using a GetAsync method. I have added comments within the code to provide some additional context.

// GET: APIAnalytics
public async Task<ActionResult> Index()

ViewBag.Message = String.Format(“API Management calls for today: {0}”, DateTime.Now.ToShortDateString());

//Note that all Azure timestamps are in UTC.  Going to use this value as a predicate so that we are only returning data for the current day
string currentDate = DateTime.Today.ToUniversalTime().ToString(“s”);

//Notice the Odata filter where we will restrict based upon timestamp

string url = String.Format(“https://<your_tenant_name>$filter=timestamp ge datetime'{0}'”, currentDate);

using (System.Net.Http.HttpClient client = new System.Net.Http.HttpClient())
client.BaseAddress = new Uri(url);
client.DefaultRequestHeaders.Add(“Authorization”, “<your_auth_token_from_api_management_portal”);

client.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue(“application/json”));

System.Net.Http.HttpResponseMessage response = await client.GetAsync(url);
if (response.IsSuccessStatusCode)
var data = await response.Content.ReadAsStringAsync();

//We want some typed data so that we can bind to our grid view control. Look for “APIStats” model class below
var table = Newtonsoft.Json.JsonConvert.DeserializeObject<APIStats>(data);


//Within our view we will use a Gridview control

System.Web.UI.WebControls.GridView gView = new System.Web.UI.WebControls.GridView();

//We are going to create a data binding event so we can manipulate column headers, etc
gView.RowDataBound += gView_RowDataBound;
gView.DataSource = table.value;
gView.AllowSorting = true;
gView.AutoGenerateColumns = false;

gView.AlternatingRowStyle.BackColor = System.Drawing.Color.Azure;
gView.Width = System.Web.UI.WebControls.Unit.Percentage(90);

using (System.IO.StringWriter sw = new System.IO.StringWriter())
using (System.Web.UI.HtmlTextWriter htw = new System.Web.UI.HtmlTextWriter(sw))
ViewBag.ReturnedData = sw.ToString();
return View();



//Here is our data binding event

protected void gView_RowDataBound(object sender, GridViewRowEventArgs e)

if (e.Row.RowType == DataControlRowType.Header)
e.Row.BackColor = System.Drawing.Color.LightGray;
e.Row.Cells[0].Text = “API Product”;
e.Row.Cells[0].Width = 300;
e.Row.Cells[1].Text = “Success Count”;
e.Row.Cells[2].Text = “Block Count”;
e.Row.Cells[3].Text = “Failed Count”;
e.Row.Cells[4].Text = “Other Count”;
e.Row.Cells[5].Text = “Total”;
e.Row.Cells[6].Text = “Bandwidth”;
e.Row.Cells[7].Text = “API Avg Time”;
e.Row.Cells[8].Text = “API Min Time”;
e.Row.Cells[9].Text = “API Max Time”;
e.Row.Cells[10].Text = “Svc Avg Time”;
e.Row.Cells[11].Text = “Svc Min Time”;
e.Row.Cells[12].Text = “Svc Max Time”;


//Going to round some values so that we are not displaying so many decimal points

e.Row.Cells[7].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[7].Text), 1).ToString();
e.Row.Cells[8].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[8].Text), 1).ToString();
e.Row.Cells[9].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[9].Text), 1).ToString();
e.Row.Cells[10].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[10].Text), 1).ToString();
e.Row.Cells[11].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[11].Text), 1).ToString();
e.Row.Cells[12].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[12].Text), 1).ToString();

catch (Exception ex;)
throw ex;




//Here is our APIStats Model Class

public class APIDetail

public string name { get; set; }

public int callCountSuccess { get; set; }
public int callCountBlocked { get; set; }
public int callCountFailed { get; set; }
public int callCountOther { get; set; }
public int callCountTotal { get; set; }
public int bandwidth { get; set; }
public double apiTimeAvg { get; set; }
public double apiTimeMin { get; set; }
public double apiTimeMax { get; set; }
public double serviceTimeAvg { get; set; }
public double serviceTimeMin { get; set; }
public double serviceTimeMax { get; set; }

public class APIStats
public List<APIDetail> value { get; set; }
public int count { get; set; }
public object nextLink { get; set; }

//Lastly here is our Index.cshtml view

ViewBag.Title = “Index”;



Running Application

If we go ahead and run our application we will discover the following web page.  The are all of the API Products that exist within my tenant.  As you can see I didn’t have any calls to my Auto Insurance apis and instead was just trying out my BizTalk Boot Camp products.

Within the unlimited tier you will see a very large API Max Time.  The reason for that is that I didn’t actually have the API backend up during that initial call. On my limited product you will see I have a Block Count of 3. The reason for this is that I have rate limiting enabled on that product and as a result I had some blocked calls.



The Azure API Management – Management API is quite rich and allows organizations to create custom dashboards or integrate this data with related data from other sources.

Within this blog post,  I have only demonstrated one type of call that is available within this API. For more details and to discover the different APIs, please refer to the following web page.

Azure API Management–IP Whitelisting

When implementing API Management solutions, it is a common practice to use IP Whitelisting when interacting with certain trading partners.  The idea being that only traffic presented from a specific IP Address (or range) can call your API Proxy.  This usually isn’t used exclusively, but can be combined with other techniques to reduce your attack surface.

Trying to manage IP Whitelisting using Firewalls and Reverse Proxies can sometimes be a complex and messy endeavor but is pretty straightforward in Azure API Management.

In my demo, I added the Restrict caller IPs policy at the product level in the inbound section.


Once the policy has been added, you can add a specific IP Address or a range if the trading partner has multiple servers that you want to communicate with.  The IP Addresses below are obviously fictitious.


Once enabled, if you try to call the API from an IP Address that has not been whitelisted you will get the following error:



You can find more information about this and other Azure API Management policies here.

Logic Apps: Integrating Custom SharePoint Lists and Salesforce

I have posted a little tutorial video that describes consuming data from a SharePoint Server Custom List and using that data to create new Contacts in Salesforce. The SharePoint Connector will take advantage of the Hybrid Connection capability that allows messages to flow between the Azure cloud and an On-Premise system without requiring ports to be opened in firewall.