Speaking at Microsoft Ignite

With everything now confirmed, I am happy to report that I will be speaking at Microsoft’s largest conference, Ignite, in September in Atlanta, Georgia.

I will be sharing the stage with John Taubensee from the Azure Service Bus team.  We will be talking about some real world messaging scenarios.  Service Bus and Event Hubs are important pillars in our Integration strategy at TransAlta so I am looking forward to discussing how we use these technologies to provide mission critical messaging for our business.

You can find more details on the Ignite website about our session.

Hybrid Connectivity with Logic Apps and API Apps

Note: This blog post was written in early July, 2016 against preview bits.  The tools and techniques used in this post are bound to change.

In a previous blog post (Using Azure AD Authentication between Logic Apps and Azure API Apps ) I discussed a hybrid scenario where I wanted to expose on-premises data to an Azure Logic App, via an Azure API App). In my scenario, the data that I want to expose comes via another middleware platform called Denodo. 

What is Denodo?

Denodo is a Data Virtualization Platform.  But what does that mean?  Well I consider it to be a Data Integration tool.  On the surface, you may think does this tool compete with other Integration Brokers/ESBs.  If used correctly, I don’t think so .  If overused/abused, then sure it can compete – much like many other tools.

Instead, Denodo competes more with ETL tools such as SSIS or SAP Data Services. What it aims to do is to eliminate, or reduce, the amount of ETL (aka data copying) from data sources only to consolidate those data sources upstream.  As soon as ETL is in the mix, you have batch processing and risk running into stale data or missed opportunities.

Instead, you introduce a “data abstraction” layer that sits in front of physical data sources and can project virtual database/table views.  These views can span tables and even multiple data sources. Data sources are not limited to databases either.  Denodo can virtualize files and scrape web pages.  Scraping a web page seems like a very tedious thing to do – and it can be but not if you have tools that are flexible enough that when you want to make a change you can easily do so.


One of the other benefits of Data Virtualization platforms, such as Denodo, is that they can take a regular ODBC/JDBC data source and project it as something else such as  REST service.  In my scenario, this is part of the reason why I am using Denodo to begin with. 

In the app that I am building in the previous blog post, I need to bring in multiple data sets. One data set is on premises and has an Oracle backend, another is a csv file that gets published on a regulator’s website every hour and the last is a Cloud data source. 

All of these data sources have been projected as RESTful services using HTTP and JSON. But now I have an issue.  How do I connect a Logic App with these on-premise REST services?  Yes, Logic Apps has an HTTP connector, but can it see on-premises endpoints?  Denodo sits in our Corporate Intranet and I am not interested in opening up inbound firewalls.  Even if I could access Denodo from Logic Apps, Denodo does not have any metadata exposed, such as Swagger, so then I would have to deal with message shapes in Logic Apps.

One way to solve this problem, is to use an Azure API App.  By doing so we can wrap our Denodo RESTful services in our Web API, decorate it with Swagger and then publish to Azure.  By doing this we can now enable a Hybrid Connection since an API App is really running in the context of a Web Site.  We also can plug into Logic Apps easily since Swagger is a first class citizen in Logic Apps.

By adding an Azure API App, my architecture now looks like this:


In my previous post, I walked through deploying my API App so I am not going to get into more details here on that process.  At this point, I am assuming the API App has been deployed.

  • Within your App Service, navigate to the Networking blade and then click Configure your hybrid connection endpoints.


  • Click on Add


  • Add New hybrid connection


  • Provide a Name, Hostname, Port and then click on Configure required settings


  • Provide a Name, Pricing Tier, Resource Group, Subscription and Location


  • Click OK to continue
  • The Hybrid Connection will get created


  • The Hybrid connection will take a couple minutes to provision.  Once provisioned, we will see that it has a Not Connected status.  This is normal since we have not set up our On-premises agent yet.


  • Next,we need to download and install the agent.  Click on the name of your Hybrid connection which in my case is OnPremDenodo.
  • Next, click on Listener Setup and then Install and configure now.


  • An executable will show up in your browser.  Open it.


  • Click Run


  • Midway through the installation you will be prompted to provide a Hybrid Connection String.  You can get this from the Azure Portal in the blade that you downloaded the Hybrid Connection agent from.


  • Once your connection string has been entered, you installation should be complete.


  • In the Azure Portal, your Hybrid Connection should now have a status of Connected



  • From within your API App, the URL of the REST service that you are trying to hit will be your local REST service URL.  It will not be your Hybrid Connection.
  • In fact, you actually don’t need to reference the Hybrid Connection URL anywhere.  When your API App gets called and in turn you call your On-Premise URL, your Hybrid Connection will kick-in and you will access that resource through that tunnel.
  • Logic Apps will access your API App using the public Azure endpoint, hence the need to lock it down using Azure AD as I discussed in my previous post.


In this post we reviewed how it is possible for a Logic App to access an On-Premise REST service by using Hybrid Connectivity through a custom API App.  We could have also used a site to site VPN connection, but that would have involved more moving  parts, approvals and people.

The Current State of iPaaS

In addition to my day job of being an Enterprise Architect at an Energy company in Calgary, I also write for InfoQ.  Writing for InfoQ allows me to explore many areas of Cloud and engage with many thought leaders in the business.  Recently, I had the opportunity to host a Virtual Panel on Integration Platform as a Service (iPaaS).

Participating in this panel was:

  • Dan Diephouse – Director of Product Management at MuleSoft where he is part of the team that launched MuleSoft’s iPaaS offering: CloudHub.
  • Darren Cunningham – Vice President of Marketing at SnapLogic where he focuses on product management and outbound product marketing.
  • Jim Harrer – Principal Group Program Manager in the Cloud & Enterprise division, at Microsoft, where his team is responsible for Program Management for BizTalk Server and Microsoft’s iPaaS offering: Azure Logic Apps.

Overall, I was very happy with the outcome of the article.  I think the panelists offered some great insight into the current state of iPaaS and where this paradigm is headed.

You can read the entire article here and feel free to add comments in the article’s comment section.

Using Azure AD Authentication between Logic Apps and Azure API Apps

NOTE: This blog post was written in June 2016 and is based upon a preview of Azure Logic Apps.  The functionality is bound to change in the future.  I have no additional information about when the new functionality may, or may not, be available.

Recently I have been working on a PoC where I have created an API App that needs to talk to an On-Premise REST Service hosted in a 3rd party platform called Denodo.  I will talk about those details in a future post.  But for the purpose of this post I want to discuss how I can secure, using Azure AD, my Denodo API App.  In the broader solution that  I am working on, I know this API App will be called from a Logic App. As a result, I want to prove out that a Logic App can authenticate, with Azure AD, while calling my API App.

Some documentation that did help me out in this journey was the following post from Stephen Siciliano.  In the comments, Jeff Hollan also provided some commentary that was helpful. There were still a few bumps in the road, so  I figured I would document exactly the steps that I followed, in order to provide a more streamlined experience.

Part 1 – Enabling Authentication on Endpoint

  • Build your API App in Visual Studio.  In my case I am wrapping an existing REST API that is provided by Denodo platform.  One of the benefits of wrapping it using App Service is that I can add Swagger meta data, which will help me in Logic Apps.  Also, by using an App Service, I can bridge cloud and on-prem using either Hybrid Connections or a VPN connection.  More on the hybrid connectivity in next post (promise).
  • Enable Swagger metadata by uncommenting the .EnableSwaggerUI call in the SwaggerConfig.cs


  • Publish App Service.  You will need to specify/create an Azure APP Service Plan, Azure Subscription and Resource Group.


  • With the Azure App Service Authentication set to off we will be able to access our API through a browser.


  • By navigating to our APP Service URL and appending “/Swagger” we will see our API exposed.  We can interact with our API by clicking on Try It Out!


  • At this point we know our API is working and in my case it is calling an on-prem REST API.

NOTE: When you first deploy your API App, a swagger file will be created.  You should download this file before you start locking down your endpoint.


  • At my organization we heavily leverage Azure AD.  Whenever we are doing something with cloud, we try to plug-in to Azure AD as we can centrally manage it and it generally plays nice in the Microsoft ecosystem and other SaaS apps.
  • At first I thought I was going to have to create an Azure AD Application by going into the old portal, but you don’t have to do that any more.  Within the new Azure AD Portal there is some slick integration going on.
  • While in your App Service click on Authentication/Authorization and turn on App Service Authentication and and select Log in with Azure Active Directory.


  • Click on Azure Active Directory under Authentication Providers
  • From Management mode select Express


  • Create New AD App and provide AD App Name.
  • Save your configuration
  • If you then navigate to your Web API/Swagger console you should now be challenged to authenticate against Azure AD.  Enter your credentials and you can interact with Swagger console.
  • At this point your endpoint is secure, but how do you connect Logic Apps to use it? Keep reading.

Part 2 – Exposing Swagger Metadata

As of this writing (June 2016), you will have issues with Logic Apps being able to consume your Swagger metadata.  The reason for this is that Logic Apps (at least for now) requires that the Swagger metadata is available from a public source and over HTTPS. As soon as you locked down your endpoint, the swagger metadata is not publicly available.

For now (as I fully expect that Microsoft is working on a cleaner solution), Take your swagger metadata and place it in Blob storage without any authentication around it. To do this perform the following steps:

  • Create the storage account from the new portal


  • Use the Resource manager Deployment model, it can be General Purpose with Standard Performance, using Locally-redundant storage (LRS) and provide a Resource Group.


  • With your storage container created, you can then use a tool like Azure Storage Explorer to manage your storage instance.
  • While logged in with my Storage Account credentials (available from Azure Portal) I created a new Blob Container and then set the access level to Public


  • Use the Upload button to update your swagger meta data and then you can view your URI for your document by having it selected and clicking on View.


Unfortunately we are not done yet.  We now need to deal with CORS. There are ways to enable CORS for blob storage but I decided not to go down that path…at least for now.  As part of the original article that I referenced, Jeff Hollan from the Logic Apps team has provided a bit of a work around to get around CORS.  With your swagger file in a public blob storage, you can take that URL and use his utility which has a “CORS bypass” enabled.  This may not be a long term solution and I can’t speak to how long Jeff will keep this alive but for now it works for me.  If you want to enable CORS for your storage account then check out this link.

To use Jeff’s workaround I just added my Swagger/Blob Storage URL to his helper api:


  • I then need to update my API Definition in my API App to use this new URL.


Part 3 – Wiring Logic App to use AAD

In order to complete this part we are going to need some data from Azure AD.  In order to get this data we need to get it from the old portal at https://manage.windowsazure.com.

Ultimately, we need to construct a message that looks the following so that we can provide it as an Authentication header in our API Call

Now the question is where do you get these values?  You get them from your Azure AD instance and more specifically the Azure AD App that you created in Part 1 of this blog post.  In the following image I have outlined exactly  where to get the required values.


With this information in hand we can now create a new Logic App.  To keep things simple, I have created a Logic App and use a Recurrence Trigger, mainly so that I can trigger it on demand.

Next, I should be able to select Show APIs for App Services in the same region and my API should show up.


Note: If you have any issues with Swagger meta data, this is where you will see them. You may see the dreaded “Failed to fetch swagger. Ensure you have CORS enabled on the endpoint and are calling an HTTPS endpoint” error.  If so you likely have one of two issues:

  • Swagger Metadata not being publicly accessible
  • CORS

For my API, I am only performing a GET and as a result do not have any query parameters.  The only data I need to send is in my Authentication Header which we covered in a previous step.


When submitting a message and getting an error like the following, then that means something hasn’t been set up correctly with your Authentication header.

{“code”:”BadRequest”,”message”:”Http request failed as there is an error getting AD OAuth token: ‘AADSTS70001: Application with identifier ‘<bad_token>’ was not found in the directory aff3442b-5f55-409c-be77-da97b366435a\r\nTrace ID: 54eb2e86-2e1b-46p9-8d14-983102278428\r\nCorrelation ID: 873a248f-900c-4f19-9684-447b5bfe6da4\r\nTimestamp: 2016-06-27 03:01:28Z’.”}

I fully expect Microsoft to make this a simplier  and more streamlined experience but until that time, I think it is important that people are locking down their Azure resources.  Lately I have been doing a lot of PoCs with the business and other IT groups.  Naturally the question about security is going to come up and I am not going to just say it is secure, or say it is possible for it to be secure – I want to ensure it is secure.  Once you have the process down, I also don’t think it is too much effort to get it working once you understand all of the mechanics involved.

Azure Logic Apps–Deleting Items From SharePoint (Online) List

 I have a scenario at work where we need to provide some simple syncronization between a SQL Azure table and a SharePoint Online Custom List. As a pre-requisite each morning before business users get into the office, we need to purge the contents from the SharePoint list and update it with today’s data + a 6 day forecast of future data.

I have integrated  BizTalk with custom SharePoint Lists in the past, and even wrote about it in one of my books. It wasn’t a particularly good experience so I was interested in evaluating how Logic Apps would deal with custom lists. 

One difference between BizTalk and Logic Apps, in this case, is that BIzTalk has a SharePoint Adapter but it will only interface with SharePoint Document Libraries. If you wanted to integrate BizTalk with SharePoint Custom lists you are likely going to do so with the Lists.asmx web service.  While it is completely possible to use this custom web service approach, be prepared to spend a few hours (if you are lucky) getting everything working.

With Logic Apps, it is a very different experience.  From the Logic Apps canvas you need to add a trigger to kick off your workflow.  In my case, I used a Recurrence trigger that will run every day at 7:45 am.  I can also kick this trigger off manually in the Azure Portal if I wish.


Next, I want to add an action and then search for SharePoint from the Microsoft managed APIs dropdown.  After we do that, all available SharePoint Online operations will be displayed.

In my case, I want to purge all items, but there is no Delete List operation.  Instead, I need to get all items in my list first, so that I can use the ID from each record to delete that item.  In my scenario I expect 7 days * 24 hourly records (168 items) to be in my list at any given time so this load is not a concern.

With this situation in-mind, I will select the Get Items  operation.


Once I have selected my Get Items operation, I need to establish my connection to my Office 365 subscription. (I will spare you from the password prompts)


With my connection established, I now need to provide the URL to my SharePoint List.  I can also specify optional parameters that control the number of items returned. 

I must say the experience in this dialog is a good one.  I can click on the Site URL dropdown and all of the sites that I have access to will be in that list.  Once I have selected my URL and then click on the List Name dropdown, I then see all the lists that I have access to on that site.


Next, I need to add another activity and this time I will select the Delete Item operation.


I have a similar experience in the Delete Item dialog that I had in the Get Items dialog.  With my connection already established, I need to provide the same Site URL  and the same List Name.  What is different this time is I need to provide an ID for the list item that I would like to delete.  In this case it will be an ID  that is coming from my Get Items response.

Delete Item

You might be asking yourself – will how is that going to work for all of my items in my list?  Don’t you need a for loop to iterate through the Get Items collection? The answer is Yes, but the Logic Apps team has made this very simple – they have done the heavy lifting for you.  If you go into Code View you can see it there:

foreach”: “@body(‘Get_items’)[‘value’]”,
“inputs”: {
    “host”: {
        “api”: {
            “runtimeUrl”: “https://logic-apis-westus.azure-apim.net/apim/sharepointonline”
        “connection”: {
            “name”: “@parameters(‘$connections’)[‘sharepointonline’][‘connectionId’]”
    “method”: “delete”,
    “path”: “/datasets/@{encodeURIComponent(encodeURIComponent(string(‘https://SharePointsharepoint.com/sites/HOPOC/Bighorn/SitePages/Home.aspx’)))}/tables/@{encodeURIComponent(encodeURIComponent(string(‘c28b1ea2-e2a0-4faf-b7c2-3eerec21a8b’)))}/items/@{encodeURIComponent(string(item()[‘ID’]))}”



The end result is a Logic App that looks like this:

Total Solution

I can now my Logic App from the Azure Portal by clicking on Select Trigger and then recurrence.


I can follow my execution and dive into my Get Items and Delete Items calls.  By inspecting my inbound and outbound traces I can see the exact payloads and HTTP Status codes from the underlying operations.  In this case I was able to delete 300 items in 23 seconds.  For each item in my collection, a Delete Item call is made.



It honestly took me about 10 minutes to figure this out.  Part of the reason why I am writing about this is I know how long this would take with BizTalk. I anticipate it would take at least 3 hours to do this in BizTalk if it was your first time.  So this isn’t a knock against BizTalk, as Logic Apps has been built, IMO, for these lightweight scenarios with little friction.

In the short term I think Microsoft Integration architects and developers will have many opportunities like this one where you can choose one tool or the other.  For me, developer productivity needs to be part of that equation.  Where the systems that you are integrating are hosted will also play a role.  In this case, I am connecting to SharePoint Online and SQL Azure so it also doesn’t make sense, IMO, to route all of this info back on-premises only to go back up to the cloud.

We may also see a Hybrid approach where a BizTalk process can call out to a Logic App where you can take advantage of these new Logic App connectors.  This was something that Microsoft discussed at 2016 Integrate Event in London.

In a future blog post we will talk about the other half of this solution which is how to create items in SharePoint Online from SQL Azure.

Speaking at Integrate 2016

BizTalk360 has recently released more details on their annual conference in London.  This year there is a name change.  Instead of the BizTalk Summit, the name gets altered to align with the “Integrate” brand.  In case you were not aware, BizTalk360 organized the last Integrate Summit in Redmond back in December 2014 so it makes sense to carry that name forward. BizTalk360 has been working closely with the Microsoft product groups to put on a great event.

This year the summit looks to be better and bigger than ever.  There are more than 20 speakers lined up over 3 days .  The speakers have a variety of backgrounds including Microsoft Product Group, Consultants, Customers, System Integrators and MVPs. There is also an opportunity to hear from the Microsoft’s leadership team and get insight into their plans as it pertains to Azure App Service and Integration.

The session abstracts look great! The topics cover a broad set of technologies that will appeal to integration focused professionals.  The topics include:

  • BizTalk Server 2016
  • Azure Logic Apps
  • Azure App Service
  • Azure Service Bus (Event Hubs and Messaging)
  • Internet of Things (IoT)
  • Azure API Management
  • Azure Stream Analytics
  • Power Bi

My topic will focus on some the recent learnings from an Industrial IoT project.  I will talk about tag data ingestion, complex event processing (calculations, reference data, out of bounds, absence of event) and visualization.  I will also throw in a little BizTalk and Logic apps for good measure.

This will be my third time speaking in London at a BizTalk360 event.  I am once again looking forward to the experience as BizTalk360 always puts on a good show and it is a great opportunity to network with the excellent European Integration community.

For more details, please check out the event page.  There are early bird specials so pay close attention to the dates.

See you in London!



Be careful with Azure Stream Analytics, PowerBi and Azure AD MFA

Ran into an interesting situation today.  I have some real time Azure Stream Analytics (ASA) streams that feed a couple Power BI dashboards.  I also took the plunge with getting enabled with Azure Multi Factor Authentication (MFA).  The MFA setup was fine but I logged back into the Azure Portal later to notice that my ASA job had stopped.  When I looked in the ASA Monitoring Logs I found the following Send Error:


Followed by a User authentication error:


I was able to put 2 and 2 together and the timelines aligned to when my MFA was turned on but I was still perplexed.  I mean this wasn’t a Windows service that had an out of date password. My credentials were never associated to the ASA job. I did use MFA when logging into the portal so that wasn’t it.  I tried to restard the ASA job multiplie times as the portal was giving me a “this is a transient error message”….it wasn’t.

It wasn’t until it clued in on me that when you set Power Bi Outputs for Stream Analytics that you need to also authenticate using your credentials.  As soon as MFA was enabled, my credentials had expired and I needed to log back in which did result in an MFA challenge.  I got through that and could start my job and outstanding events streamed through.


This isn’t a production workload so the damage was minimal.  It does beg the question thought that if you are publishing events from ASA to PowerBI, what is the right way to authenticate with Power BI?  Should you be creating a “system” account with no MFA and no password being reset?  I am open to recommendations if you have them.