Speaking at Global Integration Boot camp

On a previous episode of #MiddlewareFriday I talked about a global integration event occurring on Saturday, March 25th 2017.  Happy to announce that I will be speaking at the New York meetup which is being held at the Microsoft Technology Center near Times Square. This is my second opportunity to speak at the MTC and am very much looking forward to it.

The FREE event will take place from 8:30 am to 5 pm and registration is open.

VNB Consulting is hosting the event and I would like to thank Amit and Howard for inviting me to speak. My topic will be on protecting Azure Logic Apps with Azure API Management.  I will be using the new Azure API Management Design Surface, so even if you are very familiar with Azure API Management, you will learn something new.

The event is also taking place in 9 other places simultaneously worldwide including Australia, Belgium, Finland, India, New Zealand, The Netherlands, Norway, Portugal, Sweden.  You can find more information at the bootcamp’s website.



Speaking at Microsoft Ignite

With everything now confirmed, I am happy to report that I will be speaking at Microsoft’s largest conference, Ignite, in September in Atlanta, Georgia.

I will be sharing the stage with John Taubensee from the Azure Service Bus team.  We will be talking about some real world messaging scenarios.  Service Bus and Event Hubs are important pillars in our Integration strategy at TransAlta so I am looking forward to discussing how we use these technologies to provide mission critical messaging for our business.

You can find more details on the Ignite website about our session.

Hybrid Connectivity with Logic Apps and API Apps

Note: This blog post was written in early July, 2016 against preview bits.  The tools and techniques used in this post are bound to change.

In a previous blog post (Using Azure AD Authentication between Logic Apps and Azure API Apps ) I discussed a hybrid scenario where I wanted to expose on-premises data to an Azure Logic App, via an Azure API App). In my scenario, the data that I want to expose comes via another middleware platform called Denodo. 

What is Denodo?

Denodo is a Data Virtualization Platform.  But what does that mean?  Well I consider it to be a Data Integration tool.  On the surface, you may think does this tool compete with other Integration Brokers/ESBs.  If used correctly, I don’t think so .  If overused/abused, then sure it can compete – much like many other tools.

Instead, Denodo competes more with ETL tools such as SSIS or SAP Data Services. What it aims to do is to eliminate, or reduce, the amount of ETL (aka data copying) from data sources only to consolidate those data sources upstream.  As soon as ETL is in the mix, you have batch processing and risk running into stale data or missed opportunities.

Instead, you introduce a “data abstraction” layer that sits in front of physical data sources and can project virtual database/table views.  These views can span tables and even multiple data sources. Data sources are not limited to databases either.  Denodo can virtualize files and scrape web pages.  Scraping a web page seems like a very tedious thing to do – and it can be but not if you have tools that are flexible enough that when you want to make a change you can easily do so.


One of the other benefits of Data Virtualization platforms, such as Denodo, is that they can take a regular ODBC/JDBC data source and project it as something else such as  REST service.  In my scenario, this is part of the reason why I am using Denodo to begin with. 

In the app that I am building in the previous blog post, I need to bring in multiple data sets. One data set is on premises and has an Oracle backend, another is a csv file that gets published on a regulator’s website every hour and the last is a Cloud data source. 

All of these data sources have been projected as RESTful services using HTTP and JSON. But now I have an issue.  How do I connect a Logic App with these on-premise REST services?  Yes, Logic Apps has an HTTP connector, but can it see on-premises endpoints?  Denodo sits in our Corporate Intranet and I am not interested in opening up inbound firewalls.  Even if I could access Denodo from Logic Apps, Denodo does not have any metadata exposed, such as Swagger, so then I would have to deal with message shapes in Logic Apps.

One way to solve this problem, is to use an Azure API App.  By doing so we can wrap our Denodo RESTful services in our Web API, decorate it with Swagger and then publish to Azure.  By doing this we can now enable a Hybrid Connection since an API App is really running in the context of a Web Site.  We also can plug into Logic Apps easily since Swagger is a first class citizen in Logic Apps.

By adding an Azure API App, my architecture now looks like this:


In my previous post, I walked through deploying my API App so I am not going to get into more details here on that process.  At this point, I am assuming the API App has been deployed.

  • Within your App Service, navigate to the Networking blade and then click Configure your hybrid connection endpoints.


  • Click on Add


  • Add New hybrid connection


  • Provide a Name, Hostname, Port and then click on Configure required settings


  • Provide a Name, Pricing Tier, Resource Group, Subscription and Location


  • Click OK to continue
  • The Hybrid Connection will get created


  • The Hybrid connection will take a couple minutes to provision.  Once provisioned, we will see that it has a Not Connected status.  This is normal since we have not set up our On-premises agent yet.


  • Next,we need to download and install the agent.  Click on the name of your Hybrid connection which in my case is OnPremDenodo.
  • Next, click on Listener Setup and then Install and configure now.


  • An executable will show up in your browser.  Open it.


  • Click Run


  • Midway through the installation you will be prompted to provide a Hybrid Connection String.  You can get this from the Azure Portal in the blade that you downloaded the Hybrid Connection agent from.


  • Once your connection string has been entered, you installation should be complete.


  • In the Azure Portal, your Hybrid Connection should now have a status of Connected



  • From within your API App, the URL of the REST service that you are trying to hit will be your local REST service URL.  It will not be your Hybrid Connection.
  • In fact, you actually don’t need to reference the Hybrid Connection URL anywhere.  When your API App gets called and in turn you call your On-Premise URL, your Hybrid Connection will kick-in and you will access that resource through that tunnel.
  • Logic Apps will access your API App using the public Azure endpoint, hence the need to lock it down using Azure AD as I discussed in my previous post.


In this post we reviewed how it is possible for a Logic App to access an On-Premise REST service by using Hybrid Connectivity through a custom API App.  We could have also used a site to site VPN connection, but that would have involved more moving  parts, approvals and people.

Azure Logic Apps–Deleting Items From SharePoint (Online) List

 I have a scenario at work where we need to provide some simple syncronization between a SQL Azure table and a SharePoint Online Custom List. As a pre-requisite each morning before business users get into the office, we need to purge the contents from the SharePoint list and update it with today’s data + a 6 day forecast of future data.

I have integrated  BizTalk with custom SharePoint Lists in the past, and even wrote about it in one of my books. It wasn’t a particularly good experience so I was interested in evaluating how Logic Apps would deal with custom lists. 

One difference between BizTalk and Logic Apps, in this case, is that BIzTalk has a SharePoint Adapter but it will only interface with SharePoint Document Libraries. If you wanted to integrate BizTalk with SharePoint Custom lists you are likely going to do so with the Lists.asmx web service.  While it is completely possible to use this custom web service approach, be prepared to spend a few hours (if you are lucky) getting everything working.

With Logic Apps, it is a very different experience.  From the Logic Apps canvas you need to add a trigger to kick off your workflow.  In my case, I used a Recurrence trigger that will run every day at 7:45 am.  I can also kick this trigger off manually in the Azure Portal if I wish.


Next, I want to add an action and then search for SharePoint from the Microsoft managed APIs dropdown.  After we do that, all available SharePoint Online operations will be displayed.

In my case, I want to purge all items, but there is no Delete List operation.  Instead, I need to get all items in my list first, so that I can use the ID from each record to delete that item.  In my scenario I expect 7 days * 24 hourly records (168 items) to be in my list at any given time so this load is not a concern.

With this situation in-mind, I will select the Get Items  operation.


Once I have selected my Get Items operation, I need to establish my connection to my Office 365 subscription. (I will spare you from the password prompts)


With my connection established, I now need to provide the URL to my SharePoint List.  I can also specify optional parameters that control the number of items returned. 

I must say the experience in this dialog is a good one.  I can click on the Site URL dropdown and all of the sites that I have access to will be in that list.  Once I have selected my URL and then click on the List Name dropdown, I then see all the lists that I have access to on that site.


Next, I need to add another activity and this time I will select the Delete Item operation.


I have a similar experience in the Delete Item dialog that I had in the Get Items dialog.  With my connection already established, I need to provide the same Site URL  and the same List Name.  What is different this time is I need to provide an ID for the list item that I would like to delete.  In this case it will be an ID  that is coming from my Get Items response.

Delete Item

You might be asking yourself – will how is that going to work for all of my items in my list?  Don’t you need a for loop to iterate through the Get Items collection? The answer is Yes, but the Logic Apps team has made this very simple – they have done the heavy lifting for you.  If you go into Code View you can see it there:

foreach”: “@body(‘Get_items’)[‘value’]”,
“inputs”: {
    “host”: {
        “api”: {
            “runtimeUrl”: “https://logic-apis-westus.azure-apim.net/apim/sharepointonline”
        “connection”: {
            “name”: “@parameters(‘$connections’)[‘sharepointonline’][‘connectionId’]”
    “method”: “delete”,
    “path”: “/datasets/@{encodeURIComponent(encodeURIComponent(string(‘https://SharePointsharepoint.com/sites/HOPOC/Bighorn/SitePages/Home.aspx’)))}/tables/@{encodeURIComponent(encodeURIComponent(string(‘c28b1ea2-e2a0-4faf-b7c2-3eerec21a8b’)))}/items/@{encodeURIComponent(string(item()[‘ID’]))}”



The end result is a Logic App that looks like this:

Total Solution

I can now my Logic App from the Azure Portal by clicking on Select Trigger and then recurrence.


I can follow my execution and dive into my Get Items and Delete Items calls.  By inspecting my inbound and outbound traces I can see the exact payloads and HTTP Status codes from the underlying operations.  In this case I was able to delete 300 items in 23 seconds.  For each item in my collection, a Delete Item call is made.



It honestly took me about 10 minutes to figure this out.  Part of the reason why I am writing about this is I know how long this would take with BizTalk. I anticipate it would take at least 3 hours to do this in BizTalk if it was your first time.  So this isn’t a knock against BizTalk, as Logic Apps has been built, IMO, for these lightweight scenarios with little friction.

In the short term I think Microsoft Integration architects and developers will have many opportunities like this one where you can choose one tool or the other.  For me, developer productivity needs to be part of that equation.  Where the systems that you are integrating are hosted will also play a role.  In this case, I am connecting to SharePoint Online and SQL Azure so it also doesn’t make sense, IMO, to route all of this info back on-premises only to go back up to the cloud.

We may also see a Hybrid approach where a BizTalk process can call out to a Logic App where you can take advantage of these new Logic App connectors.  This was something that Microsoft discussed at 2016 Integrate Event in London.

In a future blog post we will talk about the other half of this solution which is how to create items in SharePoint Online from SQL Azure.

Speaking at Integrate 2016

BizTalk360 has recently released more details on their annual conference in London.  This year there is a name change.  Instead of the BizTalk Summit, the name gets altered to align with the “Integrate” brand.  In case you were not aware, BizTalk360 organized the last Integrate Summit in Redmond back in December 2014 so it makes sense to carry that name forward. BizTalk360 has been working closely with the Microsoft product groups to put on a great event.

This year the summit looks to be better and bigger than ever.  There are more than 20 speakers lined up over 3 days .  The speakers have a variety of backgrounds including Microsoft Product Group, Consultants, Customers, System Integrators and MVPs. There is also an opportunity to hear from the Microsoft’s leadership team and get insight into their plans as it pertains to Azure App Service and Integration.

The session abstracts look great! The topics cover a broad set of technologies that will appeal to integration focused professionals.  The topics include:

  • BizTalk Server 2016
  • Azure Logic Apps
  • Azure App Service
  • Azure Service Bus (Event Hubs and Messaging)
  • Internet of Things (IoT)
  • Azure API Management
  • Azure Stream Analytics
  • Power Bi

My topic will focus on some the recent learnings from an Industrial IoT project.  I will talk about tag data ingestion, complex event processing (calculations, reference data, out of bounds, absence of event) and visualization.  I will also throw in a little BizTalk and Logic apps for good measure.

This will be my third time speaking in London at a BizTalk360 event.  I am once again looking forward to the experience as BizTalk360 always puts on a good show and it is a great opportunity to network with the excellent European Integration community.

For more details, please check out the event page.  There are early bird specials so pay close attention to the dates.

See you in London!



Azure Logic Apps Preview Refresh: Moving a v1 app to v2

The Microsoft Integration team (BizTalk/Logic Apps) recently hit an important milestone.  They have launched the Logic Apps Preview Refresh.  You can find the original press release here.

The purpose of this post is to highlight some of the recent investments and then take an existing v1 Logic App and rebuild it in v2.

New Designer

Probably the biggest feature released was the new designer experience.  As opposed to the Left –> Right workflow we saw in v1 of Logic Apps, we now find ourselves moving top-down.  I think for most BizTalk people, moving top-down is more natural.


In addition to top-down, we also have a search/intellisense type experience when looking for new shapes(connectors).  It is a neat way to approach the problem as a canvas that contains pages and pages of connectors isn’t the best user experience.


Intelligent Output

Another benefit is we are not as dependent on the Logic Apps Workflow Language.  It is still there but Microsoft has abstracted much of that away from us.  So if we want to use a value from a previous step in our workflow we can just select it.  A great addition!


This is very much welcomed over the v1 equivalent: A new table has been created @{body(‘wearsy.servicenow.apiapp’).TableName}


Another investment has been in the way you add an action or a condition.  No longer is a condition ‘hidden’ at the card/connector level.  Very explicit, easy to read and natural.


Moving the furniture

With v1 of Logic Apps, once you had a card on the canvas you were committed.  Rearranging the cards was only possible using the json in the code behind.  It at times was painful, so this is a great add. You can now move cards back and forth as long as you are not breaking dependencies.



Native Webhook support

Logic Apps now supports Webhooks which allows developers to build ‘callback’ interfaces over HTTP.  Many services support Webhooks as  a way to extend their offering including Visual Studio Online, GitHub, Stripe and Paypal to name a few. An example the Logic Apps team likes to use is that you can call a Logic App from Visual Studio when code is committed to your master branch as a way to kick off other downstream processes.

Managed APIs

In v1 of Logic Apps, there was always a provisioning exercise when you wanted to use a new connector/API App. This added some delays in your dev experience and also forced you to create multiple instances in the event you had some different connection strings.  Now, Microsoft has provisioned a set of Managed API connections.  This means that there is zero delay when using out of the box API connections.

This does have an impact on existing v1 API Apps that you have created and that is one of the main reasons for this post.  Since a custom v1 API App is not in this Managed API list it will not be immediately discoverable within Logic Apps.  (This is bound to change as we are still in preview and Microsoft is working on this)  But, since v1 API Apps were decorated with Swagger meta-data, we can take advantage of the Http + Swagger connector in order to consume our existing Swagger “contract”.

Note: There are a few actions that you need to perform in order for your v1 API App to be discoverable.  In order to avoid duplications, I will refer you to Jeff Hollan’s (Microsoft) post and Daniel Probert’s post.

As a starting point, here is my v1 Logic App.  I documented the use case here so I won’t get into a lot of details about what it does.  In summary, on a regular basis I want this Logic App to initiate, create a table in Service Now using my custom Service Now connector and send me a text message when it is complete.


With my configuration set as described in those previous blog posts I am going to create a new logic app.

  • The first step that I want to perform is to add the Recurrence card to my Logic App.  I will then set a frequency. In this case it is set to a minute but I will revert it back to 1 week after I am done testing.  Interesting that Microsoft has added a timezone field and a starttime field. 


  • Next, I will add my Http + Swagger connector. When you think about it, this is actually a very powerful feature.  Swagger (recently renamed OpenAPI) is the leading API markup language used by IBM, Amazon, Apigee and many others.  What this means is that any API that has Swagger metadata can now be “plugged” into Logic Apps.  This has already provided many benefits.  An example is that the Microsoft Cortana Analytics team has been including Swagger metadata in their APIs so now Logic Apps can plug them in and developers do not have to worry about connectivity or write their own wrappers.  This is a big win!


  • With my Swagger URL in my clipboard I can paste it in to the Endpoint URL text box and click the Next button.


  • We will now discover all of the different operations that are available for me to use in my Logic App.


  • For the purpose of this post, I will use the Table_PostByValue operation. There are 4 attributes that need to be set.  Since I want my table to be dynamic, I will use the Logic App Workflow Language functions.  I did run into a bit of a bug with this one.  If you need to use the @utcnow() function, set it in the code behind. The Product Team is aware and will be addressing this issue.


  • After configuring my ServiceNow Api App, I want to now send a text message using the v2 Twilio API App. I can easily add it by starting to type the word “Twilio”.


  • I previously had a Twilio account so I needed to log into their website to obtain my credentials.


  • Next, I need to provide my Twilio phone number and the phone number where I would like to receive a text. I also want to provide some text in the message and include the name of the table that was just created.  By using Swagger, Logic Apps knows that the Service Now API App will return the name of the Table that was created and as a result I can simply click on that item and have it added to my message body.


  • After all my configuration is complete, this is what my Logic App looks like.  Pretty clean!!!


  • The end result when testing is that I have a new table created and receive a text message.  Note that Service Now instance is the pacific region which accounts for the 1 hour timestamp delta.




While still in preview, we can see that the Microsoft team is working hard and is focused on providing continuous value to customers.  While there are some features I would like to see included, this was a good release and a step in the right direction.

Azure Logic Apps Automation

Lately I have been building my own SaaS connector for a well known SaaS vendor.  This vendor (which will remain nameless for this post) provides free dev instances for devs to play with.  There is a limitation though.  If you do not create an object in their system within 10 days you will have your instance revoked.  My interest in the platform has been very much related to calling their APIs and unfortunately calling their APIs does not count as ‘usage’.  To work around this I have been manually logging into the service and creating a dummy table which resets the counter.

I have been meaning to automate this function and a recent hackathon gave me the opportunity to do so.  I extended my SaaS connector to include a Table entity and the ability to create a new Table based upon some parameters via their API.

Recently, I have have been seeing more and more people automating simple tasks in Logic Apps so I figured why not.  I can use the Recurrence Trigger in Logic apps, generate some unique parameters, using expression language, and then create my table once a week using a unique name.  This saves me from manually performing this same action.

To add some ‘bells and whistles’ to my Logic App, I added the Twilio connector so that I can receive a text message whenever this new table is created.  Perhaps in the future I will also automate the clean-up of these tables.

Here is my Logic App – I call it Vakna which means (awake in Swedish –  I was sitting beside my Swedish buddies when I built it).