Speaking at Global Integration Boot camp

On a previous episode of #MiddlewareFriday I talked about a global integration event occurring on Saturday, March 25th 2017.  Happy to announce that I will be speaking at the New York meetup which is being held at the Microsoft Technology Center near Times Square. This is my second opportunity to speak at the MTC and am very much looking forward to it.

The FREE event will take place from 8:30 am to 5 pm and registration is open.

VNB Consulting is hosting the event and I would like to thank Amit and Howard for inviting me to speak. My topic will be on protecting Azure Logic Apps with Azure API Management.  I will be using the new Azure API Management Design Surface, so even if you are very familiar with Azure API Management, you will learn something new.

The event is also taking place in 9 other places simultaneously worldwide including Australia, Belgium, Finland, India, New Zealand, The Netherlands, Norway, Portugal, Sweden.  You can find more information at the bootcamp’s website.

image

#MiddlewareFriday

The purpose of this post is to talk about a side-project that I have going on with Saravana Kumar and BizTalk360. The purpose of #MiddlewareFriday is to create a video blog of new and interesting developments going on in the industry.  Each week we will publish a short video that has some new content.  The content may feature news, demos and will also highlight other activities going on in the community.  From time to time we will also bring on some guests to keep the content fresh and get some different perspectives.

For both Saravana and myself there is no direct commercial incentive in doing the show.  It really comes down to participating in a community, learning by doing, improving communication skills and having some fun along the way.

I am going to keep this post updated to keep a running list of the shows – in part to aid in search engine discovery.

Episode Title Date Tags
1 Protecting Azure Logic Apps with Azure API Management January 6, 2017 Azure API Management, Logic Apps, ServiceNow, API Apps
2

Azure Logic Apps and Service Bus Peek-Lock

January 13, 2017 Logic Apps, Service Bus, Patterns
3

Logic Apps and Cognitive Services Face API – Part 1

January 20, 2017 Logic Apps, Cognitive Service, Face API, Steef-Jan Wiggers
4 Microsoft PowerApps and Cognitive Services Face API -Part 2 January 27, 2017 PowerApps, Cognitive Services, Face API
5 Serverless Integration with Steef-Jan Wiggers February 3rd, 2017 Logic Apps, Sentiment Analysis, Slack, Azure Functions, Steef-Jan Wiggers
6 Azure Logic Apps and Power BI Real Time Data Sets February, 10, 2017 Logic Apps, Power BI connector, Sandro’s Integration stencils, Quicklearn, Global Integration Bootcamp
7 Azure Monitoring, Azure Logic Apps and Creating ServiceNow Incidents February 17, 2017 Logic Apps, Azure Monitoring, API Apps, ServiceNow, Glen Colpaert SAP, Webhook Notification BizTalk360
8 Exploring ServiceBus360 Monitoring February 24, 2017 Service Bus, BizTalk360, Community Content: Team Flow + Luis, Exception handling for Logic App Web Services Toon Vanhoutte
9 SAP and Logic Apps (Part 1) March 3rd, 2017 Logic Apps, On-premises data gateway, SAP, Steef-Jan Wiggers, Michael Stephenson, Logic Apps Live review
10 SAP and Logic Apps (Part 2) using Enterprise Integration Pack March 10, 2017 Integration Pack, SAP – Richard Seroter, Enterprise Architecture for cloud natives, BizTalk 2016 Task Schedule Adapter
11 Logic Apps – Inbound Custom HTTP Paths/Methods March 17th, 2017 HTTP Request, Custom Paths, GET, DELETE, Johan Hedberg, Continuous Integration
12 Using Azure API Management to protect BizTalk Server Endpoints – with Steef-Jan Wiggers March 24th, 2017 Azure API Management, BizTalk, Azure Service Bus Relay
13 Global Integration Bootcamp – Highlights from New York City March 31st, 2017 Global Integration Bootcamp, Stephen W Thomas, Howard Edidin, Mandi Ohlinger
14 Azure Functions, Swagger and API Managment (Coming Soon) April 7th  

2016 Year in Review, Looking Ahead to 2017

It is that time of year where I like to reflect back on what the previous year has brought and also set my bearings for the road ahead.  If you are interested in reading my 2015 recap, you can find it here.

Personal

2016 was a milestone birthdate for myself and my twin brother. In order to celebrate, and try to deny getting old for another year, we decided to run a marathon in New York City.  The NYC Marathon is one of the 6 major marathons in the world so it acted as a fantastic backdrop for our celebration.  Never one to turn down an adventure, my good friend Steef-Jan Wiggers joined us for this event.  As you may recall, Steef-Jan and I ran the Berlin Marathon (another major) back in 2013.

Marathon

The course was pretty tough.  The long arching bridges created some challenges for me, but I fought through it and completed the race.  We all finished within about 10 minutes of each and had a great experience touring the city.

Before Race 

Kurt, Kent and Steef-Jan in the TCS tent before the race

AfterRace

At the finish line with the hardware.

After Pary

Celebrating our victory at Tavern on the Green in Central Park.

Speaking

Traveling and speaking is something I really like to do and the MVP program has given me many opportunities to scratch this itch. I also need to thank my previous boss and mentor Nipa Chakravarti for all of the support that she has provided which made all of this possible.

In Q2, I once again had a chance to head to Europe to speak at BizTalk360’s Integrate Event with the Microsoft Product Group.  My topic was on Industrial IoT and some of the project work that we had been working on. You can find a recording of this talk here.

Speaking

On stage….

Integrate

I really like this photo as it reminds me of the conversation I was having with Sandro.  He was trying to sell me a copy of his book, and I was trying to convince him that if he gave me a free copy, that I could help him sell more.  Sandro has to be one of the hardest working MVPs I know who is recognized as one of the top Microsoft Integration Gurus.  If you have ever having a problem in BizTalk, there is a good chance he has already solved it.  You can find his book here in both digital and physical versions.

Integrate Group

BizTalk360 continues to be an integral part of the Microsoft Integration community.  Their 2016 event had record attendance from more than 20 countries.  Thank-you BizTalk360 for another great event and for building a great product.  We use BizTalk360 everyday to monitor our BizTalk and Azure services. 

On a bit of a different note, this past year we had a new set of auditors come in for SOX compliance.  For the first time, that I have experienced, the auditors were really interested in how we were monitoring our interfaces and what our governance model was.  We passed the audit with flying colours, but that was really related to having BizTalk360.  Without it, our results would not have been what they were.

Q3

Things really started to heat up in Q3.  My first, of many trips, was out to Toronto to speak at Microsoft Canada’s Annual General meeting. I shared the stage with Microsoft Canada VP Chris Barry as we chatted about Digital Transformation and discuss our experiences with moving workloads to the cloud.

AGM

Next up was heading to the south east United States to participate in the BizTalk Bootcamp. This is my third time presenting at the event.  I really enjoy speaking at this event as it is very well run and is in a very intimate setting.  I have had the chance to meet some really passionate integration folks at this meetup so it was great to catch up once again.  Thank-you Mandi Ohlinger and the Microsoft Pro Integration team for having me out in Charlotte once again.

BizTalkBootcamp

At the Bootcamp talking about Azure Stream Analytics Windowing.

The following week, I was off to Atlanta to speak at Microsoft Ignite.  Speaking at a Microsoft premier conference like Ignite (formerly TechEd) has been a bucket list item so this was a really great opportunity for me.  At Ignite, I was lucky enough to participate in two sessions.  The first session that I was involved in was a customer segment as part of the PowerApps session with Frank Weigel and Kees Hertogh.  During this session I had the opportunity to show off one of the apps my team has built using PowerApps.  This app was also featured as part of a case study here.

Powerapps

On stage with PowerApps team.

Next up, was a presentation with John Taubensee of the Azure Messaging team.  Once again my presentation focused on some Cloud Messaging work that we had completed earlier in the year.  Working with the Service Bus team has been fantastic this year.  The team has been very open to our feedback and has helped validate different use cases that we have.  In addition to this presentation, I also had the opportunity to work on a customer case study with them.  You can find that document here. Thanks Dan Rosanova, John Taubensee, Clemens Vasters and Joe Sherman for all the support over the past year.

Ignite

Lastly, at the MVP Summit in November, I had the opportunity to record a segment in the Channel 9 studio.  Having watched countless videos on Channel 9, this is always a neat experience.  The segment is not public yet, but I will be sure to post when it is.  Once again, I had the opportunity to hang out with Sandro Pereira before our recordings.

Sandro

Channel9Studio

In the booth, recording.

Channel9StudioCouch

Prepping in the Channel 9 studio

Writing

I continue to write for InfoQ on Richard Seroter’s Cloud Editorial team.  It has been a great experience writing as part of this team.  Not only do I get exposed to some really smart people, I get exposed to a lot of interesting topics that only fuels my career growth.  In total, I wrote 46 articles but here are my top 5 that I either really enjoyed writing or learned a tremendous amount about.

  • Integration Platform as a Service (iPaaS) Virtual Panel In this article, I had the opportunity to interview some thought leaders in the iPaaS space from some industry leading organizations.  Thank-you Jim Harrer (Microsoft), Dan Diephouse (MuleSoft) and Darren Cunningham (SnapLogic) for taking the time to contribute to this feature.  I hope to run another panel in 2017 to gauge how far iPaaS has come.
  • Building Conversational and Text Interfaces Using Amazon Lex – After researching this topic, I immediately became interested in Bots and Deep Learning.  It was really this article that acted as a catalyst for spending more time in this space and writing about Google and Microsoft’s offerings.
  • Azure Functions Reach General AvailabilitySomething that I like to do, when possible, is to get a few sound bytes from people involved in the piece of news that I am covering.  I met Chris Anderson at the Integrate event earlier in the year, so it was great to get more of his perspective when writing this article.
  • Microsoft PowerApps Reaches General Availability – Another opportunity to interview someone directly involved in the news itself.  This time it was Kees Hertogh, a Senior Director of Product Marketing at Microsoft. 
  • Netflix Cloud Migration Complete – Everyone in the industry knows that Netflix is a very innovative company and has disrupted and captured markets from large incumbents.  I found it interesting to get more insight into how they have accomplished this.  Many people probably thought the journey was very short, but what I found was that it wasn’t the case.  It was a very methodical approach that actually took around 8 years to complete.

Another article that I enjoyed writing was for the Microsoft MVP blog called Technical Tuesday.  My topic focused on Extending Azure Logic Apps using Azure Functions. The article was well received and I will have another Technical Tuesday article published early in the new year.

Back to School

Blockchain

I left this topic off of the top 5 deliberately as I will talk about it here, but it absolutely belongs up there. Back in June, I covered a topic for InfoQ called Microsoft Introduces Project Bletchley: A Modular Blockchain Fabric.  I really picked up this topic out of our Cloud queue as my boss at the time had asked me about Blockchain and I didn’t really have a good answer. After researching and writing about the topic, I had the opportunity to attend a Microsoft presentation in Toronto for Financial organizations looking to understand Blockchain.  At the Microsoft event (you can find similar talk here), Alex Tapscott gave a presentation about Blockchain and where he saw it heading.  ConsenSys, a Microsoft partner and Blockchain thought leader was also there talking about the Brooklyn Microgrid. I remember walking out the venue that day thinking everything was about to change.  And it did.  I needed to better understand blockchain.

For those that are not familiar with blockchain, simply put, it is a paradigm that focuses on using a distributed ledger for recording transactions and providing the ability to execute smart contracts against these transactions.  An underlying principle of blockchain is to address the transfer of trust amongst different parties.  Historically, this has been achieved through intermediaries that act as a “middleman” between trading partners.  In return, the intermediary takes a cut on the transaction, but doesn’t really add a lot of value beyond collecting and dispersing funds.  Trading parties are then left to deal with the terms that the intermediary sets.  Using this model typically does not provide incentives for innovation, in fact it typically does the opposite and stifles it due to complacency and entitlement by large incumbent organizations.

What you will quickly discover with blockchain is that it is more about business than technology.  While technology plays a very significant role in blockchain, if your conversation starts off with technology, you are headed in the wrong direction.  With this in mind, I read Blockchain Revolution by Alex and Don Tapscott which really focuses on the art of the possible and identifying some modern-day scenarios that can benefit from blockchain.  While some of the content is very aspirational, it does set the tone for what blockchain could become.

Having completed the book, I decided to continue down the learning path.  I wanted to now focus on the technical path.  I am a firm believer that in order for me to truly understand something, I need to touch it.  By taking the Blockchain Developer course from B9Lab I was able to get some hands on experience with the technology.  As a person that spends a lot of time in the Microsoft ecosystem, this was a good learning opportunity to get back into Linux and more of the open source community as blockchain tools and platforms are pretty much all open source.  Another technical course that I took was the following course on Udemy.  The price point for this course is much lower, so it may be a good place to start without making a more significant financial investment in a longer course.

Next, I wanted to be able to apply some of my learnings.  I found the Future Commerce certificate course from MIT.  It was a three month course, all delivered online.  There were about 1000 students, worldwide, in the course and it was very structured and based upon a lot of group work.  I had a great group that I worked with on an Energy-based blockchain startup.  We had to come up with a business plan, pitch deck, solution architecture and go to market strategy, Having never been involved in a start-up at this level (I did work for MuleSoft, but they were at more than 300 people at the time), it was a great experience to work through this under the tutelage of MIT instructors. 

If you are interested in the future of finance, aka FinTech, I highly recommend this course.  There is a great mix of Finance, Technology, Entrepreneurs, Risk and Legal folks in this class you will learn a lot.

MIT

Gary Vaynerchuk

While some people feel that Twitter is losing its relevancy, I still get tremendous value out of the platform.  The following is just an example.  Someone I follow on Twitter is Dona Sarkar, from Microsoft, I had the opportunity to see her speak at the Microsoft World Partner Conference and quickly became a fan.  Back in October, she put out the following tweet, which required further investigation on my part.

Gary V

Dona’s talks, from the ones that I have seen, are very engaging and also entertaining at the same time.  If she is talking about “Gary Vee” in this manner, I am thinking there is something here.  So I start to digest some of his content.  I was very quickly impressed.  What I like about Gary is he has a bias for action.  Unfortunately, I don’t see this too often in Enterprise IT shops; we try to boil the ocean and watch initiatives fail because people have added so much baggage that the solution is unachievable or people have become disenfranchised.  I have also seen people being rewarded for building “strategies” without a clue how to actual implement them.  I find this really prevalent in Enterprise Architecture where some take pride in not getting into the details.  While you may not need to stay in the details for long, without understanding the mechanics, a strategy is just a document.  And a strategy that has not/cannot be executed is useless.

If you have not spent time listening to Gary, here are some of his quotes that really resonated with me.

  • Bet on your strengths and don’t give a f&%# about what you are not good at.
  • Educate…then Execute
  • You didn’t grow up driving, but somehow you figured it out.
  • Results are results are results
  • I am just not built, to have it dictate my one at-bat at life.
  • Document, Don’t Create.
  • We will have people who are romantic and hold onto the old world who die and we will have people that execute and story tell on the new platform who emerge as leaders in the new world.
  • I am built to get punched in the mouth, I am going spit my front tooth out and look right back at you and be like now what bitch.

image

If this sounds interesting to you, check out a few of his content clips that I have really enjoyed:

Looking Forward

I find it is harder and harder to do this.  The world is changing so fast, why would anyone want to tie themselves down to an arbitrary list? Looking back on my recap from last year, you won’t find blockchain or bots anywhere in that post, yet those are two of the transformative topics that really interested me in 2016.  But, there are some constants that I don’t see changing.  I will continue to be involved in the Microsoft Integration community, developing content, really focused on iPaaS and API Management.  IoT continues to be really important for us at work so I am sure I will continue to watch that space closely.  In fact, I will be speaking about IoT at the next Azure Meetup in Calgary on January 10th.  More details here.

I will also be focusing on blockchain and bots/artificial intelligence as I see a lot of potential in these spaces.  One thing you can bet on is that I will be watching the markets closely and looking for opportunities where I see a technology disrupting or transforming incumbent business models.

Also, it looks like I will be running a marathon again in 2017.  My training has begun and am just awaiting confirmation into the race.

Speaking at Microsoft Ignite

With everything now confirmed, I am happy to report that I will be speaking at Microsoft’s largest conference, Ignite, in September in Atlanta, Georgia.

I will be sharing the stage with John Taubensee from the Azure Service Bus team.  We will be talking about some real world messaging scenarios.  Service Bus and Event Hubs are important pillars in our Integration strategy at TransAlta so I am looking forward to discussing how we use these technologies to provide mission critical messaging for our business.

You can find more details on the Ignite website about our session.

Hybrid Connectivity with Logic Apps and API Apps

Note: This blog post was written in early July, 2016 against preview bits.  The tools and techniques used in this post are bound to change.

In a previous blog post (Using Azure AD Authentication between Logic Apps and Azure API Apps ) I discussed a hybrid scenario where I wanted to expose on-premises data to an Azure Logic App, via an Azure API App). In my scenario, the data that I want to expose comes via another middleware platform called Denodo. 

What is Denodo?

Denodo is a Data Virtualization Platform.  But what does that mean?  Well I consider it to be a Data Integration tool.  On the surface, you may think does this tool compete with other Integration Brokers/ESBs.  If used correctly, I don’t think so .  If overused/abused, then sure it can compete – much like many other tools.

Instead, Denodo competes more with ETL tools such as SSIS or SAP Data Services. What it aims to do is to eliminate, or reduce, the amount of ETL (aka data copying) from data sources only to consolidate those data sources upstream.  As soon as ETL is in the mix, you have batch processing and risk running into stale data or missed opportunities.

Instead, you introduce a “data abstraction” layer that sits in front of physical data sources and can project virtual database/table views.  These views can span tables and even multiple data sources. Data sources are not limited to databases either.  Denodo can virtualize files and scrape web pages.  Scraping a web page seems like a very tedious thing to do – and it can be but not if you have tools that are flexible enough that when you want to make a change you can easily do so.

image

One of the other benefits of Data Virtualization platforms, such as Denodo, is that they can take a regular ODBC/JDBC data source and project it as something else such as  REST service.  In my scenario, this is part of the reason why I am using Denodo to begin with. 

In the app that I am building in the previous blog post, I need to bring in multiple data sets. One data set is on premises and has an Oracle backend, another is a csv file that gets published on a regulator’s website every hour and the last is a Cloud data source. 

All of these data sources have been projected as RESTful services using HTTP and JSON. But now I have an issue.  How do I connect a Logic App with these on-premise REST services?  Yes, Logic Apps has an HTTP connector, but can it see on-premises endpoints?  Denodo sits in our Corporate Intranet and I am not interested in opening up inbound firewalls.  Even if I could access Denodo from Logic Apps, Denodo does not have any metadata exposed, such as Swagger, so then I would have to deal with message shapes in Logic Apps.

One way to solve this problem, is to use an Azure API App.  By doing so we can wrap our Denodo RESTful services in our Web API, decorate it with Swagger and then publish to Azure.  By doing this we can now enable a Hybrid Connection since an API App is really running in the context of a Web Site.  We also can plug into Logic Apps easily since Swagger is a first class citizen in Logic Apps.

By adding an Azure API App, my architecture now looks like this:

image

In my previous post, I walked through deploying my API App so I am not going to get into more details here on that process.  At this point, I am assuming the API App has been deployed.

  • Within your App Service, navigate to the Networking blade and then click Configure your hybrid connection endpoints.

image

  • Click on Add

image

  • Add New hybrid connection

image

  • Provide a Name, Hostname, Port and then click on Configure required settings

image

  • Provide a Name, Pricing Tier, Resource Group, Subscription and Location

image

  • Click OK to continue
  • The Hybrid Connection will get created

image

  • The Hybrid connection will take a couple minutes to provision.  Once provisioned, we will see that it has a Not Connected status.  This is normal since we have not set up our On-premises agent yet.

image

  • Next,we need to download and install the agent.  Click on the name of your Hybrid connection which in my case is OnPremDenodo.
  • Next, click on Listener Setup and then Install and configure now.

image

  • An executable will show up in your browser.  Open it.

image

  • Click Run

image

  • Midway through the installation you will be prompted to provide a Hybrid Connection String.  You can get this from the Azure Portal in the blade that you downloaded the Hybrid Connection agent from.

image

  • Once your connection string has been entered, you installation should be complete.

image

  • In the Azure Portal, your Hybrid Connection should now have a status of Connected

image

Notes:

  • From within your API App, the URL of the REST service that you are trying to hit will be your local REST service URL.  It will not be your Hybrid Connection.
  • In fact, you actually don’t need to reference the Hybrid Connection URL anywhere.  When your API App gets called and in turn you call your On-Premise URL, your Hybrid Connection will kick-in and you will access that resource through that tunnel.
  • Logic Apps will access your API App using the public Azure endpoint, hence the need to lock it down using Azure AD as I discussed in my previous post.

Conclusion

In this post we reviewed how it is possible for a Logic App to access an On-Premise REST service by using Hybrid Connectivity through a custom API App.  We could have also used a site to site VPN connection, but that would have involved more moving  parts, approvals and people.

The Current State of iPaaS

In addition to my day job of being an Enterprise Architect at an Energy company in Calgary, I also write for InfoQ.  Writing for InfoQ allows me to explore many areas of Cloud and engage with many thought leaders in the business.  Recently, I had the opportunity to host a Virtual Panel on Integration Platform as a Service (iPaaS).

Participating in this panel was:

  • Dan Diephouse – Director of Product Management at MuleSoft where he is part of the team that launched MuleSoft’s iPaaS offering: CloudHub.
  • Darren Cunningham – Vice President of Marketing at SnapLogic where he focuses on product management and outbound product marketing.
  • Jim Harrer – Principal Group Program Manager in the Cloud & Enterprise division, at Microsoft, where his team is responsible for Program Management for BizTalk Server and Microsoft’s iPaaS offering: Azure Logic Apps.

Overall, I was very happy with the outcome of the article.  I think the panelists offered some great insight into the current state of iPaaS and where this paradigm is headed.

You can read the entire article here and feel free to add comments in the article’s comment section.

Using Azure AD Authentication between Logic Apps and Azure API Apps

NOTE: This blog post was written in June 2016 and is based upon a preview of Azure Logic Apps.  The functionality is bound to change in the future.  I have no additional information about when the new functionality may, or may not, be available.

Recently I have been working on a PoC where I have created an API App that needs to talk to an On-Premise REST Service hosted in a 3rd party platform called Denodo.  I will talk about those details in a future post.  But for the purpose of this post I want to discuss how I can secure, using Azure AD, my Denodo API App.  In the broader solution that  I am working on, I know this API App will be called from a Logic App. As a result, I want to prove out that a Logic App can authenticate, with Azure AD, while calling my API App.

Some documentation that did help me out in this journey was the following post from Stephen Siciliano.  In the comments, Jeff Hollan also provided some commentary that was helpful. There were still a few bumps in the road, so  I figured I would document exactly the steps that I followed, in order to provide a more streamlined experience.

Part 1 – Enabling Authentication on Endpoint

  • Build your API App in Visual Studio.  In my case I am wrapping an existing REST API that is provided by Denodo platform.  One of the benefits of wrapping it using App Service is that I can add Swagger meta data, which will help me in Logic Apps.  Also, by using an App Service, I can bridge cloud and on-prem using either Hybrid Connections or a VPN connection.  More on the hybrid connectivity in next post (promise).
  • Enable Swagger metadata by uncommenting the .EnableSwaggerUI call in the SwaggerConfig.cs

image

  • Publish App Service.  You will need to specify/create an Azure APP Service Plan, Azure Subscription and Resource Group.

image

  • With the Azure App Service Authentication set to off we will be able to access our API through a browser.

image

  • By navigating to our APP Service URL and appending “/Swagger” we will see our API exposed.  We can interact with our API by clicking on Try It Out!

image

  • At this point we know our API is working and in my case it is calling an on-prem REST API.

NOTE: When you first deploy your API App, a swagger file will be created.  You should download this file before you start locking down your endpoint.

image

  • At my organization we heavily leverage Azure AD.  Whenever we are doing something with cloud, we try to plug-in to Azure AD as we can centrally manage it and it generally plays nice in the Microsoft ecosystem and other SaaS apps.
  • At first I thought I was going to have to create an Azure AD Application by going into the old portal, but you don’t have to do that any more.  Within the new Azure AD Portal there is some slick integration going on.
  • While in your App Service click on Authentication/Authorization and turn on App Service Authentication and and select Log in with Azure Active Directory.

image

  • Click on Azure Active Directory under Authentication Providers
  • From Management mode select Express

image

  • Create New AD App and provide AD App Name.
  • Save your configuration
  • If you then navigate to your Web API/Swagger console you should now be challenged to authenticate against Azure AD.  Enter your credentials and you can interact with Swagger console.
  • At this point your endpoint is secure, but how do you connect Logic Apps to use it? Keep reading.

Part 2 – Exposing Swagger Metadata

As of this writing (June 2016), you will have issues with Logic Apps being able to consume your Swagger metadata.  The reason for this is that Logic Apps (at least for now) requires that the Swagger metadata is available from a public source and over HTTPS. As soon as you locked down your endpoint, the swagger metadata is not publicly available.

For now (as I fully expect that Microsoft is working on a cleaner solution), Take your swagger metadata and place it in Blob storage without any authentication around it. To do this perform the following steps:

  • Create the storage account from the new portal

image

  • Use the Resource manager Deployment model, it can be General Purpose with Standard Performance, using Locally-redundant storage (LRS) and provide a Resource Group.

image

  • With your storage container created, you can then use a tool like Azure Storage Explorer to manage your storage instance.
  • While logged in with my Storage Account credentials (available from Azure Portal) I created a new Blob Container and then set the access level to Public

image

  • Use the Upload button to update your swagger meta data and then you can view your URI for your document by having it selected and clicking on View.

image

Unfortunately we are not done yet.  We now need to deal with CORS. There are ways to enable CORS for blob storage but I decided not to go down that path…at least for now.  As part of the original article that I referenced, Jeff Hollan from the Logic Apps team has provided a bit of a work around to get around CORS.  With your swagger file in a public blob storage, you can take that URL and use his utility which has a “CORS bypass” enabled.  This may not be a long term solution and I can’t speak to how long Jeff will keep this alive but for now it works for me.  If you want to enable CORS for your storage account then check out this link.

To use Jeff’s workaround I just added my Swagger/Blob Storage URL to his helper api:

https://corspassthrough.azurewebsites.net/api/path?uri=https://<myblobnamespace>.blob.core.windows.net/<myblobcontainer>/<myswaggerfile>.swagger

  • I then need to update my API Definition in my API App to use this new URL.

image

Part 3 – Wiring Logic App to use AAD

In order to complete this part we are going to need some data from Azure AD.  In order to get this data we need to get it from the old portal at https://manage.windowsazure.com.

Ultimately, we need to construct a message that looks the following so that we can provide it as an Authentication header in our API Call
{“audience”:”<SignOnURL>”,
“clientId”:”<ClientID>”,
“secret”:”<Secret>”,
“tenant”:”<TenantID>”,
“type”:”ActiveDirectoryOAuth”}

Now the question is where do you get these values?  You get them from your Azure AD instance and more specifically the Azure AD App that you created in Part 1 of this blog post.  In the following image I have outlined exactly  where to get the required values.

image

With this information in hand we can now create a new Logic App.  To keep things simple, I have created a Logic App and use a Recurrence Trigger, mainly so that I can trigger it on demand.

Next, I should be able to select Show APIs for App Services in the same region and my API should show up.

image

Note: If you have any issues with Swagger meta data, this is where you will see them. You may see the dreaded “Failed to fetch swagger. Ensure you have CORS enabled on the endpoint and are calling an HTTPS endpoint” error.  If so you likely have one of two issues:

  • Swagger Metadata not being publicly accessible
  • CORS

For my API, I am only performing a GET and as a result do not have any query parameters.  The only data I need to send is in my Authentication Header which we covered in a previous step.

image

When submitting a message and getting an error like the following, then that means something hasn’t been set up correctly with your Authentication header.

{“code”:”BadRequest”,”message”:”Http request failed as there is an error getting AD OAuth token: ‘AADSTS70001: Application with identifier ‘<bad_token>’ was not found in the directory aff3442b-5f55-409c-be77-da97b366435a\r\nTrace ID: 54eb2e86-2e1b-46p9-8d14-983102278428\r\nCorrelation ID: 873a248f-900c-4f19-9684-447b5bfe6da4\r\nTimestamp: 2016-06-27 03:01:28Z’.”}

Conclusion
I fully expect Microsoft to make this a simplier  and more streamlined experience but until that time, I think it is important that people are locking down their Azure resources.  Lately I have been doing a lot of PoCs with the business and other IT groups.  Naturally the question about security is going to come up and I am not going to just say it is secure, or say it is possible for it to be secure – I want to ensure it is secure.  Once you have the process down, I also don’t think it is too much effort to get it working once you understand all of the mechanics involved.