Note: This blog post was written in early July, 2016 against preview bits. The tools and techniques used in this post are bound to change.
In a previous blog post (Using Azure AD Authentication between Logic Apps and Azure API Apps ) I discussed a hybrid scenario where I wanted to expose on-premises data to an Azure Logic App, via an Azure API App). In my scenario, the data that I want to expose comes via another middleware platform called Denodo.
What is Denodo?
Denodo is a Data Virtualization Platform. But what does that mean? Well I consider it to be a Data Integration tool. On the surface, you may think does this tool compete with other Integration Brokers/ESBs. If used correctly, I don’t think so . If overused/abused, then sure it can compete – much like many other tools.
Instead, Denodo competes more with ETL tools such as SSIS or SAP Data Services. What it aims to do is to eliminate, or reduce, the amount of ETL (aka data copying) from data sources only to consolidate those data sources upstream. As soon as ETL is in the mix, you have batch processing and risk running into stale data or missed opportunities.
Instead, you introduce a “data abstraction” layer that sits in front of physical data sources and can project virtual database/table views. These views can span tables and even multiple data sources. Data sources are not limited to databases either. Denodo can virtualize files and scrape web pages. Scraping a web page seems like a very tedious thing to do – and it can be but not if you have tools that are flexible enough that when you want to make a change you can easily do so.
One of the other benefits of Data Virtualization platforms, such as Denodo, is that they can take a regular ODBC/JDBC data source and project it as something else such as REST service. In my scenario, this is part of the reason why I am using Denodo to begin with.
In the app that I am building in the previous blog post, I need to bring in multiple data sets. One data set is on premises and has an Oracle backend, another is a csv file that gets published on a regulator’s website every hour and the last is a Cloud data source.
All of these data sources have been projected as RESTful services using HTTP and JSON. But now I have an issue. How do I connect a Logic App with these on-premise REST services? Yes, Logic Apps has an HTTP connector, but can it see on-premises endpoints? Denodo sits in our Corporate Intranet and I am not interested in opening up inbound firewalls. Even if I could access Denodo from Logic Apps, Denodo does not have any metadata exposed, such as Swagger, so then I would have to deal with message shapes in Logic Apps.
One way to solve this problem, is to use an Azure API App. By doing so we can wrap our Denodo RESTful services in our Web API, decorate it with Swagger and then publish to Azure. By doing this we can now enable a Hybrid Connection since an API App is really running in the context of a Web Site. We also can plug into Logic Apps easily since Swagger is a first class citizen in Logic Apps.
By adding an Azure API App, my architecture now looks like this:
In my previous post, I walked through deploying my API App so I am not going to get into more details here on that process. At this point, I am assuming the API App has been deployed.
- Within your App Service, navigate to the Networking blade and then click Configure your hybrid connection endpoints.
- Click on Add
- Add New hybrid connection
- Provide a Name, Hostname, Port and then click on Configure required settings
- Provide a Name, Pricing Tier, Resource Group, Subscription and Location
- Click OK to continue
- The Hybrid Connection will get created
- The Hybrid connection will take a couple minutes to provision. Once provisioned, we will see that it has a Not Connected status. This is normal since we have not set up our On-premises agent yet.
- Next,we need to download and install the agent. Click on the name of your Hybrid connection which in my case is OnPremDenodo.
- Next, click on Listener Setup and then Install and configure now.
- An executable will show up in your browser. Open it.
- Click Run
- Midway through the installation you will be prompted to provide a Hybrid Connection String. You can get this from the Azure Portal in the blade that you downloaded the Hybrid Connection agent from.
- Once your connection string has been entered, you installation should be complete.
- In the Azure Portal, your Hybrid Connection should now have a status of Connected
- From within your API App, the URL of the REST service that you are trying to hit will be your local REST service URL. It will not be your Hybrid Connection.
- In fact, you actually don’t need to reference the Hybrid Connection URL anywhere. When your API App gets called and in turn you call your On-Premise URL, your Hybrid Connection will kick-in and you will access that resource through that tunnel.
- Logic Apps will access your API App using the public Azure endpoint, hence the need to lock it down using Azure AD as I discussed in my previous post.
In this post we reviewed how it is possible for a Logic App to access an On-Premise REST service by using Hybrid Connectivity through a custom API App. We could have also used a site to site VPN connection, but that would have involved more moving parts, approvals and people.