Speaking at Integrate 2016

BizTalk360 has recently released more details on their annual conference in London.  This year there is a name change.  Instead of the BizTalk Summit, the name gets altered to align with the “Integrate” brand.  In case you were not aware, BizTalk360 organized the last Integrate Summit in Redmond back in December 2014 so it makes sense to carry that name forward. BizTalk360 has been working closely with the Microsoft product groups to put on a great event.

This year the summit looks to be better and bigger than ever.  There are more than 20 speakers lined up over 3 days .  The speakers have a variety of backgrounds including Microsoft Product Group, Consultants, Customers, System Integrators and MVPs. There is also an opportunity to hear from the Microsoft’s leadership team and get insight into their plans as it pertains to Azure App Service and Integration.

The session abstracts look great! The topics cover a broad set of technologies that will appeal to integration focused professionals.  The topics include:

  • BizTalk Server 2016
  • Azure Logic Apps
  • Azure App Service
  • Azure Service Bus (Event Hubs and Messaging)
  • Internet of Things (IoT)
  • Azure API Management
  • Azure Stream Analytics
  • Power Bi

My topic will focus on some the recent learnings from an Industrial IoT project.  I will talk about tag data ingestion, complex event processing (calculations, reference data, out of bounds, absence of event) and visualization.  I will also throw in a little BizTalk and Logic apps for good measure.

This will be my third time speaking in London at a BizTalk360 event.  I am once again looking forward to the experience as BizTalk360 always puts on a good show and it is a great opportunity to network with the excellent European Integration community.

For more details, please check out the event page.  There are early bird specials so pay close attention to the dates.

See you in London!

http://www.biztalk360.com/integrate-2016/

speaker-badge

Azure Logic Apps Preview Refresh: Moving a v1 app to v2

The Microsoft Integration team (BizTalk/Logic Apps) recently hit an important milestone.  They have launched the Logic Apps Preview Refresh.  You can find the original press release here.

The purpose of this post is to highlight some of the recent investments and then take an existing v1 Logic App and rebuild it in v2.

New Designer

Probably the biggest feature released was the new designer experience.  As opposed to the Left –> Right workflow we saw in v1 of Logic Apps, we now find ourselves moving top-down.  I think for most BizTalk people, moving top-down is more natural.

Search/Intellisense

In addition to top-down, we also have a search/intellisense type experience when looking for new shapes(connectors).  It is a neat way to approach the problem as a canvas that contains pages and pages of connectors isn’t the best user experience.

image

Intelligent Output

Another benefit is we are not as dependent on the Logic Apps Workflow Language.  It is still there but Microsoft has abstracted much of that away from us.  So if we want to use a value from a previous step in our workflow we can just select it.  A great addition!

image

This is very much welcomed over the v1 equivalent: A new table has been created @{body(‘wearsy.servicenow.apiapp’).TableName}

Decisions…Decisions

Another investment has been in the way you add an action or a condition.  No longer is a condition ‘hidden’ at the card/connector level.  Very explicit, easy to read and natural.

image

Moving the furniture

With v1 of Logic Apps, once you had a card on the canvas you were committed.  Rearranging the cards was only possible using the json in the code behind.  It at times was painful, so this is a great add. You can now move cards back and forth as long as you are not breaking dependencies.

image

image

Native Webhook support

Logic Apps now supports Webhooks which allows developers to build ‘callback’ interfaces over HTTP.  Many services support Webhooks as  a way to extend their offering including Visual Studio Online, GitHub, Stripe and Paypal to name a few. An example the Logic Apps team likes to use is that you can call a Logic App from Visual Studio when code is committed to your master branch as a way to kick off other downstream processes.

Managed APIs

In v1 of Logic Apps, there was always a provisioning exercise when you wanted to use a new connector/API App. This added some delays in your dev experience and also forced you to create multiple instances in the event you had some different connection strings.  Now, Microsoft has provisioned a set of Managed API connections.  This means that there is zero delay when using out of the box API connections.

This does have an impact on existing v1 API Apps that you have created and that is one of the main reasons for this post.  Since a custom v1 API App is not in this Managed API list it will not be immediately discoverable within Logic Apps.  (This is bound to change as we are still in preview and Microsoft is working on this)  But, since v1 API Apps were decorated with Swagger meta-data, we can take advantage of the Http + Swagger connector in order to consume our existing Swagger “contract”.

Note: There are a few actions that you need to perform in order for your v1 API App to be discoverable.  In order to avoid duplications, I will refer you to Jeff Hollan’s (Microsoft) post and Daniel Probert’s post.

As a starting point, here is my v1 Logic App.  I documented the use case here so I won’t get into a lot of details about what it does.  In summary, on a regular basis I want this Logic App to initiate, create a table in Service Now using my custom Service Now connector and send me a text message when it is complete.

image

With my configuration set as described in those previous blog posts I am going to create a new logic app.

  • The first step that I want to perform is to add the Recurrence card to my Logic App.  I will then set a frequency. In this case it is set to a minute but I will revert it back to 1 week after I am done testing.  Interesting that Microsoft has added a timezone field and a starttime field. 

image

  • Next, I will add my Http + Swagger connector. When you think about it, this is actually a very powerful feature.  Swagger (recently renamed OpenAPI) is the leading API markup language used by IBM, Amazon, Apigee and many others.  What this means is that any API that has Swagger metadata can now be “plugged” into Logic Apps.  This has already provided many benefits.  An example is that the Microsoft Cortana Analytics team has been including Swagger metadata in their APIs so now Logic Apps can plug them in and developers do not have to worry about connectivity or write their own wrappers.  This is a big win!

image

  • With my Swagger URL in my clipboard I can paste it in to the Endpoint URL text box and click the Next button.

image

  • We will now discover all of the different operations that are available for me to use in my Logic App.

image

  • For the purpose of this post, I will use the Table_PostByValue operation. There are 4 attributes that need to be set.  Since I want my table to be dynamic, I will use the Logic App Workflow Language functions.  I did run into a bit of a bug with this one.  If you need to use the @utcnow() function, set it in the code behind. The Product Team is aware and will be addressing this issue.

image

  • After configuring my ServiceNow Api App, I want to now send a text message using the v2 Twilio API App. I can easily add it by starting to type the word “Twilio”.

image

  • I previously had a Twilio account so I needed to log into their website to obtain my credentials.

image

  • Next, I need to provide my Twilio phone number and the phone number where I would like to receive a text. I also want to provide some text in the message and include the name of the table that was just created.  By using Swagger, Logic Apps knows that the Service Now API App will return the name of the Table that was created and as a result I can simply click on that item and have it added to my message body.

image

  • After all my configuration is complete, this is what my Logic App looks like.  Pretty clean!!!

image

  • The end result when testing is that I have a new table created and receive a text message.  Note that Service Now instance is the pacific region which accounts for the 1 hour timestamp delta.

image

image

Conclusion

While still in preview, we can see that the Microsoft team is working hard and is focused on providing continuous value to customers.  While there are some features I would like to see included, this was a good release and a step in the right direction.

Azure Logic Apps Automation

Lately I have been building my own SaaS connector for a well known SaaS vendor.  This vendor (which will remain nameless for this post) provides free dev instances for devs to play with.  There is a limitation though.  If you do not create an object in their system within 10 days you will have your instance revoked.  My interest in the platform has been very much related to calling their APIs and unfortunately calling their APIs does not count as ‘usage’.  To work around this I have been manually logging into the service and creating a dummy table which resets the counter.

I have been meaning to automate this function and a recent hackathon gave me the opportunity to do so.  I extended my SaaS connector to include a Table entity and the ability to create a new Table based upon some parameters via their API.

Recently, I have have been seeing more and more people automating simple tasks in Logic Apps so I figured why not.  I can use the Recurrence Trigger in Logic apps, generate some unique parameters, using expression language, and then create my table once a week using a unique name.  This saves me from manually performing this same action.

To add some ‘bells and whistles’ to my Logic App, I added the Twilio connector so that I can receive a text message whenever this new table is created.  Perhaps in the future I will also automate the clean-up of these tables.

Here is my Logic App – I call it Vakna which means (awake in Swedish –  I was sitting beside my Swedish buddies when I built it).

image

Azure Hybrid Integration Day coming to Calgary

Every year some of the brightest minds in Microsoft Integration descend upon Redmond, Washington for the Microsoft MVP Summit. This year 3 MVPs (Saravana Kumar, Steef-Jan Wiggers and Michael Stephenson) from Europe will be stopping by Calgary on their way to the Summit and will be giving some presentations.   Myself and a local Microsoft employee, Darren King, will also be presenting.

I have shared the stage with these MVPs before and can vouch that attendees are in for a unique experience as they discuss their experiences with Microsoft Azure and BizTalk Server.

During this full day of sessions you will learn about how BizTalk and Microsoft Azure can address integration challenges. Session topics include SaaS connectivity, IoT, Hybrid SQL Server, BizTalk administration & operations and Two Speed IT using Microsoft Azure. Also bring your burning questions for our interactive Ask the Experts Q & A.

The free event takes place on October 30th, 2015  at the Calgary Microsoft office.  You can find more details here.

Logic Apps: Integrating Custom SharePoint Lists and Salesforce

I have posted a little tutorial video that describes consuming data from a SharePoint Server Custom List and using that data to create new Contacts in Salesforce. The SharePoint Connector will take advantage of the Hybrid Connection capability that allows messages to flow between the Azure cloud and an On-Premise system without requiring ports to be opened in firewall.  

 

Azure App Service Demo Scenario–Part 1

While recording is a lot of work, I find that it is a very useful channel for learning. As I learn more about the new Azure App Service I will be posting some short demonstrations and walkthroughs.

The first post in this series is a quick Logic App demo that includes Twitter and Dropbox integration.  The inspiration for the demo comes from the App Service Documentation which can be found here.

Azure–Service Bus Queues in Worker Roles

 

Another nugget of information that I picked up at TechEd North America is a new template that ships as part of the Azure SDK 1.7 called Worker Role with Service Bus Queue. 

image

What got me interested in this feature is some of the work that I did last year with the AppFabric Applications (Composite Apps) CTP.  I was a big fan of that CTP as it allowed you to wire-up different cloud services rather seamlessly.  That CTP has been officially shut-down so I can only assume that this template was introduced to address some of the problems that the AppFabric Applications CTP sought to solve.

 

The Worker Role with Service Bus Queue feature works in both Visual Studio 2010 or Visual Studio 2012 RC.  For this post I am going to be using 2012 RC.

I am going to use a similar scenario to the one that I used in the App Fabric Applications CTP.  I will build a simple Web Page that will allow a “customer” to populate a power outage form.  I will then submit this message to a Service Bus Queue and will then have a Worker role dequeue this message. For the purpose of this post I will simply write a trace event to prove that I am able to pull the message off of the queue.

 

Building the Application

  • Create a new project by clicking on File (or is it FILE) – New Project
  • Select Cloud template and then you will see a blank pane with no template able to be selected.  The Azure SDK is currently built on top of .Net 4.0 not 4.5.  With this in mind, we need to select .Net Framework 4

image

  • We now need to select the Cloud Services that will make up our solution.  In my scenario I am going to include an ASP.Net Web Role and a Worker Role with Service Bus Queue.

image

Note: we do have the opportunity to rename these artifacts by hovering over the label and then clicking on the pencil. This was a gap that existed in the old AppFabric Apps CTP.  After renaming my artifacts my solution looks like this:

image

  • I want to send and receive a strongly typed message so I am going to create a Class Library and call it CustomerEntity.

image

  • In this project I will simply have one class called Customer with the following properties

namespace CustomerEntity
{
    public class Customer
    {
        public string Address { get; set; }
        public string City { get; set; }
        public string State { get; set; }
    }
}

 

  • I will then add a reference in both the Web Project and Worker Role projects to this CustomerEntity project.

 

  • Within the PowerOutageWeb project clear out all of the default markup in the Default.aspx page and add the following controls.

<h3>Customer Information:</h3>
Address: <asp:TextBox ID="txtAddress" runat="server"></asp:TextBox><br />   
City: <asp:TextBox ID="txtCity" runat="server"></asp:TextBox> <br />
State: <asp:TextBox ID="txtState" runat="server"></asp:TextBox><br />
<asp:Button ID="btnSubmit" runat="server" Text="Submit"  OnClick="btnSubmit_Click" />
<asp:Label ID="lblResult" runat="server" Text=""></asp:Label>

 

  • Also within the PowerOutageWeb project we need to add references to the Service Bus Assembly:  Microsoft.ServiceBus.dll and Runtime Serialization Assembly: System.Runtime.Serialization.dll

 

  • We now need to provide the following include statements:

using CustomerEntity;
using Microsoft.WindowsAzure;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;

 

  • Next we need to provide a click event for our submit button and then include the following code:

    protected void btnSubmit_Click(object sender, EventArgs e)
      {
          Customer cs = new Customer();
          cs.Address = txtAddress.Text;
          cs.City = txtCity.Text;
          cs.State = txtState.Text;

          const string QueueName = "PowerOutageQueue";

         //Get the connection string from Configuration Manager

         string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");

          var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

         //Check to see if Queue exists, if it doesn’t, create it
          if (!namespaceManager.QueueExists(QueueName))
          {
              namespaceManager.CreateQueue(QueueName);
          }

          MessagingFactory factory = MessagingFactory.CreateFromConnectionString(connectionString);

          //Create Queue CLient
          QueueClient myQueueClient = factory.CreateQueueClient(QueueName);

          BrokeredMessage bm = new BrokeredMessage(cs);

         //Send Message

          myQueueClient.Send(bm);

         //Update Web Page

          lblResult.Text = "Message sent to queue";
      }

 

  • Another new feature in this SDK that you may have noticed is the CreateFromConnectionString method that is available to the MessagingFactory and NamespaceManager classes.  This allows us to retrieve our configuration settings from our project properties page.  To access the project properties right mouse click on the particular role and then select Properties.  Next, click on Settings where you will find Key/Value Pairings.  The name of the key that we are interested in is: Microsoft.ServiceBus.ConnectionString and our value is

Endpoint=sb://[your namespace].servicebus.windows.net;SharedSecretIssuer=owner;SharedSecretValue=[your secret]

  • Since both our Web and Worker Roles will be accessing the Queue we will want to   ensure that both configuration files have this entry included.  This will allow our code to make a connection to our Service Bus Namespace where our Queue may be found.  If we edit this property here in these locations, then we do not need to modify the Cloud.cscfg and Local.cscfg configuration files because Visual Studio will take care of this for us.

image

  • Next we want to shift focus to the Worker Role and edit the WorkerRole.cs file.  Since we are going to be dequeuing our typed CustomerService message we want to include a reference to this namespace:

    using CustomerEntity;

  • Something that you probably noticed when you opened up the WorkerRole.cs file is that there is already some code written for us.  We can leverage most of it but can delete the code that is highlighted in red below:

image

  • Where we deleted this code, we will want to add the following:

Customer cs = receivedMessage.GetBody<Customer>();
Trace.WriteLine(receivedMessage.SequenceNumber.ToString(), "Received Message");
Trace.WriteLine(cs.Address, "Address");
Trace.WriteLine(cs.City, "City");
Trace.WriteLine(cs.State, "State");
receivedMessage.Complete();

In this code we are going to receive a typed Customer message and then simply write out the contents of this message using the Trace utility.  If we wanted to save this information to a Database, this would a good place to write that code.

  • We also want to make sure that we update the name of Queue so that it matches the name that we specified in the Web Role:

// The name of your queue
const string QueueName = "PowerOutageQueue";

 

Creating our Queue

We have a few options when it comes to creating our Queue that is required for this solution to work. The code in our ASP.NET Web page code-behind will take care of this for us.  So for our solution to work, this method is sufficient but perhaps we want to use a design-time alternative to specify some more advanced features.  In this case we do have a few options:

  • Using the http://www.windowsazure.com portal.image
  • Similarly we can use the Service Bus Explorer tool that was written by Paolo Salvatori.  Steef-Jan Wiggers has provided an in-depth walk through of this tool so I won’t go into more details here. http://soa-thoughts.blogspot.ca/2012/06/visual-studio-service-bus-explorer.html
  • As part of the Azure 1.7 SDK release, a Visual Studio Service Bus Explorer is now included.  It is accessible from the Server Explorer view from within Visual Studio.  Using this tool we can perform some functions like:
    • Creating Queues/Topics
    • Set advanced properties: Queue Size, Time to Live, Lock Duration etc
    • Send and Receive Test Messages
    • Get the current Queue Depth
    • Get our Service Bus Connection string

image

 

Any of these methods will work.  As I mentioned earlier, if we do nothing, the code will take care of it for us.

Testing

We are just going to test our code locally, with the exception of our ServiceBus Queue.  That is going to be created in Azure.  To test our example:

  • Hit the F5 and our website should be displayed

image

  • Since we have our Trace statements included in our Worker Role code, we need to launch our Compute Emulator.  To do this right mouse click on the Azure icon that is located in your taskbar and select Show Compute Emulator UI.

image

  • A “Console” like window will appear.  We will want to click on the ServiceBusWorker label

image

  • Switch back to our Web Application, provide some data and click the submit button.  We should see that our results label is updated to indicate that our Message sent to queue. image
  • If we switch back to our Compute Emulator we should discover that our message has been dequeued.

image

 

Conclusion

While the experience is a little different than that of the AppFabric Applications CTP, it is effective.  Especially for developers who may not be all that familiar with “integration”.  This template really provides a great starter point and allows them to wire up a Service Bus queue to their Web Application very quickly.