Azure Logic Apps Automation

Lately I have been building my own SaaS connector for a well known SaaS vendor.  This vendor (which will remain nameless for this post) provides free dev instances for devs to play with.  There is a limitation though.  If you do not create an object in their system within 10 days you will have your instance revoked.  My interest in the platform has been very much related to calling their APIs and unfortunately calling their APIs does not count as ‘usage’.  To work around this I have been manually logging into the service and creating a dummy table which resets the counter.

I have been meaning to automate this function and a recent hackathon gave me the opportunity to do so.  I extended my SaaS connector to include a Table entity and the ability to create a new Table based upon some parameters via their API.

Recently, I have have been seeing more and more people automating simple tasks in Logic Apps so I figured why not.  I can use the Recurrence Trigger in Logic apps, generate some unique parameters, using expression language, and then create my table once a week using a unique name.  This saves me from manually performing this same action.

To add some ‘bells and whistles’ to my Logic App, I added the Twilio connector so that I can receive a text message whenever this new table is created.  Perhaps in the future I will also automate the clean-up of these tables.

Here is my Logic App – I call it Vakna which means (awake in Swedish –  I was sitting beside my Swedish buddies when I built it).


Azure Hybrid Integration Day coming to Calgary

Every year some of the brightest minds in Microsoft Integration descend upon Redmond, Washington for the Microsoft MVP Summit. This year 3 MVPs (Saravana Kumar, Steef-Jan Wiggers and Michael Stephenson) from Europe will be stopping by Calgary on their way to the Summit and will be giving some presentations.   Myself and a local Microsoft employee, Darren King, will also be presenting.

I have shared the stage with these MVPs before and can vouch that attendees are in for a unique experience as they discuss their experiences with Microsoft Azure and BizTalk Server.

During this full day of sessions you will learn about how BizTalk and Microsoft Azure can address integration challenges. Session topics include SaaS connectivity, IoT, Hybrid SQL Server, BizTalk administration & operations and Two Speed IT using Microsoft Azure. Also bring your burning questions for our interactive Ask the Experts Q & A.

The free event takes place on October 30th, 2015  at the Calgary Microsoft office.  You can find more details here.

Logic Apps: Integrating Custom SharePoint Lists and Salesforce

I have posted a little tutorial video that describes consuming data from a SharePoint Server Custom List and using that data to create new Contacts in Salesforce. The SharePoint Connector will take advantage of the Hybrid Connection capability that allows messages to flow between the Azure cloud and an On-Premise system without requiring ports to be opened in firewall.  


Azure App Service Demo Scenario–Part 1

While recording is a lot of work, I find that it is a very useful channel for learning. As I learn more about the new Azure App Service I will be posting some short demonstrations and walkthroughs.

The first post in this series is a quick Logic App demo that includes Twitter and Dropbox integration.  The inspiration for the demo comes from the App Service Documentation which can be found here.

Azure–Service Bus Queues in Worker Roles


Another nugget of information that I picked up at TechEd North America is a new template that ships as part of the Azure SDK 1.7 called Worker Role with Service Bus Queue. 


What got me interested in this feature is some of the work that I did last year with the AppFabric Applications (Composite Apps) CTP.  I was a big fan of that CTP as it allowed you to wire-up different cloud services rather seamlessly.  That CTP has been officially shut-down so I can only assume that this template was introduced to address some of the problems that the AppFabric Applications CTP sought to solve.


The Worker Role with Service Bus Queue feature works in both Visual Studio 2010 or Visual Studio 2012 RC.  For this post I am going to be using 2012 RC.

I am going to use a similar scenario to the one that I used in the App Fabric Applications CTP.  I will build a simple Web Page that will allow a “customer” to populate a power outage form.  I will then submit this message to a Service Bus Queue and will then have a Worker role dequeue this message. For the purpose of this post I will simply write a trace event to prove that I am able to pull the message off of the queue.


Building the Application

  • Create a new project by clicking on File (or is it FILE) – New Project
  • Select Cloud template and then you will see a blank pane with no template able to be selected.  The Azure SDK is currently built on top of .Net 4.0 not 4.5.  With this in mind, we need to select .Net Framework 4


  • We now need to select the Cloud Services that will make up our solution.  In my scenario I am going to include an ASP.Net Web Role and a Worker Role with Service Bus Queue.


Note: we do have the opportunity to rename these artifacts by hovering over the label and then clicking on the pencil. This was a gap that existed in the old AppFabric Apps CTP.  After renaming my artifacts my solution looks like this:


  • I want to send and receive a strongly typed message so I am going to create a Class Library and call it CustomerEntity.


  • In this project I will simply have one class called Customer with the following properties

namespace CustomerEntity
    public class Customer
        public string Address { get; set; }
        public string City { get; set; }
        public string State { get; set; }


  • I will then add a reference in both the Web Project and Worker Role projects to this CustomerEntity project.


  • Within the PowerOutageWeb project clear out all of the default markup in the Default.aspx page and add the following controls.

<h3>Customer Information:</h3>
Address: <asp:TextBox ID="txtAddress" runat="server"></asp:TextBox><br />   
City: <asp:TextBox ID="txtCity" runat="server"></asp:TextBox> <br />
State: <asp:TextBox ID="txtState" runat="server"></asp:TextBox><br />
<asp:Button ID="btnSubmit" runat="server" Text="Submit"  OnClick="btnSubmit_Click" />
<asp:Label ID="lblResult" runat="server" Text=""></asp:Label>


  • Also within the PowerOutageWeb project we need to add references to the Service Bus Assembly:  Microsoft.ServiceBus.dll and Runtime Serialization Assembly: System.Runtime.Serialization.dll


  • We now need to provide the following include statements:

using CustomerEntity;
using Microsoft.WindowsAzure;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;


  • Next we need to provide a click event for our submit button and then include the following code:

    protected void btnSubmit_Click(object sender, EventArgs e)
          Customer cs = new Customer();
          cs.Address = txtAddress.Text;
          cs.City = txtCity.Text;
          cs.State = txtState.Text;

          const string QueueName = "PowerOutageQueue";

         //Get the connection string from Configuration Manager

         string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");

          var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

         //Check to see if Queue exists, if it doesn’t, create it
          if (!namespaceManager.QueueExists(QueueName))

          MessagingFactory factory = MessagingFactory.CreateFromConnectionString(connectionString);

          //Create Queue CLient
          QueueClient myQueueClient = factory.CreateQueueClient(QueueName);

          BrokeredMessage bm = new BrokeredMessage(cs);

         //Send Message


         //Update Web Page

          lblResult.Text = "Message sent to queue";


  • Another new feature in this SDK that you may have noticed is the CreateFromConnectionString method that is available to the MessagingFactory and NamespaceManager classes.  This allows us to retrieve our configuration settings from our project properties page.  To access the project properties right mouse click on the particular role and then select Properties.  Next, click on Settings where you will find Key/Value Pairings.  The name of the key that we are interested in is: Microsoft.ServiceBus.ConnectionString and our value is

Endpoint=sb://[your namespace];SharedSecretIssuer=owner;SharedSecretValue=[your secret]

  • Since both our Web and Worker Roles will be accessing the Queue we will want to   ensure that both configuration files have this entry included.  This will allow our code to make a connection to our Service Bus Namespace where our Queue may be found.  If we edit this property here in these locations, then we do not need to modify the Cloud.cscfg and Local.cscfg configuration files because Visual Studio will take care of this for us.


  • Next we want to shift focus to the Worker Role and edit the WorkerRole.cs file.  Since we are going to be dequeuing our typed CustomerService message we want to include a reference to this namespace:

    using CustomerEntity;

  • Something that you probably noticed when you opened up the WorkerRole.cs file is that there is already some code written for us.  We can leverage most of it but can delete the code that is highlighted in red below:


  • Where we deleted this code, we will want to add the following:

Customer cs = receivedMessage.GetBody<Customer>();
Trace.WriteLine(receivedMessage.SequenceNumber.ToString(), "Received Message");
Trace.WriteLine(cs.Address, "Address");
Trace.WriteLine(cs.City, "City");
Trace.WriteLine(cs.State, "State");

In this code we are going to receive a typed Customer message and then simply write out the contents of this message using the Trace utility.  If we wanted to save this information to a Database, this would a good place to write that code.

  • We also want to make sure that we update the name of Queue so that it matches the name that we specified in the Web Role:

// The name of your queue
const string QueueName = "PowerOutageQueue";


Creating our Queue

We have a few options when it comes to creating our Queue that is required for this solution to work. The code in our ASP.NET Web page code-behind will take care of this for us.  So for our solution to work, this method is sufficient but perhaps we want to use a design-time alternative to specify some more advanced features.  In this case we do have a few options:

  • Using the portal.image
  • Similarly we can use the Service Bus Explorer tool that was written by Paolo Salvatori.  Steef-Jan Wiggers has provided an in-depth walk through of this tool so I won’t go into more details here.
  • As part of the Azure 1.7 SDK release, a Visual Studio Service Bus Explorer is now included.  It is accessible from the Server Explorer view from within Visual Studio.  Using this tool we can perform some functions like:
    • Creating Queues/Topics
    • Set advanced properties: Queue Size, Time to Live, Lock Duration etc
    • Send and Receive Test Messages
    • Get the current Queue Depth
    • Get our Service Bus Connection string



Any of these methods will work.  As I mentioned earlier, if we do nothing, the code will take care of it for us.


We are just going to test our code locally, with the exception of our ServiceBus Queue.  That is going to be created in Azure.  To test our example:

  • Hit the F5 and our website should be displayed


  • Since we have our Trace statements included in our Worker Role code, we need to launch our Compute Emulator.  To do this right mouse click on the Azure icon that is located in your taskbar and select Show Compute Emulator UI.


  • A “Console” like window will appear.  We will want to click on the ServiceBusWorker label


  • Switch back to our Web Application, provide some data and click the submit button.  We should see that our results label is updated to indicate that our Message sent to queue. image
  • If we switch back to our Compute Emulator we should discover that our message has been dequeued.




While the experience is a little different than that of the AppFabric Applications CTP, it is effective.  Especially for developers who may not be all that familiar with “integration”.  This template really provides a great starter point and allows them to wire up a Service Bus queue to their Web Application very quickly.

Azure Service Bus–Don’t run as Root!


So I can’t take credit for the catchy tag line for this post.  It was inspired by a recent session that Clemens Vasters and Abhishek Lal provided at TechEd North America 2012.  You can watch the entire session here

While watching the presentation, this particular segment, on not running as root,  really resonated with me.  I have built some proof of concept mobile applications that use REST APIs to send messages to Service Bus Queues.  In these applications I use the default owner key to submit messages.  I knew at the time that this could not be a good practice but since it was just a POC it was acceptable.  I was curious about how I could solve this problem in a better manner and I think what Microsoft has done here is definitely a step in the right direction.

If you are familiar with the Azure Service Bus, and Azure in general, there are 3 fundamental ‘artifacts’ that are required whenever you try to provision or manipulate Azure Services.  These artifacts include:

  • Namespace
  • Default Issuer (username)
  • Default Key (password)



The Problem

In 99.99% of the demos and blog posts that exist on the interwebs, people are embedding their Default Issuer and Default Key in their solution.  This creates issues on a few different levels:

  • The account really is “root” when it comes to your namespace.  Using your “account” allows you to create services like Queues, Topics, ACS Trust relationships, provision Cache etc.  Do you see the problem with embedding these credentials in your app and then distributing it?
  • If your account does become compromised, it could be used to maliciously manipulate your solution.  If you comprimised a solution that processed Customer Orders, wouldn’t it be nice to add your own subscriber to the Topic Subscription and receive a copy of each customer’s Credit Card number?

Like any other security principal, we want to ensure that people, or systems, have the minimum level of security rights that they need to perform a specific function.  A parallel example to this scenario is giving end users Domain Admin.  You wouldn’t give that to end users, much like you shouldn’t embed your “owner ‘s” credential in your application.

Enter SBAzTool

There is a tool that is part of the Windows Azure 1.7 SDK that can help us assign fine grained authorization to “Default Name(s)” (Usernames).  You can download the source code for this tool here.  In the rest of this post I will demonstrate how you can use this tool and then show an example that demonstrates why this tool is beneficial and that creating authorization rules is not so hard.

Create Namespace in Portal

Even though I have a functional namespace I am going to go ahead and create one from scratch so that we are all beginning at the same starting point.  If you have an existing namespace that you want to use, you can.  You don’t have to go through these next few steps where I create the namespace.

    • From, log in and then select the previous(old) portal as Service Bus and ACS do not exist in the new portal…yet.
    • Next, click on the Service Bus label and then click on the {New} button


  • In this case I am selecting Access Control, Service Bus, a namespace and a Country/Region.  Since United States (West) is closest to my locale, I will use it.



  • So I now have a Namespace called DontRunAsRoot.  It may take a couple minutes to create.  You will also notice that we have a tree like structure where our Queues and Topics will be displayed.



  • For the purpose of this demo we are going to create a Queue called MyQueue by clicking on the New Queue button.image


  • We will need to populate the Name text box and then can accept the defaults.



  • Voila, we have our newly created Namespace and Queue


  • We are going to need our “owner” key so we might as well get it while we are in the portal.  Click on the View button and then click the Copy to Clipboard button.



      Compiling SBAzTool

      Once you have downloaded the SBAzTool you can compile the solution and then open a command prompt and launch the tool by specifying sbaztool.exe.  You can now see all of the command line arguments.  However you can also access this documentation from the download page as well.



      In order to set permissions for other users/issuers we need to be authenticated ourselves using the Key that we previously retrieved from the portal.  By using the storeoptions argument we can continue to execute commands without having to re-issue our key/password.  To do this we will want to execute the following command:

      sbaztool.exe storeoptions –n <namespace> –k <key>




      So lets now create a new “user” by specifying the makeid command.  In this case we can actually specify a “username” instead of the regular “owner” that we are so use to. 

      sbaztool.exe makeid <username>


      As you can see, a Key has been provided for this user(which I have whited out).  We also have the ability to specify a password but including our own <key> provided it is a 32 byte, base64 encoded value.


      Now that we have a Issuer/User created, we can actually assign this user permissions.  The command to do so is:

      grant <operation> <path> <name>

      The available operations that we have access to are:

      • Send
      • Listen
      • Manage

      Send and Listen are pretty self explanatory but Manage deserves further elaboration.  We can actually delegate the authority to manage resources to a particular user based upon the path.  So lets imagine we have a path that looks like this:


      If we wanted to allow someone to administrate the /engineering services we could do so using this command.

      For the purpose of this blog post, lets keep things simple and assign Send and Listen privileges to our QueueUser for our queue which is called myqueue






      To verify that our permissions have been set correctly we can execute the Show command by providing the following:

      show <path>

      As you can see below the QueueUser now has Send and Listen permissions on the queue called myqueue.  This is a much better solution than giving entire rights to a namespace when you only need to specify rights on a particular queue.



      Test permissions

      So in order to validate that this stuff actually works and this is not smoke and mirrors I am going to create a very simple console application that will use these credentials to send and receive a message.  Once we have validated that this works we will pull the Send permission and see what happens.

      The following code will create a QueueClient, send a message to the queue and then receive the message.

              const string QueueName = "myqueue";
               const string ServiceNamespace = "DontRunAsRoot";
               const string IssuerName = "QueueUser";
               const string IssuerKey = "<removed>";

               //Sending the message

               TokenProvider credentials = TokenProvider.CreateSharedSecretTokenProvider(IssuerName, IssuerKey);
               Uri serviceUri = ServiceBusEnvironment.CreateServiceUri("sb", ServiceNamespace, string.Empty);

               MessagingFactory factory = null;


                   factory = MessagingFactory.Create(serviceUri, credentials);

                   //This code assumes that queue has already been created since we have
                   //provisioned access
                   QueueClient myQueueClient = factory.CreateQueueClient(QueueName);

                   Console.WriteLine("\nCreated Queue Client");

                   //Create Brokered Message
                   BrokeredMessage bm = new BrokeredMessage("I hope this works");

                   Console.WriteLine("\nSending messages to Queue…");


                   Console.WriteLine("\nMessage sent to Queue");
                   Console.WriteLine("\nPress Any ENTER to receive message");

                   bm = myQueueClient.Receive(TimeSpan.FromSeconds(5));

                   if (bm != null)
                       Console.WriteLine(string.Format("Message received: Id = {0}, Body = {1}", bm.MessageId, bm.GetBody<string>()));
                       // Further custom message processing could go here…

                   Console.WriteLine("\nNo more messages to process");
                   Console.WriteLine("\nPress ENTER Key to exit");

               catch (Exception ex)


      When we run the application we will discover that it is executing correctly:




      Let’s now make this a little interesting.  Let’s remove the QueueUser’s ability to send messages to the Queue and see what happens.  To do this we will use the following command:

      revoke <operation> <path> <user>


      To validate that the permission revoke was successful lets run the show command again.  As you can see our revoke command was successful.



      Let’s now try to send a message and see what happens.


      As expected we get an authentication exception as we should.



      With the introduction of the SBAzTool tool there is no excuse for using your “owner” credentials when building applications.  The SBAzTool has a wide variety of commands that facilitate managing and even delegating permissions.  Since this is a command line tool you can even script these permissions as you provision your Service Bus artifacts.

      SAP meet Azure Service Bus – EAI/EDI December 2011 CTP


      The Azure Service Bus EAI/EDI December 2011 CTP has been out for about 2 weeks at the time of this blog post.  As soon as I saw the Service Bus Connect feature in the documentation I wanted to try and hook up the Service Bus to SAP.  The organization that I work for utilizes SAP to support many of its core business processes.  We are also heavily invested in BizTalk Server when integrating SAP with other Corporate Systems. For the past 5 years much of my work experience has involved integration with SAP. So much that I had the opportunity to write a couple chapters on BizTalk-SAP integration in the Microsoft BizTalk 2010 Line of Business Systems Integration book.

      Integrating with SAP is of great interest to me both personally and professionally.  I like the challenge of taking two different types of systems that would seemingly be impossible to integrate yet find a way to do it.  I also enjoy the expression on SAP consultants face when you take a Microsoft product and successfully execute operations inside their system like creating customer records or creating Work Orders. 

      Using the Service Bus Connect feature is not the only way of bridging your On-Premise Line of Business Systems with external parties via cloud based messaging technologies.  Within the past year Microsoft also introduced a feature called BizTalk Server 2010 AppFabric Connect for Services.  This tool allows for BizTalk to expose an endpoint via a Service Bus Relay.  I have also used this mechanism to communicate with SAP via a Mobile Device and it does work. 

      There are a few differences between Service Bus Connect and AppFabric Connect for Services.  Some of these differences include:

      • Any message transformations that need to take place actually take place in the Cloud instead of On Premise.  When integrating with SAP, you never want to expose SAP schemas to calling clients.  They are ugly to say the least. In this scenario we can expose a client friendly, or conical schema, and then transform this message into our SAP request in Azure.
      • AppFabric Connect for Services utilizes a full deployment of BizTalk in your environment where as Service Bus Connect only requires the BizTalk Adapter Pack when communicating with SAP. All message transformations and orchestration takes place On Premise and the cloud (Azure Service Bus) is basically used as a communication relay.

      When connecting to On-Premise Line of Business Systems, both methods require the BizTalk Adapter Pack to be installed On-Premise.  The BizTalk Adapter Pack is included in your BizTalk license.  Licensing details for Service Bus Connect have not been released at the time of this writing.

      The following walkthrough assumes you have some experience with the new Service Bus CTP.  If you haven’t looked at the CTP before I suggest that you visit a few of the following links to get more familiar with the tool:

      Also it is worth pointing out another blog post written by Steef-Jans Wiggers where he discusses Oracle integration with Service Bus Connect.

      Building our Application

      • The first thing we need to do is to create a new ServiceBus – Enterprise Application Integration project.  In my case I am calling it HelloSAP.



      • Since we know that we want to communicate with an On-Premise LOB system like SAP we need to Add a ServiceBus Connect Server.  We can do this by accessing Server Explorer, right mouse clicking on ServiceBus Connect Servers and then selecting Add Server.  When prompted we can provide a host name of localhost since this is a local environment.



      • We now can expand our Service Bus Connect Servers hierarchy.  Since we want to build an SAP interface we can right mouse click on SAP and select Add SAP Target



      • If you have ever used the BizTalk Adapter Pack before, you are now in familiar territory.  This is (almost) the same wizard that we use to generate schemas when connecting to SAP systems via BizTalk.  There is a subtle difference in the bottom left corner called Configure Target Path which we will discuss in a few moments.  If you are unfamiliar with this screen you are going to need some help from your SAP BASIS Admin to provide you with the connection details required to connect to SAP.  Also if you are interested in further understanding everything that is going on in this screen I recommend you pick up the BizTalk LOB book that I previously talked about as I discuss the different aspects of this wizard in great detail.  ( more shameless plugs)



      • We now want to select the type of interface that we want to interact with.  For the purpose of this blog post I am going to select a custom IDOC that is used when submitting timesheets from our field personnel.  In my case, the version of SAP that I am connecting to is 700 so that is why I am selecting the ZHR_CATS IDOC that corresponds to this version.  Once again, if you are unsure you will need to speak to your BASIS Admin.



      • Notice how we cannot click the OK button after establishing a connection to SAP and selecting an IDOC?  We now need to create a Target Path.  Creating a Target Path will provide the Bridge from the Azure Service Bus into SAP.  Click the Configure button to continue.



      • Assuming that we have not been through this exercise before we need to select Add New LobRelay  from the Select LOB Relay to host the LOB Target: dropdown list. 



      • Another dialog box will appear. Within this dialog box we need to provide our CTP Labs namespace, a Relay path, Issuer name and key.  For Relay path:, we can really provide whatever we want here.  It will essentially make up the latter portion of URI for the Endpoint that is about to be created.



      • Now we are are prompted to Enter LOB Target sub-path.  Once again this value can be whatever we want to choose.  Since the HR Timesheet module inside of SAP is often called CATS I will go ahead and use this value here.


      • Now with our Target Path configured we are able to select the OK button to proceed.


      • Inside Server Explorer we now have an entry underneath SAP.  This represents our End Point that will bridge requests coming from the cloud to SAP.


      • At this point we haven’t added any artifacts to our Enterprise Application Integration project that we created earlier.  This is about to change.  We need to right mouse click on our SAP endpoint and then select Add schemas to HelloSAP


      • We will now get prompted for some additional information in order to re-establish a connection to SAP so that we can generate Schemas that will enable us to send a message to SAP in a format that it is expecting.  You may also notice that we aren’t being prompted for any SAP server information.  In the Properties grid you will notice that this information is already populated because we had previously specified it when using the Consume Adapter Service Wizard.


      • Inside our solution, we will now discover that we have our SAP schemas in a folder called LOB Schemas.


      • For the purpose of this blog post, I have created another folder called Schemas and saved a Custom Schema called CloudRequest.xsd here.  This is the message that our MessageSender application will be sending in once we test our solution.  (BTW: I do find that the Schema editor that is included in BizTalk is much more intuitive and user friendly than this one.  I am not a big fan of this one)


      • We now need to create a Map, or Transform, to convert our request message into a request that SAP will understand.


      • Next we need to add a Bridge on to the surface of our Bridge Configuration.  Our Bridge will be responsible for executing our Map that we just created and then our message will get routed to our On-Premise end point so that our Timesheet can be sent to SAP.


      • We now need to set the message type that we expect will enter the bridge.  By double clicking on our TimeSheetBridge we can then use the Message Type picker to select our Custom message type: CloudRequest.



      • Once we have selected our message type, we can then select a transform by clicking on the Transform Xml Transform box and then selecting our map from the Maps Collection.



      • Before we drag our LOB Connection shape onto our canvas we need to set our Service Namespace.  This is the value that we created when we signed up for the CTP in the Azure Portal.  To set the Service Namespace we need to click on any open space, in the Bridge Configuration canvas, and then look in the Properties Page.  Place your Service Namespace here.



      • We are now at the point where we need to wire up our XML One-Way Bridge to our On-Premise LOB system.  In order to do so we we need to drag our SAP instance onto the Bridge Configuration canvas.



      • Next, we need to drag a Connection shape onto the canvas to connect our Bridge to our LOB system.



      • The next action that needs to take place is setting up a Filter Condition between our LOB Shape and our Bridge.  You can think of this like creating a subscription.  If we wanted to filter messages by their content we would be able to do so here.  Since we are interested in all messages we will just Match All.  In order to set this property we need to select our Connection arrow then click on the Filter Condition ellipses.



      • If you have used the BizTalk Adapter Pack in the past you will be familiar with SOAP Action headers that need to be be set in your BizTalk Send Port.  Since we don’t have Send Ports per say, we need to set this action in the Route Action as part of the One-Way Connection shape.  In the Expression text box we want to put the name of our Operation which is http://Microsoft.LobServices.Sap/2007/03/Idoc/3/ZHR_CATS//700/Send wrapped with single quotes ‘ ’.  We can obtain this value by selecting our Service Bus Connect Server endpoint and then viewing the Properties page.  Within the Properties page there is an Operations arrow that can be expanded on and we will find this value here.  In the Destination (Write To) we want to set our Type to Soap and our Identifier to Action.



      • There is one last configuration that needs to take place before enable and deploy our service.  We need to set our Security Type.  We can do so by selecting our SAP – ServiceBus Connect Server instance from Server Explorer.  Then in the Properties Page, click on the SecurityType ellipses.  Determining which Security type to use will depend upon how your SAP instance ahs been configured.  In my case, I am using ConfiguredUsername and I need to provide both a Username and Password.


      • With our configuration set, we can now enable our SAP – ServiceBus Connect Server instance by right mouse clicking on it and then selecting Enable.



      • We can now deploy our application to Azure by right mouse clicking on our Visual Studio solution and selecting Deploy.



      Testing Application

      • In order to test our application we can use the MessageSender tool that is provided with the CTP Samples/Tools.  It will simply allow us to submit EDI, or in this case XML, messages to an endpoint in the Azure Service Bus.  In order to successfully submit these messages we need to provide our Service Namespace, Issuer Name, Shared Secret, Service Bus endpoint address, a path to our XML file that we want to submit and indicate that we are submitting an xml document.  Once we have provide this information we can hit the enter key and provided we do not have errors we will see a Message sent successfully message.

      Note: In the image below I have blocked my Shared Secret (in red) for privacy reasons.


      • If we launch our SAP GUI we should discover that it has received a message successfully.


      • We can then drill down into the message and discover the information that has been posted to our timesheet




      While testing, I ran into an exception.  In the CATSHOURS field I messed up the format of the field and sent in too much data.  The BizTalk Adapter Pack/SAP Adapter validated this incoming data against the SAP schema that is being used to send messages to SAP.  The result is a message was returned back to the MessageSender application.  I thought that this was pretty interesting.  Why?  In my solution I am using a One-Way bridge and this exception is still being propagated to the calling application.  Cool and very beneficial.



      Overall the experience of using Service Bus connect was good.  There were a few times I had to forget how we do this in BizTalk and think about how the Service Bus Connect does it.  An example of this was the SOAP Action Headers that BizTalk developers are use to manipulating inside of Send Ports.  I am not saying one way is better than the other but they are just different.  Another example is the XML Schema editor.  I find the BizTalk editor to be much more user friendly.

      While I am not convinced that the current state of Service Bus Connect is ready for primetime (they have CTPs for a reason), I am very impressed that the Microsoft team could build this type of functionality into a CTP.  For me personally, this type of functionality(connecting to On Premise LOB Systems) is a MUST HAVE as organizations start evolving towards Cloud computing.