SAP meet Azure Service Bus – EAI/EDI December 2011 CTP

 

The Azure Service Bus EAI/EDI December 2011 CTP has been out for about 2 weeks at the time of this blog post.  As soon as I saw the Service Bus Connect feature in the documentation I wanted to try and hook up the Service Bus to SAP.  The organization that I work for utilizes SAP to support many of its core business processes.  We are also heavily invested in BizTalk Server when integrating SAP with other Corporate Systems. For the past 5 years much of my work experience has involved integration with SAP. So much that I had the opportunity to write a couple chapters on BizTalk-SAP integration in the Microsoft BizTalk 2010 Line of Business Systems Integration book.

Integrating with SAP is of great interest to me both personally and professionally.  I like the challenge of taking two different types of systems that would seemingly be impossible to integrate yet find a way to do it.  I also enjoy the expression on SAP consultants face when you take a Microsoft product and successfully execute operations inside their system like creating customer records or creating Work Orders. 

Using the Service Bus Connect feature is not the only way of bridging your On-Premise Line of Business Systems with external parties via cloud based messaging technologies.  Within the past year Microsoft also introduced a feature called BizTalk Server 2010 AppFabric Connect for Services.  This tool allows for BizTalk to expose an endpoint via a Service Bus Relay.  I have also used this mechanism to communicate with SAP via a Mobile Device and it does work. 

There are a few differences between Service Bus Connect and AppFabric Connect for Services.  Some of these differences include:

  • Any message transformations that need to take place actually take place in the Cloud instead of On Premise.  When integrating with SAP, you never want to expose SAP schemas to calling clients.  They are ugly to say the least. In this scenario we can expose a client friendly, or conical schema, and then transform this message into our SAP request in Azure.
  • AppFabric Connect for Services utilizes a full deployment of BizTalk in your environment where as Service Bus Connect only requires the BizTalk Adapter Pack when communicating with SAP. All message transformations and orchestration takes place On Premise and the cloud (Azure Service Bus) is basically used as a communication relay.

When connecting to On-Premise Line of Business Systems, both methods require the BizTalk Adapter Pack to be installed On-Premise.  The BizTalk Adapter Pack is included in your BizTalk license.  Licensing details for Service Bus Connect have not been released at the time of this writing.

The following walkthrough assumes you have some experience with the new Service Bus CTP.  If you haven’t looked at the CTP before I suggest that you visit a few of the following links to get more familiar with the tool:

Also it is worth pointing out another blog post written by Steef-Jans Wiggers where he discusses Oracle integration with Service Bus Connect.

Building our Application

  • The first thing we need to do is to create a new ServiceBus – Enterprise Application Integration project.  In my case I am calling it HelloSAP.

image

 

  • Since we know that we want to communicate with an On-Premise LOB system like SAP we need to Add a ServiceBus Connect Server.  We can do this by accessing Server Explorer, right mouse clicking on ServiceBus Connect Servers and then selecting Add Server.  When prompted we can provide a host name of localhost since this is a local environment.

image

 

  • We now can expand our Service Bus Connect Servers hierarchy.  Since we want to build an SAP interface we can right mouse click on SAP and select Add SAP Target

image

 

  • If you have ever used the BizTalk Adapter Pack before, you are now in familiar territory.  This is (almost) the same wizard that we use to generate schemas when connecting to SAP systems via BizTalk.  There is a subtle difference in the bottom left corner called Configure Target Path which we will discuss in a few moments.  If you are unfamiliar with this screen you are going to need some help from your SAP BASIS Admin to provide you with the connection details required to connect to SAP.  Also if you are interested in further understanding everything that is going on in this screen I recommend you pick up the BizTalk LOB book that I previously talked about as I discuss the different aspects of this wizard in great detail.  (ok..no more shameless plugs)

image

 

  • We now want to select the type of interface that we want to interact with.  For the purpose of this blog post I am going to select a custom IDOC that is used when submitting timesheets from our field personnel.  In my case, the version of SAP that I am connecting to is 700 so that is why I am selecting the ZHR_CATS IDOC that corresponds to this version.  Once again, if you are unsure you will need to speak to your BASIS Admin.

image

 

  • Notice how we cannot click the OK button after establishing a connection to SAP and selecting an IDOC?  We now need to create a Target Path.  Creating a Target Path will provide the Bridge from the Azure Service Bus into SAP.  Click the Configure button to continue.

image

 

  • Assuming that we have not been through this exercise before we need to select Add New LobRelay  from the Select LOB Relay to host the LOB Target: dropdown list. 

image

 

  • Another dialog box will appear. Within this dialog box we need to provide our CTP Labs namespace, a Relay path, Issuer name and key.  For Relay path:, we can really provide whatever we want here.  It will essentially make up the latter portion of URI for the Endpoint that is about to be created.

image

 

  • Now we are are prompted to Enter LOB Target sub-path.  Once again this value can be whatever we want to choose.  Since the HR Timesheet module inside of SAP is often called CATS I will go ahead and use this value here.

image

  • Now with our Target Path configured we are able to select the OK button to proceed.

image

  • Inside Server Explorer we now have an entry underneath SAP.  This represents our End Point that will bridge requests coming from the cloud to SAP.

image

  • At this point we haven’t added any artifacts to our Enterprise Application Integration project that we created earlier.  This is about to change.  We need to right mouse click on our SAP endpoint and then select Add schemas to HelloSAP

image

  • We will now get prompted for some additional information in order to re-establish a connection to SAP so that we can generate Schemas that will enable us to send a message to SAP in a format that it is expecting.  You may also notice that we aren’t being prompted for any SAP server information.  In the Properties grid you will notice that this information is already populated because we had previously specified it when using the Consume Adapter Service Wizard.

image

  • Inside our solution, we will now discover that we have our SAP schemas in a folder called LOB Schemas.

image

  • For the purpose of this blog post, I have created another folder called Schemas and saved a Custom Schema called CloudRequest.xsd here.  This is the message that our MessageSender application will be sending in once we test our solution.  (BTW: I do find that the Schema editor that is included in BizTalk is much more intuitive and user friendly than this one.  I am not a big fan of this one)

image

  • We now need to create a Map, or Transform, to convert our request message into a request that SAP will understand.

image

  • Next we need to add a Bridge on to the surface of our Bridge Configuration.  Our Bridge will be responsible for executing our Map that we just created and then our message will get routed to our On-Premise end point so that our Timesheet can be sent to SAP.

image

  • We now need to set the message type that we expect will enter the bridge.  By double clicking on our TimeSheetBridge we can then use the Message Type picker to select our Custom message type: CloudRequest.

image

 

  • Once we have selected our message type, we can then select a transform by clicking on the Transform Xml Transform box and then selecting our map from the Maps Collection.

 image

 

  • Before we drag our LOB Connection shape onto our canvas we need to set our Service Namespace.  This is the value that we created when we signed up for the CTP in the Azure Portal.  To set the Service Namespace we need to click on any open space, in the Bridge Configuration canvas, and then look in the Properties Page.  Place your Service Namespace here.

image

 

  • We are now at the point where we need to wire up our XML One-Way Bridge to our On-Premise LOB system.  In order to do so we we need to drag our SAP instance onto the Bridge Configuration canvas.

image

 

  • Next, we need to drag a Connection shape onto the canvas to connect our Bridge to our LOB system.

image

 

  • The next action that needs to take place is setting up a Filter Condition between our LOB Shape and our Bridge.  You can think of this like creating a subscription.  If we wanted to filter messages by their content we would be able to do so here.  Since we are interested in all messages we will just Match All.  In order to set this property we need to select our Connection arrow then click on the Filter Condition ellipses.

image

 

  • If you have used the BizTalk Adapter Pack in the past you will be familiar with SOAP Action headers that need to be be set in your BizTalk Send Port.  Since we don’t have Send Ports per say, we need to set this action in the Route Action as part of the One-Way Connection shape.  In the Expression text box we want to put the name of our Operation which is http://Microsoft.LobServices.Sap/2007/03/Idoc/3/ZHR_CATS//700/Send wrapped with single quotes ‘ ’.  We can obtain this value by selecting our Service Bus Connect Server endpoint and then viewing the Properties page.  Within the Properties page there is an Operations arrow that can be expanded on and we will find this value here.  In the Destination (Write To) we want to set our Type to Soap and our Identifier to Action.

image

 

  • There is one last configuration that needs to take place before enable and deploy our service.  We need to set our Security Type.  We can do so by selecting our SAP – ServiceBus Connect Server instance from Server Explorer.  Then in the Properties Page, click on the SecurityType ellipses.  Determining which Security type to use will depend upon how your SAP instance ahs been configured.  In my case, I am using ConfiguredUsername and I need to provide both a Username and Password.

image

  • With our configuration set, we can now enable our SAP – ServiceBus Connect Server instance by right mouse clicking on it and then selecting Enable.

image

 

  • We can now deploy our application to Azure by right mouse clicking on our Visual Studio solution and selecting Deploy.

image

 

Testing Application

  • In order to test our application we can use the MessageSender tool that is provided with the CTP Samples/Tools.  It will simply allow us to submit EDI, or in this case XML, messages to an endpoint in the Azure Service Bus.  In order to successfully submit these messages we need to provide our Service Namespace, Issuer Name, Shared Secret, Service Bus endpoint address, a path to our XML file that we want to submit and indicate that we are submitting an xml document.  Once we have provide this information we can hit the enter key and provided we do not have errors we will see a Message sent successfully message.

Note: In the image below I have blocked my Shared Secret (in red) for privacy reasons.

image

  • If we launch our SAP GUI we should discover that it has received a message successfully.

image

  • We can then drill down into the message and discover the information that has been posted to our timesheet

image

 

Exceptions

While testing, I ran into an exception.  In the CATSHOURS field I messed up the format of the field and sent in too much data.  The BizTalk Adapter Pack/SAP Adapter validated this incoming data against the SAP schema that is being used to send messages to SAP.  The result is a message was returned back to the MessageSender application.  I thought that this was pretty interesting.  Why?  In my solution I am using a One-Way bridge and this exception is still being propagated to the calling application.  Cool and very beneficial.

 image

Conclusion

Overall the experience of using Service Bus connect was good.  There were a few times I had to forget how we do this in BizTalk and think about how the Service Bus Connect does it.  An example of this was the SOAP Action Headers that BizTalk developers are use to manipulating inside of Send Ports.  I am not saying one way is better than the other but they are just different.  Another example is the XML Schema editor.  I find the BizTalk editor to be much more user friendly.

While I am not convinced that the current state of Service Bus Connect is ready for primetime (they have CTPs for a reason), I am very impressed that the Microsoft team could build this type of functionality into a CTP.  For me personally, this type of functionality(connecting to On Premise LOB Systems) is a MUST HAVE as organizations start evolving towards Cloud computing.

Azure Service Bus EAI/EDI December 2011 CTP – New Mapper

In this blog post we are going to explore some of the new functoids that are available in the Azure Service Bus Mapper.

At first glance, the Mapper looks pretty similar to the BizTalk 2010 Mapper. Sure there are some different icons, perhaps some lipstick applied but conceptually we are dealing with the same thing right?

image

Wrong! It isn’t until we take a look at the toolbox that we discover this isn’t your Father’s mapper.

image

This post isn’t meant to be an exhaustive reference guide for the new mapper but here are some of the new functoids that stick out for me.

 

Functoid Section Functoid Name Description
Loop Operations MapEach Loop This functoid will loop over a repeating record from the source document and evaluate an operation at each iteration of the loop.  If the criteria of the operation is met then a record in the target document will be created.
Expressions Arithmetic Expression No we haven’t lost our Addition or Subtraction functionality!  Many functoids that you would have found in the BizTalk 2010 Mathematical Functoids section have been consolidated in the Arithmetic Expression operation.
Expressions Logical Expression Similar to the Arithmetic Expressions, these have now been consolidated and can be found within a single Expression including >, <, >=, <=, ==, !=, Logical Negation, Conditional ADD and Conditional OR
Expressions If-Then-Else A much anticipated operation! BizTalk developers have been wanting a If-Then-Else functoid in BizTalk for many years.  If a condition has been evaluated to True then we can provide a particular value.  Otherwise we can provide different value when the condition has not been satisfied.
List Operations All of them Complete new functionality provides us with the ability to manipulate Lists within a map.  Functionality includes creating a list, adding an item to the list, selecting a unique group, selecting a value, selecting entries, getting items, and ordering the list.  Wow!  Will be interesting to see how this progresses.
Date/Time Operations DateTime Reformat This one should be useful.  I am constantly re-formatting dates when integrating with SAP.  Usually this formatting winds up in a Helper assembly which tends to be overkill for what really needs to take place.
Misc Operations Data Lookup This one is interesting.  It allows us to access data from SQL Azure within a transform at runtime.
Misc Operations Generate ID This functoid will generate a GUID.  It is the equivalent of calling the GUID.NewGuid method in .Net.
Misc Operations Get Context Property Another useful operation!  This operation allows us to retrieve a value from context.  This is something that just isn’t possible in BizTalk.
 
What’s Missing?

I can’t take credit for discovering this, but while chatting with Mikael Håkansson he mentioned “hey – where is the scripting functoid?”  Perhaps this is just a limitation of the CTP but definitely something that needs to be addressed for RTM.  It is always nice to be able to fall back on a .Net helper assembly or custom XSLT.

Conclusion
While this post was not intended to be comprehensive, I hope it has highlighted some new opportunities that warrant some further investigation.  It is nice to see that Microsoft is evolving and maturing in this area of EAI.

Introduction to the Azure Service Bus EAI/EDI December 2011 CTP

The Azure team has recently reached a new milestone; delivering on Service Bus enhancements for the December 2011 CTP.  More specifically, this CTP provides Enterprise Application Integration (EAI) and Electronic Data Interchange (EDI) functionality.  Within the Microsoft stack, both EAI and EDI have historically been tackled by BizTalk.  With this CTP we are seeing an early glimpse into how Microsoft envisions these types of integration scenarios being addressed in a Platform as a Service (PaaS) based environment.

 What is included in this CTP?

There are 3 core areas to this release:

  1. Enterprise Application Integration: In Visual Studio, we will now have the ability to create Enterprise Application Integration and Transform projects. 
  2. Connectivity to On-premise LOB applications: Using this feature will allow us to bridge our on-premise world with the other trading partners using the Azure platform.  This is of particular interest to me.  My organization utilizes systems like SAP, Oracle and 3rd party work order management and GIS systems.  In our environment, these applications are not going anywhere near the cloud anytime soon.  But, we also do a lot of data exchange with external parties.  This creates an opportunity to bridge the gap using existing on-premise systems in conjunction with Azure.
  3. Electronic Data Interchange: We now have a Trading Partner Management portal that allows us to manage EDI message exchanges with our various business partners.  The Trading Partner Management Portal is available here.

What do I need to run the CTP?

  • The first thing you will need is the actual SDK (WindowsAzureServiceBusEAI-EDILabsSDK-x64.exe) which you can download here
  • A tool called Service Bus Connect (not to be confused with AppFabric Connect) enables us to bridge on-premise with Azure.  The setup msi can also be accessed from the same page as the SDK. Within this setup we have the ability to install the Service Bus Connect SDK(which includes the BizTalk Adapter Pack), Runtime and Management tools.  In order for the runtime to be installed, we need to have Windows Server AppFabric 1.0 installed.
  • Once we have the SDKs installed, we will have the ability to create ServiceBus projects in Visual Studio.

image

  • Since our applications will require resources from the Azure cloud, we need to create an account in the Azure CTP Labs environment.  To create a namespace in this environment, just follow these simple instructions.
  • Next, we need to register our account with the Windows Azure EDI Portal.

image

 Digging into the samples

For the purpose of this blog post I will describe some of what is going on in the first Getting Started sample called Order Processing.

When we open up the project and launch the BridgeConfiguration.bcs file we will discover a canvas with a couple shapes on it.  I have outlined, in black, a Bridge and I have highlighted, in Green, what is called a Route Destination.  So the scenario that has been built for us is one that will receive a typed xml message, transform it and then place it on a Service Bus Queue.

image

When we dig into the Xml One-Way Bridge shape, we will discover something that looks similar to a BizTalk Pipeline.  Within this “shape” we can add the various message types that are going to be involved in the bridge.  We can then provide the names of the maps that we want to execute.

image

Within our BridgeConfiguration.bcs “canvas” we need to provide our service namespace.  We can set this by click on the whitespace within the canvas and then populate the Service Namespace property.

image

We need to set this Service Namespace so that we know where to send the result of Xml One-Way bridge.  In this case we are sending it to a Service Bus Queue that resides within our namespace.

image 

With our application configured we can build our application as we normally would.  In order to deploy, we simply right mouse click on solution, or project, and select Deploy.  We will once again be asked to provide our Service Bus credentials.

image

The deployment is quick, but then again we aren’t pushing too many artifacts to the Service Bus.

Within the SDK samples, Microsoft has provided us with MessageReceiver and MessageSender tools.  With the MessageReceiver tool, we can use it to create our Queue and receive a message.  The MessageSender tool is used to send the message to the Queue.

When we test our application we will be asked to provide an instance of our typed Xml message.  It will be sent to the Service Bus, transformed (twice) and then the result will be placed on a Queue.  We will then pull this message off of the queue and the result will be displayed in our Console.

 

image

Conclusion

So far is seems pretty cool and looks like this team is headed in the right direction.  Much like the AppFabric Apps CTP, there will be some gaps in the technology, and bugs, that have been delivered as this is still just a technical preview.  If you have any feedback for the Service Bus team or want some help troubleshooting a problem, a forum has been set up for this purpose.

I am definitely looking forward to digging into this technology further – especially in the area of connecting to on-premise LOB systems such as SAP.  Stay tuned for more posts on these topics.

Speaking at Prairie Dev Con West–March 2012

I have recently been informed that my session abstract has been accepted and I will be speaking at Prairie Dev Con West in March.  My session will focus on Microsoft’s Cloud based Middleware platform: Azure AppFabric.  In my session I will be discussing topics such as Service Bus,  AppFabric Queues/Subscriptions and bridging on-premise Line of Business systems with cloud based integration.

You can read more about the Prairie Dev Con West here and find more information about registration here.   Do note there is some early bird pricing in effect.

 

image

AppFabric Apps (June 2011 CTP)–Only 1 SQL Azure Instance at a time

I must have missed it in the release notes but when you have access to the AppFabric Apps CTP, you are only allowed on SQL Azure instance within this Labs environment.  When you first log into the Portal you will see a SQL Azure DB that has a name of LabsSqlDatabase.  The problem is that you can’t actually create another one here and if you do try to access this database you will not be able to connect to it using the Database Name LabsSqlDatabase.

image

What is a little confusing is when you are working within your Local Dev Fabric that you can specify any name you want in your SQL Azure cofig for your DatabaseName.  The problem is this name will not exist in the cloud if you want to access it after you have published your AppFabric Application.

image

The error message that you will receive will resemble the following:

image

So due to this current limitation (it is CTP  – I am ok with it), you need to use the database that was provisioned for you when you were granted access to the CTP Labs enviornment.  To do this you will need to get your connection string from the Labs Portal by clicking on LabsSqlDatabase then click on the View button below the Connection String label.  A dialog box will appear and you can copy your connection string by clicking the Copy to Clipboard button.

image

If we paste this value in notepad we will discover the following connection string.  The Data Source represents the name of the server (s) that our database will be running on.  Next we have an Initial Catalog value which is the actual name of our database that has been provisioned for us….not LabsSqlDatabase.  In my case, and I would imagine others, my Initial Catalog value is the same as my user id.  If have blacked out portions of these values for privacy reasons.image

If I now try to connect to this database either through SQL Management Studio or Database Manager Portal, using this initial catalog value for my Database Name,  I will have success.

image

At this point we are free to run any of our SQL scripts to generate tables, stored procedures or load data.

Conclusion

I hope this post will save some others some time as it did create some confusion for me.  Admittedly I sometimes dive right into things instead of thoroughly reviewing the documentation (what fun is that).  From what I gather, it is possible to hook up an AppFabric Application to an existing SQL Azure database that exists in the NON-LABS environment.  So if you really need to create multiple SQL Azure databases that is also an option.  Since I am just playing around with this stuff having a single SQL Azure instance works for me now that I understand the limitations of the Labs environment.

AppFabric Apps (June 2011 CTP) Accessing AppFabric Queue via REST

I recently watched an episode of AppFabric TV where they were discussing the REST API for Azure AppFabric.  I thought the ability to access the AppFabric Service Bus, and therefore other Azure AppFabric services like Queues, Topics and Subscriptions via REST was a pretty compelling scenario.  For example, if we take a look at the “Power Outage” scenario that I have been using to demonstrate some of the features of AppFabric Applications, it means that we can create a Windows Phone 7 (or Windows Phone 7.1/Mango Beta) application and dump messages into an AppFabric Queue securely via the AppFabric Service Bus.  Currently, the Windows Phone SDK does not allow for the managed, Service Bus bindings to be loaded on the phone so using the REST based API over HTTP is a viable option.

Below is a diagram that illustrates the solution that we are about to build.  We will have a Windows Phone 7.1 app that will push messages to an Azure AppFabric Queue.  (You will soon discover that I am not a Windows Phone developer.  If you were expecting to see some whiz bang new Mango features then this post is not for you.) 

The purpose of the mobile is to submit power outage information to an AppFabric Queue.  But before we can do this we need to retrieve a token from the Access Control Service and include this token in our AppFabric Queue message Header.  Once the message is in the Queue,  we will once again use a Code Service to retrieve messages that we will then insert into a SQL Azure table.

image

 

Building our Mobile Solution

One of the benefits of using AppFabric queues is the loose coupling between publishers and subscribers.  With this in mind, we can proceed with building a Windows Phone 7 application in its own solution.  For the purpose of this blog post I am going to use the latest mango beta sdk bits which are available here.

image

Since I have downloaded the latest 7.1 bits, I am going to target this phone version.

image

 

On our WP7 Canvas we are going to add a few controls:

  • TextBlock called lblAddress that has a Text property of Address
  • TextBox called txtAddress that has an empty Text Property
  • Button called btnSubmit that has a Content property of Submit
  • TextBlock called lblStatus that has an empty Text Property

image

 

Within the btnSubmit_Click event we are going to place our code that will communicate with the Access Control Service.

private void btnSubmit_Click(object sender, RoutedEventArgs e)
     {

         //Build ACS and Service Bus Addresses
         string acsAddress = https + serviceNameSpace + acsSuffix ;
         string relyingPartyAddress = http + serviceNameSpace + serviceBusSuffix;
         string serviceAddress = https + serviceNameSpace + serviceBusSuffix;

         //Formulate Query String
         string postData = "wrap_scope=" + Uri.EscapeDataString(relyingPartyAddress) +
             "&wrap_name=" + Uri.EscapeDataString(issuerName) +
             "&wrap_password=" + Uri.EscapeDataString(issuerKey);
        
     
         WebClient acsWebClient = new WebClient();
        
         //Since Web/Http calls are all async in WP7, we need to register and event handler
         acsWebClient.UploadStringCompleted += new UploadStringCompletedEventHandler(acsWebClient_UploadStringCompleted);

         //instantiate Uri object with our acs URL so that we can provide in remote method call
         Uri acsUri = new Uri(acsAddress);
        
         acsWebClient.UploadStringAsync(acsUri,"POST",postData); 

}

 

Since we are making an Asynchronous call to the ACS service, we need to implement  the   handling of the response from the ACS Service.

private void acsWebClient_UploadStringCompleted(object sender, UploadStringCompletedEventArgs e)
    {
        if (e.Error != null)
        {
            lblStatus.Text = "Error communicating with ACS";
        }
        else
        {
            //store response since we will need to pull ACS token from it
            string response = e.Result;

            //update WP7 UI with status update
            lblStatus.Text = "Received positive response from ACS";
           
            //Sleep just for visual purposes
            System.Threading.Thread.Sleep(250);

            //parsing the ACS token from response
            string[] tokenVariables = response.Split(‘&’);
            string[] tokenVariable = tokenVariables[0].Split(‘=’);
            string authorizationToken = Uri.UnescapeDataString(tokenVariable[1]);

          
           //Creating our Web client that will use to populate the Queue
            WebClient queueClient = new WebClient();

            //add our authorization token to our header
            queueClient.Headers["Authorization"] = "WRAP access_token=\"" + authorizationToken +"\"";
            queueClient.Headers[HttpRequestHeader.ContentType] = "text/plain";

            //capture textbox data
            string messageBody = txtAddress.Text;
          
            //assemble our queue address
            //For example: "
https://MyNameSpace.servicebus.appfabriclabs.com/MyQueueName/Messages"
            string sendAddress = https + serviceNameSpace + serviceBusSuffix + queueName + messages;

            //Register event handler
            queueClient.UploadStringCompleted += new UploadStringCompletedEventHandler(queueClient_UploadStringCompleted);

            Uri queueUri = new Uri(sendAddress);
            //Call method to populate queue
            queueClient.UploadStringAsync(queueUri, "POST", messageBody);
        }

       

    }

So at this point we have made a successful request to ACS and received a response that included our token.  We then registered an event handler as we will call the AppFabric Service Bus Queue using an Asynchronous call.  Finally we made a call to our Service Bus Queue.

We now need to process the response coming back from the AppFabric Service Bus Queue.

private void queueClient_UploadStringCompleted(object sender, UploadStringCompletedEventArgs e)
{
    //Update status to user.
    if (e.Error != null)
    {
        lblStatus.Text = "Error sending message to Queue";
    }
    else
    {
         lblStatus.Text = "Message successfully sent to Queue";
    }
}

 

That concludes the code that is required to submit a message securely to the AppFabric Service Bus Queue using the Access Control Service to authenticate our request.

Building our Azure AppFabric Application

The first artifact that we are going to build is the AppFabric Service Bus Queue called QueueMobile.

image_thumb2[1]

Much like we have done in previous posts we need to provide an IssuerKey, IssuerName and Uri.

image_thumb4

The next artifact that we need add is a SQL Azure Database.

image26_thumb

Adding this artifact is only the beginning.  We still need to create our local database in our SQL Express instance.  So what I have done is manually created a Database called PowerOutage and a Table called Outages.

image_thumb6

Within this table I have two very simple columns: ID and Address.

image_thumb10

So the next question is how do I connect to this database.  If you navigate to the AppFabric Applications Manager which is found within the AppFabric Labs portal, we will see that a SQL Azure DB has been provisioned for us.

image_thumb12

Part of this configuration includes our connection string for this Database.  We can access this connection string by clicking on the View button that is part of the Connection String panel.

image_thumb13

I have covered up some of the core credential details that are part of my connection string for security reasons.  What I have decided to do though to make things a little more consistent is created a SQL Server account that has these same credentials in my local SQL Express.  This way when I provision to the cloud I only need to change my Data Source.

image_thumb17

For the time being I am only interested my local development fabric so I need to update my connections string to use my local SQL Express version of the database.

 

image_thumb19

With our Queue and Database now created and configured, we need to focus on our Code Service.  The purpose of this Code Service is to retrieve messages from our AppFabric Queue and insert them into our SQL Azure table.  We will call this Code Service CodeMobileQueue and then will click the OK button to proceed.

image17_thumb

We now need to add references from our Code Service to both our AppFabric Queue and our SQL Azure Instance.  I always like to rename my references so that they have meaningful names.

image_thumb5

Inside our Code Service, It is now time to start focusing on the plumbing of our solutions We need to be able to retrieve messages from the AppFabric Queue and insert them into our SQL Azure table.

public void Run(CancellationToken cancelToken)
      {
        
           //Create reference to our Queue CLient
            QueueClient qClient = ServiceReferences.CreateQueueMobile();
          
          //Create reference to our SQL Azure Connection
            SqlConnection sConn =  ServiceReferences.CreateSqlQueueMobile();
            MessageReceiver mr = qClient.CreateReceiver();
            BrokeredMessage bm;
            
           Stream qStream;
           StreamReader sReader;
           string address;

            System.Diagnostics.Trace.TraceInformation("Entering Queue Retrieval " + System.DateTime.Now.ToString());

            while (!cancelToken.IsCancellationRequested)
            {
                //Open Connection to the database
                sConn.Open();
              
                while (mr.TryReceive(new TimeSpan(hours: 0, minutes: 30, seconds: 0), out bm))
                {

                    try
                    {
                        //Note: we are using a Stream here instead of a String like in other examples
                        //the reason for this is that did not put the message on the wire using a
                        //BrokeredMessage(Binary Format) like in other examples.  We just put on raw text.
                        //The way to get around this is to use a Stream and then a StreamReader to pull the text out as a String
                        qStream = bm.GetBody<Stream>();
                        sReader = new StreamReader(qStream);
                        address = sReader.ReadToEnd();

                        //remove message from the Queue
                        bm.Complete();

                        System.Diagnostics.Trace.TraceInformation(string.Format("Message received: ID= {0}, Body= {1}", bm.MessageId, address));

                       //Insert Message from Queue and add it to a Database
                        SqlCommand cmd = sConn.CreateCommand();
                        cmd.Parameters.Add(new SqlParameter("@ID", SqlDbType.NVarChar));
                        cmd.Parameters["@ID"].Value = bm.MessageId;
                        cmd.Parameters.Add(new SqlParameter("@Address", SqlDbType.NVarChar));
                        cmd.Parameters["@Address"].Value = address;
                        cmd.CommandText = "Insert into Outages(ID,Address) Values (@ID,@Address)";
                        cmd.CommandType = CommandType.Text;
                        cmd.ExecuteNonQuery();
                        System.Diagnostics.Trace.TraceInformation("Record inserted into Database");
                    }
                    catch (Exception ex)
                    {
                        System.Diagnostics.Trace.TraceError("error occurred " + ex.ToString()); 
                    }
                }

                // Add your code here
                Thread.Sleep(5 * 1000);
            }

            mr.Close();
            qClient.Close();
            sConn.Dispose();

         
      }

Testing Application

We are done with all the coding and configuration for our solution.  Once again I am going to run this application in the local Dev Fabric so I am going to go ahead and type CRTL + F5.  Once my Windows Azure Emulator has been started and our application has been deployed we can start our Windows Phone Project.

For the purpose of this blog post we are going to run our Windows Mobile solution in the provided emulator.  However, I have verified the application can be side-loaded on a WP7 device and the application does work properly.

We are now going to populate our Address text box with a value.  In this case I am going to provide 1 Microsoft Way  and click the Submit button.

image_thumb[1]

Once we click the Submit button we can expect our first status message update indicating that we have received a positive response from ACS.

image_thumb[5]

The next update we will have displayed is one that indicates our message has been successfully sent to our AppFabric Queue.

image_thumb[3]

As outlined previously, our WP7 app will publish message to our AppFabric Queue, from there we will have our Code Service de-queue the message and then insert our record into a SQL Azure table.  So if we check our Outages table we will discover that a record has been added to our Database.

 

image_thumb[7]

 

Conclusion

Overall I am pretty please with how this demo turned out.  I really like the ability to have a loosely coupled interface that a Mobile client can utilize.  What is also nice about using a RESTful interface is that we have a lot of flexibility when porting a solution like this over to other platforms.

Another aspect of this solution that I like is having a durable Queue in the cloud.  In this solution we had a code service de-queuing this message in the cloud.  However, I could also have some code written that is living on-premise that could retrieve these messages from the cloud and then send them to an LOB system like SAP.  Should we have a planned, or unplanned system outage on-premise, I know that all mobile clients can still talk to the Queue in the cloud.

AppFabric Apps (June 2011 CTP) Simple Service Bus Topics–Part 2

(…Continued from Part 1, if you have not read it please do so to understand the context of this post)

Solution

We want customers to be able to submit power outages to our fictitious power company located in the Redmond, Washington area. What is different this time around(from our Queue scenario) is that due to the growth in the Kirkland, Washington area we have subcontracted power line maintenance to another company. So we want our Redmond system to receive messages for the Redmond area and this other company to receive messages for Kirkland. The Redmond company should not see the Kirkland messages and vice versa.

Configuring Core Artifacts

The user interface for this application will once again be an ASP.Net Web Application.  We will add this Web Application by clicking on the Add New Service label which is found on the AppFabric Design View Canvas.

image

Next, we will want to provide an appropriate name for this Web App and click on the OK button.

image

We now need to create a Service Bus Topic and can do this by once again clicking on the Add New Service label which is found on the AppFabric Design View Canvas.

image

Much like we have had to do with other Service Bus artifacts, we need to provide our Service Bus IssuerKey, IssuerName and Uri.  For my URI, I have provided the following value:

sb://<your_namespace>.servicebus.appfabriclabs.com/SimpleServiceBusTopic

image

Note: Notice the RequiresProvisioning property which is set to True.  When we deploy our AppFabric application the provisioning of our Topic will be taken care of without any additional work from our side. When we shutdown the Azure Compute Emulator, this Topic will be removed.

In many of the May AppFabric CTP examples out there people handle provisioning tasks by using the ServiceBusNamespaceClient  class.  No where in this code will we be directly using this class.  I would imagine that under the hood, the provisioning code is using this class.

Next, our Web application we need to add a reference to this newly created Topic.

image

I like to rename my references to provide a more descriptive title than the default Import1 value.

image

 

With our topic now setup we want to add two subscriptions.  The first subscription will be for our Redmond messages and the second subscription will be for our Kirkland subscription.

Redmondimage

 

Kirkland

image

 

For each of these Subscriptions we need to once again provide  IssuerKey, IssuerName and Uri propertiesThe Uri property for subscriptions provides an interesting twist. The actual word “subscriptions” must be included between the name of your Topic and Subscription. The convention for these Uri are:

sb://your_namespace.servicebus.appfabriclabs.com/TopicName/subscriptions/SubscriptionName

 

image

 

Much like in the Queue Blog Post we are going to use a Code Service to retrieve messages from our Topic via a Subscription.  We need once again click on the Add New Service label and select Code.

image

 

This Code Service will represent the Redmond Client and therefore we need to add a reference to the SimpleServiceBusSubscriptionRedmond Subscription.

image

 

The underlying core of our application should be set and we should be able to build our application.  If we look at the AppFabric Design View Canvas we should see the following:

image

Our Diagram View should look like this:

image

 

Update Web Application

At this point we have simply configured all of the artifacts that are required in our solution but we have not written any code so our application will not be very functional.  In the Source view of the Default.aspx  page we want to add a few controls:

  • Text Box for our Address
  • Text Box for our City
  • Button used to submit our form

 

image

In the code behind for Default.aspx we will want to add an event handler for the Button clicked event.

protected void btnSubmit_Click(object sender, EventArgs e)
{     
    TopicClient tc = ServiceReferences.CreateSimpleServiceBusTopic();

    MessageSender ms = tc.CreateSender();

    BrokeredMessage bm = BrokeredMessage.CreateMessage(txtAddress.Text);
    bm.Properties[“OutageCity”] = txtCity.Text;

    ms.Send(bm);

    txtAddress.Text = “”;
    txtCity.Text = “”;

}

Some lines of significance include:

TopicClient tc = ServiceReferences.CreateSimpleServiceBusTopic(); where we are creating an instance of our TopicClient.

The BrokeredMessage instantiation line includes our message body which is our address that is coming from our Address text box.

BrokeredMessage bm = BrokeredMessage.CreateMessage(txtAddress.Text);

 

You may also notice where we assign our City value to a property called OutageCity.  This property is kinda like a BizTalk Context property.  We are going to be able to route our message based upon this value.  We will later use this property when creating a Subscription Filter.

bm.Properties[“OutageCity”] = txtCity.Text;

Note: this code looks a little different than the May AppFabic CTP code that you have seen.  In the May CTP we had to worry about our Service Bus configuration for our namespace, Issuer owner and key.  We also had to create an instance of  a MessagingFactory object so that we could create a Topic Client.  I like the AppFabric Application approach better.  It is a little cleaner and all of our configuration info is handled in our Application model.

 

Update Code Service

We now need to update our Run method that exists within our Code Service.  As discussed in the Queue blog post, this method will continue to run until a cancellation request is received.  The purpose of this method is to retrieve messages from our Topic via our Redmond Subscription.

For the purpose of this demonstration we are simply going to log messages that have been retrieved  in our Trace viewer.

public void Run(CancellationToken cancelToken)
       {

           System.Diagnostics.Trace.TraceInformation(“Entering Code Service Loop”);

           while (!cancelToken.IsCancellationRequested)
           {

               SubscriptionClient subClient = ServiceReferences.CreateSimpleServiceBusSubscription();
               subClient.MessagingFactory.CreateSubscriptionClient(“SimpleServiceBusTopic”, “SimpleServiceBusSubscriptionRedmond”);

               MessageReceiver mr = subClient.CreateReceiver();
               BrokeredMessage bm;

               while (mr.TryReceive(new TimeSpan(hours: 0, minutes: 0, seconds: 5), out bm))
               {
                   System.Diagnostics.Trace.TraceInformation(string.Format(“Message received: ID=   {0}, Body= {1}”, bm.MessageId, bm.GetBody<string>()));
                   bm.Complete();
               }

               // Add your code here
               Thread.Sleep(5 * 1000);

           }
       }

Some lines of significance include:

Our line where we create a subscription client that will include the name of our Topic and our Subscription.

subClient.MessagingFactory.CreateSubscriptionClient(“SimpleServiceBusTopic”, “SimpleServiceBusSubscriptionRedmond”);

The next line is our while loop that will attempt to retrieve messages from this subscription.  If a message is retrieved it will be stored in a BrokeredMessage.

while (mr.TryReceive(new TimeSpan(hours: 0, minutes: 5, seconds: 0), out bm))

Creating a Kirkland client

Our Kirkland client is going to follow a different path than our Redmond client.  At this point everything that we have built exists in the Azure Cloud.  For our Kirkland client, it is going to reside on-premise.  Since our Topics and Subscriptions are provisioned in the cloud we can access them from within the cloud and on-premise.

For this example we will follow the path that many of the May CTP examples took.  It is a console application that is in a separate solution from this AppFabric Application.  In this application I have added references manually to the ServiceBus CTP assemblies.

 

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using Microsoft.ServiceBus.Description;

namespace SimpleTopicOnPrem
{
    class Program
    {
        static void Main(string[] args)
        {

            Uri sbURI = ServiceBusEnvironment.CreateServiceUri(“sb”, “<your_namespace>”, string.Empty);
            string name = “owner”;
            string key = “<your_key>”;

          

            MessagingFactory factory = MessagingFactory.Create(sbURI, TransportClientCredentialBase.CreateSharedSecretCredential(name, key));

            SubscriptionClient subClient = factory.CreateSubscriptionClient(“SimpleServiceBusTopic”, “SimpleServiceBusSubscriptionKirkland”);

            //Next we create the MessageReceiver and receive a message.

            MessageReceiver mr = subClient.CreateReceiver();

            BrokeredMessage bm;

            while (mr.TryReceive(new TimeSpan(hours: 0, minutes: 5, seconds: 0), out bm))
            {
                Console.WriteLine(string.Format(“Message received: ID= {0}, Body= {1}”, bm.MessageId, bm.GetBody<string>()));
                bm.Complete();
            }

            Console.WriteLine(“Done Reading From Queue”);
            Console.Read();
        }
    }
}

 

This code looks pretty similar to our Code Service with the exception of how we create a SubscriptionClient and that we need to deal with our credentials and URI in the application.

 

Testing the application

For the purpose of this blog post we are going to keep our AppFabric application in our Development Fabric.  We can provision and deploy our application by typing CRTL + F5.  We should see our web application launch.

Next we will want to start an instance of our Kirkland On Premise client by typing F5.  This means that both of our applications are up and running and ready to receive messages.

Since we have not added any filters on our Subscriptions both applications should receive a copy of any message that we submit. 

 

The first message that we will send will have an Address of Space Needle and a City of Seattle.

image

 

In our Trace logs and in our Console application we will discover that our message has been received by both clients.

image

 

At this point we have proved that our initial deployment has been successful and that we can broadcast this message to multiple clients.  But, this is not the end state that we desire.  Remember we want the Redmond client to only receive the Redmond messages and the Kirkland client to only receive the Kirkland messages.  In order to accomplish this behavior we need to add Subscription rules.

Unless I have missed something(and please let me know if I have), there is no way to specify our filter when we create our Subscription through the AppFabric Design Canvas.  The only way that I have seen how to do this is through code by using the ServiceBusNamespaceClient class.  So I could have written this code but opted for a different option.  The AppFabric Cat team recently released a tool called the Service Bus Explorer.  Within this tool, you can provide your credentials and then interrogate your Queues, Topics and Subscriptions.  You can read all about this tool and some more detailed information on their blog post.

Once I have connected to my namespace using the Service Bus Explorer, I want to navigate to my Redmond subscription and then delete the existing rule.  Basically this rule is saying that any messages should be retrieved using this subscription.

image

I then want to right mouse click on the Redmond Subscription and Add Rule.  I now need to provide a name for this rule (it can be anything) and a SQL Filter Expression.  In this case I am providing OutageCity=’Redmond’.

image

With my Subscription rule established for my Redmond Subscription lets submit another message.  This time we will provide an Address of 1 Microsoft Way and a City of Redmond.

image

 

So notice that both clients received this message.  The reason for this is that this message did match the Subscription rule (filter) for the Redmond Client.  Since we have not configured a Subscription rule (filter) for our Kirkland client it is still configured to receive all messages.

image

To prove that the Redmond Subscription is working, lets send in a message that will not match the Redmond rule.  This time only the Kirkland application should receive it if we make the city equal to Bellevue.

image

 

We will discover that the Subscription rule is working.  Our Redmond client did not receive this Bellevue message but our Kirkland client did since its Subscription rule is wide open.

image

So let’s create a Subscription rule for our Kirkland client.  We will head back to the Service Bus Explorer tool to do this.

Once again we will delete the Default Rule, this time for the Kirkland Subscription.  We will add a new Rule by right mouse clicking on the Kirkland Subscription and selecting Add Rule.  We will provide this rule with a Name of Kirkland and a SQL Filter Expression of OutageCity=’Kirkland’.

image

With our Subscription rule now in place, lets send in a message that only our Kirkland client will receive. 

 

image

Sure enough, our Redmond client did not receive a copy of this message since its Subscription rule didn’t match.  Our Kirkland client did receive this message since it did match our Subscription rule

image

 

Closing Notes:

  • You will lose any of these Subscription rules between deployments.  When you shutdown the Dev Fabric emulator, your Topic and Subscriptions will be removed.  When you deploy your application to the Local Dev fabric your Topic and Subscriptions will be deployed but your rules/filters will not return unless you configure them again.
  • Topics will push messages to Subscriptions no matter the Rule that you have in place.  But, clients will only retrieve messages that match the Subscription rule(s).
  • You can have multiple Subscription rules per Subscription.
  • It would be nice if we can provide our Subscription rules in our AppFabric Design view canvas.  This way they could be deployed with our Topic and Subscription(s).
  • Publishers instantiate TopicClient instances, Consumers instantiate Subscription Client instances.
  • Overall the technology is pretty cool.  Having true pub/sub in the cloud should open up many opportunities for organizations.