Archive

Archive for the ‘Azure’ Category

Azure–Service Bus Queues in Worker Roles

June 23, 2012 2 comments

 

Another nugget of information that I picked up at TechEd North America is a new template that ships as part of the Azure SDK 1.7 called Worker Role with Service Bus Queue. 

image

What got me interested in this feature is some of the work that I did last year with the AppFabric Applications (Composite Apps) CTP.  I was a big fan of that CTP as it allowed you to wire-up different cloud services rather seamlessly.  That CTP has been officially shut-down so I can only assume that this template was introduced to address some of the problems that the AppFabric Applications CTP sought to solve.

 

The Worker Role with Service Bus Queue feature works in both Visual Studio 2010 or Visual Studio 2012 RC.  For this post I am going to be using 2012 RC.

I am going to use a similar scenario to the one that I used in the App Fabric Applications CTP.  I will build a simple Web Page that will allow a “customer” to populate a power outage form.  I will then submit this message to a Service Bus Queue and will then have a Worker role dequeue this message. For the purpose of this post I will simply write a trace event to prove that I am able to pull the message off of the queue.

 

Building the Application

  • Create a new project by clicking on File (or is it FILE) – New Project
  • Select Cloud template and then you will see a blank pane with no template able to be selected.  The Azure SDK is currently built on top of .Net 4.0 not 4.5.  With this in mind, we need to select .Net Framework 4

image

  • We now need to select the Cloud Services that will make up our solution.  In my scenario I am going to include an ASP.Net Web Role and a Worker Role with Service Bus Queue.

image

Note: we do have the opportunity to rename these artifacts by hovering over the label and then clicking on the pencil. This was a gap that existed in the old AppFabric Apps CTP.  After renaming my artifacts my solution looks like this:

image

  • I want to send and receive a strongly typed message so I am going to create a Class Library and call it CustomerEntity.

image

  • In this project I will simply have one class called Customer with the following properties

namespace CustomerEntity
{
    public class Customer
    {
        public string Address { get; set; }
        public string City { get; set; }
        public string State { get; set; }
    }
}

 

  • I will then add a reference in both the Web Project and Worker Role projects to this CustomerEntity project.

 

  • Within the PowerOutageWeb project clear out all of the default markup in the Default.aspx page and add the following controls.

<h3>Customer Information:</h3>
Address: <asp:TextBox ID="txtAddress" runat="server"></asp:TextBox><br />   
City: <asp:TextBox ID="txtCity" runat="server"></asp:TextBox> <br />
State: <asp:TextBox ID="txtState" runat="server"></asp:TextBox><br />
<asp:Button ID="btnSubmit" runat="server" Text="Submit"  OnClick="btnSubmit_Click" />
<asp:Label ID="lblResult" runat="server" Text=""></asp:Label>

 

  • Also within the PowerOutageWeb project we need to add references to the Service Bus Assembly:  Microsoft.ServiceBus.dll and Runtime Serialization Assembly: System.Runtime.Serialization.dll

 

  • We now need to provide the following include statements:

using CustomerEntity;
using Microsoft.WindowsAzure;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;

 

  • Next we need to provide a click event for our submit button and then include the following code:

    protected void btnSubmit_Click(object sender, EventArgs e)
      {
          Customer cs = new Customer();
          cs.Address = txtAddress.Text;
          cs.City = txtCity.Text;
          cs.State = txtState.Text;

          const string QueueName = "PowerOutageQueue";

         //Get the connection string from Configuration Manager

         string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");

          var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

         //Check to see if Queue exists, if it doesn’t, create it
          if (!namespaceManager.QueueExists(QueueName))
          {
              namespaceManager.CreateQueue(QueueName);
          }

          MessagingFactory factory = MessagingFactory.CreateFromConnectionString(connectionString);

          //Create Queue CLient
          QueueClient myQueueClient = factory.CreateQueueClient(QueueName);

          BrokeredMessage bm = new BrokeredMessage(cs);

         //Send Message

          myQueueClient.Send(bm);

         //Update Web Page

          lblResult.Text = "Message sent to queue";
      }

 

  • Another new feature in this SDK that you may have noticed is the CreateFromConnectionString method that is available to the MessagingFactory and NamespaceManager classes.  This allows us to retrieve our configuration settings from our project properties page.  To access the project properties right mouse click on the particular role and then select Properties.  Next, click on Settings where you will find Key/Value Pairings.  The name of the key that we are interested in is: Microsoft.ServiceBus.ConnectionString and our value is

Endpoint=sb://[your namespace].servicebus.windows.net;SharedSecretIssuer=owner;SharedSecretValue=[your secret]

  • Since both our Web and Worker Roles will be accessing the Queue we will want to   ensure that both configuration files have this entry included.  This will allow our code to make a connection to our Service Bus Namespace where our Queue may be found.  If we edit this property here in these locations, then we do not need to modify the Cloud.cscfg and Local.cscfg configuration files because Visual Studio will take care of this for us.

image

  • Next we want to shift focus to the Worker Role and edit the WorkerRole.cs file.  Since we are going to be dequeuing our typed CustomerService message we want to include a reference to this namespace:

    using CustomerEntity;

  • Something that you probably noticed when you opened up the WorkerRole.cs file is that there is already some code written for us.  We can leverage most of it but can delete the code that is highlighted in red below:

image

  • Where we deleted this code, we will want to add the following:

Customer cs = receivedMessage.GetBody<Customer>();
Trace.WriteLine(receivedMessage.SequenceNumber.ToString(), "Received Message");
Trace.WriteLine(cs.Address, "Address");
Trace.WriteLine(cs.City, "City");
Trace.WriteLine(cs.State, "State");
receivedMessage.Complete();

In this code we are going to receive a typed Customer message and then simply write out the contents of this message using the Trace utility.  If we wanted to save this information to a Database, this would a good place to write that code.

  • We also want to make sure that we update the name of Queue so that it matches the name that we specified in the Web Role:

// The name of your queue
const string QueueName = "PowerOutageQueue";

 

Creating our Queue

We have a few options when it comes to creating our Queue that is required for this solution to work. The code in our ASP.NET Web page code-behind will take care of this for us.  So for our solution to work, this method is sufficient but perhaps we want to use a design-time alternative to specify some more advanced features.  In this case we do have a few options:

  • Using the http://www.windowsazure.com portal.image
  • Similarly we can use the Service Bus Explorer tool that was written by Paolo Salvatori.  Steef-Jan Wiggers has provided an in-depth walk through of this tool so I won’t go into more details here. http://soa-thoughts.blogspot.ca/2012/06/visual-studio-service-bus-explorer.html
  • As part of the Azure 1.7 SDK release, a Visual Studio Service Bus Explorer is now included.  It is accessible from the Server Explorer view from within Visual Studio.  Using this tool we can perform some functions like:
    • Creating Queues/Topics
    • Set advanced properties: Queue Size, Time to Live, Lock Duration etc
    • Send and Receive Test Messages
    • Get the current Queue Depth
    • Get our Service Bus Connection string

image

 

Any of these methods will work.  As I mentioned earlier, if we do nothing, the code will take care of it for us.

Testing

We are just going to test our code locally, with the exception of our ServiceBus Queue.  That is going to be created in Azure.  To test our example:

  • Hit the F5 and our website should be displayed

image

  • Since we have our Trace statements included in our Worker Role code, we need to launch our Compute Emulator.  To do this right mouse click on the Azure icon that is located in your taskbar and select Show Compute Emulator UI.

image

  • A “Console” like window will appear.  We will want to click on the ServiceBusWorker label

image

  • Switch back to our Web Application, provide some data and click the submit button.  We should see that our results label is updated to indicate that our Message sent to queue. image
  • If we switch back to our Compute Emulator we should discover that our message has been dequeued.

image

 

Conclusion

While the experience is a little different than that of the AppFabric Applications CTP, it is effective.  Especially for developers who may not be all that familiar with “integration”.  This template really provides a great starter point and allows them to wire up a Service Bus queue to their Web Application very quickly.

Categories: Azure, ServiceBus

Azure Service Bus–Don’t run as Root!

June 20, 2012 1 comment

 

So I can’t take credit for the catchy tag line for this post.  It was inspired by a recent session that Clemens Vasters and Abhishek Lal provided at TechEd North America 2012.  You can watch the entire session here

While watching the presentation, this particular segment, on not running as root,  really resonated with me.  I have built some proof of concept mobile applications that use REST APIs to send messages to Service Bus Queues.  In these applications I use the default owner key to submit messages.  I knew at the time that this could not be a good practice but since it was just a POC it was acceptable.  I was curious about how I could solve this problem in a better manner and I think what Microsoft has done here is definitely a step in the right direction.

If you are familiar with the Azure Service Bus, and Azure in general, there are 3 fundamental ‘artifacts’ that are required whenever you try to provision or manipulate Azure Services.  These artifacts include:

  • Namespace
  • Default Issuer (username)
  • Default Key (password)

image

image

The Problem

In 99.99% of the demos and blog posts that exist on the interwebs, people are embedding their Default Issuer and Default Key in their solution.  This creates issues on a few different levels:

  • The account really is “root” when it comes to your namespace.  Using your “account” allows you to create services like Queues, Topics, ACS Trust relationships, provision Cache etc.  Do you see the problem with embedding these credentials in your app and then distributing it?
  • If your account does become compromised, it could be used to maliciously manipulate your solution.  If you comprimised a solution that processed Customer Orders, wouldn’t it be nice to add your own subscriber to the Topic Subscription and receive a copy of each customer’s Credit Card number?

Like any other security principal, we want to ensure that people, or systems, have the minimum level of security rights that they need to perform a specific function.  A parallel example to this scenario is giving end users Domain Admin.  You wouldn’t give that to end users, much like you shouldn’t embed your “owner ‘s” credential in your application.

Enter SBAzTool

There is a tool that is part of the Windows Azure 1.7 SDK that can help us assign fine grained authorization to “Default Name(s)” (Usernames).  You can download the source code for this tool here.  In the rest of this post I will demonstrate how you can use this tool and then show an example that demonstrates why this tool is beneficial and that creating authorization rules is not so hard.

Create Namespace in Portal

Even though I have a functional namespace I am going to go ahead and create one from scratch so that we are all beginning at the same starting point.  If you have an existing namespace that you want to use, you can.  You don’t have to go through these next few steps where I create the namespace.

    • From http://www.windowsazure.com, log in and then select the previous(old) portal as Service Bus and ACS do not exist in the new portal…yet.
    • Next, click on the Service Bus label and then click on the {New} button

    image  

  • In this case I am selecting Access Control, Service Bus, a namespace and a Country/Region.  Since United States (West) is closest to my locale, I will use it.

image

 

  • So I now have a Namespace called DontRunAsRoot.  It may take a couple minutes to create.  You will also notice that we have a tree like structure where our Queues and Topics will be displayed.

image

 

  • For the purpose of this demo we are going to create a Queue called MyQueue by clicking on the New Queue button.image

 

  • We will need to populate the Name text box and then can accept the defaults.

image

 

  • Voila, we have our newly created Namespace and Queue

image

  • We are going to need our “owner” key so we might as well get it while we are in the portal.  Click on the View button and then click the Copy to Clipboard button.

 

    image

      Compiling SBAzTool

      Once you have downloaded the SBAzTool you can compile the solution and then open a command prompt and launch the tool by specifying sbaztool.exe.  You can now see all of the command line arguments.  However you can also access this documentation from the download page as well.

       

      StoreOptions

      In order to set permissions for other users/issuers we need to be authenticated ourselves using the Key that we previously retrieved from the portal.  By using the storeoptions argument we can continue to execute commands without having to re-issue our key/password.  To do this we will want to execute the following command:

      sbaztool.exe storeoptions –n <namespace> –k <key>

      image

       

      MakeId

      So lets now create a new “user” by specifying the makeid command.  In this case we can actually specify a “username” instead of the regular “owner” that we are so use to. 

      sbaztool.exe makeid <username>

      image

      As you can see, a Key has been provided for this user(which I have whited out).  We also have the ability to specify a password but including our own <key> provided it is a 32 byte, base64 encoded value.

      Grant

      Now that we have a Issuer/User created, we can actually assign this user permissions.  The command to do so is:

      grant <operation> <path> <name>

      The available operations that we have access to are:

      • Send
      • Listen
      • Manage

      Send and Listen are pretty self explanatory but Manage deserves further elaboration.  We can actually delegate the authority to manage resources to a particular user based upon the path.  So lets imagine we have a path that looks like this:

      /organization/department/engineering

      If we wanted to allow someone to administrate the /engineering services we could do so using this command.

      For the purpose of this blog post, lets keep things simple and assign Send and Listen privileges to our QueueUser for our queue which is called myqueue

      Send

      image

      Listen

      image

      Show

      To verify that our permissions have been set correctly we can execute the Show command by providing the following:

      show <path>

      As you can see below the QueueUser now has Send and Listen permissions on the queue called myqueue.  This is a much better solution than giving entire rights to a namespace when you only need to specify rights on a particular queue.

      image

       

      Test permissions

      So in order to validate that this stuff actually works and this is not smoke and mirrors I am going to create a very simple console application that will use these credentials to send and receive a message.  Once we have validated that this works we will pull the Send permission and see what happens.

      The following code will create a QueueClient, send a message to the queue and then receive the message.

              const string QueueName = "myqueue";
               const string ServiceNamespace = "DontRunAsRoot";
               const string IssuerName = "QueueUser";
               const string IssuerKey = "<removed>";

               //Sending the message

               TokenProvider credentials = TokenProvider.CreateSharedSecretTokenProvider(IssuerName, IssuerKey);
               Uri serviceUri = ServiceBusEnvironment.CreateServiceUri("sb", ServiceNamespace, string.Empty);

               MessagingFactory factory = null;

               try
               {

                   factory = MessagingFactory.Create(serviceUri, credentials);

                   //This code assumes that queue has already been created since we have
                   //provisioned access
                   QueueClient myQueueClient = factory.CreateQueueClient(QueueName);

                   Console.WriteLine("\nCreated Queue Client");

                   //Create Brokered Message
                   BrokeredMessage bm = new BrokeredMessage("I hope this works");

                   Console.WriteLine("\nSending messages to Queue…");

                   myQueueClient.Send(bm);

                   Console.WriteLine("\nMessage sent to Queue");
                   Console.WriteLine("\nPress Any ENTER to receive message");
                   Console.ReadLine();

                
                   bm = myQueueClient.Receive(TimeSpan.FromSeconds(5));

                   if (bm != null)
                   {
                       Console.WriteLine(string.Format("Message received: Id = {0}, Body = {1}", bm.MessageId, bm.GetBody<string>()));
                       // Further custom message processing could go here…
                       bm.Complete();
                   }

                   Console.WriteLine("\nNo more messages to process");
                   Console.WriteLine("\nPress ENTER Key to exit");
                   Console.ReadLine();
                 

               }
               catch (Exception ex)
               {
                   Console.WriteLine(ex.ToString());
               }

           }

      When we run the application we will discover that it is executing correctly:

      image

       

      Revoke

      Let’s now make this a little interesting.  Let’s remove the QueueUser’s ability to send messages to the Queue and see what happens.  To do this we will use the following command:

      revoke <operation> <path> <user>

      image

      To validate that the permission revoke was successful lets run the show command again.  As you can see our revoke command was successful.

      image

       

      Let’s now try to send a message and see what happens.

      image

      As expected we get an authentication exception as we should.

       

      Conclusion

      With the introduction of the SBAzTool tool there is no excuse for using your “owner” credentials when building applications.  The SBAzTool has a wide variety of commands that facilitate managing and even delegating permissions.  Since this is a command line tool you can even script these permissions as you provision your Service Bus artifacts.

      Categories: Azure, ServiceBus

      SAP meet Azure Service Bus – EAI/EDI December 2011 CTP

      December 29, 2011 11 comments

       

      The Azure Service Bus EAI/EDI December 2011 CTP has been out for about 2 weeks at the time of this blog post.  As soon as I saw the Service Bus Connect feature in the documentation I wanted to try and hook up the Service Bus to SAP.  The organization that I work for utilizes SAP to support many of its core business processes.  We are also heavily invested in BizTalk Server when integrating SAP with other Corporate Systems. For the past 5 years much of my work experience has involved integration with SAP. So much that I had the opportunity to write a couple chapters on BizTalk-SAP integration in the Microsoft BizTalk 2010 Line of Business Systems Integration book.

      Integrating with SAP is of great interest to me both personally and professionally.  I like the challenge of taking two different types of systems that would seemingly be impossible to integrate yet find a way to do it.  I also enjoy the expression on SAP consultants face when you take a Microsoft product and successfully execute operations inside their system like creating customer records or creating Work Orders. 

      Using the Service Bus Connect feature is not the only way of bridging your On-Premise Line of Business Systems with external parties via cloud based messaging technologies.  Within the past year Microsoft also introduced a feature called BizTalk Server 2010 AppFabric Connect for Services.  This tool allows for BizTalk to expose an endpoint via a Service Bus Relay.  I have also used this mechanism to communicate with SAP via a Mobile Device and it does work. 

      There are a few differences between Service Bus Connect and AppFabric Connect for Services.  Some of these differences include:

      • Any message transformations that need to take place actually take place in the Cloud instead of On Premise.  When integrating with SAP, you never want to expose SAP schemas to calling clients.  They are ugly to say the least. In this scenario we can expose a client friendly, or conical schema, and then transform this message into our SAP request in Azure.
      • AppFabric Connect for Services utilizes a full deployment of BizTalk in your environment where as Service Bus Connect only requires the BizTalk Adapter Pack when communicating with SAP. All message transformations and orchestration takes place On Premise and the cloud (Azure Service Bus) is basically used as a communication relay.

      When connecting to On-Premise Line of Business Systems, both methods require the BizTalk Adapter Pack to be installed On-Premise.  The BizTalk Adapter Pack is included in your BizTalk license.  Licensing details for Service Bus Connect have not been released at the time of this writing.

      The following walkthrough assumes you have some experience with the new Service Bus CTP.  If you haven’t looked at the CTP before I suggest that you visit a few of the following links to get more familiar with the tool:

      Also it is worth pointing out another blog post written by Steef-Jans Wiggers where he discusses Oracle integration with Service Bus Connect.

      Building our Application

      • The first thing we need to do is to create a new ServiceBus – Enterprise Application Integration project.  In my case I am calling it HelloSAP.

      image

       

      • Since we know that we want to communicate with an On-Premise LOB system like SAP we need to Add a ServiceBus Connect Server.  We can do this by accessing Server Explorer, right mouse clicking on ServiceBus Connect Servers and then selecting Add Server.  When prompted we can provide a host name of localhost since this is a local environment.

      image

       

      • We now can expand our Service Bus Connect Servers hierarchy.  Since we want to build an SAP interface we can right mouse click on SAP and select Add SAP Target

      image

       

      • If you have ever used the BizTalk Adapter Pack before, you are now in familiar territory.  This is (almost) the same wizard that we use to generate schemas when connecting to SAP systems via BizTalk.  There is a subtle difference in the bottom left corner called Configure Target Path which we will discuss in a few moments.  If you are unfamiliar with this screen you are going to need some help from your SAP BASIS Admin to provide you with the connection details required to connect to SAP.  Also if you are interested in further understanding everything that is going on in this screen I recommend you pick up the BizTalk LOB book that I previously talked about as I discuss the different aspects of this wizard in great detail.  (ok..no more shameless plugs)

      image

       

      • We now want to select the type of interface that we want to interact with.  For the purpose of this blog post I am going to select a custom IDOC that is used when submitting timesheets from our field personnel.  In my case, the version of SAP that I am connecting to is 700 so that is why I am selecting the ZHR_CATS IDOC that corresponds to this version.  Once again, if you are unsure you will need to speak to your BASIS Admin.

      image

       

      • Notice how we cannot click the OK button after establishing a connection to SAP and selecting an IDOC?  We now need to create a Target Path.  Creating a Target Path will provide the Bridge from the Azure Service Bus into SAP.  Click the Configure button to continue.

      image

       

      • Assuming that we have not been through this exercise before we need to select Add New LobRelay  from the Select LOB Relay to host the LOB Target: dropdown list. 

      image

       

      • Another dialog box will appear. Within this dialog box we need to provide our CTP Labs namespace, a Relay path, Issuer name and key.  For Relay path:, we can really provide whatever we want here.  It will essentially make up the latter portion of URI for the Endpoint that is about to be created.

      image

       

      • Now we are are prompted to Enter LOB Target sub-path.  Once again this value can be whatever we want to choose.  Since the HR Timesheet module inside of SAP is often called CATS I will go ahead and use this value here.

      image

      • Now with our Target Path configured we are able to select the OK button to proceed.

      image

      • Inside Server Explorer we now have an entry underneath SAP.  This represents our End Point that will bridge requests coming from the cloud to SAP.

      image

      • At this point we haven’t added any artifacts to our Enterprise Application Integration project that we created earlier.  This is about to change.  We need to right mouse click on our SAP endpoint and then select Add schemas to HelloSAP

      image

      • We will now get prompted for some additional information in order to re-establish a connection to SAP so that we can generate Schemas that will enable us to send a message to SAP in a format that it is expecting.  You may also notice that we aren’t being prompted for any SAP server information.  In the Properties grid you will notice that this information is already populated because we had previously specified it when using the Consume Adapter Service Wizard.

      image

      • Inside our solution, we will now discover that we have our SAP schemas in a folder called LOB Schemas.

      image

      • For the purpose of this blog post, I have created another folder called Schemas and saved a Custom Schema called CloudRequest.xsd here.  This is the message that our MessageSender application will be sending in once we test our solution.  (BTW: I do find that the Schema editor that is included in BizTalk is much more intuitive and user friendly than this one.  I am not a big fan of this one)

      image

      • We now need to create a Map, or Transform, to convert our request message into a request that SAP will understand.

      image

      • Next we need to add a Bridge on to the surface of our Bridge Configuration.  Our Bridge will be responsible for executing our Map that we just created and then our message will get routed to our On-Premise end point so that our Timesheet can be sent to SAP.

      image

      • We now need to set the message type that we expect will enter the bridge.  By double clicking on our TimeSheetBridge we can then use the Message Type picker to select our Custom message type: CloudRequest.

      image

       

      • Once we have selected our message type, we can then select a transform by clicking on the Transform Xml Transform box and then selecting our map from the Maps Collection.

       image

       

      • Before we drag our LOB Connection shape onto our canvas we need to set our Service Namespace.  This is the value that we created when we signed up for the CTP in the Azure Portal.  To set the Service Namespace we need to click on any open space, in the Bridge Configuration canvas, and then look in the Properties Page.  Place your Service Namespace here.

      image

       

      • We are now at the point where we need to wire up our XML One-Way Bridge to our On-Premise LOB system.  In order to do so we we need to drag our SAP instance onto the Bridge Configuration canvas.

      image

       

      • Next, we need to drag a Connection shape onto the canvas to connect our Bridge to our LOB system.

      image

       

      • The next action that needs to take place is setting up a Filter Condition between our LOB Shape and our Bridge.  You can think of this like creating a subscription.  If we wanted to filter messages by their content we would be able to do so here.  Since we are interested in all messages we will just Match All.  In order to set this property we need to select our Connection arrow then click on the Filter Condition ellipses.

      image

       

      • If you have used the BizTalk Adapter Pack in the past you will be familiar with SOAP Action headers that need to be be set in your BizTalk Send Port.  Since we don’t have Send Ports per say, we need to set this action in the Route Action as part of the One-Way Connection shape.  In the Expression text box we want to put the name of our Operation which is http://Microsoft.LobServices.Sap/2007/03/Idoc/3/ZHR_CATS//700/Send wrapped with single quotes ‘ ’.  We can obtain this value by selecting our Service Bus Connect Server endpoint and then viewing the Properties page.  Within the Properties page there is an Operations arrow that can be expanded on and we will find this value here.  In the Destination (Write To) we want to set our Type to Soap and our Identifier to Action.

      image

       

      • There is one last configuration that needs to take place before enable and deploy our service.  We need to set our Security Type.  We can do so by selecting our SAP – ServiceBus Connect Server instance from Server Explorer.  Then in the Properties Page, click on the SecurityType ellipses.  Determining which Security type to use will depend upon how your SAP instance ahs been configured.  In my case, I am using ConfiguredUsername and I need to provide both a Username and Password.

      image

      • With our configuration set, we can now enable our SAP – ServiceBus Connect Server instance by right mouse clicking on it and then selecting Enable.

      image

       

      • We can now deploy our application to Azure by right mouse clicking on our Visual Studio solution and selecting Deploy.

      image

       

      Testing Application

      • In order to test our application we can use the MessageSender tool that is provided with the CTP Samples/Tools.  It will simply allow us to submit EDI, or in this case XML, messages to an endpoint in the Azure Service Bus.  In order to successfully submit these messages we need to provide our Service Namespace, Issuer Name, Shared Secret, Service Bus endpoint address, a path to our XML file that we want to submit and indicate that we are submitting an xml document.  Once we have provide this information we can hit the enter key and provided we do not have errors we will see a Message sent successfully message.

      Note: In the image below I have blocked my Shared Secret (in red) for privacy reasons.

      image

      • If we launch our SAP GUI we should discover that it has received a message successfully.

      image

      • We can then drill down into the message and discover the information that has been posted to our timesheet

      image

       

      Exceptions

      While testing, I ran into an exception.  In the CATSHOURS field I messed up the format of the field and sent in too much data.  The BizTalk Adapter Pack/SAP Adapter validated this incoming data against the SAP schema that is being used to send messages to SAP.  The result is a message was returned back to the MessageSender application.  I thought that this was pretty interesting.  Why?  In my solution I am using a One-Way bridge and this exception is still being propagated to the calling application.  Cool and very beneficial.

       image

      Conclusion

      Overall the experience of using Service Bus connect was good.  There were a few times I had to forget how we do this in BizTalk and think about how the Service Bus Connect does it.  An example of this was the SOAP Action Headers that BizTalk developers are use to manipulating inside of Send Ports.  I am not saying one way is better than the other but they are just different.  Another example is the XML Schema editor.  I find the BizTalk editor to be much more user friendly.

      While I am not convinced that the current state of Service Bus Connect is ready for primetime (they have CTPs for a reason), I am very impressed that the Microsoft team could build this type of functionality into a CTP.  For me personally, this type of functionality(connecting to On Premise LOB Systems) is a MUST HAVE as organizations start evolving towards Cloud computing.

      Categories: Azure, EAI/EDI December 2011 CTP Tags:

      Azure Service Bus EAI/EDI December 2011 CTP – New Mapper

      December 17, 2011 11 comments

      In this blog post we are going to explore some of the new functoids that are available in the Azure Service Bus Mapper.

      At first glance, the Mapper looks pretty similar to the BizTalk 2010 Mapper. Sure there are some different icons, perhaps some lipstick applied but conceptually we are dealing with the same thing right?

      image

      Wrong! It isn’t until we take a look at the toolbox that we discover this isn’t your Father’s mapper.

      image

      This post isn’t meant to be an exhaustive reference guide for the new mapper but here are some of the new functoids that stick out for me.

       

      Functoid Section Functoid Name Description
      Loop Operations MapEach Loop This functoid will loop over a repeating record from the source document and evaluate an operation at each iteration of the loop.  If the criteria of the operation is met then a record in the target document will be created.
      Expressions Arithmetic Expression No we haven’t lost our Addition or Subtraction functionality!  Many functoids that you would have found in the BizTalk 2010 Mathematical Functoids section have been consolidated in the Arithmetic Expression operation.
      Expressions Logical Expression Similar to the Arithmetic Expressions, these have now been consolidated and can be found within a single Expression including >, <, >=, <=, ==, !=, Logical Negation, Conditional ADD and Conditional OR
      Expressions If-Then-Else A much anticipated operation! BizTalk developers have been wanting a If-Then-Else functoid in BizTalk for many years.  If a condition has been evaluated to True then we can provide a particular value.  Otherwise we can provide different value when the condition has not been satisfied.
      List Operations All of them Complete new functionality provides us with the ability to manipulate Lists within a map.  Functionality includes creating a list, adding an item to the list, selecting a unique group, selecting a value, selecting entries, getting items, and ordering the list.  Wow!  Will be interesting to see how this progresses.
      Date/Time Operations DateTime Reformat This one should be useful.  I am constantly re-formatting dates when integrating with SAP.  Usually this formatting winds up in a Helper assembly which tends to be overkill for what really needs to take place.
      Misc Operations Data Lookup This one is interesting.  It allows us to access data from SQL Azure within a transform at runtime.
      Misc Operations Generate ID This functoid will generate a GUID.  It is the equivalent of calling the GUID.NewGuid method in .Net.
      Misc Operations Get Context Property Another useful operation!  This operation allows us to retrieve a value from context.  This is something that just isn’t possible in BizTalk.
       
      What’s Missing?

      I can’t take credit for discovering this, but while chatting with Mikael Håkansson he mentioned “hey – where is the scripting functoid?”  Perhaps this is just a limitation of the CTP but definitely something that needs to be addressed for RTM.  It is always nice to be able to fall back on a .Net helper assembly or custom XSLT.

      Conclusion
      While this post was not intended to be comprehensive, I hope it has highlighted some new opportunities that warrant some further investigation.  It is nice to see that Microsoft is evolving and maturing in this area of EAI.

      Introduction to the Azure Service Bus EAI/EDI December 2011 CTP

      December 17, 2011 3 comments

      The Azure team has recently reached a new milestone; delivering on Service Bus enhancements for the December 2011 CTP.  More specifically, this CTP provides Enterprise Application Integration (EAI) and Electronic Data Interchange (EDI) functionality.  Within the Microsoft stack, both EAI and EDI have historically been tackled by BizTalk.  With this CTP we are seeing an early glimpse into how Microsoft envisions these types of integration scenarios being addressed in a Platform as a Service (PaaS) based environment.

       What is included in this CTP?

      There are 3 core areas to this release:

      1. Enterprise Application Integration: In Visual Studio, we will now have the ability to create Enterprise Application Integration and Transform projects. 
      2. Connectivity to On-premise LOB applications: Using this feature will allow us to bridge our on-premise world with the other trading partners using the Azure platform.  This is of particular interest to me.  My organization utilizes systems like SAP, Oracle and 3rd party work order management and GIS systems.  In our environment, these applications are not going anywhere near the cloud anytime soon.  But, we also do a lot of data exchange with external parties.  This creates an opportunity to bridge the gap using existing on-premise systems in conjunction with Azure.
      3. Electronic Data Interchange: We now have a Trading Partner Management portal that allows us to manage EDI message exchanges with our various business partners.  The Trading Partner Management Portal is available here.

      What do I need to run the CTP?

      • The first thing you will need is the actual SDK (WindowsAzureServiceBusEAI-EDILabsSDK-x64.exe) which you can download here
      • A tool called Service Bus Connect (not to be confused with AppFabric Connect) enables us to bridge on-premise with Azure.  The setup msi can also be accessed from the same page as the SDK. Within this setup we have the ability to install the Service Bus Connect SDK(which includes the BizTalk Adapter Pack), Runtime and Management tools.  In order for the runtime to be installed, we need to have Windows Server AppFabric 1.0 installed.
      • Once we have the SDKs installed, we will have the ability to create ServiceBus projects in Visual Studio.

      image

      • Since our applications will require resources from the Azure cloud, we need to create an account in the Azure CTP Labs environment.  To create a namespace in this environment, just follow these simple instructions.
      • Next, we need to register our account with the Windows Azure EDI Portal.

      image

       Digging into the samples

      For the purpose of this blog post I will describe some of what is going on in the first Getting Started sample called Order Processing.

      When we open up the project and launch the BridgeConfiguration.bcs file we will discover a canvas with a couple shapes on it.  I have outlined, in black, a Bridge and I have highlighted, in Green, what is called a Route Destination.  So the scenario that has been built for us is one that will receive a typed xml message, transform it and then place it on a Service Bus Queue.

      image

      When we dig into the Xml One-Way Bridge shape, we will discover something that looks similar to a BizTalk Pipeline.  Within this “shape” we can add the various message types that are going to be involved in the bridge.  We can then provide the names of the maps that we want to execute.

      image

      Within our BridgeConfiguration.bcs “canvas” we need to provide our service namespace.  We can set this by click on the whitespace within the canvas and then populate the Service Namespace property.

      image

      We need to set this Service Namespace so that we know where to send the result of Xml One-Way bridge.  In this case we are sending it to a Service Bus Queue that resides within our namespace.

      image 

      With our application configured we can build our application as we normally would.  In order to deploy, we simply right mouse click on solution, or project, and select Deploy.  We will once again be asked to provide our Service Bus credentials.

      image

      The deployment is quick, but then again we aren’t pushing too many artifacts to the Service Bus.

      Within the SDK samples, Microsoft has provided us with MessageReceiver and MessageSender tools.  With the MessageReceiver tool, we can use it to create our Queue and receive a message.  The MessageSender tool is used to send the message to the Queue.

      When we test our application we will be asked to provide an instance of our typed Xml message.  It will be sent to the Service Bus, transformed (twice) and then the result will be placed on a Queue.  We will then pull this message off of the queue and the result will be displayed in our Console.

       

      image

      Conclusion

      So far is seems pretty cool and looks like this team is headed in the right direction.  Much like the AppFabric Apps CTP, there will be some gaps in the technology, and bugs, that have been delivered as this is still just a technical preview.  If you have any feedback for the Service Bus team or want some help troubleshooting a problem, a forum has been set up for this purpose.

      I am definitely looking forward to digging into this technology further – especially in the area of connecting to on-premise LOB systems such as SAP.  Stay tuned for more posts on these topics.

      Speaking at Prairie Dev Con West–March 2012

      December 2, 2011 1 comment

      I have recently been informed that my session abstract has been accepted and I will be speaking at Prairie Dev Con West in March.  My session will focus on Microsoft’s Cloud based Middleware platform: Azure AppFabric.  In my session I will be discussing topics such as Service Bus,  AppFabric Queues/Subscriptions and bridging on-premise Line of Business systems with cloud based integration.

      You can read more about the Prairie Dev Con West here and find more information about registration here.   Do note there is some early bird pricing in effect.

       

      image

      Categories: AppFabric, Azure, Queue

      AppFabric Apps (June 2011 CTP)–Only 1 SQL Azure Instance at a time

      September 10, 2011 1 comment

      I must have missed it in the release notes but when you have access to the AppFabric Apps CTP, you are only allowed on SQL Azure instance within this Labs environment.  When you first log into the Portal you will see a SQL Azure DB that has a name of LabsSqlDatabase.  The problem is that you can’t actually create another one here and if you do try to access this database you will not be able to connect to it using the Database Name LabsSqlDatabase.

      image

      What is a little confusing is when you are working within your Local Dev Fabric that you can specify any name you want in your SQL Azure cofig for your DatabaseName.  The problem is this name will not exist in the cloud if you want to access it after you have published your AppFabric Application.

      image

      The error message that you will receive will resemble the following:

      image

      So due to this current limitation (it is CTP  – I am ok with it), you need to use the database that was provisioned for you when you were granted access to the CTP Labs enviornment.  To do this you will need to get your connection string from the Labs Portal by clicking on LabsSqlDatabase then click on the View button below the Connection String label.  A dialog box will appear and you can copy your connection string by clicking the Copy to Clipboard button.

      image

      If we paste this value in notepad we will discover the following connection string.  The Data Source represents the name of the server (s) that our database will be running on.  Next we have an Initial Catalog value which is the actual name of our database that has been provisioned for us….not LabsSqlDatabase.  In my case, and I would imagine others, my Initial Catalog value is the same as my user id.  If have blacked out portions of these values for privacy reasons.image

      If I now try to connect to this database either through SQL Management Studio or Database Manager Portal, using this initial catalog value for my Database Name,  I will have success.

      image

      At this point we are free to run any of our SQL scripts to generate tables, stored procedures or load data.

      Conclusion

      I hope this post will save some others some time as it did create some confusion for me.  Admittedly I sometimes dive right into things instead of thoroughly reviewing the documentation (what fun is that).  From what I gather, it is possible to hook up an AppFabric Application to an existing SQL Azure database that exists in the NON-LABS environment.  So if you really need to create multiple SQL Azure databases that is also an option.  Since I am just playing around with this stuff having a single SQL Azure instance works for me now that I understand the limitations of the Labs environment.

      Categories: AppFabric, Azure, SQL Azure

      AppFabric Apps (June 2011 CTP) Accessing AppFabric Queue via REST

      August 13, 2011 1 comment

      I recently watched an episode of AppFabric TV where they were discussing the REST API for Azure AppFabric.  I thought the ability to access the AppFabric Service Bus, and therefore other Azure AppFabric services like Queues, Topics and Subscriptions via REST was a pretty compelling scenario.  For example, if we take a look at the “Power Outage” scenario that I have been using to demonstrate some of the features of AppFabric Applications, it means that we can create a Windows Phone 7 (or Windows Phone 7.1/Mango Beta) application and dump messages into an AppFabric Queue securely via the AppFabric Service Bus.  Currently, the Windows Phone SDK does not allow for the managed, Service Bus bindings to be loaded on the phone so using the REST based API over HTTP is a viable option.

      Below is a diagram that illustrates the solution that we are about to build.  We will have a Windows Phone 7.1 app that will push messages to an Azure AppFabric Queue.  (You will soon discover that I am not a Windows Phone developer.  If you were expecting to see some whiz bang new Mango features then this post is not for you.) 

      The purpose of the mobile is to submit power outage information to an AppFabric Queue.  But before we can do this we need to retrieve a token from the Access Control Service and include this token in our AppFabric Queue message Header.  Once the message is in the Queue,  we will once again use a Code Service to retrieve messages that we will then insert into a SQL Azure table.

      image

       

      Building our Mobile Solution

      One of the benefits of using AppFabric queues is the loose coupling between publishers and subscribers.  With this in mind, we can proceed with building a Windows Phone 7 application in its own solution.  For the purpose of this blog post I am going to use the latest mango beta sdk bits which are available here.

      image

      Since I have downloaded the latest 7.1 bits, I am going to target this phone version.

      image

       

      On our WP7 Canvas we are going to add a few controls:

      • TextBlock called lblAddress that has a Text property of Address
      • TextBox called txtAddress that has an empty Text Property
      • Button called btnSubmit that has a Content property of Submit
      • TextBlock called lblStatus that has an empty Text Property

      image

       

      Within the btnSubmit_Click event we are going to place our code that will communicate with the Access Control Service.

      private void btnSubmit_Click(object sender, RoutedEventArgs e)
           {

               //Build ACS and Service Bus Addresses
               string acsAddress = https + serviceNameSpace + acsSuffix ;
               string relyingPartyAddress = http + serviceNameSpace + serviceBusSuffix;
               string serviceAddress = https + serviceNameSpace + serviceBusSuffix;

               //Formulate Query String
               string postData = "wrap_scope=" + Uri.EscapeDataString(relyingPartyAddress) +
                   "&wrap_name=" + Uri.EscapeDataString(issuerName) +
                   "&wrap_password=" + Uri.EscapeDataString(issuerKey);
              
           
               WebClient acsWebClient = new WebClient();
              
               //Since Web/Http calls are all async in WP7, we need to register and event handler
               acsWebClient.UploadStringCompleted += new UploadStringCompletedEventHandler(acsWebClient_UploadStringCompleted);

               //instantiate Uri object with our acs URL so that we can provide in remote method call
               Uri acsUri = new Uri(acsAddress);
              
               acsWebClient.UploadStringAsync(acsUri,"POST",postData); 

      }

       

      Since we are making an Asynchronous call to the ACS service, we need to implement  the   handling of the response from the ACS Service.

      private void acsWebClient_UploadStringCompleted(object sender, UploadStringCompletedEventArgs e)
          {
              if (e.Error != null)
              {
                  lblStatus.Text = "Error communicating with ACS";
              }
              else
              {
                  //store response since we will need to pull ACS token from it
                  string response = e.Result;

                  //update WP7 UI with status update
                  lblStatus.Text = "Received positive response from ACS";
                 
                  //Sleep just for visual purposes
                  System.Threading.Thread.Sleep(250);

                  //parsing the ACS token from response
                  string[] tokenVariables = response.Split(‘&’);
                  string[] tokenVariable = tokenVariables[0].Split(‘=’);
                  string authorizationToken = Uri.UnescapeDataString(tokenVariable[1]);

                
                 //Creating our Web client that will use to populate the Queue
                  WebClient queueClient = new WebClient();

                  //add our authorization token to our header
                  queueClient.Headers["Authorization"] = "WRAP access_token=\"" + authorizationToken +"\"";
                  queueClient.Headers[HttpRequestHeader.ContentType] = "text/plain";

                  //capture textbox data
                  string messageBody = txtAddress.Text;
                
                  //assemble our queue address
                  //For example: "
      https://MyNameSpace.servicebus.appfabriclabs.com/MyQueueName/Messages"
                  string sendAddress = https + serviceNameSpace + serviceBusSuffix + queueName + messages;

                  //Register event handler
                  queueClient.UploadStringCompleted += new UploadStringCompletedEventHandler(queueClient_UploadStringCompleted);

                  Uri queueUri = new Uri(sendAddress);
                  //Call method to populate queue
                  queueClient.UploadStringAsync(queueUri, "POST", messageBody);
              }

             

          }

      So at this point we have made a successful request to ACS and received a response that included our token.  We then registered an event handler as we will call the AppFabric Service Bus Queue using an Asynchronous call.  Finally we made a call to our Service Bus Queue.

      We now need to process the response coming back from the AppFabric Service Bus Queue.

      private void queueClient_UploadStringCompleted(object sender, UploadStringCompletedEventArgs e)
      {
          //Update status to user.
          if (e.Error != null)
          {
              lblStatus.Text = "Error sending message to Queue";
          }
          else
          {
               lblStatus.Text = "Message successfully sent to Queue";
          }
      }

       

      That concludes the code that is required to submit a message securely to the AppFabric Service Bus Queue using the Access Control Service to authenticate our request.

      Building our Azure AppFabric Application

      The first artifact that we are going to build is the AppFabric Service Bus Queue called QueueMobile.

      image_thumb2[1]

      Much like we have done in previous posts we need to provide an IssuerKey, IssuerName and Uri.

      image_thumb4

      The next artifact that we need add is a SQL Azure Database.

      image26_thumb

      Adding this artifact is only the beginning.  We still need to create our local database in our SQL Express instance.  So what I have done is manually created a Database called PowerOutage and a Table called Outages.

      image_thumb6

      Within this table I have two very simple columns: ID and Address.

      image_thumb10

      So the next question is how do I connect to this database.  If you navigate to the AppFabric Applications Manager which is found within the AppFabric Labs portal, we will see that a SQL Azure DB has been provisioned for us.

      image_thumb12

      Part of this configuration includes our connection string for this Database.  We can access this connection string by clicking on the View button that is part of the Connection String panel.

      image_thumb13

      I have covered up some of the core credential details that are part of my connection string for security reasons.  What I have decided to do though to make things a little more consistent is created a SQL Server account that has these same credentials in my local SQL Express.  This way when I provision to the cloud I only need to change my Data Source.

      image_thumb17

      For the time being I am only interested my local development fabric so I need to update my connections string to use my local SQL Express version of the database.

       

      image_thumb19

      With our Queue and Database now created and configured, we need to focus on our Code Service.  The purpose of this Code Service is to retrieve messages from our AppFabric Queue and insert them into our SQL Azure table.  We will call this Code Service CodeMobileQueue and then will click the OK button to proceed.

      image17_thumb

      We now need to add references from our Code Service to both our AppFabric Queue and our SQL Azure Instance.  I always like to rename my references so that they have meaningful names.

      image_thumb5

      Inside our Code Service, It is now time to start focusing on the plumbing of our solutions We need to be able to retrieve messages from the AppFabric Queue and insert them into our SQL Azure table.

      public void Run(CancellationToken cancelToken)
            {
              
                 //Create reference to our Queue CLient
                  QueueClient qClient = ServiceReferences.CreateQueueMobile();
                
                //Create reference to our SQL Azure Connection
                  SqlConnection sConn =  ServiceReferences.CreateSqlQueueMobile();
                  MessageReceiver mr = qClient.CreateReceiver();
                  BrokeredMessage bm;
                  
                 Stream qStream;
                 StreamReader sReader;
                 string address;

                  System.Diagnostics.Trace.TraceInformation("Entering Queue Retrieval " + System.DateTime.Now.ToString());

                  while (!cancelToken.IsCancellationRequested)
                  {
                      //Open Connection to the database
                      sConn.Open();
                    
                      while (mr.TryReceive(new TimeSpan(hours: 0, minutes: 30, seconds: 0), out bm))
                      {

                          try
                          {
                              //Note: we are using a Stream here instead of a String like in other examples
                              //the reason for this is that did not put the message on the wire using a
                              //BrokeredMessage(Binary Format) like in other examples.  We just put on raw text.
                              //The way to get around this is to use a Stream and then a StreamReader to pull the text out as a String
                              qStream = bm.GetBody<Stream>();
                              sReader = new StreamReader(qStream);
                              address = sReader.ReadToEnd();

                              //remove message from the Queue
                              bm.Complete();

                              System.Diagnostics.Trace.TraceInformation(string.Format("Message received: ID= {0}, Body= {1}", bm.MessageId, address));

                             //Insert Message from Queue and add it to a Database
                              SqlCommand cmd = sConn.CreateCommand();
                              cmd.Parameters.Add(new SqlParameter("@ID", SqlDbType.NVarChar));
                              cmd.Parameters["@ID"].Value = bm.MessageId;
                              cmd.Parameters.Add(new SqlParameter("@Address", SqlDbType.NVarChar));
                              cmd.Parameters["@Address"].Value = address;
                              cmd.CommandText = "Insert into Outages(ID,Address) Values (@ID,@Address)";
                              cmd.CommandType = CommandType.Text;
                              cmd.ExecuteNonQuery();
                              System.Diagnostics.Trace.TraceInformation("Record inserted into Database");
                          }
                          catch (Exception ex)
                          {
                              System.Diagnostics.Trace.TraceError("error occurred " + ex.ToString()); 
                          }
                      }

                      // Add your code here
                      Thread.Sleep(5 * 1000);
                  }

                  mr.Close();
                  qClient.Close();
                  sConn.Dispose();

               
            }

      Testing Application

      We are done with all the coding and configuration for our solution.  Once again I am going to run this application in the local Dev Fabric so I am going to go ahead and type CRTL + F5.  Once my Windows Azure Emulator has been started and our application has been deployed we can start our Windows Phone Project.

      For the purpose of this blog post we are going to run our Windows Mobile solution in the provided emulator.  However, I have verified the application can be side-loaded on a WP7 device and the application does work properly.

      We are now going to populate our Address text box with a value.  In this case I am going to provide 1 Microsoft Way  and click the Submit button.

      image_thumb[1]

      Once we click the Submit button we can expect our first status message update indicating that we have received a positive response from ACS.

      image_thumb[5]

      The next update we will have displayed is one that indicates our message has been successfully sent to our AppFabric Queue.

      image_thumb[3]

      As outlined previously, our WP7 app will publish message to our AppFabric Queue, from there we will have our Code Service de-queue the message and then insert our record into a SQL Azure table.  So if we check our Outages table we will discover that a record has been added to our Database.

       

      image_thumb[7]

       

      Conclusion

      Overall I am pretty please with how this demo turned out.  I really like the ability to have a loosely coupled interface that a Mobile client can utilize.  What is also nice about using a RESTful interface is that we have a lot of flexibility when porting a solution like this over to other platforms.

      Another aspect of this solution that I like is having a durable Queue in the cloud.  In this solution we had a code service de-queuing this message in the cloud.  However, I could also have some code written that is living on-premise that could retrieve these messages from the cloud and then send them to an LOB system like SAP.  Should we have a planned, or unplanned system outage on-premise, I know that all mobile clients can still talk to the Queue in the cloud.

      Categories: AppFabric, Azure, Queue Tags: ,

      AppFabric Apps (June 2011 CTP) Simple Service Bus Topics–Part 2

      July 31, 2011 Leave a comment

      (…Continued from Part 1, if you have not read it please do so to understand the context of this post)

      Solution

      We want customers to be able to submit power outages to our fictitious power company located in the Redmond, Washington area. What is different this time around(from our Queue scenario) is that due to the growth in the Kirkland, Washington area we have subcontracted power line maintenance to another company. So we want our Redmond system to receive messages for the Redmond area and this other company to receive messages for Kirkland. The Redmond company should not see the Kirkland messages and vice versa.

      Configuring Core Artifacts

      The user interface for this application will once again be an ASP.Net Web Application.  We will add this Web Application by clicking on the Add New Service label which is found on the AppFabric Design View Canvas.

      image

      Next, we will want to provide an appropriate name for this Web App and click on the OK button.

      image

      We now need to create a Service Bus Topic and can do this by once again clicking on the Add New Service label which is found on the AppFabric Design View Canvas.

      image

      Much like we have had to do with other Service Bus artifacts, we need to provide our Service Bus IssuerKey, IssuerName and Uri.  For my URI, I have provided the following value:

      sb://<your_namespace>.servicebus.appfabriclabs.com/SimpleServiceBusTopic

      image

      Note: Notice the RequiresProvisioning property which is set to True.  When we deploy our AppFabric application the provisioning of our Topic will be taken care of without any additional work from our side. When we shutdown the Azure Compute Emulator, this Topic will be removed.

      In many of the May AppFabric CTP examples out there people handle provisioning tasks by using the ServiceBusNamespaceClient  class.  No where in this code will we be directly using this class.  I would imagine that under the hood, the provisioning code is using this class.

      Next, our Web application we need to add a reference to this newly created Topic.

      image

      I like to rename my references to provide a more descriptive title than the default Import1 value.

      image

       

      With our topic now setup we want to add two subscriptions.  The first subscription will be for our Redmond messages and the second subscription will be for our Kirkland subscription.

      Redmondimage

       

      Kirkland

      image

       

      For each of these Subscriptions we need to once again provide  IssuerKey, IssuerName and Uri propertiesThe Uri property for subscriptions provides an interesting twist. The actual word “subscriptions” must be included between the name of your Topic and Subscription. The convention for these Uri are:

      sb://your_namespace.servicebus.appfabriclabs.com/TopicName/subscriptions/SubscriptionName

       

      image

       

      Much like in the Queue Blog Post we are going to use a Code Service to retrieve messages from our Topic via a Subscription.  We need once again click on the Add New Service label and select Code.

      image

       

      This Code Service will represent the Redmond Client and therefore we need to add a reference to the SimpleServiceBusSubscriptionRedmond Subscription.

      image

       

      The underlying core of our application should be set and we should be able to build our application.  If we look at the AppFabric Design View Canvas we should see the following:

      image

      Our Diagram View should look like this:

      image

       

      Update Web Application

      At this point we have simply configured all of the artifacts that are required in our solution but we have not written any code so our application will not be very functional.  In the Source view of the Default.aspx  page we want to add a few controls:

      • Text Box for our Address
      • Text Box for our City
      • Button used to submit our form

       

      image

      In the code behind for Default.aspx we will want to add an event handler for the Button clicked event.

      protected void btnSubmit_Click(object sender, EventArgs e)
      {     
          TopicClient tc = ServiceReferences.CreateSimpleServiceBusTopic();

          MessageSender ms = tc.CreateSender();

          BrokeredMessage bm = BrokeredMessage.CreateMessage(txtAddress.Text);
          bm.Properties["OutageCity"] = txtCity.Text;

          ms.Send(bm);

          txtAddress.Text = “”;
          txtCity.Text = “”;

      }

      Some lines of significance include:

      TopicClient tc = ServiceReferences.CreateSimpleServiceBusTopic(); where we are creating an instance of our TopicClient.

      The BrokeredMessage instantiation line includes our message body which is our address that is coming from our Address text box.

      BrokeredMessage bm = BrokeredMessage.CreateMessage(txtAddress.Text);

       

      You may also notice where we assign our City value to a property called OutageCity.  This property is kinda like a BizTalk Context property.  We are going to be able to route our message based upon this value.  We will later use this property when creating a Subscription Filter.

      bm.Properties["OutageCity"] = txtCity.Text;

      Note: this code looks a little different than the May AppFabic CTP code that you have seen.  In the May CTP we had to worry about our Service Bus configuration for our namespace, Issuer owner and key.  We also had to create an instance of  a MessagingFactory object so that we could create a Topic Client.  I like the AppFabric Application approach better.  It is a little cleaner and all of our configuration info is handled in our Application model.

       

      Update Code Service

      We now need to update our Run method that exists within our Code Service.  As discussed in the Queue blog post, this method will continue to run until a cancellation request is received.  The purpose of this method is to retrieve messages from our Topic via our Redmond Subscription.

      For the purpose of this demonstration we are simply going to log messages that have been retrieved  in our Trace viewer.

      public void Run(CancellationToken cancelToken)
             {

                 System.Diagnostics.Trace.TraceInformation(“Entering Code Service Loop”);

                 while (!cancelToken.IsCancellationRequested)
                 {

                     SubscriptionClient subClient = ServiceReferences.CreateSimpleServiceBusSubscription();
                     subClient.MessagingFactory.CreateSubscriptionClient(“SimpleServiceBusTopic”, “SimpleServiceBusSubscriptionRedmond”);

                     MessageReceiver mr = subClient.CreateReceiver();
                     BrokeredMessage bm;

                     while (mr.TryReceive(new TimeSpan(hours: 0, minutes: 0, seconds: 5), out bm))
                     {
                         System.Diagnostics.Trace.TraceInformation(string.Format(“Message received: ID=   {0}, Body= {1}”, bm.MessageId, bm.GetBody<string>()));
                         bm.Complete();
                     }

                     // Add your code here
                     Thread.Sleep(5 * 1000);

                 }
             }

      Some lines of significance include:

      Our line where we create a subscription client that will include the name of our Topic and our Subscription.

      subClient.MessagingFactory.CreateSubscriptionClient(“SimpleServiceBusTopic”, “SimpleServiceBusSubscriptionRedmond”);

      The next line is our while loop that will attempt to retrieve messages from this subscription.  If a message is retrieved it will be stored in a BrokeredMessage.

      while (mr.TryReceive(new TimeSpan(hours: 0, minutes: 5, seconds: 0), out bm))

      Creating a Kirkland client

      Our Kirkland client is going to follow a different path than our Redmond client.  At this point everything that we have built exists in the Azure Cloud.  For our Kirkland client, it is going to reside on-premise.  Since our Topics and Subscriptions are provisioned in the cloud we can access them from within the cloud and on-premise.

      For this example we will follow the path that many of the May CTP examples took.  It is a console application that is in a separate solution from this AppFabric Application.  In this application I have added references manually to the ServiceBus CTP assemblies.

       

      using System;
      using System.Collections.Generic;
      using System.Linq;
      using System.Text;
      using Microsoft.ServiceBus;
      using Microsoft.ServiceBus.Messaging;
      using Microsoft.ServiceBus.Description;

      namespace SimpleTopicOnPrem
      {
          class Program
          {
              static void Main(string[] args)
              {

                  Uri sbURI = ServiceBusEnvironment.CreateServiceUri(“sb”, “<your_namespace>”, string.Empty);
                  string name = “owner”;
                  string key = “<your_key>”;

                

                  MessagingFactory factory = MessagingFactory.Create(sbURI, TransportClientCredentialBase.CreateSharedSecretCredential(name, key));

                  SubscriptionClient subClient = factory.CreateSubscriptionClient(“SimpleServiceBusTopic”, “SimpleServiceBusSubscriptionKirkland”);

                  //Next we create the MessageReceiver and receive a message.

                  MessageReceiver mr = subClient.CreateReceiver();

                  BrokeredMessage bm;

                  while (mr.TryReceive(new TimeSpan(hours: 0, minutes: 5, seconds: 0), out bm))
                  {
                      Console.WriteLine(string.Format(“Message received: ID= {0}, Body= {1}”, bm.MessageId, bm.GetBody<string>()));
                      bm.Complete();
                  }

                  Console.WriteLine(“Done Reading From Queue”);
                  Console.Read();
              }
          }
      }

       

      This code looks pretty similar to our Code Service with the exception of how we create a SubscriptionClient and that we need to deal with our credentials and URI in the application.

       

      Testing the application

      For the purpose of this blog post we are going to keep our AppFabric application in our Development Fabric.  We can provision and deploy our application by typing CRTL + F5.  We should see our web application launch.

      Next we will want to start an instance of our Kirkland On Premise client by typing F5.  This means that both of our applications are up and running and ready to receive messages.

      Since we have not added any filters on our Subscriptions both applications should receive a copy of any message that we submit. 

       

      The first message that we will send will have an Address of Space Needle and a City of Seattle.

      image

       

      In our Trace logs and in our Console application we will discover that our message has been received by both clients.

      image

       

      At this point we have proved that our initial deployment has been successful and that we can broadcast this message to multiple clients.  But, this is not the end state that we desire.  Remember we want the Redmond client to only receive the Redmond messages and the Kirkland client to only receive the Kirkland messages.  In order to accomplish this behavior we need to add Subscription rules.

      Unless I have missed something(and please let me know if I have), there is no way to specify our filter when we create our Subscription through the AppFabric Design Canvas.  The only way that I have seen how to do this is through code by using the ServiceBusNamespaceClient class.  So I could have written this code but opted for a different option.  The AppFabric Cat team recently released a tool called the Service Bus Explorer.  Within this tool, you can provide your credentials and then interrogate your Queues, Topics and Subscriptions.  You can read all about this tool and some more detailed information on their blog post.

      Once I have connected to my namespace using the Service Bus Explorer, I want to navigate to my Redmond subscription and then delete the existing rule.  Basically this rule is saying that any messages should be retrieved using this subscription.

      image

      I then want to right mouse click on the Redmond Subscription and Add Rule.  I now need to provide a name for this rule (it can be anything) and a SQL Filter Expression.  In this case I am providing OutageCity=’Redmond’.

      image

      With my Subscription rule established for my Redmond Subscription lets submit another message.  This time we will provide an Address of 1 Microsoft Way and a City of Redmond.

      image

       

      So notice that both clients received this message.  The reason for this is that this message did match the Subscription rule (filter) for the Redmond Client.  Since we have not configured a Subscription rule (filter) for our Kirkland client it is still configured to receive all messages.

      image

      To prove that the Redmond Subscription is working, lets send in a message that will not match the Redmond rule.  This time only the Kirkland application should receive it if we make the city equal to Bellevue.

      image

       

      We will discover that the Subscription rule is working.  Our Redmond client did not receive this Bellevue message but our Kirkland client did since its Subscription rule is wide open.

      image

      So let’s create a Subscription rule for our Kirkland client.  We will head back to the Service Bus Explorer tool to do this.

      Once again we will delete the Default Rule, this time for the Kirkland Subscription.  We will add a new Rule by right mouse clicking on the Kirkland Subscription and selecting Add Rule.  We will provide this rule with a Name of Kirkland and a SQL Filter Expression of OutageCity=’Kirkland’.

      image

      With our Subscription rule now in place, lets send in a message that only our Kirkland client will receive. 

       

      image

      Sure enough, our Redmond client did not receive a copy of this message since its Subscription rule didn’t match.  Our Kirkland client did receive this message since it did match our Subscription rule

      image

       

      Closing Notes:

      • You will lose any of these Subscription rules between deployments.  When you shutdown the Dev Fabric emulator, your Topic and Subscriptions will be removed.  When you deploy your application to the Local Dev fabric your Topic and Subscriptions will be deployed but your rules/filters will not return unless you configure them again.
      • Topics will push messages to Subscriptions no matter the Rule that you have in place.  But, clients will only retrieve messages that match the Subscription rule(s).
      • You can have multiple Subscription rules per Subscription.
      • It would be nice if we can provide our Subscription rules in our AppFabric Design view canvas.  This way they could be deployed with our Topic and Subscription(s).
      • Publishers instantiate TopicClient instances, Consumers instantiate Subscription Client instances.
      • Overall the technology is pretty cool.  Having true pub/sub in the cloud should open up many opportunities for organizations.
      Categories: AppFabric, Azure, Queue

      AppFabric Apps (June 2011 CTP) Simple Service Bus Topics–Part 1

      July 31, 2011 2 comments

      In my last post we discussed using Azure AppFabric Queues as part of Azure AppFabric Applications.  In this post, we will discuss a couple of related Queue technologies called Topics and Subscriptions.  The purpose of this post will be to describe the conceptual behavior of Topics and Subscriptions and then in a follow up post actually dive into a detailed example of how we can use this technology. 

      What is a Topic?

      A Topic is much like a Queue in some ways.  It is a durable storage container that allows consuming clients to retrieve messages from.  Data within a Queue can be consumed by a single client.  Data within a topic can be consumed by multiple clients through an AppFabric Subscription.

      What is a Subscription?

      A Subscription is probably exactly what you think it is.  It is a client requesting a copy of a particular message that has been published to a Topic.  Subscriptions by default include all messages for a particular subscription but we can also specify a SQL 92 query which essentially allows us to “filter” messages that we are interested in.

      Topics + Subscriptions = Pub/Sub

      The combination of Topics and Subscriptions really allow for Pub/Sub in the cloud.  It provides the ability for a publisher to put a message in the cloud then have consuming client(s)  retrieve messages that are relevant.  When a publisher sends a message to the cloud, it is really sending it to a Topic and when a consuming client retrieves a message it is doing so through a Subscription.

      Why is this important?

      In my opinion, I think this is pretty cool technology.  Internet Pub/Sub messaging is something that is a relatively new concept in the Microsoft Stack.  The idea that you can perform this pub/sub in the cloud provides some unique opportunities in both Business to Business and Business to Consumer scenarios.  With Business to Business scenarios dealing with network firewalls to allow for communication to occur(between organizations) is usually not a fun adventure.   Also, providing the ability to scale usually provides challenges as well.  On the Business to Consumer side, our Topic messages are stored in a durable store and therefore support disconnected client scenarios provided we mark messages with a suitable time to live (TTL) parameter.

      Example

      In the previous Queue post we had an example of a Power Outage application that allows customers with a power outage to report it through a mobile phone browser.  That example works great if we have a single consumer, but what if we have multiple consumers?  Enter Topics and Subscriptions.

      This scenario is very similar.  We want customers to be able to submit power outages to our fictitious power company located in the Redmond, Washington area.  What is different this time around is that due to the growth in the Kirkland, Washington area we have subcontracted power line maintenance to another company.  So we want our Redmond system to receive messages for the Redmond area and this other company to receive messages for Kirkland.  The Redmond company should not see the Kirkland messages and vice versa.

      To support this solution, we will create 1 Topic.  The Web application will submit messages to this Topic.  We will also register 2 Subscriptions on this topic and provide a filter expression that will ensure that our Redmond client receives messages for Redmond and a filter expression that ensures our Kirkland client receives messages for Kirkland.

      image

       

      Please see post 2 that will provide a step by step implementation of this solution.

      Categories: AppFabric, Azure, Queue Tags: ,
      Follow

      Get every new post delivered to your Inbox.