Archive

Author Archive

Azure–Service Bus Queues in Worker Roles

June 23, 2012 2 comments

 

Another nugget of information that I picked up at TechEd North America is a new template that ships as part of the Azure SDK 1.7 called Worker Role with Service Bus Queue. 

image

What got me interested in this feature is some of the work that I did last year with the AppFabric Applications (Composite Apps) CTP.  I was a big fan of that CTP as it allowed you to wire-up different cloud services rather seamlessly.  That CTP has been officially shut-down so I can only assume that this template was introduced to address some of the problems that the AppFabric Applications CTP sought to solve.

 

The Worker Role with Service Bus Queue feature works in both Visual Studio 2010 or Visual Studio 2012 RC.  For this post I am going to be using 2012 RC.

I am going to use a similar scenario to the one that I used in the App Fabric Applications CTP.  I will build a simple Web Page that will allow a “customer” to populate a power outage form.  I will then submit this message to a Service Bus Queue and will then have a Worker role dequeue this message. For the purpose of this post I will simply write a trace event to prove that I am able to pull the message off of the queue.

 

Building the Application

  • Create a new project by clicking on File (or is it FILE) – New Project
  • Select Cloud template and then you will see a blank pane with no template able to be selected.  The Azure SDK is currently built on top of .Net 4.0 not 4.5.  With this in mind, we need to select .Net Framework 4

image

  • We now need to select the Cloud Services that will make up our solution.  In my scenario I am going to include an ASP.Net Web Role and a Worker Role with Service Bus Queue.

image

Note: we do have the opportunity to rename these artifacts by hovering over the label and then clicking on the pencil. This was a gap that existed in the old AppFabric Apps CTP.  After renaming my artifacts my solution looks like this:

image

  • I want to send and receive a strongly typed message so I am going to create a Class Library and call it CustomerEntity.

image

  • In this project I will simply have one class called Customer with the following properties

namespace CustomerEntity
{
    public class Customer
    {
        public string Address { get; set; }
        public string City { get; set; }
        public string State { get; set; }
    }
}

 

  • I will then add a reference in both the Web Project and Worker Role projects to this CustomerEntity project.

 

  • Within the PowerOutageWeb project clear out all of the default markup in the Default.aspx page and add the following controls.

<h3>Customer Information:</h3>
Address: <asp:TextBox ID="txtAddress" runat="server"></asp:TextBox><br />   
City: <asp:TextBox ID="txtCity" runat="server"></asp:TextBox> <br />
State: <asp:TextBox ID="txtState" runat="server"></asp:TextBox><br />
<asp:Button ID="btnSubmit" runat="server" Text="Submit"  OnClick="btnSubmit_Click" />
<asp:Label ID="lblResult" runat="server" Text=""></asp:Label>

 

  • Also within the PowerOutageWeb project we need to add references to the Service Bus Assembly:  Microsoft.ServiceBus.dll and Runtime Serialization Assembly: System.Runtime.Serialization.dll

 

  • We now need to provide the following include statements:

using CustomerEntity;
using Microsoft.WindowsAzure;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;

 

  • Next we need to provide a click event for our submit button and then include the following code:

    protected void btnSubmit_Click(object sender, EventArgs e)
      {
          Customer cs = new Customer();
          cs.Address = txtAddress.Text;
          cs.City = txtCity.Text;
          cs.State = txtState.Text;

          const string QueueName = "PowerOutageQueue";

         //Get the connection string from Configuration Manager

         string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");

          var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

         //Check to see if Queue exists, if it doesn’t, create it
          if (!namespaceManager.QueueExists(QueueName))
          {
              namespaceManager.CreateQueue(QueueName);
          }

          MessagingFactory factory = MessagingFactory.CreateFromConnectionString(connectionString);

          //Create Queue CLient
          QueueClient myQueueClient = factory.CreateQueueClient(QueueName);

          BrokeredMessage bm = new BrokeredMessage(cs);

         //Send Message

          myQueueClient.Send(bm);

         //Update Web Page

          lblResult.Text = "Message sent to queue";
      }

 

  • Another new feature in this SDK that you may have noticed is the CreateFromConnectionString method that is available to the MessagingFactory and NamespaceManager classes.  This allows us to retrieve our configuration settings from our project properties page.  To access the project properties right mouse click on the particular role and then select Properties.  Next, click on Settings where you will find Key/Value Pairings.  The name of the key that we are interested in is: Microsoft.ServiceBus.ConnectionString and our value is

Endpoint=sb://[your namespace].servicebus.windows.net;SharedSecretIssuer=owner;SharedSecretValue=[your secret]

  • Since both our Web and Worker Roles will be accessing the Queue we will want to   ensure that both configuration files have this entry included.  This will allow our code to make a connection to our Service Bus Namespace where our Queue may be found.  If we edit this property here in these locations, then we do not need to modify the Cloud.cscfg and Local.cscfg configuration files because Visual Studio will take care of this for us.

image

  • Next we want to shift focus to the Worker Role and edit the WorkerRole.cs file.  Since we are going to be dequeuing our typed CustomerService message we want to include a reference to this namespace:

    using CustomerEntity;

  • Something that you probably noticed when you opened up the WorkerRole.cs file is that there is already some code written for us.  We can leverage most of it but can delete the code that is highlighted in red below:

image

  • Where we deleted this code, we will want to add the following:

Customer cs = receivedMessage.GetBody<Customer>();
Trace.WriteLine(receivedMessage.SequenceNumber.ToString(), "Received Message");
Trace.WriteLine(cs.Address, "Address");
Trace.WriteLine(cs.City, "City");
Trace.WriteLine(cs.State, "State");
receivedMessage.Complete();

In this code we are going to receive a typed Customer message and then simply write out the contents of this message using the Trace utility.  If we wanted to save this information to a Database, this would a good place to write that code.

  • We also want to make sure that we update the name of Queue so that it matches the name that we specified in the Web Role:

// The name of your queue
const string QueueName = "PowerOutageQueue";

 

Creating our Queue

We have a few options when it comes to creating our Queue that is required for this solution to work. The code in our ASP.NET Web page code-behind will take care of this for us.  So for our solution to work, this method is sufficient but perhaps we want to use a design-time alternative to specify some more advanced features.  In this case we do have a few options:

  • Using the http://www.windowsazure.com portal.image
  • Similarly we can use the Service Bus Explorer tool that was written by Paolo Salvatori.  Steef-Jan Wiggers has provided an in-depth walk through of this tool so I won’t go into more details here. http://soa-thoughts.blogspot.ca/2012/06/visual-studio-service-bus-explorer.html
  • As part of the Azure 1.7 SDK release, a Visual Studio Service Bus Explorer is now included.  It is accessible from the Server Explorer view from within Visual Studio.  Using this tool we can perform some functions like:
    • Creating Queues/Topics
    • Set advanced properties: Queue Size, Time to Live, Lock Duration etc
    • Send and Receive Test Messages
    • Get the current Queue Depth
    • Get our Service Bus Connection string

image

 

Any of these methods will work.  As I mentioned earlier, if we do nothing, the code will take care of it for us.

Testing

We are just going to test our code locally, with the exception of our ServiceBus Queue.  That is going to be created in Azure.  To test our example:

  • Hit the F5 and our website should be displayed

image

  • Since we have our Trace statements included in our Worker Role code, we need to launch our Compute Emulator.  To do this right mouse click on the Azure icon that is located in your taskbar and select Show Compute Emulator UI.

image

  • A “Console” like window will appear.  We will want to click on the ServiceBusWorker label

image

  • Switch back to our Web Application, provide some data and click the submit button.  We should see that our results label is updated to indicate that our Message sent to queue. image
  • If we switch back to our Compute Emulator we should discover that our message has been dequeued.

image

 

Conclusion

While the experience is a little different than that of the AppFabric Applications CTP, it is effective.  Especially for developers who may not be all that familiar with “integration”.  This template really provides a great starter point and allows them to wire up a Service Bus queue to their Web Application very quickly.

Categories: Azure, ServiceBus

Azure Service Bus–Don’t run as Root!

June 20, 2012 1 comment

 

So I can’t take credit for the catchy tag line for this post.  It was inspired by a recent session that Clemens Vasters and Abhishek Lal provided at TechEd North America 2012.  You can watch the entire session here

While watching the presentation, this particular segment, on not running as root,  really resonated with me.  I have built some proof of concept mobile applications that use REST APIs to send messages to Service Bus Queues.  In these applications I use the default owner key to submit messages.  I knew at the time that this could not be a good practice but since it was just a POC it was acceptable.  I was curious about how I could solve this problem in a better manner and I think what Microsoft has done here is definitely a step in the right direction.

If you are familiar with the Azure Service Bus, and Azure in general, there are 3 fundamental ‘artifacts’ that are required whenever you try to provision or manipulate Azure Services.  These artifacts include:

  • Namespace
  • Default Issuer (username)
  • Default Key (password)

image

image

The Problem

In 99.99% of the demos and blog posts that exist on the interwebs, people are embedding their Default Issuer and Default Key in their solution.  This creates issues on a few different levels:

  • The account really is “root” when it comes to your namespace.  Using your “account” allows you to create services like Queues, Topics, ACS Trust relationships, provision Cache etc.  Do you see the problem with embedding these credentials in your app and then distributing it?
  • If your account does become compromised, it could be used to maliciously manipulate your solution.  If you comprimised a solution that processed Customer Orders, wouldn’t it be nice to add your own subscriber to the Topic Subscription and receive a copy of each customer’s Credit Card number?

Like any other security principal, we want to ensure that people, or systems, have the minimum level of security rights that they need to perform a specific function.  A parallel example to this scenario is giving end users Domain Admin.  You wouldn’t give that to end users, much like you shouldn’t embed your “owner ‘s” credential in your application.

Enter SBAzTool

There is a tool that is part of the Windows Azure 1.7 SDK that can help us assign fine grained authorization to “Default Name(s)” (Usernames).  You can download the source code for this tool here.  In the rest of this post I will demonstrate how you can use this tool and then show an example that demonstrates why this tool is beneficial and that creating authorization rules is not so hard.

Create Namespace in Portal

Even though I have a functional namespace I am going to go ahead and create one from scratch so that we are all beginning at the same starting point.  If you have an existing namespace that you want to use, you can.  You don’t have to go through these next few steps where I create the namespace.

    • From http://www.windowsazure.com, log in and then select the previous(old) portal as Service Bus and ACS do not exist in the new portal…yet.
    • Next, click on the Service Bus label and then click on the {New} button

    image  

  • In this case I am selecting Access Control, Service Bus, a namespace and a Country/Region.  Since United States (West) is closest to my locale, I will use it.

image

 

  • So I now have a Namespace called DontRunAsRoot.  It may take a couple minutes to create.  You will also notice that we have a tree like structure where our Queues and Topics will be displayed.

image

 

  • For the purpose of this demo we are going to create a Queue called MyQueue by clicking on the New Queue button.image

 

  • We will need to populate the Name text box and then can accept the defaults.

image

 

  • Voila, we have our newly created Namespace and Queue

image

  • We are going to need our “owner” key so we might as well get it while we are in the portal.  Click on the View button and then click the Copy to Clipboard button.

 

    image

      Compiling SBAzTool

      Once you have downloaded the SBAzTool you can compile the solution and then open a command prompt and launch the tool by specifying sbaztool.exe.  You can now see all of the command line arguments.  However you can also access this documentation from the download page as well.

       

      StoreOptions

      In order to set permissions for other users/issuers we need to be authenticated ourselves using the Key that we previously retrieved from the portal.  By using the storeoptions argument we can continue to execute commands without having to re-issue our key/password.  To do this we will want to execute the following command:

      sbaztool.exe storeoptions –n <namespace> –k <key>

      image

       

      MakeId

      So lets now create a new “user” by specifying the makeid command.  In this case we can actually specify a “username” instead of the regular “owner” that we are so use to. 

      sbaztool.exe makeid <username>

      image

      As you can see, a Key has been provided for this user(which I have whited out).  We also have the ability to specify a password but including our own <key> provided it is a 32 byte, base64 encoded value.

      Grant

      Now that we have a Issuer/User created, we can actually assign this user permissions.  The command to do so is:

      grant <operation> <path> <name>

      The available operations that we have access to are:

      • Send
      • Listen
      • Manage

      Send and Listen are pretty self explanatory but Manage deserves further elaboration.  We can actually delegate the authority to manage resources to a particular user based upon the path.  So lets imagine we have a path that looks like this:

      /organization/department/engineering

      If we wanted to allow someone to administrate the /engineering services we could do so using this command.

      For the purpose of this blog post, lets keep things simple and assign Send and Listen privileges to our QueueUser for our queue which is called myqueue

      Send

      image

      Listen

      image

      Show

      To verify that our permissions have been set correctly we can execute the Show command by providing the following:

      show <path>

      As you can see below the QueueUser now has Send and Listen permissions on the queue called myqueue.  This is a much better solution than giving entire rights to a namespace when you only need to specify rights on a particular queue.

      image

       

      Test permissions

      So in order to validate that this stuff actually works and this is not smoke and mirrors I am going to create a very simple console application that will use these credentials to send and receive a message.  Once we have validated that this works we will pull the Send permission and see what happens.

      The following code will create a QueueClient, send a message to the queue and then receive the message.

              const string QueueName = "myqueue";
               const string ServiceNamespace = "DontRunAsRoot";
               const string IssuerName = "QueueUser";
               const string IssuerKey = "<removed>";

               //Sending the message

               TokenProvider credentials = TokenProvider.CreateSharedSecretTokenProvider(IssuerName, IssuerKey);
               Uri serviceUri = ServiceBusEnvironment.CreateServiceUri("sb", ServiceNamespace, string.Empty);

               MessagingFactory factory = null;

               try
               {

                   factory = MessagingFactory.Create(serviceUri, credentials);

                   //This code assumes that queue has already been created since we have
                   //provisioned access
                   QueueClient myQueueClient = factory.CreateQueueClient(QueueName);

                   Console.WriteLine("\nCreated Queue Client");

                   //Create Brokered Message
                   BrokeredMessage bm = new BrokeredMessage("I hope this works");

                   Console.WriteLine("\nSending messages to Queue…");

                   myQueueClient.Send(bm);

                   Console.WriteLine("\nMessage sent to Queue");
                   Console.WriteLine("\nPress Any ENTER to receive message");
                   Console.ReadLine();

                
                   bm = myQueueClient.Receive(TimeSpan.FromSeconds(5));

                   if (bm != null)
                   {
                       Console.WriteLine(string.Format("Message received: Id = {0}, Body = {1}", bm.MessageId, bm.GetBody<string>()));
                       // Further custom message processing could go here…
                       bm.Complete();
                   }

                   Console.WriteLine("\nNo more messages to process");
                   Console.WriteLine("\nPress ENTER Key to exit");
                   Console.ReadLine();
                 

               }
               catch (Exception ex)
               {
                   Console.WriteLine(ex.ToString());
               }

           }

      When we run the application we will discover that it is executing correctly:

      image

       

      Revoke

      Let’s now make this a little interesting.  Let’s remove the QueueUser’s ability to send messages to the Queue and see what happens.  To do this we will use the following command:

      revoke <operation> <path> <user>

      image

      To validate that the permission revoke was successful lets run the show command again.  As you can see our revoke command was successful.

      image

       

      Let’s now try to send a message and see what happens.

      image

      As expected we get an authentication exception as we should.

       

      Conclusion

      With the introduction of the SBAzTool tool there is no excuse for using your “owner” credentials when building applications.  The SBAzTool has a wide variety of commands that facilitate managing and even delegating permissions.  Since this is a command line tool you can even script these permissions as you provision your Service Bus artifacts.

      Categories: Azure, ServiceBus

      Slide Deck from Prairie Dev Con Session

      March 14, 2012 Leave a comment

      Thank you to those of you who attended my “Introduction to Microsoft’s Middleware in the Cloud” session yesterday at PrairieDevCon West in Calgary.  I appreciated the level of engagement and the amount of questions that were asked.  Below you will find a link to the Slide Deck you saw yesterday.  I have also added a few slides to include screen shots of the demo applications that you saw to provide some additional context.

      https://skydrive.live.com/redir.aspx?cid=db51ef47e2bb432a&resid=DB51EF47E2BB432A!5663&parid=DB51EF47E2BB432A!375&authkey=!AOe1Rn7b80mPYdg  (Open in Powerpoint for Animations)

      Categories: Uncategorized

      SAP meet Azure Service Bus – EAI/EDI December 2011 CTP

      December 29, 2011 11 comments

       

      The Azure Service Bus EAI/EDI December 2011 CTP has been out for about 2 weeks at the time of this blog post.  As soon as I saw the Service Bus Connect feature in the documentation I wanted to try and hook up the Service Bus to SAP.  The organization that I work for utilizes SAP to support many of its core business processes.  We are also heavily invested in BizTalk Server when integrating SAP with other Corporate Systems. For the past 5 years much of my work experience has involved integration with SAP. So much that I had the opportunity to write a couple chapters on BizTalk-SAP integration in the Microsoft BizTalk 2010 Line of Business Systems Integration book.

      Integrating with SAP is of great interest to me both personally and professionally.  I like the challenge of taking two different types of systems that would seemingly be impossible to integrate yet find a way to do it.  I also enjoy the expression on SAP consultants face when you take a Microsoft product and successfully execute operations inside their system like creating customer records or creating Work Orders. 

      Using the Service Bus Connect feature is not the only way of bridging your On-Premise Line of Business Systems with external parties via cloud based messaging technologies.  Within the past year Microsoft also introduced a feature called BizTalk Server 2010 AppFabric Connect for Services.  This tool allows for BizTalk to expose an endpoint via a Service Bus Relay.  I have also used this mechanism to communicate with SAP via a Mobile Device and it does work. 

      There are a few differences between Service Bus Connect and AppFabric Connect for Services.  Some of these differences include:

      • Any message transformations that need to take place actually take place in the Cloud instead of On Premise.  When integrating with SAP, you never want to expose SAP schemas to calling clients.  They are ugly to say the least. In this scenario we can expose a client friendly, or conical schema, and then transform this message into our SAP request in Azure.
      • AppFabric Connect for Services utilizes a full deployment of BizTalk in your environment where as Service Bus Connect only requires the BizTalk Adapter Pack when communicating with SAP. All message transformations and orchestration takes place On Premise and the cloud (Azure Service Bus) is basically used as a communication relay.

      When connecting to On-Premise Line of Business Systems, both methods require the BizTalk Adapter Pack to be installed On-Premise.  The BizTalk Adapter Pack is included in your BizTalk license.  Licensing details for Service Bus Connect have not been released at the time of this writing.

      The following walkthrough assumes you have some experience with the new Service Bus CTP.  If you haven’t looked at the CTP before I suggest that you visit a few of the following links to get more familiar with the tool:

      Also it is worth pointing out another blog post written by Steef-Jans Wiggers where he discusses Oracle integration with Service Bus Connect.

      Building our Application

      • The first thing we need to do is to create a new ServiceBus – Enterprise Application Integration project.  In my case I am calling it HelloSAP.

      image

       

      • Since we know that we want to communicate with an On-Premise LOB system like SAP we need to Add a ServiceBus Connect Server.  We can do this by accessing Server Explorer, right mouse clicking on ServiceBus Connect Servers and then selecting Add Server.  When prompted we can provide a host name of localhost since this is a local environment.

      image

       

      • We now can expand our Service Bus Connect Servers hierarchy.  Since we want to build an SAP interface we can right mouse click on SAP and select Add SAP Target

      image

       

      • If you have ever used the BizTalk Adapter Pack before, you are now in familiar territory.  This is (almost) the same wizard that we use to generate schemas when connecting to SAP systems via BizTalk.  There is a subtle difference in the bottom left corner called Configure Target Path which we will discuss in a few moments.  If you are unfamiliar with this screen you are going to need some help from your SAP BASIS Admin to provide you with the connection details required to connect to SAP.  Also if you are interested in further understanding everything that is going on in this screen I recommend you pick up the BizTalk LOB book that I previously talked about as I discuss the different aspects of this wizard in great detail.  (ok..no more shameless plugs)

      image

       

      • We now want to select the type of interface that we want to interact with.  For the purpose of this blog post I am going to select a custom IDOC that is used when submitting timesheets from our field personnel.  In my case, the version of SAP that I am connecting to is 700 so that is why I am selecting the ZHR_CATS IDOC that corresponds to this version.  Once again, if you are unsure you will need to speak to your BASIS Admin.

      image

       

      • Notice how we cannot click the OK button after establishing a connection to SAP and selecting an IDOC?  We now need to create a Target Path.  Creating a Target Path will provide the Bridge from the Azure Service Bus into SAP.  Click the Configure button to continue.

      image

       

      • Assuming that we have not been through this exercise before we need to select Add New LobRelay  from the Select LOB Relay to host the LOB Target: dropdown list. 

      image

       

      • Another dialog box will appear. Within this dialog box we need to provide our CTP Labs namespace, a Relay path, Issuer name and key.  For Relay path:, we can really provide whatever we want here.  It will essentially make up the latter portion of URI for the Endpoint that is about to be created.

      image

       

      • Now we are are prompted to Enter LOB Target sub-path.  Once again this value can be whatever we want to choose.  Since the HR Timesheet module inside of SAP is often called CATS I will go ahead and use this value here.

      image

      • Now with our Target Path configured we are able to select the OK button to proceed.

      image

      • Inside Server Explorer we now have an entry underneath SAP.  This represents our End Point that will bridge requests coming from the cloud to SAP.

      image

      • At this point we haven’t added any artifacts to our Enterprise Application Integration project that we created earlier.  This is about to change.  We need to right mouse click on our SAP endpoint and then select Add schemas to HelloSAP

      image

      • We will now get prompted for some additional information in order to re-establish a connection to SAP so that we can generate Schemas that will enable us to send a message to SAP in a format that it is expecting.  You may also notice that we aren’t being prompted for any SAP server information.  In the Properties grid you will notice that this information is already populated because we had previously specified it when using the Consume Adapter Service Wizard.

      image

      • Inside our solution, we will now discover that we have our SAP schemas in a folder called LOB Schemas.

      image

      • For the purpose of this blog post, I have created another folder called Schemas and saved a Custom Schema called CloudRequest.xsd here.  This is the message that our MessageSender application will be sending in once we test our solution.  (BTW: I do find that the Schema editor that is included in BizTalk is much more intuitive and user friendly than this one.  I am not a big fan of this one)

      image

      • We now need to create a Map, or Transform, to convert our request message into a request that SAP will understand.

      image

      • Next we need to add a Bridge on to the surface of our Bridge Configuration.  Our Bridge will be responsible for executing our Map that we just created and then our message will get routed to our On-Premise end point so that our Timesheet can be sent to SAP.

      image

      • We now need to set the message type that we expect will enter the bridge.  By double clicking on our TimeSheetBridge we can then use the Message Type picker to select our Custom message type: CloudRequest.

      image

       

      • Once we have selected our message type, we can then select a transform by clicking on the Transform Xml Transform box and then selecting our map from the Maps Collection.

       image

       

      • Before we drag our LOB Connection shape onto our canvas we need to set our Service Namespace.  This is the value that we created when we signed up for the CTP in the Azure Portal.  To set the Service Namespace we need to click on any open space, in the Bridge Configuration canvas, and then look in the Properties Page.  Place your Service Namespace here.

      image

       

      • We are now at the point where we need to wire up our XML One-Way Bridge to our On-Premise LOB system.  In order to do so we we need to drag our SAP instance onto the Bridge Configuration canvas.

      image

       

      • Next, we need to drag a Connection shape onto the canvas to connect our Bridge to our LOB system.

      image

       

      • The next action that needs to take place is setting up a Filter Condition between our LOB Shape and our Bridge.  You can think of this like creating a subscription.  If we wanted to filter messages by their content we would be able to do so here.  Since we are interested in all messages we will just Match All.  In order to set this property we need to select our Connection arrow then click on the Filter Condition ellipses.

      image

       

      • If you have used the BizTalk Adapter Pack in the past you will be familiar with SOAP Action headers that need to be be set in your BizTalk Send Port.  Since we don’t have Send Ports per say, we need to set this action in the Route Action as part of the One-Way Connection shape.  In the Expression text box we want to put the name of our Operation which is http://Microsoft.LobServices.Sap/2007/03/Idoc/3/ZHR_CATS//700/Send wrapped with single quotes ‘ ’.  We can obtain this value by selecting our Service Bus Connect Server endpoint and then viewing the Properties page.  Within the Properties page there is an Operations arrow that can be expanded on and we will find this value here.  In the Destination (Write To) we want to set our Type to Soap and our Identifier to Action.

      image

       

      • There is one last configuration that needs to take place before enable and deploy our service.  We need to set our Security Type.  We can do so by selecting our SAP – ServiceBus Connect Server instance from Server Explorer.  Then in the Properties Page, click on the SecurityType ellipses.  Determining which Security type to use will depend upon how your SAP instance ahs been configured.  In my case, I am using ConfiguredUsername and I need to provide both a Username and Password.

      image

      • With our configuration set, we can now enable our SAP – ServiceBus Connect Server instance by right mouse clicking on it and then selecting Enable.

      image

       

      • We can now deploy our application to Azure by right mouse clicking on our Visual Studio solution and selecting Deploy.

      image

       

      Testing Application

      • In order to test our application we can use the MessageSender tool that is provided with the CTP Samples/Tools.  It will simply allow us to submit EDI, or in this case XML, messages to an endpoint in the Azure Service Bus.  In order to successfully submit these messages we need to provide our Service Namespace, Issuer Name, Shared Secret, Service Bus endpoint address, a path to our XML file that we want to submit and indicate that we are submitting an xml document.  Once we have provide this information we can hit the enter key and provided we do not have errors we will see a Message sent successfully message.

      Note: In the image below I have blocked my Shared Secret (in red) for privacy reasons.

      image

      • If we launch our SAP GUI we should discover that it has received a message successfully.

      image

      • We can then drill down into the message and discover the information that has been posted to our timesheet

      image

       

      Exceptions

      While testing, I ran into an exception.  In the CATSHOURS field I messed up the format of the field and sent in too much data.  The BizTalk Adapter Pack/SAP Adapter validated this incoming data against the SAP schema that is being used to send messages to SAP.  The result is a message was returned back to the MessageSender application.  I thought that this was pretty interesting.  Why?  In my solution I am using a One-Way bridge and this exception is still being propagated to the calling application.  Cool and very beneficial.

       image

      Conclusion

      Overall the experience of using Service Bus connect was good.  There were a few times I had to forget how we do this in BizTalk and think about how the Service Bus Connect does it.  An example of this was the SOAP Action Headers that BizTalk developers are use to manipulating inside of Send Ports.  I am not saying one way is better than the other but they are just different.  Another example is the XML Schema editor.  I find the BizTalk editor to be much more user friendly.

      While I am not convinced that the current state of Service Bus Connect is ready for primetime (they have CTPs for a reason), I am very impressed that the Microsoft team could build this type of functionality into a CTP.  For me personally, this type of functionality(connecting to On Premise LOB Systems) is a MUST HAVE as organizations start evolving towards Cloud computing.

      Categories: Azure, EAI/EDI December 2011 CTP Tags:

      Azure Service Bus EAI/EDI December 2011 CTP – New Mapper

      December 17, 2011 11 comments

      In this blog post we are going to explore some of the new functoids that are available in the Azure Service Bus Mapper.

      At first glance, the Mapper looks pretty similar to the BizTalk 2010 Mapper. Sure there are some different icons, perhaps some lipstick applied but conceptually we are dealing with the same thing right?

      image

      Wrong! It isn’t until we take a look at the toolbox that we discover this isn’t your Father’s mapper.

      image

      This post isn’t meant to be an exhaustive reference guide for the new mapper but here are some of the new functoids that stick out for me.

       

      Functoid Section Functoid Name Description
      Loop Operations MapEach Loop This functoid will loop over a repeating record from the source document and evaluate an operation at each iteration of the loop.  If the criteria of the operation is met then a record in the target document will be created.
      Expressions Arithmetic Expression No we haven’t lost our Addition or Subtraction functionality!  Many functoids that you would have found in the BizTalk 2010 Mathematical Functoids section have been consolidated in the Arithmetic Expression operation.
      Expressions Logical Expression Similar to the Arithmetic Expressions, these have now been consolidated and can be found within a single Expression including >, <, >=, <=, ==, !=, Logical Negation, Conditional ADD and Conditional OR
      Expressions If-Then-Else A much anticipated operation! BizTalk developers have been wanting a If-Then-Else functoid in BizTalk for many years.  If a condition has been evaluated to True then we can provide a particular value.  Otherwise we can provide different value when the condition has not been satisfied.
      List Operations All of them Complete new functionality provides us with the ability to manipulate Lists within a map.  Functionality includes creating a list, adding an item to the list, selecting a unique group, selecting a value, selecting entries, getting items, and ordering the list.  Wow!  Will be interesting to see how this progresses.
      Date/Time Operations DateTime Reformat This one should be useful.  I am constantly re-formatting dates when integrating with SAP.  Usually this formatting winds up in a Helper assembly which tends to be overkill for what really needs to take place.
      Misc Operations Data Lookup This one is interesting.  It allows us to access data from SQL Azure within a transform at runtime.
      Misc Operations Generate ID This functoid will generate a GUID.  It is the equivalent of calling the GUID.NewGuid method in .Net.
      Misc Operations Get Context Property Another useful operation!  This operation allows us to retrieve a value from context.  This is something that just isn’t possible in BizTalk.
       
      What’s Missing?

      I can’t take credit for discovering this, but while chatting with Mikael Håkansson he mentioned “hey – where is the scripting functoid?”  Perhaps this is just a limitation of the CTP but definitely something that needs to be addressed for RTM.  It is always nice to be able to fall back on a .Net helper assembly or custom XSLT.

      Conclusion
      While this post was not intended to be comprehensive, I hope it has highlighted some new opportunities that warrant some further investigation.  It is nice to see that Microsoft is evolving and maturing in this area of EAI.

      Introduction to the Azure Service Bus EAI/EDI December 2011 CTP

      December 17, 2011 3 comments

      The Azure team has recently reached a new milestone; delivering on Service Bus enhancements for the December 2011 CTP.  More specifically, this CTP provides Enterprise Application Integration (EAI) and Electronic Data Interchange (EDI) functionality.  Within the Microsoft stack, both EAI and EDI have historically been tackled by BizTalk.  With this CTP we are seeing an early glimpse into how Microsoft envisions these types of integration scenarios being addressed in a Platform as a Service (PaaS) based environment.

       What is included in this CTP?

      There are 3 core areas to this release:

      1. Enterprise Application Integration: In Visual Studio, we will now have the ability to create Enterprise Application Integration and Transform projects. 
      2. Connectivity to On-premise LOB applications: Using this feature will allow us to bridge our on-premise world with the other trading partners using the Azure platform.  This is of particular interest to me.  My organization utilizes systems like SAP, Oracle and 3rd party work order management and GIS systems.  In our environment, these applications are not going anywhere near the cloud anytime soon.  But, we also do a lot of data exchange with external parties.  This creates an opportunity to bridge the gap using existing on-premise systems in conjunction with Azure.
      3. Electronic Data Interchange: We now have a Trading Partner Management portal that allows us to manage EDI message exchanges with our various business partners.  The Trading Partner Management Portal is available here.

      What do I need to run the CTP?

      • The first thing you will need is the actual SDK (WindowsAzureServiceBusEAI-EDILabsSDK-x64.exe) which you can download here
      • A tool called Service Bus Connect (not to be confused with AppFabric Connect) enables us to bridge on-premise with Azure.  The setup msi can also be accessed from the same page as the SDK. Within this setup we have the ability to install the Service Bus Connect SDK(which includes the BizTalk Adapter Pack), Runtime and Management tools.  In order for the runtime to be installed, we need to have Windows Server AppFabric 1.0 installed.
      • Once we have the SDKs installed, we will have the ability to create ServiceBus projects in Visual Studio.

      image

      • Since our applications will require resources from the Azure cloud, we need to create an account in the Azure CTP Labs environment.  To create a namespace in this environment, just follow these simple instructions.
      • Next, we need to register our account with the Windows Azure EDI Portal.

      image

       Digging into the samples

      For the purpose of this blog post I will describe some of what is going on in the first Getting Started sample called Order Processing.

      When we open up the project and launch the BridgeConfiguration.bcs file we will discover a canvas with a couple shapes on it.  I have outlined, in black, a Bridge and I have highlighted, in Green, what is called a Route Destination.  So the scenario that has been built for us is one that will receive a typed xml message, transform it and then place it on a Service Bus Queue.

      image

      When we dig into the Xml One-Way Bridge shape, we will discover something that looks similar to a BizTalk Pipeline.  Within this “shape” we can add the various message types that are going to be involved in the bridge.  We can then provide the names of the maps that we want to execute.

      image

      Within our BridgeConfiguration.bcs “canvas” we need to provide our service namespace.  We can set this by click on the whitespace within the canvas and then populate the Service Namespace property.

      image

      We need to set this Service Namespace so that we know where to send the result of Xml One-Way bridge.  In this case we are sending it to a Service Bus Queue that resides within our namespace.

      image 

      With our application configured we can build our application as we normally would.  In order to deploy, we simply right mouse click on solution, or project, and select Deploy.  We will once again be asked to provide our Service Bus credentials.

      image

      The deployment is quick, but then again we aren’t pushing too many artifacts to the Service Bus.

      Within the SDK samples, Microsoft has provided us with MessageReceiver and MessageSender tools.  With the MessageReceiver tool, we can use it to create our Queue and receive a message.  The MessageSender tool is used to send the message to the Queue.

      When we test our application we will be asked to provide an instance of our typed Xml message.  It will be sent to the Service Bus, transformed (twice) and then the result will be placed on a Queue.  We will then pull this message off of the queue and the result will be displayed in our Console.

       

      image

      Conclusion

      So far is seems pretty cool and looks like this team is headed in the right direction.  Much like the AppFabric Apps CTP, there will be some gaps in the technology, and bugs, that have been delivered as this is still just a technical preview.  If you have any feedback for the Service Bus team or want some help troubleshooting a problem, a forum has been set up for this purpose.

      I am definitely looking forward to digging into this technology further – especially in the area of connecting to on-premise LOB systems such as SAP.  Stay tuned for more posts on these topics.

      Speaking at Prairie Dev Con West–March 2012

      December 2, 2011 1 comment

      I have recently been informed that my session abstract has been accepted and I will be speaking at Prairie Dev Con West in March.  My session will focus on Microsoft’s Cloud based Middleware platform: Azure AppFabric.  In my session I will be discussing topics such as Service Bus,  AppFabric Queues/Subscriptions and bridging on-premise Line of Business systems with cloud based integration.

      You can read more about the Prairie Dev Con West here and find more information about registration here.   Do note there is some early bird pricing in effect.

       

      image

      Categories: AppFabric, Azure, Queue
      Follow

      Get every new post delivered to your Inbox.