Azure Mobile Services–Update Entity results in 400 Bad Request

While performing some local testing on an Azure Mobile Services (C# backend) app, I kept getting a 400 Bad Request whenever I tried to perform an update to an Entity:

await MobileService.GetTable<MyEntity>().UpdateAsync(myEntity);

I ensured that my entity object was populated correctly and that I was providing an Id as part of the request but was still getting the following error:

“Microsoft.WindowsAzure.MobileServices.MobileServiceInvalidOperationException: The request could not be completed.  (Bad Request)\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<ThrowInvalidResponse>d__18.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<SendRequestAsync>d__1d.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Micr
osoft.WindowsAzure.MobileServices.MobileServiceHttpClient.<RequestAsync>d__4.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceTable.<>c__DisplayClass21.<<UpdateAsync>b__20>d__23.MoveNext()\r\n— End of stack trace from previous location where exception was thrown —\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n   at Microsoft.WindowsAzure.MobileServices.MobileServiceTable.<TransformHttpException>d__51.MoveNext()\r\n— End of stack trace from previous

Not the most descriptive error hence my decision to create the post.  The end result is that within my C# backend project I declared an entity within my DataObjects class.

public class MyEntity:EntityData
   {
       public string Id { get; set; }

As soon as I removed the Id from this class my updates were successful.

In the end it was a silly error on my part as within my Mobile App Model classes you need to declare this Id as part of your class. This is not the same case with your backend classes.  I have several other Entities in this project and knew that you should not declare this Id but it slipped through the cracks.

It is also worth noting that during this time I could successfully insert records even though this Id was declared. 

Using the Azure API Management–Management API

I recall the first time I saw this demoed at the MVP Summit last fall and wondered how/if I would use this.  My thought process, at the time, was that Microsoft has put together a nice, functional Administration and Developer portals so why would I want to call these APIs myself?

Fast-forward about 8 months where I have a ‘Middleware’ support team and also other internal stakeholders who are interested in some of this data.  While I could provide access to all of these users to the portals I just mentioned but it feels like it would be just another website that someone has to remember credentials for.  It also would be very disconnected from some of the other tooling that the Middleware team uses to support interfaces (BAM, Exception Portal, BizTalk360 etc).  In addition to BizTalk, we also have other Azure Services that participate in some API interactions including  Web APIs, SQL Azure and Service Bus topics.  With all of these disparate sites and services, it was time to build an “Integration Portal” where we could consume these external data sources (API Management, SQL Azure, Service Bus) and then link to the other BizTalk tools).

Perhaps I will get deeper into this Integration Portal solution in the future, but for now let’s focus on the Azure API Management – Management API and how we can consume some of the Analytics that we would ordinarily see in our Admin/Developer portal.

Security and Authorization

By default, the Management API is disabled within your API Management tenant. Within the API Management Portal, click on the Security label and then the API Management REST API tab to enable it.  Within this tab:

  • Click on the Enable API REST API checkbox
  • Set an appropriate Expiry date for your token
  • Click Generate Token button
  • Copy your Authorization header token

image

Building the application

For the purposes of this Blog Post I am going to build a simple ASP.NET MVC Project where I will display my API Management Analytic data.  I am not going to go through setting up the MVC project in much detail, but here is a link to a good tutorial.

Within this MVC project I am going to create a new Controller called APIAnalytics and will use the a view called Index to display my data.

image

Below is the code that I have included within my Index() method.  You will notice this is marked as an async method as we are making our API Call using a GetAsync method. I have added comments within the code to provide some additional context.

// GET: APIAnalytics
public async Task<ActionResult> Index()
{

ViewBag.Message = String.Format(“API Management calls for today: {0}”, DateTime.Now.ToShortDateString());

//Note that all Azure timestamps are in UTC.  Going to use this value as a predicate so that we are only returning data for the current day
string currentDate = DateTime.Today.ToUniversalTime().ToString(“s”);

//Notice the Odata filter where we will restrict based upon timestamp

string url = String.Format(“https://<your_tenant_name>.management.azure-api.net/reports/byProduct?api-version=2014-02-14&$filter=timestamp ge datetime'{0}'”, currentDate);

using (System.Net.Http.HttpClient client = new System.Net.Http.HttpClient())
{
client.BaseAddress = new Uri(url);
client.DefaultRequestHeaders.Add(“Authorization”, “<your_auth_token_from_api_management_portal”);

client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue(“application/json”));

System.Net.Http.HttpResponseMessage response = await client.GetAsync(url);
if (response.IsSuccessStatusCode)
{
var data = await response.Content.ReadAsStringAsync();

//We want some typed data so that we can bind to our grid view control. Look for “APIStats” model class below
var table = Newtonsoft.Json.JsonConvert.DeserializeObject<APIStats>(data);

 

//Within our view we will use a Gridview control

System.Web.UI.WebControls.GridView gView = new System.Web.UI.WebControls.GridView();

//We are going to create a data binding event so we can manipulate column headers, etc
gView.RowDataBound += gView_RowDataBound;
gView.DataSource = table.value;
gView.DataBind();
gView.AllowSorting = true;
gView.AutoGenerateColumns = false;

gView.AlternatingRowStyle.BackColor = System.Drawing.Color.Azure;
gView.Width = System.Web.UI.WebControls.Unit.Percentage(90);

using (System.IO.StringWriter sw = new System.IO.StringWriter())
{
using (System.Web.UI.HtmlTextWriter htw = new System.Web.UI.HtmlTextWriter(sw))
{
gView.RenderControl(htw);
ViewBag.ReturnedData = sw.ToString();
}
}
}
}
return View();

}

 

//Here is our data binding event

protected void gView_RowDataBound(object sender, GridViewRowEventArgs e)
{

if (e.Row.RowType == DataControlRowType.Header)
{
e.Row.BackColor = System.Drawing.Color.LightGray;
e.Row.Cells[0].Text = “API Product”;
e.Row.Cells[0].Width = 300;
e.Row.Cells[1].Text = “Success Count”;
e.Row.Cells[2].Text = “Block Count”;
e.Row.Cells[3].Text = “Failed Count”;
e.Row.Cells[4].Text = “Other Count”;
e.Row.Cells[5].Text = “Total”;
e.Row.Cells[6].Text = “Bandwidth”;
e.Row.Cells[7].Text = “API Avg Time”;
e.Row.Cells[8].Text = “API Min Time”;
e.Row.Cells[9].Text = “API Max Time”;
e.Row.Cells[10].Text = “Svc Avg Time”;
e.Row.Cells[11].Text = “Svc Min Time”;
e.Row.Cells[12].Text = “Svc Max Time”;

}
else
{
try
{

//Going to round some values so that we are not displaying so many decimal points

e.Row.Cells[7].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[7].Text), 1).ToString();
e.Row.Cells[8].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[8].Text), 1).ToString();
e.Row.Cells[9].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[9].Text), 1).ToString();
e.Row.Cells[10].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[10].Text), 1).ToString();
e.Row.Cells[11].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[11].Text), 1).ToString();
e.Row.Cells[12].Text = Math.Round(System.Convert.ToDouble(e.Row.Cells[12].Text), 1).ToString();

}
catch (Exception ex;)
{
throw ex;
}

}

}

 

//Here is our APIStats Model Class

public class APIDetail
{

public string name { get; set; }

public int callCountSuccess { get; set; }
public int callCountBlocked { get; set; }
public int callCountFailed { get; set; }
public int callCountOther { get; set; }
public int callCountTotal { get; set; }
public int bandwidth { get; set; }
public double apiTimeAvg { get; set; }
public double apiTimeMin { get; set; }
public double apiTimeMax { get; set; }
public double serviceTimeAvg { get; set; }
public double serviceTimeMin { get; set; }
public double serviceTimeMax { get; set; }
}

public class APIStats
{
public List<APIDetail> value { get; set; }
public int count { get; set; }
public object nextLink { get; set; }
}
}

//Lastly here is our Index.cshtml view

@{
ViewBag.Title = “Index”;
}

<h2>Index</h2>
@ViewBag.Message

@Html.Raw(ViewBag.ReturnedData)

Running Application

If we go ahead and run our application we will discover the following web page.  The are all of the API Products that exist within my tenant.  As you can see I didn’t have any calls to my Auto Insurance apis and instead was just trying out my BizTalk Boot Camp products.

Within the unlimited tier you will see a very large API Max Time.  The reason for that is that I didn’t actually have the API backend up during that initial call. On my limited product you will see I have a Block Count of 3. The reason for this is that I have rate limiting enabled on that product and as a result I had some blocked calls.

image

Conclusion

The Azure API Management – Management API is quite rich and allows organizations to create custom dashboards or integrate this data with related data from other sources.

Within this blog post,  I have only demonstrated one type of call that is available within this API. For more details and to discover the different APIs, please refer to the following web page.

Azure API Management–IP Whitelisting

When implementing API Management solutions, it is a common practice to use IP Whitelisting when interacting with certain trading partners.  The idea being that only traffic presented from a specific IP Address (or range) can call your API Proxy.  This usually isn’t used exclusively, but can be combined with other techniques to reduce your attack surface.

Trying to manage IP Whitelisting using Firewalls and Reverse Proxies can sometimes be a complex and messy endeavor but is pretty straightforward in Azure API Management.

In my demo, I added the Restrict caller IPs policy at the product level in the inbound section.

image

Once the policy has been added, you can add a specific IP Address or a range if the trading partner has multiple servers that you want to communicate with.  The IP Addresses below are obviously fictitious.

image

Once enabled, if you try to call the API from an IP Address that has not been whitelisted you will get the following error:

image

 

You can find more information about this and other Azure API Management policies here.

Logic Apps: Integrating Custom SharePoint Lists and Salesforce

I have posted a little tutorial video that describes consuming data from a SharePoint Server Custom List and using that data to create new Contacts in Salesforce. The SharePoint Connector will take advantage of the Hybrid Connection capability that allows messages to flow between the Azure cloud and an On-Premise system without requiring ports to be opened in firewall.  

 

Introduction to Azure API Management: Hands-on Lab

I have recently returned from Charlotte where I had a couple of Azure API Management sessions at the BizTalk Bootcamp.  The first presentation was an Introduction to the API Economy, API Management, Azure API Management and how you can use Azure API Management and BizTalk to introduce some agility in your organization.  This was the same session that I delivered at the BizTalk Summit in London.  BizTalk360, who hosted that event, recorded all sessions and started to publish both the talks and the slide decks.  I don’t want to disrupt what they are doing over there so I would encourage you to check out their conference page where my presentation should be available soon.

The second session that I delivered in Charlotte was a Hands-on Lab.  Attendees were encouraged to create an Azure API Management instance in advance of arriving or during a break (only takes 15 min to provision).  I then published an ASP.NET Web API in Azure and each attendee had the ability to:

  • Create 2 API Products
  • Create an API and 3 Operations
  • Publish their Products
  • Testing the API through the Developer Portal
  • Configure a Rate Limiting scenario

The scenarios are pretty simple but I was pleased to hear that for many people, who had no previous experience with API Management, were able to get an API managed in only 45 minutes.

I will make this API available in the interim but if I run into any issues or abuse with it I may decide to take it down.

Also note that I provisioned my API Management solution many months ago and it looks like there have been some subtle changes to the UI in the Azure API Management Portal.  None of these changes should get in the way of walking through the scenarios but I figured I would bring this up now to avoid further confusion.

Azure App Service Demo Scenario–Part 1

While recording is a lot of work, I find that it is a very useful channel for learning. As I learn more about the new Azure App Service I will be posting some short demonstrations and walkthroughs.

The first post in this series is a quick Logic App demo that includes Twitter and Dropbox integration.  The inspiration for the demo comes from the App Service Documentation which can be found here.

Azure–Service Bus Queues in Worker Roles

 

Another nugget of information that I picked up at TechEd North America is a new template that ships as part of the Azure SDK 1.7 called Worker Role with Service Bus Queue. 

image

What got me interested in this feature is some of the work that I did last year with the AppFabric Applications (Composite Apps) CTP.  I was a big fan of that CTP as it allowed you to wire-up different cloud services rather seamlessly.  That CTP has been officially shut-down so I can only assume that this template was introduced to address some of the problems that the AppFabric Applications CTP sought to solve.

 

The Worker Role with Service Bus Queue feature works in both Visual Studio 2010 or Visual Studio 2012 RC.  For this post I am going to be using 2012 RC.

I am going to use a similar scenario to the one that I used in the App Fabric Applications CTP.  I will build a simple Web Page that will allow a “customer” to populate a power outage form.  I will then submit this message to a Service Bus Queue and will then have a Worker role dequeue this message. For the purpose of this post I will simply write a trace event to prove that I am able to pull the message off of the queue.

 

Building the Application

  • Create a new project by clicking on File (or is it FILE) – New Project
  • Select Cloud template and then you will see a blank pane with no template able to be selected.  The Azure SDK is currently built on top of .Net 4.0 not 4.5.  With this in mind, we need to select .Net Framework 4

image

  • We now need to select the Cloud Services that will make up our solution.  In my scenario I am going to include an ASP.Net Web Role and a Worker Role with Service Bus Queue.

image

Note: we do have the opportunity to rename these artifacts by hovering over the label and then clicking on the pencil. This was a gap that existed in the old AppFabric Apps CTP.  After renaming my artifacts my solution looks like this:

image

  • I want to send and receive a strongly typed message so I am going to create a Class Library and call it CustomerEntity.

image

  • In this project I will simply have one class called Customer with the following properties

namespace CustomerEntity
{
    public class Customer
    {
        public string Address { get; set; }
        public string City { get; set; }
        public string State { get; set; }
    }
}

 

  • I will then add a reference in both the Web Project and Worker Role projects to this CustomerEntity project.

 

  • Within the PowerOutageWeb project clear out all of the default markup in the Default.aspx page and add the following controls.

<h3>Customer Information:</h3>
Address: <asp:TextBox ID="txtAddress" runat="server"></asp:TextBox><br />   
City: <asp:TextBox ID="txtCity" runat="server"></asp:TextBox> <br />
State: <asp:TextBox ID="txtState" runat="server"></asp:TextBox><br />
<asp:Button ID="btnSubmit" runat="server" Text="Submit"  OnClick="btnSubmit_Click" />
<asp:Label ID="lblResult" runat="server" Text=""></asp:Label>

 

  • Also within the PowerOutageWeb project we need to add references to the Service Bus Assembly:  Microsoft.ServiceBus.dll and Runtime Serialization Assembly: System.Runtime.Serialization.dll

 

  • We now need to provide the following include statements:

using CustomerEntity;
using Microsoft.WindowsAzure;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;

 

  • Next we need to provide a click event for our submit button and then include the following code:

    protected void btnSubmit_Click(object sender, EventArgs e)
      {
          Customer cs = new Customer();
          cs.Address = txtAddress.Text;
          cs.City = txtCity.Text;
          cs.State = txtState.Text;

          const string QueueName = "PowerOutageQueue";

         //Get the connection string from Configuration Manager

         string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");

          var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

         //Check to see if Queue exists, if it doesn’t, create it
          if (!namespaceManager.QueueExists(QueueName))
          {
              namespaceManager.CreateQueue(QueueName);
          }

          MessagingFactory factory = MessagingFactory.CreateFromConnectionString(connectionString);

          //Create Queue CLient
          QueueClient myQueueClient = factory.CreateQueueClient(QueueName);

          BrokeredMessage bm = new BrokeredMessage(cs);

         //Send Message

          myQueueClient.Send(bm);

         //Update Web Page

          lblResult.Text = "Message sent to queue";
      }

 

  • Another new feature in this SDK that you may have noticed is the CreateFromConnectionString method that is available to the MessagingFactory and NamespaceManager classes.  This allows us to retrieve our configuration settings from our project properties page.  To access the project properties right mouse click on the particular role and then select Properties.  Next, click on Settings where you will find Key/Value Pairings.  The name of the key that we are interested in is: Microsoft.ServiceBus.ConnectionString and our value is

Endpoint=sb://[your namespace].servicebus.windows.net;SharedSecretIssuer=owner;SharedSecretValue=[your secret]

  • Since both our Web and Worker Roles will be accessing the Queue we will want to   ensure that both configuration files have this entry included.  This will allow our code to make a connection to our Service Bus Namespace where our Queue may be found.  If we edit this property here in these locations, then we do not need to modify the Cloud.cscfg and Local.cscfg configuration files because Visual Studio will take care of this for us.

image

  • Next we want to shift focus to the Worker Role and edit the WorkerRole.cs file.  Since we are going to be dequeuing our typed CustomerService message we want to include a reference to this namespace:

    using CustomerEntity;

  • Something that you probably noticed when you opened up the WorkerRole.cs file is that there is already some code written for us.  We can leverage most of it but can delete the code that is highlighted in red below:

image

  • Where we deleted this code, we will want to add the following:

Customer cs = receivedMessage.GetBody<Customer>();
Trace.WriteLine(receivedMessage.SequenceNumber.ToString(), "Received Message");
Trace.WriteLine(cs.Address, "Address");
Trace.WriteLine(cs.City, "City");
Trace.WriteLine(cs.State, "State");
receivedMessage.Complete();

In this code we are going to receive a typed Customer message and then simply write out the contents of this message using the Trace utility.  If we wanted to save this information to a Database, this would a good place to write that code.

  • We also want to make sure that we update the name of Queue so that it matches the name that we specified in the Web Role:

// The name of your queue
const string QueueName = "PowerOutageQueue";

 

Creating our Queue

We have a few options when it comes to creating our Queue that is required for this solution to work. The code in our ASP.NET Web page code-behind will take care of this for us.  So for our solution to work, this method is sufficient but perhaps we want to use a design-time alternative to specify some more advanced features.  In this case we do have a few options:

  • Using the http://www.windowsazure.com portal.image
  • Similarly we can use the Service Bus Explorer tool that was written by Paolo Salvatori.  Steef-Jan Wiggers has provided an in-depth walk through of this tool so I won’t go into more details here. http://soa-thoughts.blogspot.ca/2012/06/visual-studio-service-bus-explorer.html
  • As part of the Azure 1.7 SDK release, a Visual Studio Service Bus Explorer is now included.  It is accessible from the Server Explorer view from within Visual Studio.  Using this tool we can perform some functions like:
    • Creating Queues/Topics
    • Set advanced properties: Queue Size, Time to Live, Lock Duration etc
    • Send and Receive Test Messages
    • Get the current Queue Depth
    • Get our Service Bus Connection string

image

 

Any of these methods will work.  As I mentioned earlier, if we do nothing, the code will take care of it for us.

Testing

We are just going to test our code locally, with the exception of our ServiceBus Queue.  That is going to be created in Azure.  To test our example:

  • Hit the F5 and our website should be displayed

image

  • Since we have our Trace statements included in our Worker Role code, we need to launch our Compute Emulator.  To do this right mouse click on the Azure icon that is located in your taskbar and select Show Compute Emulator UI.

image

  • A “Console” like window will appear.  We will want to click on the ServiceBusWorker label

image

  • Switch back to our Web Application, provide some data and click the submit button.  We should see that our results label is updated to indicate that our Message sent to queue. image
  • If we switch back to our Compute Emulator we should discover that our message has been dequeued.

image

 

Conclusion

While the experience is a little different than that of the AppFabric Applications CTP, it is effective.  Especially for developers who may not be all that familiar with “integration”.  This template really provides a great starter point and allows them to wire up a Service Bus queue to their Web Application very quickly.