Category Archives: C#

Azure App Service: My Experience with Auto-Scale Out

I was recently testing the automatic scaling capabilities of Azure App Service plans. I had a static website and a Web API running off the same Azure App Service plan. It was a Production S1 Plan.

The static website was small (less than 10MB) and the Web API exposed a single method which did some file manipulation on files up to 25MB in size. This had the potential to drive memory usage up as the files would be held in memory while this took place. I wanted to be sure that under load my service wouldn’t run out of memory and die on me. Could Azure’s ability to auto-scale handle this scenario and spin up new instances of the App Service dynamically when the load got heavy? The upsides if this worked are obvious; I wouldn’t have to pay the fixed price of a more expensive plan that provided more memory 100% of the time, instead I’d just pay for the additional instance(s) that get dynamically spun up when I need them and therefore the extra cost during those periods would be warranted.

As an aside before I get started, it’s worth pointing out that the memory usage reporting works totally different on the Dev/Test Free plan than on Production plans, and I’d guess this has to do with the fact that the Free plan is a shared plan where it really doesn’t have it’s own dedicated resources.

What I noticed here is that if I ran my static website and Web API on the Dev/Test Free plan then my memory usage sat at 0% when idle. As soon as I change the plan to a production S1 then memory sat at around 55% when idle.

Enabling scale out is really simple, it’s just a matter of setting the trigger(s) for when you want to scale out (create additional instances) and I was impressed with a few of the other options that gave fine grained control over the sample period and cool-down periods to ensure scaling would happen in a sensible and measured way.

Before configuring my scale out rules, I first wanted to check my test rig and measure the load it would put on a single instance so I knew at what level to set my scaling thresholds. This is how the service behaved with just the one instance (no scaling)

You can see that the test was going to consistently get both the CPU and Memory usage above 80%.

Next I went about configuring the scale out rules. Here I’ve set it to scale out if the average CPU Usage > 80% or the Memory Usage > 80%. I also set the maximum instances to 6.

I also liked the option to get notified and receive an email when scaling creates or removes an instance.

So did it work? Let’s see what happened when I started applying some load to both the static website and the Web API.

Before long I started getting emails notifying me that it was scaling up, each new instance resulted in an email like this

These graphs show what happened as more and more load was gradually applied. The red box is before scaling was enabled and the the green box shows how the system behaved as more load was applied and the number of instances grew from 1 to 6 instances. While the CPU and Memory dropped notice how the amount of Data Out and Data In during the green period was significantly higher? Not only was the CPU and Memory usage on average across the instances lower, but it was able to process a much higher volume of requests.

I have to say, I was pretty impressed when I first watched all this happen automatically in-front of my eyes. During testing I was also recording the response codes from every call I was making the the static website and Web API. Not a single request failed during the entire test.

But what happened when I stopped applying such a heavy load on the website and web API? Would it scale down just as gracefully? Read on for part two of this test to find out.

Microsoft Graph API, Throttling & SharePoint Lists/Libraries – HTTP 429 Error Code

When developing against the Microsoft Graph you may find yourself experiencing  HTTP 429 Error Codes now that resource throttling is being implemented in different areas of the Graph.

I came up against a strange and somewhat misleading one this week which is worth being aware of if you are using the Graph to access SharePoint lists and libraries using the /sites/ area of the Graph.

I had a service running which started reporting HTTP 429 error codes. I read through all the latest published documentation to try a figure out how the throttling has been implemented and what the limitations are to see what part of the code could be triggering the throttling. As you’ll find the documentation is very non-committal and mostly serves to justify why there are no specific limits and rather algorithms that dynamically determine the throttling based on a large number of dynamic criteria. All of this sounds really fancy and advanced but is not very helpful when trying to identify what could be causing the throttling issue, or what limit your code is hitting.

Here’s the Microsoft documentation links which are well worth the read:

Microsoft Graph throttling guidance

Updated guidance around SharePoint web service identification and throttling

Avoid getting throttled or blocked in SharePoint Online

(Azure) Throttling pattern

Most of the above advice is summarised in this section I took from one of those official documents on handling throttling with the Graph API (Feb 2018)

Best practices to handle throttling

The following are best practices for handling throttling:

  • Reduce the number of operations per request.
  • Reduce the frequency of calls.
  • Avoid immediate retries, because all requests accrue against your usage limits.

When you implement error handling, use the HTTP error code 429 to detect throttling. The failed response includes the Retry-After field in the response header. Backing off requests using the Retry-After delay is the fastest way to recover from throttling because Microsoft Graph continues to log resource usage while a client is being throttled.

  1. Wait the number of seconds specified in the Retry-After field.

  2. Retry the request.

  3. If the request fails again with a 429 error code, you are still being throttled. Continue to use the recommended Retry-After delay and retry the request until it succeeds

This advise all makes sense so that if your code is making a lot of calls (think migrating SharePoint items or doing bulk updates) that the Graph may tell you to slow down. When I was investigating my scenario however, it just didn’t make sense that the code was generating enough traffic to worry the Graph (Office 365 service). The telemetry was telling me the code had made around 2,500 Graph calls spread over a period of 24 hours and this was also spread across more than 100 users from a number of different Office 365 tenants.

Diving deeper into the telemetry a pattern quickly emerged, the 429 errors were being returned in response to a Graph call to get a list item based on a column value. Something along these lines:{site-id}/lists/{list-id}/items?filter=Fields/Title eq 'testitem'

This call didn’t fail all the time, if fact it only seemed to get the 429 error in less than 10% of the cases.

Having spend many hours over the past few years ‘working with’ SharePoint thresholds and query limitations on large lists and libraries, my mind started to move towards thinking that maybe the 429 error was a bit misleading and was actually failing due to the Graph API hitting SharePoint threshold limitations.

Off to prove my theory, I’ve got a library with just under 5000 items (which is the SharePoint list threshold)


Using the Graph API Explorer I can make a call that queries this SharePoint library for a specific item matching on the Title column value being equal to “upload.log” (a file which I know exists in the SharePoint library).


As expected  the item is found and a Success code 200 is returned along with the JSON payload in the response body shown above. Time to prove the theory, what if I now add 2 more files to the same document library and repeat the process?

After uploading 2 more files, the library settings now indicate that we have exceeded the list view threshold.


Now executing the same query in the Graph API explorer gives us the 429 error code. Inspecting the response body we can see the additional error code of “activityLimitReached” and message of “The application or user has been throttled”


Why was this error misleading? Neither the error code or message specifically indicate the issue being related to SharePoint thresholds. The documentation and best practice articles (linked to at the start of this article) regarding this 429 response are written on the premise that the volume and frequency of calls is responsible for the error and hence the guidance to handle the error should be to incrementally back-off and keep trying until you get success. This guidance is totally misguided in the case of hitting the underlying SharePoint threshold limitation as the call will always fail and has nothing to do with the volume or frequency of calls you are making. It will fail if it’s the only call you make all day and no matter how many times you retry, it will always fail.


How to fix mouse cursor disappearing in Visual Studio & Visual Studio Code

This is a problem I have come across each time I build a new virtual development machine with Visual Studio on it. The problem has been around for a few years now and I always have to search around for the steps to fix it each time it catches me.

I’ve seen this issue in the following versions of Visual Studio and the resolution is the same and works for them all:

  • Visual Studio 2012
  • Visual Studio 2015
  • Visual Studio 2017
  • Visual Studio Code


The Problem

When using Visual Studio the mouse cursor flickers badly or totally disappears when the mouse pointer is in the code editing area of Visual Studio (as shown in the screenshot below).


Moving the mouse cursor outside of this area makes it visible again, and it seems that the mouse pointer is unaffected when using other applications and on the Windows desktop itself.

I’ve found that the problem is much more prevalent when access Visual Studio on another machine (e.g. virtual development machine) via remote desktop.

The Solution

Thankfully the solution is quick and simple:

  • Open Control Panel | Appearance and Personalization | Personalization | Change mouse pointers
  • On the Pointers tab of the dialog change the Scheme to Windows Black (system scheme)


That’s it, your cursor should now be back and stable.


How to: Enable outlining (collapsible statement blocks) for C# code in Visual Studio

I’m often modifying existing C# code in Visual Studio and find myself trying to line up opening and closing braces and trying to figure out what level of nesting I’m currently at. It’s easy when the block fits on a single screen without scrolling, but dive into some complex logic where you’ve got plenty of nested if, else, switch, try, catch blocks it’s easy to get disoriented scrolling up and down trying to figure out the logic.

Yes I can hear the code purists begging me to restructure the code and encapsulate logic away into smaller more focused methods. I don’t disagree, it fact that might be the reason I’m looking at the code in the first place, but I need to understand the logic before I start pulling it apart.

Whatever the reason I’m sure you’ve all been in the situation, why can’t I just collapse this  switch block, or if block? Wouldn’t that make things simple? Luckily the solution to this problem is simple – there’s a free Visual Studio Extension called C# Outline and it’s available for Visual Studio 2010, 2012, 2013 & 2015.

This nifty extension provides outlining (expand/collapse) for all block elements that use curly braces { }; Just like you get out-of-the-box for classes and methods It’s simple and a massive time saver.


Detect if New Folders are allowed in SharePoint List/Library using Lists web service from remote application (works across SharePoint 2007, 2010, 2013)

Today I found myself tasked with finding a common method for detecting if a SharePoint list/library had been configured to allow new folders from a remote client application. The method also had to work across all versions of SharePoint (well at least 2007, 2010, 2013 on-premise and in the cloud). This ruled out using client side object model so my investigation turned to web services (yes they are deprecated in SP2013 so be careful using them going forward).

The Lists web service provides a few different ways to return the XML schema of a list(s) (e.g. GetList or GetListCollection)

Here’s a snippet of what the XML looks like that you get back from these web service methods.

<List DocTemplateUrl="" DefaultViewUrl="/Lists/announce123/AllItems.aspx" MobileDefaultViewUrl="" ID="{F194025B-A0F2-4318-950A-9197AD8D2285}" Title="announce123" Description="" ImageUrl="/_layouts/15/images/itann.png?rev=23" Name="{F194025B-A0F2-4318-950A-9197AD8D2285}" BaseType="0" FeatureId="00bfea71-d1ce-42de-9c63-a44004ce0104" ServerTemplate="104" Created="20130510 01:18:39" Modified="20131015 03:57:05" LastDeleted="20130802 04:12:40" Version="3" Direction="none" ThumbnailSize="" WebImageWidth="" WebImageHeight="" Flags="603983880" ItemCount="6" AnonymousPermMask="0" RootFolder="" ReadSecurity="1" WriteSecurity="1" Author="9" EventSinkAssembly="" EventSinkClass="" EventSinkData="" EmailAlias="" WebFullUrl="/" WebId="8f69cd67-4cc9-42f4-b104-2fe1e2b7944e" SendToLocation="" ScopeId="fadcba6a-3b00-44b9-a813-db5dc5cf3858" MajorVersionLimit="0" MajorWithMinorVersionsLimit="0" WorkFlowId="" HasUniqueScopes="False" NoThrottleListOperations="False" HasRelatedLists="" Followable="False" AllowDeletion="True" AllowMultiResponses="False" EnableAttachments="True" EnableModeration="False" EnableVersioning="False" HasExternalDataSource="False" Hidden="False" MultipleDataList="False" Ordered="False" ShowUser="True" EnablePeopleSelector="False" EnableResourceSelector="False" EnableMinorVersion="False" RequireCheckout="False" ThrottleListOperations="False" ExcludeFromOfflineClient="False" CanOpenFileAsync="True" EnableFolderCreation="False" IrmEnabled="False" IsApplicationList="False" PreserveEmptyValues="False" StrictTypeCoercion="False" EnforceDataValidation="False" MaxItemsPerThrottledOperation="5000" xmlns="<a href=";">"</a> />

After diligently looking through all of the attributes here it doesn’t seem like the “New Folders allowed” option is included. One attribute did catch my eye however… the “Flags” attribute. What was this for and could it hold the secret I was after?

I went into SharePoint toggled the Allow New Folders setting reran my code to call the Lists web service and grabbed the XML again to test if anything changed in the Flags attribute.

On my initial run through the Flags attribute had a value of “603983880” (no new folders allowed), on my second run through the Flags attribute had indeed changed to “67112968” (new folders allowed). Great, somehow this Flags attribute holds the key, but how do you make use of this?

My understanding of flags is that they work on an individual bit level, essentially every bit can have an on/off state. So this means a flag can hold the state of many different variables. So time to get binary! It’s been a few years since I last sat in a classroom and figured out binary/decimal conversions with a pencil and paper so let’s just cheat – or actually let’s use a much faster tool for the job.

I started up Windows Calc and put it into Programmer mode.


Next set the calc to Dec(imal) mode and cut/paste the initial value of the flag in


Now click on Bin(ary) to convert the number and we get:


This on it’s own gives us nothing, but if we repeat the steps with the second value of the flag (after we toggled the allow New Folders setting) we get the binary value:


Now we have something to work with, if you align these 2 values under each other you can see that just a single bit has changed.


That’s the bit that represents the “Allow New Folder” setting.

Before we dive into the code for the solution I’ll explain why the values you see in the code don’t look like the binary numbers above. Binary is damn hard to read, very long to type, and prone to typing errors. I’ve converted these binary numbers to Hex in order to do the comparisons. When you convert these 2 numbers to Hex (using calc again) you get


As you can see, we still have our difference of a single number in the sequence but it’s a lot less digits.

So now all that’s left is the code that checks if this bit has been set or not.

// Detect if folders are allowed in this list/library - it's hidden in the Flags attribute.
UInt64 flags = 0;
bool foldersAllowed = false;
string flagsStr = listNode.Attributes.GetNamedItem("Flags").InnerText;
if (UInt64.TryParse(flagsStr, out flags))
    foldersAllowed = ((flags & ((ulong)0x20000000L)) == 0L);

How to set the height of a WinForms CheckedListBox to fit to dynamic contents without scrollbars

I recently encountered a problem trying to get the Windows Forms CheckedListBox control to resize it’s height to exactly fit it’s contents (items) without showing scrollbars.

Initially I looked at using the obvious PreferredHeight property, but it quickly became evident that the preferred height didn’t give me a useable value. It was out by a few pixels over ten items and when I got up into hundreds of items it was way out.

I also had to contend with the fact that this code was going to be run across XP, Vista, Window 7/8 and potentially across multiple DPI settings (100%, 125%, 150%).

In the end the solution wasn’t too bad.

// Explicitly set height to fit options
checkBoxCtrl.ClientSize = new Size(checkBoxCtrl.ClientSize.Width, checkBoxCtrl.GetItemRectangle(0).Height * checkBoxCtrl.Items.Count);


The key concepts here as:

  • Set the ClientSize property rather than the Size property. ClientSize is just the internal area the items populate so you don’t have to worry about padding and border widths applied via VisualStyles
  • All items in my list are the same height so I can just measure the height of the first item and multiply by the number of items in the list
  • The GetItemRectangle property returns the height of an item in the list taking into account different DPI settings and the padding/margin between items in the list. Note: this is much simpler than trying the measure the graphic (checkbox glyph) or Text of an item and the padding/margin between items.

How to Permanently Remove TFS Source Control Bindings from Visual Studio Solutions (VS2012)

I often have the need to distribute Visual Studio source code to external parties. Internally this source code is under source control in Team Foundation Server (TFS).

It would be nice if you could just take a copy of the source files and remove the source control binding (in Visual Studio). Unfortunately I haven’t found a way to be able to do this. Even when Visual Studio says that all bindings have been removed, if you try to open up the solution you get an error message along the lines “The solution you are opening is bound to source control on the following Team Foundation Server …”


Under earlier versions of Visual Studio I had used Visual Source Safe for source control and had used a utility called VSSBindingRemover which did the job quite effectively. It removed all source control files with the solution and project directories and modified the solution and project files themselves to remove source binding information.

My search for a similar tool for Visual Studio 2012 / TFS resulted in a utility available on CodePlex called VSUnbindSourceControl. Update: this project has now been moved to GitHub

The process for removing bindings is simple:

  • Ensure you don’t have the solution/projects open in Visual Studio
  • Copy your solution to a new directory (because the tool does modify files)
  • Run the utility from the command line: VSUnbindSourceControl.exe d:\mysolution folder

Once the tool is finished, all the source control bindings have been removed from any solution and project files.

How to encode a string for use in xml

The Problem

When you are constructing xml using the classes from the System.Xml namespace you really don’t need to worry about encoding characters, you simply set the value of an XmlAttribute and the rest is taken care of for you.

There are some instances however where you find yourself constructing a string representation of xml. In my case I was trying to set the content of a dynamic menu in the Office Ribbon. In it’s simplest form what I needed to do was give Office a string which contained the xml definition of all the buttons on my dynamic menu.

This is easy enough to do you just need to be careful and ensure you write valid xml, right? Of course that’s right. But the bit that always gets me is knowing which characters need to be encoded (or escaped), and how to do it.

The Solution

If you’ve already gone to the overhead of creating a XmlDocument or performance of creating an XmlDocument in memory is not an issue, then I like the simplicity of this solution.

/// <summary>
/// Returns an encoded version of the string passed in that
/// is suitable for use in constructing valid XML
/// </summary>
/// <param name="stringToEncode">The string to encode</param>
/// <returns>The string with any reserved chars encoded for use in xml</returns>
public static string EncodeStringForUseInXml(string stringToEncode)
     XmlDocument doc = new XmlDocument();
     XmlElement element = doc.CreateElement("temp");
     element.InnerText = stringToEncode;
     return element.InnerXml;
%d bloggers like this: