How I Enjoyed the Rock Paper Azure Competition

Rock Paper AzureIn mid-December, I saw an ad on StackOverflow.com and was immediately intrigued. “Rock, Paper, Azure!” was a contest run by Microsoft wherein programmers design bots to compete in a modified game of Rock, Paper, Scissors. The bots had to be hosted in Microsoft’s cloud computing platform, Azure, so you can easily see Microsoft’s motivation to give away some small prizes to influence developers into trying and (hopefully) adopting Azure.

Although I had plenty of things to keep me busy leading up to Christmas, the Rock, Paper, Azure marketing worked on me. I figured I could take 1 or 2 hours out of my time and write the best algorithm I could in that time. Besides, I would be entered into the grand-prize contest drawing just for competing with even the most simple of bots.

Bugs LanguageI was immediately reminded of a school project from an early Computer Science course at Ohio State. The contest back then pitted “bug bots” from teams of students in the course against each other. Each team started out with a handful of bugs on a large virtual checker board. A bug could “convert” another student’s bug by facing it and issuing the “bite” command. The bitten bug would then become a member of the “biting” bug’s army. The game continues until one team has converted all bugs. If I remember correctly, there were only a few possible commands:

  • “Detect” if an object (like a wall or another bug) was in front of it
  • “Move” forward 1 square
  • “Rotate” left or right
  • “Bite”

 

It may have evolved since then, but our bot did surprisingly well back then despite a very simple algorithm:

  1. If something in front, turn left, bite.
  2. Else, move forward, bite.
  3. Repeat Step 1.

 

I’ve often wondered what additional strategy I would write into my bot if given another opportunity in such a competition. Rock, Paper, Azure was the challenge I was looking for.

Microsoft’s version of “roshambo” came with a few twists, such as the introduction of the dynamite and water balloon moves. Check out all the details and rules here. I liked that it was a simple game but with competition against other developers’ bots came many options for creative strategy. Additionally, I was extremely impressed with how simple it was to build the basic bot.

Game Rule Highlights:

  • Bots compete each other throwing one of Rock, Paper, Scissors, Dynamite, or Water Balloon
  • Normal rules apply except that the Dynamite beats everything but Water Balloon and Water Balloon loses to everything but Dynamite
  • Each bot only gets to use Dynamite 100 times
  • First bot to win 1000 times wins the entire match
  • Ties carry over, so the next round could be worth more than 1 win (similar to “Skins” in golf)

 

It took me some iteration to come up with my eventual strategy, which turned out to be admittedly mediocre (98th place out of 162). I realized that my bot can keep track of the history of moves that it has made as well as the moves of my opponent. My plan was to try and detect if my opponent was falling into a sort of pattern. I was especially concerned about the end of the round when we both would be desperately throwing dynamite to close out the match. As you can see, my strategy only had a small amount of success.

Nonetheless, I thoroughly enjoyed my time creating and deploying my bot. I encourage Microsoft to search for more clever ways to get developers interested in learning and using their development platforms. In this contest, I got to expand my mind, learn more about Azure, and I even got a free t-shirt. Here’s to the next competition!

Advertisement

Surprising Costs of Windows Azure Hosted Services

This summer I had a few days to twiddle my thumbs between quitting my job and starting with my new employer, Unstoppable Software. Creative ideas ran wild inside my head, the time off made me feel like I had an infinite amount of time to make them tangible. I whipped up a few websites and wondered where I could cheaply host them since they were built in ASP.NET MVC. Naturally, the first thing that came to mind was to store them in Windows Azure. After all, my BizSpark benefits included a reasonable amount of free service and I wanted to learn more about how the hosting service worked.

At first, this turned out to be a great idea. I was able to quickly get my site up and running in a Production environment on Windows Azure with a custom domain. Best of all, hosting was free!

Getting a working site in Production ended up being my biggest mistake.

I soon realized that I wanted a second hosting environment so that I could test development changes without affecting the working application in Production. Windows Azure’s easy to use deployment interface suggested the creation of a Staging Environment.

Azure Deployment Interface

 

The Staging site worked great. It allowed me to test with a different URL until I was satisfied with the results, then pushing my code to Production. Unfortunately, deploying to a second environment doubled the rate of consumption of a key cost metric, Compute Hours!

A Compute Hour is an hour of application use multiplied by the number of server instances. This is calculated regardless of the state of deployment (Suspended versus Active) and the environment (Production versus Staging). I was very surprised to discover this when I found a charge (below) for Microsoft Online on my credit card statement. Apparently, other developers found out the hard way as well.

Azure Deployment Cost

 

I expected the Compute Hours to be accumulated only by my Production environment. I certainly did not expect my billed rate to increase simply by following the best practice of using a Staging environment to test my application, especially if it was suspended when I was not testing. Additionally, I had incorrectly assumed that Compute Hours were only calculated when my web application was processing requests (of which I was not receiving many). I had not really thought about the cost since hosting was free at first. My recommendation to Microsoft is to make it more obvious to developers that the cost doubles when using an additional environment. I would also like to see a long-term Azure strategy for hosting low-traffic sites. Luckily, I found out about the increase in cost without running up a huge bill.

Implications of Windows Azure Container Types

In 7 Reasons I Used Windows Azure for Media Storage, I described the download process involved in streaming a large video through a Silverlight applet using the Microsoft cloud offering. My scenario involved the use of public containers to store large files (blobs) for a web application. Public containers are convenient because they can be accessed via a simple GET request. Unfortunately, being that simple begets some negative behavior. By being accessible via a simple URL, any user on the web can link to that file and/or download a copy for personal use.

If you are already using public containers, do not be alarmed as if your storage is entirely exposed. I tested my site by typing a URL in which I removed the file name and the result indicated that the URL could not be found. I immediately breathed a sigh of relief. In other words, even public containers do not act the same way that IIS would if the Directory Browsing setting were enabled.

Example URL: http://{ApplicationName}.blob.core.windows.net/{Container}/

Still, for cases in which public containers are not satisfactory due to their openness, the alternative is to use private containers.

Private containers are similar to public containers and remain fairly simple to use. They require the inclusion of a unique key during the GET request for stored files. This is extremely easy using the Azure SDK sample, which abstracts away the details of what must be included in the request.

Effectively, the container type determines where the request for Azure blobs come from. For public containers, the request comes from the client, because a simple URL fetches the file. In contrast, the request for private containers must come from the server. The server-side code embeds the key in the GET request, receives the blob, and processes it or delivers it to the client accordingly.

The obvious benefit to private containers being accessible only to the server-side code is that security logic can occur in it, thereby restricting who can access blobs to specific users based on rules. It also makes it much more difficult, but still possible, to download files stored in private containers for personal use. The drawback to this solution is that streaming now passes through the server, greatly increasing the bandwidth consumed by the application.

As described above, there are cases to be made for the use of both public and private containers. The best solution comes from weighing security requirements against bandwidth and development costs. Of course, there are ways to reap the benefits of both paradigms, but the above restrictions cover the “out of the box” web application scenario.

 

7 Reasons I Used Windows Azure for Media Storage

Windows Azure is one service that Microsoft offers as part of its Cloud Computing platform. Earlier this year, I began leveraging a small piece of the entire platform on my side projects for what is considered “Blob Storage.” Essentially, I am just storing large media files on Microsoft’s servers. Below I outline why I chose this option over storing the large files on my traditional web host.

 

General Reasons for Cloud Storage

  1. Stored Files Are Backed up

  2. Workload Scalability if traffic increases

  3. Cost scalability if traffic increases

 

Benefits of Microsoft’s Windows Azure

  • Easy to Learn for .NET Developers

I am a pretty heavy Microsoft developer so I more easily learned: how to use it, what the resources for learning it were, etc.

  • Azure Cheaper than Competitors

Thirsty Developer: Azure was cheaper than competitors

  • Azure Discounts with Microsoft BizSpark program

Azure is heavily discounted for BizSpark program members (Microsoft’s Startup program that gives free software for a 3 year time frame). BizSpark members get a big discount for 16 months, and a continued benefit for having an MSDN membership for as long as it is current (presumably a minimum of 3 years).

  • Ease of use

Retrieving and creating stored files is extremely easy for public containers. One can just get the blob from a REST URI.

 

How this applied to me

Traditional hosting had a very low threshold for data storage (1GB) above which it was very expensive to add on more storage. The same was true for Bandwidth but the threshold was higher (80GB). Instead, with Azure, the cost is directly proportional to each bit stored and/or delivered, so it is much easier to calculate cost and it will be cheaper once I get over those thresholds.

Using Azure for Storage produces the following Download Process

1. When a user requests a video file, he/she receives the web content and the Silverlight applet from my website.

2. Embedded in the web content is a link to a public video from Windows Azure storage.

3. Silverlight uses the link to stream the video content.

 

The link that my Silverlight applet uses to retrieve the file is a simple HTTP URI, which Silverlight streams to the client as it receives it. This is nice behavior; especially considering the file is downloaded directly to the client browser and does not incur additional bandwidth costs (from my traditional webhost) for every download.

 

What this means in terms of cost

There are several cost-increasing metrics as I scale my use of Windows Azure (see pricing below). Fortunately, Azure is much cheaper than my traditional web site hosting provider.

Under my current plan, costing roughly $10 per month, my hosting provider allows us to use:

80 GB of Bandwidth per Month

1 GB of storage space

For each additional 5GB of bandwidth used per month, a $5 fee is charged

For each additional 500MB of storage on the server above 1GB, the additional fee would be $5.

 

Fees for Windows Azure uploads & downloads can be easily predicted using the metrics below.

Azure Pricing (Effective 02/01/2010)

Compute = $0.12 / hour (Not applicable to this example)

Storage = $0.15 / GB stored / month

Storage transactions = $0.01 / 10K

Data transfers = $0.10 in / $0.15 out / GB

 

An Example

Let’s say that I average 80 GB of Bandwidth Usage per month on a traditional hosting provider. Because I would be using the full allotment from my provider, that would be the cheapest scenario per GB. Let’s assume that half of my $10 payment per month accounts for Bandwidth and the other half accounts for storage. In that case, my cost for Bandwidth Usage would be $5 / 80 GB, or $.06 / GB.

That may seem cheap, but remember that this price is capped at 80 GB per month. If my website sees tremendous growth, to say 500 GB per month then I would be in real trouble. For that additional 420 GB, I would have to pay a total of $420 per month. My total monthly bill attributed to Bandwidth Usage of 500 GB would be $425, or $.85 / GB.

In contrast, the Azure storage model is simple and linear. For the low traffic case of downloading 80GB per month, it would actually be more expensive. However, as Bandwidth Usage grows, the rates do not go through the roof. Case in point, my total monthly bill attributed to Bandwidth Usage of 500 GB would be $75 (assuming all outgoing traffic).

I could walk through a similar example for the cost of storage but I think you see the point. In fact, it would likely be even worse. Since my website collects videos, it would be more likely that the amount of storage used would grow faster than the Bandwidth used.

As you can see, Azure storage and bandwidth is much cheaper, especially after scaling over the initial allotment.

Note: Storage transactions are a negligible contributor to cost in comparison.

My First Experience with Microsoft Windows Azure

This week I received my Microsoft Windows Azure invitation token, so I began integrating it into a project of mine.  I wanted to use just the Blob Storage to upload and download large media files to the cloud.

At first, I was trying to follow this link as much as possible to get up and running.  The walk-through seemed simple and helped me to quickly understand how to use Blob Storage.  However, once I created the new Azure service in my existing solution I was quickly led to make decisions that influenced the web project I had already created:

  1. In order to get the Azure processes to run in the background while my web application ran, I had to set the Cloud Service project as the startup project. Set As Startup Project
  2. A Cloud Service project cannot just run by itselt, it requires a web role.  The simplest thing to do seemed to be to set this to my pre-existing web project.
  3. This all worked well at first.  My web project recognized my local azure service and worked normally.  However, once I tried to upload files to my local Blob Storage I received errors.  It behaved as if there was an issue with the Trust Level that the web role runs under.

Since Azure is still in CTP (meaning free) and I have a brand new account (meaning no pre-existing files have been uploaded), my solution was to test my web upload with a deployed, in “production”, Blob Storage service.  I updated my web.config file and fairly quickly was able to upload and download files from blob storage.  I was somewhat impressed with how easy it was to do.

Unfortunately, this created a whole new set of concerns.  Since I am forced to use my production account for Blob Storage, how can I test in my local environment going forward?  Once Azure is launched (reportedly on 02/01/2010), any testing I do will begin to incur significant costs and it will also interfere with production data.

After some deep thinking, some more research and stumbling across this MSDN link, I believe I backed my way into the solution that Microsoft imagined in the first place.

Microsoft has allocated a specific development shared key for local blob storage.  Therefore, one only has to use these standard configuration settings in the local Blob Storage service (ServiceConfiguration.cscfg file).

<ConfigurationSettings>
<Setting name=”AccountName” value=”devstoreaccount1″ />
<Setting name=”AccountSharedKey” value=”Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==” />
<Setting name=”BlobStorageEndpoint” value=”http://127.0.0.1:10000/&#8221; />
</ConfigurationSettings>

The piece of the puzzle that I was missing is that this Blob Storage service must run from outside my main solution.  Therefore, I put the Blob Storage service project in its own solution and attached it to a dummy web role.  When I run it, it starts the development Azure services and runs in the background.

Attaching to the Blob Storage service from my web project is easy.  I just make sure that I have the below configuration set in my web.config file.

<configuration>
<appSettings>
<add key = “AccountName” value=”devstoreaccount1″/>
<add key = “AccountSharedKey” value=”Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==”/>
<add key=”BlobStorageEndpoint” value=”http://127.0.0.1:10000″/&gt;
</appSettings>
<configuration>

Now I can test uploads and downloads on my local machine.

Special Note:  The Windows Azure development environment uses a different URI format to access files.  In production, files stored in public containers can be accessed via this format:

http://<account-name&gt;.blob.core.windows.net/<resource-path>

However, in the development environment, a different format is used:

http://<local-machine-address&gt;:<port>/<account-name>/<resource-path>

See this MSDN Article for more information.