Showing posts with label windows azure. Show all posts
Showing posts with label windows azure. Show all posts

Tuesday, October 22, 2013

Geotopia: additional features and sending emails

In the last blog I explained how to use Windows Azure Active Directory and Windows Azure Caching Service. This blog post will dive a bit deeper in these concepts but also adds SendGrid to the solution in order to send emails to users with their temporary password.

The WebAPI Controller will perform the following steps:

1. Create a user in WAAD by using Microsoft.WindowsAzure.ActiveDirectory.GraphHelper. The following snippets achieves this goal. It also creates some temporary password based on a Guid.

Note: your tenant ID can be found on the Windows Azure portal. Go to your application in the directory screen on the portal. Click View Endpoints and you will see a list of endpoints. When you have a look at your OAuth 2.0 token endpoint you will see the URL in following shape:

https://login.windows.net//oauth2/token?api-version=1.0

//add to to Windows Azure Active Directory
            string clientId = CloudConfigurationManager.GetSetting("ClientId").ToString();
            string password = CloudConfigurationManager.GetSetting("ClientPassword").ToString();
            // get a token using the helper
            AADJWTToken token = DirectoryDataServiceAuthorizationHelper.GetAuthorizationToken("", clientId, password);
            // initialize a graphService instance using the token acquired from previous step
            DirectoryDataService graphService = new DirectoryDataService("", token);

            User newWAADUser = new Microsoft.WindowsAzure.ActiveDirectory.User();
            newWAADUser.accountEnabled = true;
            newWAADUser.displayName = user.UserName;
            newWAADUser.mailNickname = user.UserName;
            PasswordProfile pwdProfile = new PasswordProfile();
            pwdProfile.forceChangePasswordNextLogin = true;
            pwdProfile.password = Guid.NewGuid().ToString("N").Substring(1, 10) + "!";
            newWAADUser.userPrincipalName = user.UserName + "@geotopia.onmicrosoft.com";
            newWAADUser.passwordProfile = pwdProfile;
            graphService.AddTousers(newWAADUser);
            var response = graphService.SaveChanges();

I registered "FirstUser" as being the user with an emailaddress I own. As you can see in the next figure, the user is added to the Windows Azure Active Directory.


2. An email is sent to the user with his/her temporary password which is generated in step 1. For sending emails, I use the Windows Azure add-on SendGrid which can be easily configured.

//send email to user by using SendGrid
            SendGrid myMessage = SendGrid.GetInstance();
            myMessage.AddTo(user.EmailAddress);
            myMessage.From = new MailAddress("info@geotopia.com", "Geotopia Administrator");
            myMessage.Subject = "Attention: Your temporary password for Geotopia";
            myMessage.Text = "Your username on Geotopia is:" + user.UserName + "\n\r";
            myMessage.Text += "Temporary password:" + pwdProfile.password + "\n\r";
            myMessage.Text += "\n\r";
            myMessage.Text += "The first time you sign in with your temporary password, you need to change it.";

            // Create credentials, specifying your user name and password.
            var credentials = new NetworkCredential("", "");

            // Create an SMTP transport for sending email.
            var transportSMTP = SMTP.GetInstance(credentials);

            // Send the email.
            transportSMTP.Deliver(myMessage);

After this, I get an email!


3. The user is added to the neo4j graph db. This snippet is already shown in the previous blog post.
4. Add the user to the Windows Azure Cache. This snippets is also shown in the previous blog post.

So, with everything in place now I finalize the search window on geotopia to look for users and start following them. I will blog about this feature in the next few days....

Happy coding!


Monday, October 21, 2013

Geotopia: searching and adding users (Windows Azure Cache Service)

Now that we are able to sign in and add geotopic, the next step would be to actually find users and follow them. The first version of Geotopia will allow everybody to follow everybody, a ring of authorization will be added in the future (users need to allow you to follow them after all).

For fast access and fast search, I decided to use the Windows Azure Cache Service preview to store a table of available users. An entry in the cache is created for everyone that signed up.

First of all, I created a a new cache on the Windows Azure portal.


Azure now has a fully dedicated cache for me up and running. Every created cache has a default cache that is used when no additional information is provided.

Next step is to get the necessary libraries and add them to my webrole. Use the Nuget package manager to get them and search for Windows Azure Caching. This will add the right assemblies and modifies the web.config. It adds a configsection to the configuration file (dataCacheClients).

Now with the cache in place and up and running, I can start adding entries to the cache when somebody signs up and make him/her available in the search screen.

Later on, we can also cache Page Output (performance!) and Session State (scalability).

I also created a controller/view combination that allows users to signup with simply their username and an emailaddress. The temporarily password will be sent to this email account.

Download the Windows Azure AD  Graph Helper at http://code.msdn.microsoft.com/Windows-Azure-AD-Graph-API-a8c72e18 and add it to your solution. Reference it from the project that needs to query the graph and create users.

Summary
The UserController performs the following tasks:
1. Add the registered user to Windows Azure Active Directory by using the Graph Helper
2. Adds the user to the neo4j graph db to enable it to post geotopics
3. Add the user to the Windows Azure Cache to make the user findable.

This snippet does it all.

            string clientId = CloudConfigurationManager.GetSetting("ClientId").ToString();
            string password = CloudConfigurationManager.GetSetting("ClientPassword").ToString();
            // get a token using the helper
            AADJWTToken token = DirectoryDataServiceAuthorizationHelper.GetAuthorizationToken("", clientId, password);
            // initialize a graphService instance using the token acquired from previous step
            DirectoryDataService graphService = new DirectoryDataService("", token);
            //add to Neo4j graph
            GeotopiaUser user1 = new GeotopiaUser(user.UserName, user.UserName, user.UserName + "@geotopia.onmicrosoft.com");

            var geoUser1 = client.Create(user1,
                new IRelationshipAllowingParticipantNode[0],
                new[]
                            {
                                new IndexEntry("GeotopiaUser")
                                {
                                    { "Id", user1.id.ToString() }
                                }
                            });
            //add to cache
            object result = cache.Get(user.UserName);
            if (result == null)
            {
                // "Item" not in cache. Obtain it from specified data source
                // and add it.
                cache.Add(user.UserName, user);
            }
            else
            {
                Trace.WriteLine(String.Format("User already exists : {0}", user.UserName));
                // "Item" is in cache, cast result to correct type.
            }

The modal dialog on the Geotopia canvas searches the cache every time a keydown is notices and displays the users that meet the query input of the dialog.

Monday, October 14, 2013

Geotopia, SignalR and Notification Hub

The next extension of Geotopia is to notify users of updates. Everyone who is following 'me', should get a notification of a topic I posted. Install SignalR by using the package manager console or the Manage nuget packages screen when you right-click the asp.net project. E.g. in the package manage console, run:

Install-Package Microsoft.AspNet.SignalR -Version 1.1.3.

You can also use newer versions of SignalR.

In the webapi controller that handles the creation of Geotopics, I also update all the clients with this newly created Geotopic.

The SignalR hub is lazy loaded by the Geotopic webapi controller.

protected readonly Lazy AdminHub = 
            new Lazy(() => GlobalHost.ConnectionManager.GetHubContext());


The webapi method Post is slightly modified and calls the posted method on the clients.

[System.Web.Http.Authorize]
        public void Post(Geotopic value)
        {    

            //user must be logged in.
            var userNode = client.QueryIndex("GeotopiaUser", IndexFor.Node, String.Format("Id:{0}", User.Identity.Name));

            //now create a topic
            Geotopic newTopic = new Geotopic(value.latitude, value.longitude, value.message);

            var secondtopic = client.Create(newTopic,
                new IRelationshipAllowingParticipantNode[0],
                new[]
                            {
                                new IndexEntry("Geotopic")
                                {
                                     { "Id", newTopic.id.ToString() }
                                }
                            });
            NodeReference reference = userNode.First().Reference;

            client.CreateRelationship(secondtopic, new PostedBy(reference));
            //now SignalR it to all my followers....
            AdminHub.Value.Clients.All.posted(value);
   }

A piece of Javascript makes all of the above happen. It connects to the geotopichub and responds to the "posted" call from the webapi controller.



To show a nice popup on the Geotopiascreen I use Toastr. Get this by: install-package Toastr.

When I post a topic, the Toastr package displays a nice toast:



The SignalR addition enables webclients to be updated on the fly. For future release of mobile clients of Geotopia, I also use the Notification Hub of Windows Azure to enable notifications on mobile devices. To enable this, I create a service bus namespace and a notification hub.


Now I have a notificationhub available. A mechanism to easily update my mobile clients. On the backend side (my webrole), install the service bus package.

Install-Package WindowsAzure.ServiceBus

add use it in your designated class. In my case, I use it in my TopicsController.cs webapi module.

using Microsoft.ServiceBus.Notifications;

I add the following lines to the Post method of my TopicsController:
  NotificationHubClient myNotificationHub = NotificationHubClient.
        CreateClientFromConnectionString("Endpoint=sb://.servicebus.windows.net/;SharedAccessKeyName=DefaultListenSharedAccessSignature;SharedAccessKey=;YOUR_KEY", 
            "YOUR_HUB");
 string toast = "" +
                "" +
                    "" +
                        "A toast from geotopia!" +
                    "
" +                "
";
 Task result = myNotificationHub.SendMpnsNativeNotificationAsync(toast);

In the result we can see the outcome of the notification call.

For now, everybody receives notifications of topics being posted. The next blog post will demonstrate how to make sure only my followers get my topics by using SignalR groups and adding tags for the Notification Hub.

Friday, October 11, 2013

Creating a topic on the Geotopia Canvas

Next step in realizing Geotopia is enabling logged in users to click on the map and add some information for that specific location on the map.

I enable this by:
- use jquery ui to show  a dialog on the screen that takes a piece of text
- use $.ajax to post the data to my webapi controller
- use Neo4j client to create a node containing the piece of text, the location and the user who posted it
- create a PostedBy relationship between the user and the newly created geotopic

Clicking on the map somewhere, displays this model dialog from jquery UI.




Clicking post will do the following to open mydialog and add some data to the dialog (the location as well).

  $("#MyDialog").data('location', location).dialog("open");

Here is where i get my location in the dialog function:
 var source = {
                    'latitude': location.latitude,
                    'longitude': location.longitude,
                 
                    'message': $('#topicmessage').val()
                }
This source is used to send it to my webapi controller.

 $.ajax({
            type: 'POST',
            url: '../api/Topics',
            data: source
          });

This causes my webapi controller to fire off

//user must be logged in.
//i take the username which is indexed in neo4j to get a reference to the right user node.

var userNode = client.QueryIndex("GeotopiaUser", IndexFor.Node, String.Format("Id:{0}", User.Identity.Name));

            //now create a topic
            Geotopic newTopic = new Geotopic(value.latitude, value.longitude, value.Topic);

            var secondtopic = client.Create(newTopic,
                new IRelationshipAllowingParticipantNode[0],
                new[]
                            {
                                new IndexEntry("Geotopic")
                                {
                                     { "Id", newTopic.id.ToString() }
                                }
                            });
            NodeReference reference = userNode.First().Reference;

            client.CreateRelationship(secondtopic, new PostedBy(reference));
            //now SignalR it to all my followers....

The node graph now shows up like this:

Node 163 is the newly created one.

Here are the details of node 163:



The next step will be to use SignalR and let my followers know that I posted something altogether with using the NotificationHub to notify mobile users!




Thursday, October 10, 2013

Setting up Neo4j for Geotopia's graph DB

Since my datamodel probably is going to be very chaotic (lots of relationships between entities) I decided to pick a graph database. Neo4j is my choice and there is an excellent article out there that helps you setup an environment and host it on Azure! Social networks are graphs after all. Querying a social graph can lead to massive and complex joins when you use a SQL RDBMS.

http://blog.jongallant.com/2013/03/neo4j-azure-vs2012.html

After following the instructions on the blog of Jon, I have a cloud service running neo4j and I am ready to store some nodes over there. I have neo4j server running locally and in my Azure space. Running it locally caused some error due to "too long filenames". You can tackle this by changing the output directory of the dev fabric.

To execute CRUD actions on my Neo4j graph db I use Neo4jClient from Readify.

Example: here is a simple representation of a Geotopic and a GeotopicUser.

    public class Geotopic
    {
        public string id { get; private set; }
        public string latitude { get; set; }
        public string longitude { get; set; }
        public string message { get; set; }

        public Geotopic(string latitude, string longitude, string message)
        {
            this.latitude = latitude;
            this.longitude = longitude;
            this.message = message;

            this.id = Guid.NewGuid().ToString();
        }
    }

    public class GeotopiaUser
    {
        public string id { get; set; }
        public string FirstName { get; set; }
        public string LastName { get; set; }

        public GeotopiaUser(string firstName, string lastName, string id)
        {
            this.FirstName = firstName;
            this.LastName = lastName;

            this.id = id;
        }
    }

These entities do have relationships. A user can post a geotopic. A user can follow another user. A user can like a topic. A user can recommend a topic.

First, after a user is logged in by using the Windows Azure Active Directory credentials, I make sure that the GeotopicUser exists in the neo4j graph db. The id of the user is the username that is registered in the actived directory. See the code snippet below on how users are created and how creating relationships between them is accomplished.

Using the Neo4j Client makes it quite easy to create a graph. In this example I create 2 users, one follows the other and a geotopic that is posted by the first user.

            GraphClient client = new GraphClient(new Uri("http://db/data"));
            client.Connect();

             // Create Indexes
            client.CreateIndex("GeotopiaUser", new IndexConfiguration() { Provider = IndexProvider.lucene, Type = IndexType.exact }, IndexFor.Node); // exact node index
            client.CreateIndex("Geotopic", new IndexConfiguration() { Provider = IndexProvider.lucene, Type = IndexType.exact }, IndexFor.Node); // exact node index

            // Create Entities
            // Movies
            GeotopiaUser user1 = new GeotopiaUser("Riccardo", "Becker", "riccardo@geotopia.onmicrosoft.com");

            var geoUser1 = client.Create(user1,
                new IRelationshipAllowingParticipantNode[0],
                new[]
                {
                    new IndexEntry("GeotopiaUser")
                    {
                        { "Id", user1.id.ToString() }
                    }
                });

            //new topic
            Geotopic topic1 = new Geotopic("0.0", "0.0", "First topic!");

            var firsttopic = client.Create(topic1,
                new IRelationshipAllowingParticipantNode[0],
                new[]
                {
                    new IndexEntry("Geotopic")
                    {
                         { "Id", topic1.id.ToString() }
                    }
                });

            client.CreateRelationship(firsttopic, new PostedBy(geoUser1));

            GeotopiaUser user2 = new GeotopiaUser("John", "Doe", "johndoe@geotopia.onmicrosoft.com");

            var geoUser2 = client.Create(user2,
                new IRelationshipAllowingParticipantNode[0],
                new[]
                {
                    new IndexEntry("GeotopiaUser")
                    {
                        { "Id", user2.id.ToString() }
                    }
                });

            //john follows me
            client.CreateRelationship(geoUser2, new FollowRelationship(geoUser1));

Executing this code created the following graph:


Node with id 49 is my first user (riccardo) and node 51 is the second user (John Doe). John follows me and I created one topic. This is setup by creating a "posted_by" relationship between node 50 and 49.

The following code uses the index "GeotopiaUser" and looks up every user in the graph.

            List> list = client.QueryIndex("GeotopiaUser", IndexFor.Node, "Id: *").ToList();

            foreach (Node user in list)
            {
                Console.WriteLine(user.Data.FirstName + " " + user.Data.LastName);

            }

The next post will describe how to post a Geotopic on the canvas of Geotopia and how this is added to the graph DB of Neo4j.

Happy coding!


Thursday, October 3, 2013

The Wireframe of Geotopia

Using the Visual Studio 2013 RC, I created a cloud project with an ASP.NET webrole. When you add a webrole to your cloud project, the screen below appears.



I want to create an MVC app and want to use Windows Azure Active Directory authentication. To enable this you need to create a new Directory in your Active Directory in the Windows Azure portal. When you create a new directory, this screen appears.


Add some users to your directory. In this case, I have "riccardo@geotopia.onmicrosoft.com" for now.
Go back to Visual Studio and select the Change Authencation button. Next, select the Organization Acounts and fill in your directory.


When you click Ok now you need to login with the credentials of one of your users you created in your directory. I log in with riccardo@geotopia.onmicrosoft.com. Next, select Create Project and your MVC5 app is ready to go. Before this application is able to run locally, in your development fabric, you need to change the return URL in the Windows Azure portal. Go to applications in the designated Directory on the Windows Azure portal. The newly created MVC project appears there, in my case it's Geotopia.WebRole. In this screen you see the app url which points to https://localhost:44305. This URL is NOT correct when you run the MVC5 app as a cloud project in the development fabric. Click the application in the portal and select Configure. Change the app URL and the return URL to the correct url when your app runs locally in development fabric. In my case: https://localhost:8080. When you run your app, you get a warning about a problem with the security certificate, but you can ignore this for now. After logging in succesfully, you will be redirected to the correct return URL (configured in the portal) and showing you that you are logged in. I also created a bing map which centers on the beautiful Isola di Dino in Italy with a bird's eye view.



In the next blog I will show how to create a Geotopic on the Bing Map and how to store it in Table Storage and how add your own fotographs to it to show your friends how beautiful the places are you visited. This creates an enriched view, additional to the bing map.





Monday, May 30, 2011

Windows Azure AppFabric Cache next steps

A very straightforward of using Windows Azure Appfabric is to store records from a SQL Azure table (or another source of course).

Get access to your data cache (assuming your config settings are fine, see previous post).

List lookUpItems= null;

DataCache myDataCache = CacheFactory.GetDefaultCache();
lookUpItems = myDataCache.Get("MyLookUpItems") as List;

if (lookUpItems != null) //there is something in cache obviously
{
lookUpItems.Add("got these lookups from myDataCache, don't pick me");
}
else //get my items from my datasource and save it in cache.
{
LookUpEntities myContext = new LookUpEntitites(); //EF
var lookupItems = from lookupitem in myContext.LookUpItems
select lookupitem.ID, lookupitem.Value;
lookUpItems = lookupItems.Tolist();

/* assuming my static table with lookupitems might chance only once a day or so.Therefore, set the expiration to 1 day. This means that after one day after setting the cache item, the cache will expire and will return null */
myDataCache.Add("myLookupItems", lookUpItems , TimeSpan.FromDays(1));
}

Easy to use and very effective. The more complex and timeconsuming your query to your datasource (wherever and whatever it is) the more your performance will benefit from this approach. But, still consider the price you have to pay! The natural attitude of developing for Azure is always: consider the costs of your approach and try to minimiza bandwidth and storage transactions.

Use local caching for speed
You can use local client caching to truely speed up lookups. Remember that changing local cache actually changes the items and changes the items in your comboboxes e.g.

Tuesday, February 22, 2011

A Generic Worker beats all

Windows Azure will charge you for the amount of time your application is up (and maybe running). In order to fully utilize the resources that are at your disposal, you better be efficient with your worker roles in general. This blogpost will be the first in a row showing you how to gain maximum efficiency and still have your workerrole scalable.

The magic behind this is: a generic worker that can handle different tasks.

Consider a workerrole as a program running and in the beginning doing absolutely nothing but listening at a queue in your storage environment. This queue is the broker for your workerrole and will be fed by the outside world (your own apps of apps of others you will serve). What will be in the message is description of a task that the generic worker has to fulfill. The description also contains the location of an uploaded assembly in BLOB somewhere, parameters and other information that is needed for the specific task.

After noticing the message on the queue the workerrole will look for the assembly in blob, load it in an appdomain and start executing the task that is described in the message. It can be a longrunning calculation or a task that listens to another queue where it will be fed task specific messages. It can be a single task being executed just once or a task that will run forever (as long as the app is deployed of course). The workerrole loads the different assemblies and starts executing the different tasks on configurable intervals or when a new message arrives.

Remember, this is a rough description of a generic worker that can be utilized up to the 100%. That's what you need, after all you are paying for it. Don't worry about the CPU getting hot!

To keep this workerrole scalable new instances of the role will need to preload the assemblies already available in the first instance. This requires some administration but hey, that's why we have storage at our disposal. Imagine a generic worker role that has 10's of tasks running. Once task is to provide elasticity to itself! When overrunning a certain limit (CPU, max number of threads, a steep growing number of messages in queue(s)) it will scale itself up! Now that's what i call magic.

Next blog post will show you how the bare generic worker will look like.

Wednesday, February 9, 2011

VM role the sequel

After playing some time with the VM Role beta and stumbling upon strange problems, i found out that VM beta was activated on my CTP08 subscription and not on my regular one. In the Windows Azure portal, having the information uncollapsed, it looks like it's active :-)

Anyway, testing with the VM role on a small instance now. Using remote desktop and testing if using VM role as a replacement for my own local VM images running in our own datacenter is appropriate. So far, it's looking good. The only thing is: we are running stateless. This means that information that needs to be stored should be stored in a cloudway and not to disk or other local options. Use Azure Drive, TFS hosted somewhere, skydrive, dropbox or other cloudservices that let you save information in a reliable way. Saving your work, while running a VM role, on the C: drive might cause a serious loss of the role gets recycled or crashes and it brought up somewhere else (with yet another c: drive). Although the VM role was never invented for being pure IaaS, it's still a nice alternative that can be very usefull in some scenarios.

We'll continue and make some nice differencing disks with specific tools for specific users (developers, testers, desktop workers etc.) and see how it will work. Developing using VS2010 on a 8 core cloudy thing with 14 gig of internal memory is a blessing. Having your sources stored on Azure drive or alternatives and directly connect to your TFS environment by using Azure Connect combines the best of all worlds and gives you a flexible, cost effective but most of all quick way of setting up images and also tearing them down fast.....

Monday, January 24, 2011

VM Role considerations

After experimenting a lot getting the VM role to work a few considerations:

- Take some time (a lot of time actually) to prepare your image and follow all prerequisites on http://msdn.microsoft.com/en-us/library/gg465398.aspx. Two important steps to take: build a base image VHD which will be the parent of all your other differencing disks. Differencing disks contain the specific characteristics of the VM role to upload and run. Typically you won't run your base VHD (it's just W2008R2) but it's the differencing disks that have the value add. Think of a development environment containing Visual Studio and other tools for your developers and/or architects, a specific VHD for testers having the test version of VS2010 installed, desktop environments with just Office tooling etc.
- don't bother trying to upload your sysprep'd W2008R2 VHD from Windows 7 :-)
For some reasons after creating the VHD with all the necessary tools on it, the csupload still causes some Hyper-V magic to happen. The thing is, Hyper-V magic is not on Windows 7.
- Use the Set-Connection switch of the csupload app to set a "global" connection, written to disk, in your command session and take it from there.
- We started struggling from here concerning the actual csupload. The following message was displayed:



It tells me that the subscription doesn't have the VM role Beta enabled yet. The things is....i did!



I'll just continue the struggle and get it to work....if you have suggestions please let me know, here or on twitter @riccardobecker.

Tuesday, January 4, 2011

Things to consider when migrating to Azure part 2

Here some other issues i stumbled upon by selflearning and researching around migrating your current onpremise apps to Azure. As mentioned before, just having things run in the cloud is not that difficult, but having things run in a scalable, well designed, fully using the possibilities in Azure, in a cost efficient way is something different. Here are some other things to consider.

- To be able to meet the SLA's you need to assure that your app runs with a minimum of two instances (rolecount = 2 in your configuration file per deployment of web, worker or VM role)
- To make things easy as possible and make as few changes as possible consider using SQL Azure Migration Wizard to migrate onpremise databases to sql azure databases (http://sqlazuremw.codeplex.com/)
- Moving your intranet applications to Azure probably requires changes in your authentication code. While intranet apps commonly use AD for authentication, webapps in the cloud still can use your AD information but you need to setup AD federation or use a mechanism like Azure Connect to enable the use of AD in your cloud environment.
- After migrating your SQL Database to the Cloud you need to change your connectionstring but also realize that you need to "connect" to a database and that you cannot use in your code. SQL Azure is about connecting to databases itself. Also realize that it is not possible to use Windows Authentication. Encrypt your web.config or other config files where your connectionstrings reside. It's a good habit to treat your application as "insecure" at all times and use proper Thread Model to put your finger on possible security breaches. This will keep you alert in any design decision you make regarding "security". Look at http://msdn.microsoft.com/en-us/library/ms998283.aspx how you can encrypt your configuration files using RSA.
- Additional security to your SQL Azure assets can be provided by using the firewall which allows you to specificy IP addressess that are allowed to connect to your SQL Azure database.

I'll post more on this blog when i stumble upon more...

Tuesday, December 7, 2010

Performance Counters and Scaling

To make the right decisions on scaling up or scaling down your Windows Azure instances you need to have good metrics. One of these metrics can be performance counters like CPU utilization, free memory etc. Performance counters can be added during startup of a worker or webrole or from a distant. Adding a perfcounter within the logic of the role itself is fairly easy.

public override bool OnStart()
{

var config = DiagnosticMonitor.GetDefaultInitialConfiguration();


// Adding CPU performance counters to the default diagnostic configuration
config.PerformanceCounters.DataSources.Add(
new PerformanceCounterConfiguration()
{
CounterSpecifier = @"\Processor(_Total)\% Processor Time",
//do the actual probe every 5 seconds.
SampleRate = TimeSpan.FromSeconds(5)
});

//transfer the gathered perfcount data to my storage account every minute
config.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);

// Start the diagnostic monitor with the modified configuration.
// DiagnosticsConnectionString contains my cloudstorageaccount settings
DiagnosticMonitor.Start("DiagnosticsConnectionString", config);
return base.OnStart();
}

From now on this role will gather perfcounters for ever giving you the ability to analyze the data.

How to add a performance counter from a distant?

//Create a new DeploymentDiagnosticManager for a given deployment ID
DeploymentDiagnosticManager ddm = new DeploymentDiagnosticManager(csa, this.PrivateID);

you need a storage account (where your WADPerformanceCounter table is) and the deploymentID of your deployment(multiple instances of a web and/or workerrole is possible since every instance is a running VM eventually).

//Get the role instance diagnostics manager for all instance of the a role
var ridm = ddm.GetRoleInstanceDiagnosticManagersForRole(RoleName);


//Create a performance counter for processor time
PerformanceCounterConfiguration pccCPU = new PerformanceCounterConfiguration();
pccCPU.CounterSpecifier = @"\Processor(_Total)\% Processor Time";
pccCPU.SampleRate = TimeSpan.FromSeconds(5);

//Create a performance counter for available memory
PerformanceCounterConfiguration pccMemory = new PerformanceCounterConfiguration();
pccMemory.CounterSpecifier = @"\Memory\Available Mbytes";
pccMemory.SampleRate = TimeSpan.FromSeconds(5);

//Set the new diagnostic monitor configuration for each instance of the role
foreach (var ridmN in ridm)
{
DiagnosticMonitorConfiguration dmc = ridmN.GetCurrentConfiguration();
//Add the new performance counters to the configuration
dmc.PerformanceCounters.DataSources.Add(pccCPU);
dmc.PerformanceCounters.DataSources.Add(pccMemory);

//Update the configuration
ridmN.SetCurrentConfiguration(dmc);
}

By applying the code above for a certain role (specified by RoleName), every instance of that role is being monitored. Both CPU and memory perf counters are sampled and written to the cloudstorageaccount provided. The data of every 5 seconds per instance is eventually written to storage.

So how make the right decisions on scaling? As you might notice your raw data in storage doesn't help you and only shows sparks. To discover trends or averages you need to apply e.g. smoothing to your data in order to get a readable graph. See the graph below to see the difference.




As you can see the "smoothed" graph is far more readable and shows you exactly how your roleinstance is performing and whether or not it needs to be scaled up/down.

Smoothing algorithms like simple or weighted moving average can help you make better decisions rather than examining the raw data from the WADPerformancecountersTable in your storage account. Implement this smoothing, have it run in a workerrole in the same datacenter and avoid massive bandwidth costs if you examine weeks or even months of performance counter data.

Good luck with it!

Monday, January 4, 2010

Go to Dallas and pinpoint your needs

Dallas is Microsoft's marketplace where data providers and consumers can meet each other through a simple platform. This platform takes care of billing and makes it easy to subscribe and unsubscribe from datasets. Dallas enables developers to easily integrate rich data from others into their own application or service. Dallas offers owners of rich data an easy way to offer it to a broad audience and increase their revenue or exposure.

Microsoft PinPoint is web platform that enables companies or individuals to find SME's, apps, solutions and services. On the other hand, it enables partners from Microsoft to put them self in the spotlights and show what company they are and what they have to offer. PinPoint is a Microsoft Yellow Pages like directory in which you can browse or search and find the company or solution you need.

As per the 4th of January, 7567 apps, 35953 companies and 13448 Professional services are enlisted on PinPoint. Although the number of "datasets" on PinPoint are as per the 4th of January about 10 available but lots more coming soon. You can see several organizations offer their data for free like AP, Nasa and UNData. You can subscribe on these datasets and use their data in your own application or service. Dallas data can be found through the PinPoint portal and search for application on the Windows Azure Platform but also by using the Dallas portal and choose the catalog tab.

Consuming Dallas datasets is very easy (Microsoft really lowered the bar by defining a standard set of API's) and you just need an invitation code for Dallas. Go the the Dallas homepage and get an invite.

Consume some data
Go the Dallas portal and login with your account. Click Catalog and you should see something like this:



Subscribe on the Associated Press Online Trial Offer. After agreeing the AP Online appears on your subscription tab which opens automatically. You can explore the features of this dataset by clicking the "Click here to explore the dataset". A query screen opens and offers the opportunity to query the data and test the data before integrating it into you application. A very nice feature is the "Download C# service classes" which generates a C# class that wraps the AP dataset and simplifies the access to the dataset (or parts of the dataset). Add this C# file to your application (e.g. a console application) and add the following code to your Main.

using Microsoft.Dallas.Services;

static void Main(string[] args)
{
string accountKey = "***** your accountKey *******";
Guid uniqueUserID = new Guid("**** a userID as a GUID, any GUID****");

Microsoft.Dallas.Services.NewsCategoriesService service =
new NewsCategoriesService(accountKey, uniqueUserID);

List results = service.Invoke();

foreach (NewsCategoriesItem item in results)
{
Console.WriteLine(item.Content);
}
Console.ReadLine();
}

Your output should be like this:



As you can see importing and using data from the Dallas marketplace is fairly easy and that's just how Microsoft wanted it to be. Easy to use and easy to adopt.

This AP dataset is free for now but might be a commercial one in the future. On your Dallas portal at the Access Report tab you can see how many times you've accessed a dataset and see if the billing is adequate.



In the Account Keys tab you can manage account keys and create new ones. What's the use of creating multiple account keys? To differentiate costs to different logical users (can also be entities like departments, companies etc.) for adequate billing if you want to have your customers pay for using your application or service.

Besides consuming data into your application or service you can also use PowerPivot to extend Excel and turn it into your own little data warehouse. PowerPivot enables you to easily consume Dallas datasets but also to import your own data and do some data mining on it. A nice example on PDC2009 was to have the weatherreports from Dallas and combine them with the sales results of an ice cream company and conclude, how shocking, that the sales on chocolate ice radically dropped on a freezing winter day.

As you saw Dallas is a marketplace of data where consumers and producers can meet and easily do business. Dallas data can also be found on the PinPoint portal (and that's how they are related) of Microsoft and therefor Dallas exactly fits in the Microsoft strategy. Consuming data takes no more than 2 lines and use PowerPivot to enhance Excel and turn it into your local data warehouse and perform some nice drill downs.

So far for PinPoint and Dallas.