Monday, January 24, 2011

VM Role considerations

After experimenting a lot getting the VM role to work a few considerations:

- Take some time (a lot of time actually) to prepare your image and follow all prerequisites on http://msdn.microsoft.com/en-us/library/gg465398.aspx. Two important steps to take: build a base image VHD which will be the parent of all your other differencing disks. Differencing disks contain the specific characteristics of the VM role to upload and run. Typically you won't run your base VHD (it's just W2008R2) but it's the differencing disks that have the value add. Think of a development environment containing Visual Studio and other tools for your developers and/or architects, a specific VHD for testers having the test version of VS2010 installed, desktop environments with just Office tooling etc.
- don't bother trying to upload your sysprep'd W2008R2 VHD from Windows 7 :-)
For some reasons after creating the VHD with all the necessary tools on it, the csupload still causes some Hyper-V magic to happen. The thing is, Hyper-V magic is not on Windows 7.
- Use the Set-Connection switch of the csupload app to set a "global" connection, written to disk, in your command session and take it from there.
- We started struggling from here concerning the actual csupload. The following message was displayed:



It tells me that the subscription doesn't have the VM role Beta enabled yet. The things is....i did!



I'll just continue the struggle and get it to work....if you have suggestions please let me know, here or on twitter @riccardobecker.

Tuesday, January 4, 2011

Things to consider when migrating to Azure part 2

Here some other issues i stumbled upon by selflearning and researching around migrating your current onpremise apps to Azure. As mentioned before, just having things run in the cloud is not that difficult, but having things run in a scalable, well designed, fully using the possibilities in Azure, in a cost efficient way is something different. Here are some other things to consider.

- To be able to meet the SLA's you need to assure that your app runs with a minimum of two instances (rolecount = 2 in your configuration file per deployment of web, worker or VM role)
- To make things easy as possible and make as few changes as possible consider using SQL Azure Migration Wizard to migrate onpremise databases to sql azure databases (http://sqlazuremw.codeplex.com/)
- Moving your intranet applications to Azure probably requires changes in your authentication code. While intranet apps commonly use AD for authentication, webapps in the cloud still can use your AD information but you need to setup AD federation or use a mechanism like Azure Connect to enable the use of AD in your cloud environment.
- After migrating your SQL Database to the Cloud you need to change your connectionstring but also realize that you need to "connect" to a database and that you cannot use in your code. SQL Azure is about connecting to databases itself. Also realize that it is not possible to use Windows Authentication. Encrypt your web.config or other config files where your connectionstrings reside. It's a good habit to treat your application as "insecure" at all times and use proper Thread Model to put your finger on possible security breaches. This will keep you alert in any design decision you make regarding "security". Look at http://msdn.microsoft.com/en-us/library/ms998283.aspx how you can encrypt your configuration files using RSA.
- Additional security to your SQL Azure assets can be provided by using the firewall which allows you to specificy IP addressess that are allowed to connect to your SQL Azure database.

I'll post more on this blog when i stumble upon more...

generic TableStorage

TableStorage is an excellent and scalable way to store your tabular data in a cheap way. Working with tables is easy and straightforward but writing classes for every single one of them is not necessary by using generics.

public class DynamicDataContext : TableServiceContext where T : TableServiceEntity
{
private CloudStorageAccount _storageAccount;
private string _entitySetName;

public DynamicDataContext(CloudStorageAccount storageAccount)
: base(storageAccount.TableEndpoint.AbsoluteUri, storageAccount.Credentials)
{
_storageAccount = storageAccount;
_entitySetName = typeof(T).Name;
var tableStorage = new CloudTableClient(_storageAccount.TableEndpoint.AbsoluteUri, _storageAccount.Credentials);
}
public void Add(T entityToAdd)
{
AddObject(_entitySetName, entityToAdd);
SaveChanges();
}

public void Update(T entityToUpdate)
{
UpdateObject(entityToUpdate);
SaveChanges();
}

public void Delete(T entityToDelete)
{
DeleteObject(entityToDelete);
SaveChanges();
}

public IQueryable Load()
{
return CreateQuery(_entitySetName);
}
}

This is all you need for addressing your tables, adding, updating and deleting entities. And best of all, unleashing Linq at your entities!

See here how to use it for example with your performancecounter data (WADPerformanceCountersTable) in your storage account.

Microsoft.WindowsAzure.StorageCredentialsAccountAndKey sca = new Microsoft.WindowsAzure.StorageCredentialsAccountAndKey("performancecounters",
"blablablablabla");
Microsoft.WindowsAzure.CloudStorageAccount csa = new Microsoft.WindowsAzure.CloudStorageAccount(sca, true);

var PerformanceCountersContext = new DynamicDataContext(csa);

//fire your Linq
var performanceCounters = (from perfCounterEntry in PerformanceCountersContext.Load()

where (perfCounterEntry.EventTickCount >= fromTicks) &&
(perfCounterEntry.EventTickCount <= toTicks) &&
(perfCounterEntry.CounterName.CompareTo(PerformanceCounterName) == 0)
select perfCounterEntry).ToList();