157 posts categorized "Distributed Applications"

04/01/2011

More identity in da house! thinktecture IdentityServer first code drop

Hot, very hot: Dominick just released the all new, all cool (and super modern Smile) thinktecture IdentityServer as a CTP on CodePlex.

Got get it – go and give us feedback.
Thanks!

01/28/2011

thinktecture WSCF.blue has hit the 10,000 downloads landmark on CodePlex

After thinktecture StarterSTS has hit the 5,000 downloads mark a few weeks ago (congrats again, Dom!), we now have hit 10,000 for our WCF-based Web Services contract-first tool WSCF.blue.

[Want to learn more about it and the idea behind it…? Read this MSDN Magazine article]

Thanks to a great team!

image

01/10/2011

thinktecture StarterSTS now officially ‘powered by Windows Azure’

A few hours ago I got the final notice that StarterSTS is now officially allowed admittance to the Azure Cloud olymp:

 

OK, Dominick: up to releasing 1.5… Smile

01/06/2011

Microsoft Most Valuable Professional (MVP) for Windows Azure [Architecture]

Thanks to Microsoft for being part of this community.

image

12/29/2010

Writing trace data to your beloved .svclog files in Windows Azure (aka ‘XmlWriterTraceListener in the cloud’)

Tracing is probably one of the most discussed topics in the Windows Azure world. Not because it is freaking cool – but because it can be very tedious and partly massively counter-intuitive.

One way of doing tracing is to use System.Diagnostics features like traces sources and trace listeners. This has been in place since .NET 2.0. Since .NET 3.0 and the rise of WCF (Windows Communication Foundation) there was also extensive usage of the XmlWriterTraceListener. We can see numberless occurrences of the typical .svclog file extension in many .NET projects around the world. And we can view these files with the SvcTraceViewer.exe tool from the Windows SDK.

All nice and well. But what about Windows Azure?
In Windows Azure there is a default trace listener called Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener from the Microsoft.WindowsAzure.Diagnostics assembly.

If you use this guy and want to trace data via trace sources, your data will be stored in Windows Azure Storage tables. Take some time to play around with it and find out that the data in there is close to useless and surely not very consumer-friendly (i.e. try to search for some particular text or error message. Horror).

So, taking these two facts I thought it would be helpful to have a custom trace listener which I can configure just through my config file which uses Azure local storage to store .svclog files. From there on I am using scheduled transfers (which I demonstrated here) to move the .svclog files (which are now custom error logs for Windows Azure) to Azure blob storage. From there I can just open them up with the tool of my choice.

Here is the simplified code:

using System.Configuration;
using System.Diagnostics;
using System.IO;
using Microsoft.WindowsAzure.ServiceRuntime;

namespace Thinktecture.Diagnostics.Azure
{
    public class LocalStorageXmlWriterTraceListener : XmlWriterTraceListener
    {        
        public LocalStorageXmlWriterTraceListener(string initializeData)
            : base(GetFileName(initializeData))
        {
        }

        public LocalStorageXmlWriterTraceListener(string initializeData, string name)
            : base(GetFileName(initializeData), name)
        {
        }

        private static string GetFileName(string initializationData)
        {
            try
            {
                var localResourceItems = initializationData.Split('\\');
                var localResourceFolder = localResourceItems[0];
                var localResourceFile = localResourceItems[1];

                var localResource = RoleEnvironment.GetLocalResource(localResourceFolder);

                var fileName = Path.Combine(localResource.RootPath, localResourceFile);

                return fileName;
            }
            catch
            {
                throw new ConfigurationErrorsException("No valid Windows Azure local 
resource name found in configuration."); } } } }


In my Azure role (a worker in this particular case, but it also works with a web role) I configure the trace listener like this:

<configuration>
  <system.diagnostics>
    <trace autoflush="true">
      <listeners>
        <add type="Thinktecture.Diagnostics.Azure.LocalStorageXmlWriterTraceListener, 
               AzureXmlWriterTraceListener, Version=1.0.0.0,
               Culture=neutral, PublicKeyToken=null"
             name="AzureDiagnostics"
             initializeData="TraceFiles\worker_trace.svclog" />
      </listeners>
    </trace>
    </system.diagnostics>
</configuration>


After scheduling the transfer of my log files folder I can use a tool like Cerebrata’s Cloud Storage Studio to look at my configured blob container (named traces)– and I can see my .svclog file.

image


Double-clicking on the file in blob storage opens it up in Service Trace Viewer. From here on it is all the good ole’ tracing file inspection experience Winking smile

image


Note: as you can see the Service Trace Viewer tool is not just for WCF – but you knew that before!

 

UPDATE: this does not properly work with Azure SDK 1.3 and Full IIS due to permission issues – there is more information in the SDK release notes. Very unfortunate Sad smile


Hope this helps.

Transferring your custom trace log files in Windows Azure for remote inspection

You can write your trace data explicitly to files or use tracing facilities like .NET’s trace source and listener infrastructure (or third party frameworks like log4net or nlog or…). So, this is not really news Smile

In your Windows Azure applications you can configure the diagnostics monitor to include special folders – which you obtain a reference to through a local resource in VM’s local storage - to its configuration whose files will then be transferred to the configured Azure Storage blob container.

Without any further ado:

public override bool OnStart()
{
    Trace.WriteLine("Entering OnStart...");

    var traceResource = RoleEnvironment.GetLocalResource("TraceFiles");
    var config = DiagnosticMonitor.GetDefaultInitialConfiguration();
    config.Directories.DataSources.Add(
        new DirectoryConfiguration
        {
            Path = traceResource.RootPath,
            Container = "traces",
            DirectoryQuotaInMB = 100
        });
    config.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(10);

    DiagnosticMonitor.Start("DiagnosticsConnectionString", config);
         
    return base.OnStart();
}        


Note: remember that there are special naming conventions in Azure Storage, e.g. for naming blob storage containers. So, do not try to to use ‘Traces’ as the container in the above code!

And a side note: of course this whole process incurs costs. Costs for data storage in Azure Storage. Costs for transactions (i.e. calls) against Azure Storage and costs for transferring the data from Azure Storage out of the data center for remote inspection.

Alright – this is the base for the next blog post which shows how to use a well-known trace log citizen from System.Diagnostics land in the cloud.

Hope this helps (so far).

12/28/2010

Monitoring Windows Azure applications with System Center Operations Manager (SCOM)

Windows Azure contains a few options to collect monitoring data at runtime, including event log data, performance counters, Azure logs, your custom logs, IIS logs etc. There are not really good ‘official’ monitoring tools by Microsoft – besides a management pack for System Center Operation Manager (SCOM).

To get started monitoring your Azure applications with an enterprise-style systems management solution, you need to do the following:

  • Install SCOM. SCOM is a beast, but the good news is you can install all the components (including Active Directory and SQL Server) on a single server. Here is a very nice walkthrough – long and tedious, but very good.
  • Download and import the Azure management pack (MP) for SCOM. Note that the MP is still RC at the time of this writing but Microsoft support treats it like an RTM version already.
  • Follow the instructions in the guide from the download page on how to discover and start monitoring your Azure applications.

Voila. If everything worked then you will see something like this:

SCOM Azure MP in action

 

Note: This is a very ‘enterprise-y’ solution – I surely hope to see a more light-weight solution by Microsoft soon targeted at ISVs and the like.


Hope this helps.

Running a 32-bit IIS application in Windows Azure

Up there, everything is 64 bits. By design.

What to do if you have 32-bit machines locally – erm, sorry: on-premise – and want to move your existing applications, let’s say web applications, to Windows Azure?

For this scenario you need to enable 32-bit support in (full) IIS in your web role. The following is a startup task script that enables 32-bit applications in IIS settings by using appcmd.exe.

enable32bit.cmd:

%windir%\system32\inetsrv\appcmd set config -section:applicationPools
-applicationPoolDefaults.enable32BitAppOnWin64:true


And this is the necessary startup task defined in the service definition file:

ServiceDefinition.csdef:

<Startup>
   <Task commandLine="enable32bit.cmd" executionContext="elevated" taskType="simple" />
</Startup>


Hope this helps.

12/17/2010

Windows Azure VM Role is still PaaS - if you want IaaS choose Amazon EC2, seriously

‘Nuff said?

Maybe we could blame Microsoft to name the new VM role feature (Windows Server 2008 R2 is supported as guest OS, 2011 should show broader OS support) introduced at PDC10 in a confusing way – but the fact remains:
it is based on the Azure service model & the Windows Azure Fabric Controller (FC) is still in charge of everything. Although you uploaded your own prepared VM the FC may and will decide to take your VM instances offline, start new instances, reprogram the load balancers etc.

Yes, Windows Azure Compute is about PaaS (Platform-as-a-Service), also with the VM role now in place. Don’t get confused by the new role feature name. Your applications (and thus roles!) need to be state-agnostic.

BTW: What is not possible with the VM role today is automatic OS updates/patching.
This means that you have no feasible option to have an up-to-date OS. When you try to run Windows Update this might work (actually it should). But then two things can happen:

  • your VM needs to reboot due to the Windows Update patches
  • FC decides to reboot your VM, or take it offline and hook up an new instance

In either case you end up with your original VM – bingo (and in the first case you will feel like in Groundhog Day… “Hey babe –dududu – I got you babe…”). Therefore, the official hands-on lab illustrates to disable WU entirely.

Think about the Windows Azure VM role, twice.

12/16/2010

Sending JSON push notification messages to Urban Airship service with C#

A quick follow-up from a previous post on how to receive push notifications in iOS, e.g.through the Urban Airship cloud service: this time I will show you some snippets you can use to send out notifications from your code to Urban Airship which in turn then notifies subscribed devices/apps. This code can be running in a service (like in my app it is a WCF service) or you can use it anywhere else you see fit.

They have a really straight-forward JSON API for sending push messages.
This time I tried to model the required JSON payload with WCF’s data contract feature. Following is part of the supported JSON payload modeled in C# (mind the necessary body wrapper):

[DataContract(Name = "apsNotification")]
public class APSNotification
{
    [DataMember(Name = "aps")]
    public APSBody APS { get; set; }

    [DataContract(Name = "apsBody")]
    public class APSBody
    {
        [DataMember(Name = "badge")]
        public int Badge { get; set; }

        [DataMember(Name = "alert")]
        public string Alert { get; set; }

        [DataMember(Name = "sound")]
        public string Sound { get; set; }
    }

    [DataMember(Name = "aliases")]
    public List<string> Aliases { get; set; }

    public string ToJsonString()
    {
        var ms = new MemoryStream();
        var ser = new DataContractJsonSerializer(typeof(APSNotification));
        ser.WriteObject(ms, this);
        ms.Seek(0, SeekOrigin.Begin);
        var sr = new StreamReader(ms);
        var result = sr.ReadToEnd();
        ms.Close();
        ms.Dispose();

        return result;
    }
}

With the help of a little helper class called PushNotifications we can now send out push notifications (given that we have an account with Urban Airship and set up all necessary application-specific thing) – exception handling omitted:
public static class PushNotifications
{
    public static void SendNotificationMessage(
        string alertText, int badge, string sound,
        string alias, string username, string password)
    {
        var aps = new APSNotification
        {
            APS = new APSNotification.APSBody
            {
                Badge = badge,
                Alert = alertText,
                Sound = sound
            },
            Aliases = new List<string>
            {
                alias
            }
        };
        var json = aps.ToJsonString();

        var uri = new Uri("https://go.urbanairship.com/api/push/");
        var encoding = new UTF8Encoding();
        var request = (HttpWebRequest)WebRequest.Create(uri);
        request.Method = "POST";
        request.Credentials = new NetworkCredential(username, password);
        request.ContentType = "application/json";
        request.ContentLength = encoding.GetByteCount(json);

        using (var stream = request.GetRequestStream())
        {
            stream.Write(encoding.GetBytes(json), 0, encoding.GetByteCount(json));
            stream.Close();
            var response = request.GetResponse();
            response.Close();
        }
    }
}


Now, when anything ‘of interest’ happens in my WCF services I can notify my iOS users by using the above code.

Hope this helps.