Web API Feed

Session-Materialien von der BASTA! Spring 2015

Hier sind die Folien zu meinen Sessions auf der BASTA! Spring 2015 in Darmstadt:

 

Vielen lieben Dank an alle Teilnehmer und an die Organisation der Konferenz!

Bis zum nächsten Mal Smile


Installing & Running ASP.NET vNext (Alpha 3) on Ubuntu Linux with Mono 3.8–for real

Yesterday I thought I would try and prove Microsoft’s promise that the new and overall cool ASP.NET vNext would run on many platforms, including Windows, MacOS X, and Linux.

My target platform for this experiment was Linux. First thing to do in this case is check out of of the myriads of Linux distros. After a bit of investigating I chose the Ubuntu-based Xubuntu. Very slick!

After installing Xubunutu (it took only a few minutes, literally) in a Parallels VM on my MacBook Pro, I went ahead to get Mono running. Turns out the the most reliable way to get recent versions of Mono running on stable Linux distros these days is to suck down the source code and compile it.
So here we go with some simple steps (may take a couple of minutes to get through, though):

sudo apt-get install build-essential
wget http://download.mono-project.com/sources/mono/mono-3.8.0.tar.bz2
tar -xvf mono-3.8.0.tar.bz2
cd mono-3.8.0/
./configure --prefix=/usr/local
make
sudo make install


This gives us the Mono 3.8.0 CLI:

mono_cli


According to the ASP.NET vNext Home repo on GitHub this should be all we need to get started with the samples. NOT. When we build the source of the samples we get a network-related exception showing up.

The issue is the certificates used for the package sources. The .NET Framework on Windows uses the Windows Certificates store to check whether to accept an SSL certificate from a remote site. In Mono, there is no Windows Certificate store, it has its own store. By default, it is empty and we need to manage the entries ourselves.

CERTMGR=/usr/local/bin/certmgr
sudo $CERTMGR -ssl -m https://go.microsoft.com
sudo $CERTMGR -ssl -m https://nugetgallery.blob.core.windows.net
sudo $CERTMGR -ssl -m https://nuget.org
sudo $CERTMGR -ssl -m https://www.myget.org/F/aspnetvnext/

mozroots --import --sync

After these steps we should be able to build the samples from the ASP.NET vNext Home repo successfully, e.g. the HelloMvc sample.

Running kpm restore looks promising, at least:

kpm_restore

When we then try to run the sample with k kestrel (Kestrel is the current dev web server in vNext) we get a wonderfully familiar error:

Object reference not set to an instance of an object.

Hooray!!! Sad smile

After some Googling I found out that the underlying libuv (yes, that same libuv used in node.js) seems to be the problem here. A quick chat with the ASP.NET vNext team reveals it seems like this will be fixed in the future, but there are some more important things to get done by now.

Anyway, I got it finally to work by replacing libuv as suggested here:

ln -sf /usr/lib/libuv.so.11 native/darwin/universal/libuv.dylib

 

When I now run k kestrel again everything is fine:

k_kestrel_running

By default, Kestrel runs on port 5004 – thus opening our browser of choice with http://localhost:5004 gives us the desired result:

running

Mission accomplished.
Thanks for listening.


BASTA! Spring 2014: Unterlagen

Die BASTA! Spring ist rum, und wieder einmal war es super. Danke an die Organisierer und natürlich an die Teilnehmer!

Hier sind die Folien zu meinen Vorträgen:


Zu den Ganztages-Workshops am Montag und Freitag gibt es naturgemäß keine Folien Winking smile

Für den Freitags-Workshop “End-to-End-Implementierung einer Cross-Platform Modern Business Application” gibt es hier das “laufende” GitHub-Repository, in das Ingo Rammer und ich während des Tages immer hinein ge-psuhed haben.

https://github.com/thinktecture/basta-endtoend

 

Viel Spaß!


Hands-On Course: "ASP.NET Web API & SignalR: lightweight web-based architectures for you!"

Come and join me Oct 30-31, 2013 in Oslo for a two days hands-on course organized by ProgramUtvikling.

ASP.NET Web API & SignalR: lightweight web-based architectures for you!"

Time for change: whether it is a classic desktop application, a public or internal web site or a variety of mobile apps - customers and end-users long for multiple ways to access and work with data and processes. Web-based architectures, patterns and techniques can make the life of software architects and developers considerably easier in the face of these requirements.

Description:
Let's face it! The current trends and developments especially in the area of mobile platforms & devices and cloud computing will cause a re-thinking in architectural aspects for many software projects and products. If you ignore this today you may be in big trouble tomorrow. How can I reach a plethora of users and client devices? How can I integrate my systems and application parts in a lightweight and interoperable manner? How am I able to push data in near-real-time fashion from my services to my clients?
This course tries to answer these and more questions. Christian Weyer will show you in a pragmatic way how to face the new challenges. See all of this coming to life by using technologies and frameworks like ASP.NET Web API, SignalR, .NET- and HTML5/JavaScript-based clients - mobile or not.

 

More information at the ProgramUtvikling web site.
See you there!

 


New ebook chapter published: Properly integrating SignalR hubs with your AngularJS applications

Ingo and I just published a new book chapter of our henriquat.re online (continuously deployed) ebook.
The topic this time is about properly integrating SignalR hubs with your AngularJS applications to realize near realtime push communication. For web browser, desktop or mobile apps.

"Pushing Data: Integrating With ASP.NET SignalR Hubs"

In modern applications the end users want to get their data. They want it now, they want it up-to date. In fact it does not matter whether these are pure web application, native desktop installations or mobile apps: everybody wants his data now!

For .NET-minded developers there are a numbers of options to implement near-real-time push style communication from the server/the services to the clients/consumers. You can choose plain HTTP or the super-new WebSockets features available in .NET 4.5 together with Windows 8 and Windows Server 2012. But the coolest and increasingly popular approach is to use a new framework: ASP.NET SignalR.

While it is not intended- and surely beyond the scope of this ebook - to give an introduction or even deep dive into SignalR, we need to have a look at some concepts and code in order to realize a smooth integration of SignalR and AngularJS.

The final goal of this chapter is to have an AngularJS-style integration of calling and listening to server-side SignalR push services.

 

Enjoy!


Infos zu meinen Sessions & dem Workshop auf der BASTA! Spring 2013

Erst einmal nochmals vielen lieben Dank an alle, die in meine beiden Breakout Sessions und meinen Ganztages-Workshop in Darmstadt gekommen sind!

Da ich ja in meinen beiden Sessions keine "wirklichen" Foliensätze verwendet hatte sondern Demos und Code gezeigt habe, gibt es dieses Mal nix zum Downloaden ;) Beim Workshop war es ja genauso - interaktiv!

Hier aber die versprochenen Links zum Code & den Demos:

Danke und bis bald.

 


Ain’t no IIS: Self-hosting thinktecture IdentityServer v2 – a simple proof-of-concept

There have been a couple of people asking for a sample how to host the ‚non-visual' parts of thinktecture IdentityServer v2 outside of IIS & ASP.NET. E.g. in a Windows or a Console (no, not really…) application.

Here on GitHub you will find a very simple simple PoC which hosts the OAuth2 token endpoint. That said, it is obviously by no means feature complete.
This endpoint uses ASP.NET Web API and thus self-hosting is kinda piece of cake.

namespace SelfHostConsoleHost
{
    internal class SelfHostServer
    {
        private HttpSelfHostServer selfHost;

        [Import]
        public IConfigurationRepository ConfigurationRepository { get; set; }

        public async void Start(string baseAddress)
        {
            var httpConfig = new HttpSelfHostConfiguration(baseAddress);

            Database.SetInitializer(new ConfigurationDatabaseInitializer());

            Container.Current = new CompositionContainer(new RepositoryExportProvider());
            Container.Current.SatisfyImportsOnce(this);

            ProtocolConfig.RegisterProtocols(httpConfig, ConfigurationRepository);

            selfHost = new HttpSelfHostServer(httpConfig);

            await selfHost.OpenAsync();
        }

        public async void Stop()
        {
            if (selfHost != null)
            {
                await selfHost.CloseAsync();
            }
        }
    }
}

As said, it just offers one endpoint:

namespace SelfHostConsoleHost
{
    public class ProtocolConfig
    {
        public static void RegisterProtocols(HttpConfiguration httpConfiguration, IConfigurationRepository configuration)
        {
            // necessary hack for now - until the DI implementation has been changed
            var a = Assembly.Load("Thinktecture.IdentityServer.Protocols");
 
            var clientAuthConfig = CreateClientAuthConfig();

            httpConfiguration.MessageHandlers.Add(new RequireHttpsHandler());

            if (configuration.OAuth2.Enabled)
            {        
                httpConfiguration.Routes.MapHttpRoute(
                    name: "oauth2token",
                    routeTemplate: Thinktecture.IdentityServer.Endpoints.Paths.OAuth2Token,
                    defaults: new { controller = "OAuth2Token" },
                    constraints: null,
                    handler: new AuthenticationHandler(clientAuthConfig, httpConfiguration)
                );
            }
        }
 
        public static AuthenticationConfiguration CreateClientAuthConfig()
        {
            var authConfig = new AuthenticationConfiguration
            {
                InheritHostClientIdentity = false,
                DefaultAuthenticationScheme = "Basic",
            };

            // accept arbitrary credentials on basic auth header,
            // validation will be done in the protocol endpoint
            authConfig.AddBasicAuthentication((id, secret) => true, retainPassword: true);
 
            return authConfig;
        }
    }
}

Again: the code is here: Self-Hosted IdentityServer v2 PoC

Hope this helps.


Running thinktecture IdentityServer v2 in a Windows Azure Web Role – from zero to hero (a walkthrough)

OK, I think a couple of you guys already did it successfully – others just look for something written. Here we go.

Let's start right away by browsing to GitHub and clone the IdentityServer.v2 repo:

After cloning we have the following code structure in Windows Explorer:

Open Thinktecture.identityServer.sln as an elevated admin (for the Windows Azure Compute Emulator to work correctly). Build the entire solution.

No, choose Add… New project… and add a new Cloud project to the solution.

In the Cloud Service dialog do not choose any new project, just hit OK.

We now add the existing IdSrv WebSite project as a Web Role to the Windows Azure project, just like so:…

For now, the solution should look something like this:

Alright. On to some essential Cloud stuff now.

We need an SSL certificate. I am going to use an existing self-issued cert from my local machine. This of course needs to be a 'real' certificate if you deploy IdSrv as a production STS to Windows Azure – of course

Please head over to WebSite role configuration and the Certificates tab. Specify your desired certificate:

Based on this certificate we now create an SSL endpoint:

OK, this should be it for now.

Let's attack the database side of things. We need a SQL database for our identity configuration and data. I am going to create a new one via the Windows Azure management portal:

Please make a note of the connection string for your SQL database as we still need to change the connection strings inside IdentityServer's configuration files.

Then open up connectionString.config in the Configuration folder inside the WebSite project and adjust the connection strings to point to your SQL database in the Cloud:

 

<connectionStrings>
    <add name="IdentityServerConfiguration"
    connectionString="Server=tcp:….database.windows.net,1433;
    Database=idsrvcloud;User ID=christian@…;Password=...;
    Trusted_Connection=False;Encrypt=True;Connection Timeout=30;"
    providerName="System.Data.SqlClient" />

    <add name="ProviderDB"
    connectionString="Server=tcp:….database.windows.net,1433;
    Database=idsrvcloud;User ID=christian@…;Password=...;
    Trusted_Connection=False;Encrypt=True;Connection Timeout=30;"
    providerName="System.Data.SqlClient" />
</connectionStrings>

… drum roll …

F5 (with the Cloud project as the startup project) and pray …

Enter the basic setup information you need to enter and you should be good to go. This locally running instance inside Windows Azure Compute Emulator already uses the Cloud SQL database – just for the records.

Done… well almost … I am spilling the beans already now so that we can save some cycles.

There is an issue with the Membership hash algorithm type on Cloud VMs.

  • Locally: HMACSHA256
  • Azure Cloud Emulator: HMACSHA256
  • Published to Cloud Service: SHA1

So it looks like there must some machine.config setting in Cloud Service images – Microsoft is investigating this.

For us it means we need to set the keys explicitly in web.config (you can use a tool like this):

<system.web>
    <machineKey
        decryptionKey="46CD6B691..."
        validationKey="EC4752081..."
        decryption="AES"
        validation="HMACSHA256" />
...

OK.

After that we need to export the SSL cert, anyways, so that we can upload it to the Cloud Service , e.g. via the management portal.

And then, we finally can publish & deploy to Windows Azure:

After approx. 8 to 10 minutes we have our thinktecture IdentityServer v2 running up in the Cloud.

Hope this helps.


ASP.NET Web API changes from Beta to RC

The official word on changes from Beta to RC for Web API-related topics (filtered from the original page).

  • ASP.NET Web API now uses Json.NET for JSON formatting: The default JSON formatter in ASP.NET Web API now uses Json.NET for JSON serialization. Json.NET provides the flexibility and performance required for a modern web framework.
  • Formatter improvements: The methods on MediaTypeFormatter are now public to enable unit testing of custom formatters. A single formatter can now support multiple text encodings. UseBufferedMediaTypeFormatter to implement simple synchronous formatting support.FormatterContext has been removed. To get access to the request from a formatter on the server implement GetPerRequestFormatterInstance.
  • Removed System.Json.dll: Because of the overlap with functionality already in Json.NET the System.Json.dll assembly has been removed.
  • XmlMediaTypeFormatter uses DataContractSerializer by default: The XmlMediaTypeFormatternow uses the DataContractSerializer by default. This means that by default ASP.NET Web API will use the Data Contract programming model for formatting types. You can configure theXmlMediaTypeFormatter to use the XmlSerializer by setting UseXmlSerializer to true.
  • Formatters now always handle the body content: ASP.NET Web API formatters are now used consistently for handling both the request and response content. We have removedIRequestContentReadPolicy. The FormUrlEncodedMediaTypeFormatter class has been updated to use MVC-style model binding, so you can still use model binding infrastructure for handling form data in the request body.
  • HTTP content negotiation decoupled from ObjectContent: Previously in ASP.NET Web API all HTTP content negotiation logic was encapsulated in ObjectContent, which made it difficult to predict when HTTP content negotiation would occur. We have decoupled HTTP content negotiation from ObjectContent and encapsulated it as an IContentNegotiator implementation.ObjectContent now takes a single formatter. You can run HTTP content negotiation whenever you want using the DefaultContentNegotiator implementation to select an appropriate formatter.IFormatterSelector has been removed
  • Removed HttpRequestMessage<T> and HttpResponseMessage<T>: Previously there were two ways to specify a request or response with an ObjectContent instance: you could provide anObjectContent instance directly, or you could use HttpRequestMessage<T> orHttpResponseMessage<T>. Having two ways of doing the same thing complicated request and response handling, so HttpRequestMessage<T> and HttpResponseMessage<T> have been removed. To create content negotiated responses that contain an ObjectContent use the CreateResponse<T>extension methods on the request message. To send a request that contains an ObjectContent use the PostAsync<T> extension methods on HttpClient. Or, use the PostAsJsonAsync<T> andPostAsXmlAsync<T> extension methods to specify a request that will be specifically formatted with as JSON or XML respectively.
  • Simplified action parameter binding: You can now predictably determine whether an action parameter will be bound to the request body. This ensures that the request stream is not unnecessarily consumed. Parameters with simple types by default come from the URL. Parameters with complex types by default come from the body. There can be only one body parameter. You can explicitly specify if a parameter comes from the URL or from the body using the [FromUri] and[FromBody] attributes.
  • Query composition is now implemented as a reusable filter: Previously support for query composition was hard coded into the runtime. Query composition is now implemented as a reusable filter that can be applied as an attribute ([Queryable]) to any action that returns anIQueryable instance. This attribute is now required to enable query composition.
  • Cookies: The HttpRequestMessage and HttpResponseMessage classes expose the HTTP Cookie and Set-Cookie headers as raw strings and not structured classes. This made it cumbersome and error prone to work with cookies in ASP.NET Web API. To fix this we introduced two newCookieHeaderValue and CookieState that follow RFC 6265 HTTP State Management Mechanism. You can use the AddCookies extension method to add a Set-Cookie header to a response message. Use the GetCookies extension method to get all of the CookieHeaderValues from a request.
  • HttpMessageInvoker: The HttpMessageInvoker provides a light weight mechanism to invoke anHttpMessageHandler without the overhead of using HttpClient. Use HttpMessageInvoker for unit testing message handlers and also for invoking message handlers on the server.
  • Response buffering improvements: When web-hosting a web API the response content length is now set intelligently so that responses are not always chunked. Buffering also enables reasonable error messages to be returned when exceptions occur in formatters.
  • Independently control IHttpController selection and activation: Implement theIHttpControllerSelector to control IHttpController selection. Implement IHttpControllerActivator to control IHttpController activation. The IHttpControllerFactory abstraction has been removed.
  • Clearer integration with IoC containers that support scopes: The dependency resolver for ASP.NET Web API now supports creating dependency scopes that can be independently disposed. A dependency scope is created for each request and is used for controller activation. Configuring dependency resolution (i.e. HttpConfiguration.DependencyResolver) is optional and is now configured separately from the default services used by ASP.NET Web API (HttpConfiguration.Services). However, the service locator consults the dependency resolver first for required services and then falls back to explicitly configured services.
  • Improved link generation: The ASP.NET Web API UrlHelper how has convenience methods for generating links based on the configured routes and the request URI.
  • Register resource for disposal at the end of the request life-time: Use the RegisterForDisposeextension method on the request to register an IDisposable instance that should be disposed when the request is disposed.
  • Monitoring and diagnostics: You can enable tracing by providing an ITraceWriterimplementation and configuring it as a service using the dependency resolver. The ILoggerinterface has been removed.
  • Create custom help and test pages: You now can easily build custom help and test pages for your web APIs by using the new IApiExplorer service to get a complete runtime description of your web APIs.
  • Entity Framework based scaffolding for web APIs: Use the Add Controller dialog to quickly scaffold a web API controller based on an Entity Framework based model type.
  • Create unit test projects for Web API projects: You can now easily create a unit test project for a Web API project using the New ASP.NET MVC 4 Project dialog box.
  • Unauthorized requests handled by ASP.NET Web API return 401 Unauthroized: Unauthorized requests handled by ASP.NET Web API now return a standard 401 Unauthorized response instead of redirecting the user agent to a login form so that the response can be handled by an Ajax client.
  • Configuration logic for MVC applications moved under the App_Start directory: The configuration logic For MVC applications has been moved from Global.asax.cs to a set of static classes in the App_Start directory. Routes are registered in RouteConfig.cs. Global MVC filters are registered in FilterConfig.cs. Bundling and minification configuration now lives in BundleConfig.cs.
  • Add Controller to any project folder: You can now right click and select Add Controller from any folder in your MVC project. This gives you more flexibility to organize your controllers however you want, including keeping your MVC and Web API controllers in separate folders.

 


Light-weight web-based architectures: Web APIs & Services (or: ASP.NET vs. WCF?) – a personal view

As much as I like WCF (and as much as I made money with it in the past years) – I think it is now time to re-think some approaches and some architectural ideas. What is ahead of a lot of us is what I call ‘light-weight architectures for reach’.

Just to give you some scope: I am one of the original Digerati board members of Indigo (WCF). Some people say I know WCF inside-out - I have given several dozens of introductions to WCF in seminars, conference and user group sessions world-wide and did numberless consulting gigs in Europe with WCF (and no, I did not really find time to write a book – besides a chapter here).

OK, so here goes...

There is currently a shift - or rather a new focus - in distributed application architecture. And no, I am not going to call it REST and I am not saying it is the one and only true thing from now on. But we have new drivers like application mash-ups, mobile devices and cloud computing which force us to move away from the good old feature-rich (and somehow heavy-weight) SOAP-based approaches. This development and this need is non-discussable and undeniable. One possible solution to these new needs is the use of light-weight Web APIs (which can be evolved into a REST-ful API or be used 'just' in an RPC-ish way).

So, which technology framework should we as .NET developers choose to build Web APIs? One choice is to leverage WCF's WebHttp model. After working with it for some years I quickly noticed a number of shortcomings (testability/mockability, serializers, ...) and wished that Redmond would come up with something better. *Of course* it had to be WCF, it had to be part of the overall "let's abstract away communication 'details'". And so it happened that they announced WCF Web API (and I was lucky enough to be part of the advisory board, again). All was fine in my world. It seemed.

Fine until I again realized that it has shortcomings when building some more flexible and advanced Web APIs. But this time it was actually the fact that the limitations were in WCF itself. The 'SAOP message' model of WCF just does not fit at all into this world. Everything the team tried to accomplish and to provide features for a web-based paradigm was just trying to put the round into the rectangle (and vice versa). For a web-based base framework maybe WCF is not the right choice.
And it turned out that there is a very good framework that can deal with the web and HTTP in a fantastic way. So, the decision to build a .NET web API framework on ASP.NET (and ASP.NET MVC, to be more precise) was somehow natural.
I personally had one very strong wish - something I have learned in the past years is invaluable for a lot of our customers... self-hosting! So, the advisors pushed hard to get self-hosting, and here it is: we can perfectly build our own host processes and host the exact same Web API implementation either there or in IIS/ASP.NET.

Today, I am quite happy with what happened. The only issue I still have is the name: I would definitely have not called it ASP.NET Web API, but rather .NET Web API. If you are working in my geographical area then you may understand why.

Anyway... talking a bit about application architecture (.NET server-side):
I would build a WCF service and an ASP.NET Web API (at least with the RPC-ish approach) as a facade only - the actual logic (business logic or data/resource access) is hidden beyond that facade. With this simple pattern it is no problem to have both WCF SOAP services and ASP.NET Web APIs sitting right next to each other happily ever after. I am doing it all the time. And you are then actually in the power to leverage the full power of each framework - get the most out of SOAP/XSD/WSDL/WS-* and the best out of web APIs/REST/whatever.

Fact is, I am using Web APIs quite a lot these days as we are building cloud-based systems in an increasing number and also cross-device mobile apps.

My 2 services cents.
Flame away.