I got asked an interesting question about that would potentially apply to lots of products and thought it would be an interesting question to talk about here. It went like this:

Most sites have a sign in mechanism that takes passwords from end users via an ASP.NET textbox. We go onto to store that password in a string and, by definition then , strings are stored in managed memory on the heap, and if one happens to take a hang dump the password will be in clear text. It would also be true to say that we do not control exactly when the clear text password gets garbage collected. The question then is would immediately storing user entered passwords in a System.Net.NetworkCredential and/or System.Security.SecureString in ASP.NET provide any worthwhile security benefit.

My initial answer started as a "maybe" and finally morphed into a fairly strong "no".

To be clear there are lots of situations where managing sensitive information with SecureString is critical, let's look at a canonical command line example:

public static void Main()
{
    using (var securePwd = new SecureString())
    {
        ConsoleKeyInfo key;
        Console.Write("Enter password: ");
        do
        {
            key = Console.ReadKey(true);
            if (((int)key.Key) >= 65 && ((int)key.Key <= 90))
            {
                securePwd.AppendChar(key.KeyChar);
                Console.Write("*");
            }

        }
        while (key.Key != ConsoleKey.Enter);

        Console.WriteLine();

        try
        {
            Process.Start("Notepad.exe", "MyUser", securePwd, "MYDOMAIN");
        }
        catch (Win32Exception e)
        {
            Console.WriteLine(e.Message);
        }
    }
}

You can see in this console app I am appending each character directly SecureString so there are no strings floating about. I am also taking advantage of the using pattern and ensure that SecureString object will be disposed of quickly or I could have explicitly used the Dispose method to remove the memory more directly. Would such a pattern help in ASP.NET? I am inclined to think not. What is important to remember here is that strings are immutable (unchangeable) so after you create one for any reason anything you do to it ends up creating a new string for you to manipulate. With that in mind consider then how you get from w3wp process to your specific app domain in the following image.

w3wp appdomain relationship

If the information you are getting off the wire is all about strings to begin with whatever you deserialize that string into after the fact becomes the equivalent of closing the barn door after the horse has already bolted.

This is my loosely held opinion I would love for someone to present a contrary opinion, even if it covers a small use case where SecureString might be useful when handling passwords in this way.



August 28, 2018 23:49  Comments [0]
Tagged in .NET | ASP.NET
Share on Twitter, Facebook and Google+

I continue to explore ASP.NET Core through the lens of DasBlog, I am finding it fascinating to juxtapose how problems were solved in 2004 verses 2017, it provides an enlightening perspective on the maturity of web development frameworks. A feature of DasBlog that I need to emulate for a Web Core UI replacement is the ability to define and select a theme for your blog.

First step was simply storing the current theme name settings, previously the DasBlog devs created a custom made settings file, however, ASP.NET Web Core seems to assume that app settings will be stored in the appsettings.json file, and that of course looks like this:

"DasBlog": {
    "Theme": "DasBlog"
}

The ViewLocationExpanders provides a straightforward way to view and alter the default paths used by the RazorViewEngine, as you are probably aware it checks /Views/Shared as wells as the /View/* folder associated with controller/method. The following class is designed to replace the shared location with the one I have associated with the theme name stored in the app settings file.

public class DasBlogLocationExpander : IViewLocationExpander
{
    private const string _themeLocation = "/Themes/{0}";
    private string _theme;

    public DasBlogLocationExpander(string theme)
    {
        _theme =  string.Format(_themeLocation, theme);
    }

    public IEnumerable<string> ExpandViewLocations(ViewLocationExpanderContext context, IEnumerable<string> viewLocations)
    {
        return viewLocations.Select(s => s.Replace("/Views/Shared", _theme));
    }

    public void PopulateValues(ViewLocationExpanderContext context)
    {

    }
}

You then call this code in the Startup class in the ConfigureServices and pull in the predefined theme folder name as follows:

services.Configure<RazorViewEngineOptions>(rveo => {
        rveo.ViewLocationExpanders.Add(new DasBlogLocationExpander(Configuration.GetSection("DasBlog")["Theme"]));
});

Nice and simple, and this code should work for both ASP.NET Core and MVC 6.



July 31, 2017 0:09  Comments [0]
Tagged in ASP.NET | ASP.NET Core
Share on Twitter, Facebook and Google+

Kestrel is the new cross platform .NET web server (based on libuv) which runs on Linux, Mac and Windows 10 and will, eventually, run on Raspberry Pi. One the outstanding improvements is the sheer speed. According to some measure it is about 20 times faster than ASP.NET running on IIS. This is clearly amazing and having been a ward for a high performance platform I was curious about some of the changes that were introduce to accelerate performance in such a drastic way.

The most important changes from my stand point were the ideas behind reducing Garbage Collection pressure and taking advantage of advanced CPU instructions, to be clear there are other changes but I understood these the clearest, and could envision applying them to existing apps.

Quick Garbage Collection Primer

Most developer I converse with are fully aware that .NET garbage collection (GC) is organized into generations (0, 1 and 2), however, the premise that GC further divides objects up into small and large object heaps is occasionally missed. When an object is large (greater than 85000 bytes) some attributes and actions associated with it become more significant than if the object is small. For instance, compacting it, meaning copying the memory elsewhere on the heap, is considered an expensive operation for larger objects. I then tend to think of garbage collection in the following logical and physical layouts:

Logical view of the GC Heap

  • Generation 0 – For short lived objects
  • Generation 1 - Objects that have survived gen 0, this is a buffer between gen 0 and the long lived gen 2
  • Generation 2 - Objects that have survived gen 1, and large objects (> 85000 bytes)

Physical view of the managed heap segment

  • Small object heap (SOH) < 85000 bytes [starts in gen 0]
  • Large object heap (LOH) > 85000 bytes [starts in gen 2, because compaction is expensive]

A GC occurs if one of the following three conditions presents:

  • Allocation exceeds the generation 0 or large object threshold
  • System is in a low memory situation
  • System.GC.Collect is called manually

Reducing Garbage Collection Pressure

Managing memory on our behalf helps us immeasurably but the process is not cost free, so understanding the GC design is essential for efficient application servers. We need to be concerned with reducing GC pressure, by that I mean, reducing the conditions under which a GC is triggered. One really clever way to do this is to reduce the continuous need to allocate strings (which begin in Gen 0) by converting them to bytes and then ensuring that they live in the LOH by using a Memory Pool.

Remember, the goal is reducing the need for objects to be promoted through the generations unnecessarily (really CPU intensive). So in Kestrel all the strings that are important to the HTTP request/response life cycle (GET, POST, HEAD, etc) are created as static bytes, and are made part of a contiguous Memory Pool and then pinned. Pinning the Memory Pool simply prevents the object from being moved around which frees the GC from the responsibility of constantly checking whether it needs to be decommitted, further reducing GC pressure. During a normal gen 2 the GC will take the opportunity to release segments that have no live objects on them back to the OS (by calling VirtualFree) but for pinned objects this would skipped.

Large Object Heap (LOH) Graph

Avoiding Strings

All HTTP requests arrive at the designated ports as bytes and normally we would go about the process of converting them to strings, but as I noted Kestrel has gone about the business of defining known strings as bytes so all common comparisons become a mathematic operation, rather than a string comparison. Luckily enough the verbs and headers are all 8 bytes (or less) and so you can define each of them with a static long. This then means Kestrel can process many request without dealing with strings. No strings means no allocations, no deallocation, which means reduced GC pressure … yay.

So when you retrieve a POST from the wire you can do a bitwise compare against a statically assigned constant (actual code is here):

public const string HttpPostMethod = "POST";
private readonly static long _httpGetMethodLong = GetAsciiStringAsLong("GET \0\0\0\0");
private readonly static long _httpPostMethodLong = GetAsciiStringAsLong("POST \0\0\0");

/// <summary> 
/// Checks that up to 8 bytes from <paramref name="begin"/> correspond to a known HTTP method. 
/// </summary> 
public static bool GetKnownMethod(this MemoryPoolIterator begin, out string knownMethod) 
{ 
    knownMethod = null; 
    var value = begin.PeekLong(); 
    
    if ((value & _mask4Chars) == _httpGetMethodLong) 
    { 
        knownMethod = HttpGetMethod; 
        return true; 
    } 
    foreach (var x in _knownMethods) 
    { 
        if ((value & x.Item1) == x.Item2) 
        { 
            knownMethod = x.Item3; 
            return true; 
        } 
    } 
    return false; 
}

private readonly static Tuple<long, long, string>[] _knownMethods = new Tuple<long, long, string>[8];

static MemoryPoolIteratorExtensions() 
{ 
    /// ... 
    _knownMethods[1] = Tuple.Create(_mask5Chars, _httpPostMethodLong, HttpPostMethod); 
    /// ...
}

Why do this when a simple IndexOf, Compare or EndsWith method exists? Well at this layer you are obligated to care about every allocation because it has a direct consequence on the overall speed. Every microsecond counts.

CPU Instructions

It has been a long time since I have been directly concerned with CPU instructions, however, Intel has long since introduced Advanced Vector Instructions (AVX) which allow Single Instruction Multiple Data (SIMD) operations on Intel architecture CPUs. In simple terms this means Kestrel can look at more than one byte at a time in a single CPU instruction, however, to do this you will need to write your code in Assembly language … ugh.

Not to worry, .NET Core uses the RyuJIT compiler (also used by .NET 4.6) and that allows us to emit byte code that uses AVX (check out System.Numerics.Vector).

So what does this really mean? It permits you to perform operations on data larger than the register size of the CPU. So for a 64 bit CPU you can actually perform 128 bit operations by using CPU extensions (up to 512 bits for AVX 3). You can operate on 16 bytes or 2 longs at a time rather than looping through individual bytes as they are retrieved from the wire.

Summary

Understanding the Garbage Collection process is critical to building high performance platforms like Kestrel, and while this kind of design consideration is an edge case for most of us, understanding the basics of GC can improve even the simplest modern applications. Kestrel's performance (clocked at 5 million request a second I believe) is a testament to a dedicated Microsoft Team and its commitment to collaborating with the open source community.

References:



July 14, 2016 4:52  Comments [0]
Tagged in ASP.NET | DotNetCore
Share on Twitter, Facebook and Google+

Are you old enough to remember the release of ASP.NET? I unfortunately am, I distinctly recollect it presenting a straightforward abstraction over the existing Classic ASP programming model. It took me a few years to realize, however, that the abstraction layer was designed to hide the truth of traditional web development from novices like me, and that this was not necessarily positive. But it genuinely assisted the transition to the web for those of us who were more comfortable in the event driven Windows desktop world.

How about the ASP.NET Page execution cycle diagrams? Just wow! Our world was ruled by HttpModules and HttpHandlers, and web services could only to be constructed in ASMX files, it worked but it represent a kind of cynical compromise for wed development standards.

From my perspective until we got genuine MVC integration we were living and working in a framework that was designed by Microsoft for Microsoft. Now, for the first time in a long time, I am confident that the changes upon us are developed on the idea of limited compromise.We have gone from being inexorably tied to the IIS and Windows to now punching holes in the space-time continuum and popping up into self-host scenarios, and even onto Mac and Linux platforms, with the notable release ASP.NET CORE 1.0.

The framework we need

So what is .NET Core?

  • Light-weight and modular HTTP request pipeline
  • Single web stack for Web UI and Web APIs
  • Built on .NET Core
  • Ships via NuGet packages
  • Build and run ASP.NET apps cross-platform on Windows, Mac and Linux

The .NET Core command line (CLI previously DNX) is the runtime environment (and SDK) that has everything you need to build and run .NET applications for Windows, Mac and Linux.  Instructions for the installation of SDK can be found here. To create a new project, it is as simple as this:

  • Open your command window
  • Create a new folder and navigate to it
  • dotnet new creates a new project
  • dotnet restore brings down the required NuGet packages
  • dotnet run to run the application

That’s it! After doing this there will be an auto generated program.cs file in the new folder and a standard void Main() method for a console app, you can pick your text editing application and have at it, something like this…

public class Program
{
    public static void Main(string[] args)
    {
        var host = new WebHostBuilder()
            .UseKestrel()
            .UseStartup<DoStuff>()
            .Build();

        host.Run();
    }
}

public class DoStuff
{
    public void Configure(IApplicationBuilder app)
    {            
        app.Run(async (context) =>
        {
            await context.Response.WriteAsync(
                "Doing stuff ... I promise ");                                                
        });
    }
}

Just seeing how quickly you can get started provides a great opportunity for teaching and hackathons, instead of spending the first 2 hours ensuring everyone is on the same page this setup process can just take minutes. This is great!

Middleware

HttpModules and Handlers were once my go to ASP.NET interview talking point I used to use it gauge if folks were “Senior” or not, however, that mechanism is no longer in the pipeline and we are left with the versatile Middleware, as defined below:

Middleware – Pass through components that form a pipeline between a server and application to inspect, route, or modify request and response messages for a specific purpose.

You can use Middleware to implement tasks as requests arrive such as authorizations, authentication, session state management etc. For starters though you might consider implementing existing Middleware (Authentication, Diagnostics, Routing, Working with Static Files), the following implements MVC in the pipeline.

public class DoStuff
{
    public void ConfigureServices(IServiceCollection services)
    {
        services.AddMvc();
    }

    public void Configure(IApplicationBuilder app)
    {
        app.UseMvc();
    }
}

There is so much more to talk about here (Services, Hosting), but I am hoping this sets the groundwork for future forays into the .NET Core space, I am especially interested in what this means for practical memory profiling in the IIS space.



July 6, 2016 2:31  Comments [0]
Tagged in ASP.NET | DotNetCore
Share on Twitter, Facebook and Google+

Got this really strange error in Visual Studio 2012 today:

"ASP.NET 4.5.256 has not been registered on the Web server.  You need to manually configure your Web server for ASP.NET 4.5.256 in order for your site to run correctly."



October 6, 2015 0:55  Comments [0]
Tagged in ASP.NET | Visual Studio
Share on Twitter, Facebook and Google+
Tag Helpers allow server-side code to participate in the rendering of HTML elements in Razor files. In this regard they are similar to HTML Helper methods, the biggest difference is that Tag Helpers actually attach to HTML elements in your Views rather than being called as methods that have been embedded within the HTML.


July 17, 2015 1:37  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+
One of the big changes that arrived with the latest version of ASP.NET is the notion of hosting ASP.NET application outside of IIS, this essentially meant there is no reliance on System.Web. For those of us who have been doing large scale ASP.NET development, such an idea borders on the absurd. IIS has sat at the center of the universe for so long, and so this, at first glance, looks like a radical departure from our well-trodden path. The truth of the matter is there are many applications that can and should take advantage of a life cycle without IIS, places where web semantics make a lot of sense but IIS simply does not. As more platforms are supported outside Windows, IIS is just may not exist in any way. This has left a rather a large gap of features embedded within IIS that Web API version 1 relied upon, so this is why the future vision of ASP.NET appears to have abandoned an IIS centric approach, and embraced security with strong consideration for platform agnostic remedies.


May 21, 2015 3:57  Comments [0]
Tagged in ASP.NET | Web API
Share on Twitter, Facebook and Google+

I have been heavily invested in WCF for a long time now but the tide has mostly turned on public facing APIs (and some private), the industry has almost exclusively committed to the principles of RESTful APIs. WCF still has many advantages over REST but its biggest weakness, in my humble opinion, is the limited support for non-Microsoft platforms.

SOAP Service Proxy Generator Support

REST’s natural alignment to HTTP is what makes it so easy to use and implement on the widest variety of devices.

RESTful principles

Introduced by Roy Fielding (Dissertation on Network Based Software Architectures)

“The key principles or REST involve separating your API into logical resources. These resources are manipulated using HTTP requests where the method (GET, PUST, PUT, PATCH, DELETE) has specific meaning.”

REST should be thought of as a style rather than a standard, as with all styles how it is implemented can be considered a matter of taste, however, the following are considered the formal required constraints.

In order to describe how you would apply some of this RESTful design we will consider a fictitious and arbitrary request to  create an API for a library, where we would be support adding shelves and then add books to those shelves.

Designing Good Resource URIs

Lets begin with the naming conventions, because most .NET developers come from an object oriented background, it tends to ask us to look at each object as a self contained entity with methods treated as RPC  (either directly or via a Proxy). This tends to lead to method names with verbs that indicate some specific set of actions, the following would be bad examples of RESTful methods.

  • GetBookShelves
  • SetBookShelves
  • GetBook

With RESTful styling we should try to use pluralized nouns, our http method will tell us what we are actually doing with the URI.

api/bookshelves /// Gets a list of book shelves 
api/bookshelves/1 /// Gets a single book shelf based on the id of 1
api/bookshelves/1/books /// Gets a list of books from the book shelf based on the id of 1
api/bookshelves/1/books/1 /// Gets a book from based on the id of 1, from the book shelf based on the id of 1

Specific calculations on collections or groups tend to separate the RESTful purists from the pragmatists,  for example, if we wanted to know how many pages exist on a shelf (not sure why you would) then that could defined by a property returned from by a simple GET.

api/bookshelves/1

Or I think, more appropriately, we create a new resource for example:

api/bookshelvespagetotals/1

I see the the following example a lot when perusing the web for API examples, the truth is this style diverges from RESTful principles but makes much more sense to me.

api/bookshelves/1/total

Responding to Resource Requests

Predictability is a really important for folks who are consuming your API, and while consistency on URIs helps discoverability I would suggest that consistent success/error handling helps developers deal with responses in a compatible manner. The basic consensus on Support HTTP Status Codes

  • 20x – Successful
  • 40x - The client did something wrong (formatting, authentication, authorization, etc.)
  • 50x - The server did something wrong, there is no obvious remedy from a client perspective.

Here are some more specific response I see used for GET, POST, DELETE, PUT and PATCH.

api/bookshelves

  • GET -> 200 (Ok); 404 (Not Found), 500 (Internal Server Error)
  • POST  -> 201 (Created); 400 (Bad Request), 500 (Internal Server Error)

api/bookshelves/1

  • GET -> 200 (Ok); 404 (Not Found), 500 (Internal Server Error)
  • DELETE-> 204 (No content); 404 (Not Found), 400 (Bad Request); 500 (Internal Server Error)
  • PUT -> 200 (Ok); 404 (Not Found), 400 (Bad Request); 500 (Internal Server Error)
  • PATCH -> 200 (Ok); 404 (Not Found), 400 (Bad Request); 500 (Internal Server Error)

In general the following also apply:

  • 401 (Unauthorized) – Invalid/No credentials supplied.
  • 403 (Forbidden) – Authenticated user does not have access to a specific resource.
  • 405 (Method not allowed) – A method is being requested that the authenticated user is not allowed to process.

Creating RESTful ASP.NET Web API

To create a new project in Visual Studio 2013 you can elect to create a new MVC project and then you are able select Web API.

VSWebAPI

This action does a few things in the background but for now I want to focus on the Controller, which if you will notice looks an awful lot like MVC. One of the differences of course is the use of System.Web.Http.ApiController vs System.Web.Mvc.Controller. I am told that in  ASP.NET version 5 we have been reduced to one controller, methods then return ActionResult by default.

public class BookShelvesController : System.Web.Http.ApiController
{
// GET api/bookshelves
public IEnumerable<string> Get()
{
return new string[] { "value1", "value2" };
}

// GET api/bookshelves/5
public string Get(int id)
{
return "value";
}

// POST api/bookshelves
public void Post([FromBody]string value)
{
}

// PUT api/bookshelves/5
public void Put(int id, [FromBody]string value)
{
}

// DELETE api/bookshelves/5
public void Delete(int id)
{
}
}

As the comments clearly indicate the actions map to HTTP methods (GET, POST, PUT, DELETE). However by convention if you wanted a more descriptive name in your controller you could also prefix your action method name with HTTP verb for example “GetCustomer”. Additionally you could assign default HTTP verbs by adding attributes to any methods you choose:

[HttpGet]
[HttpPost]
[HttpPut]
[HttpDelete]

Configuring Routes

Routes are what map your URI request to an ApiController the default route looks like this:

api/{controller}/id

  • {controller} + “Controller” = ApiController type name.
  • {id} is passed as an argument to the action method.
  • HTTP Verb (Get, Post, Put, Delete) is used to determine action.

Using our example then…

api/bookshelves/26

  • “bookshelves” => BookShelvesController class.
  • HTTP GET implies method that starts with the word “Get” or a method with the attribute of HttpGet.
  • In this case “26” is passed to the Get(object id) method.

Your default routing code is found in App_Start/RouteConfig.cs

public class RouteConfig
{
public static void RegisterRoutes(RouteCollection routes)
{
routes.IgnoreRoute("{resource}.axd/{*pathInfo}");

routes.MapRoute(
name: "Default",
url: "{controller}/{action}/{id}",
defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional }
);
}
}

We can use routing to map multiple URIs to the same set of resource, for example by implementing some basic changes to Routing you can ensure that a default to call like this:

api/books/1 

Can have equivalency to the following:

api/bookshelves/1/books/1


March 19, 2015 1:12  Comments [0]
Tagged in ASP.NET | Design | Web API
Share on Twitter, Facebook and Google+

Got into a discussion recently about securing websites via HTTPS and the implications of allowing any part of your site to be loaded via HTTP. The most egregious designs usually loads the site using HTTP and simply completes a form POST for sensitive data via HTTPS. Good enough? Not by a long shot!

Two points I want to bring up about this:

  1. Any part of your site that gets to you via HTTP can easily be manipulated by man in the middle attack. Leaving the integrity of anything in the page in doubt including the endpoint of your “secure” POST.
  2. Non-technical end users are realizing (if only at a high level) that the green lock in the address bar means a more secure conversation.

It is disappointing then to see major sites happily present entire pages (or just resources) over HTTP when there are clear advantages to pushing the entire site via HTTPS. The HTTP Strict Transport Security (HSTS) standard is a relatively new mechanism that is designed to facilitate this practice in conjunction with compliant browser, the abstract for HSTS reads as follows:

This specification defines a mechanism enabling web sites to declare themselves accessible only via secure connections and/or for users to be able to direct their user agent(s) to interact with given sites only over secure connections. This overall policy is referred to as HTTP Strict Transport Security (HSTS). The policy is declared by web sites via the Strict-Transport-Security HTTP response header field

Here are some scenarios it helps combat:

User bookmarks or manually types http://example.com and is subject to a man-in-the-middle attacker

  • HSTS automatically redirects HTTP requests to HTTPS for the target domain

Web application that is intended to be purely HTTPS inadvertently contains HTTP links or serves content over HTTP

  • HSTS automatically redirects HTTP requests to HTTPS for the target domain

A man-in-the-middle attacker attempts to intercept traffic from a victim user using an invalid certificate and hopes the user will accept the bad certificate

  • HSTS does not allow a user to override the invalid certificate message

There are a variety of ways to tackle integration of this solution into IIS, here are the ones I have looked at recently.

Configure IIS directly

IIS does have the ability add custom header fields to the HttpResponse:

  1. Open IIS Manager and navigate to the level you want to manage. For information about opening IIS Manager (instructions assume IIS 7).
  2. In Features View, double-click HTTP Response Headers.
  3. On the HTTP Response Headers page, in the Actions pane, click Add.
  4. In the Add Custom HTTP Response Header dialog box, type a name, and a value or set of values separated with commas (,) in the Name (Strict-Transport-Security) and Value (max-age=31536000) boxes as follows:
You could also add these headers via the web.config, something like this:
<system.webServer>
<httpProtocol>
<customHeaders>
<add name="Strict-Transport-Security" value="max-age=31536000"/>
</customHeaders>
</httpProtocol>
</system.webServer>

However, a strict adherence of the protocol means that you should not present this custom header over non-secure transport and unfortunately IIS does not support that type of conditional check. This means even if you force a 30x redirect to HTTPS for all HTTP traffic, that first 30x response over HTTP will contain the custom header.

IIS URL Rewrite

IIS 7 and above enables IIS administrators to create powerful customized rules, this one adds the custom header for only HTTPS traffic.

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<rewrite>
<outboundRules>
<rule name="Add Custom Header for HTTPS" enabled="true">
<match serverVariable="RESPONSE_Strict_Transport_Security"
pattern=".*" />
<conditions>
<add input="{HTTPS}" pattern="on" ignoreCase="true" />
</conditions>
<action type="Rewrite" value="max-age=31536000" />
</rule>
</outboundRules>
</rewrite>
</system.webServer>
</configuration>

ASP.NET HttpModule

Alternatively you could solve this by writing a HttpModule that runs within ASP.NET application context.

public class HSTSModule : IHttpModule 
{
public void Dispose()
{
}

public void Init(HttpApplication context)
{
context.PostRequestHandlerExecute += context_PostRequestHandlerExecute;
}

void context_PostRequestHandlerExecute(object sender, EventArgs e)
{
HttpContext context = ((HttpApplication)sender).Context;

if (context.Request.IsSecureConnection)
{
context.Response.AppendHeader("Strict-Transport-Security", "max-age=31536000");
}
}
}

If you elect to use a HttpModule you should be aware of which files are processed by ASP.NET, some static files (css, js, htm) are purposefully not sent through the ASP.NET pipeline, this can be configured within your Web.config but you should be aware of the implications of doing so.

Open Source IIS Module

The simplest alternative is to download and deploy the open source HTTP Strict Transport Security IIS Module, If you are comfortable with C++ and writing IIS modules you can find code details over at GitHub.


Related Posts


November 23, 2014 4:43  Comments [0]
Tagged in ASP.NET | IIS
Share on Twitter, Facebook and Google+

Connect is the cloud-first, mobile-first, developer-first, virtual event taking place today (and tomorrow) and the Microsoft team has been making some pretty amazing announcements, that genuinely transform the future opportunities  for .NET developers.

Microsoft is open sourcing the .NET Framework Libraries (MIT license), projects like Mono who have relied on contributors to their project who have not looked at disassembled .NET code, but can now freely introduce the .NET framework directly into the Mono project. The code is available here, just amazing!!!

Additionally Microsoft has begun redesigning .NET as the .NET Core which produces simpler versions of class libraries, the project is hosted on GitHub here. The .NET framework team also spent a lot of time trying to speed up the JIT compiler last year and released RyuJIT, this JIT compiler will *also* available under the same .NET Core release.

This bears repeating the MIT License is a permissive as it gets, and this also comes with a patent promise! This is Microsoft really living the open source software ideal!

Other notable updates

What announcement are you most eager to check out?


Related Posts



November 12, 2014 16:58  Comments [0]
Tagged in .NET | ASP.NET | Visual Studio
Share on Twitter, Facebook and Google+

Earlier this year I spent significant time with a vendor troubleshooting requests we were sending across the network. In the end I was convinced of my need to rely almost exclusively on the vendor to verify that the certificate were correctly applied. Well, a colleague shared this link that describes how to configure network tracing for WCF service.

Configure Network Tracing






































type="System.Diagnostics.TextWriterTraceListener"
initializeData="network.log"
/>



When troubleshooting complex systems never trust the well intentioned opinion of even the savviest technical mind, get the facts. This would have saved me days!

Related Posts



July 14, 2014 22:03  Comments [0]
Tagged in .NET | ASP.NET | WCF
Share on Twitter, Facebook and Google+

I was reading an article over at All Geek Things and realized I have designed quite few sites that at some point have been victimized by this exact type of behavior:

I noticed that I was getting a bit more traffic to some posts on my site. When I checked my analytics the bounce rate was high and the time on site ranged from no time at all to under 30 seconds. On further checking I found that the images in those posts had been hot linked. This means people were viewing the image from another site without having to visit my site. There are actually websites out there with galleries of images hotlinked from other sites. Quite frankly this is sucky behaviour.

All Geek Things goes onto to describe how you can use Google to investigate which folks are hotlinking your content and update WordPress appropriately. I am not sure that I have personally been a victim of this particular kind of malfeasance, but it is relatively easy to guard against it using your ASP.NET web.config file as follows:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<rewrite>
<rules>
<rule name="Stop Hotlinking">
<match url=".*\.(png|gif|jpg)$" />
<conditions>
<add input="{HTTP_REFERER}" pattern="^$" negate="true" />
<add input="{HTTP_REFERER}" pattern="^http://(.*\.)?poppastring\.com/.*$" negate="true" />
</conditions>
<action type="Rewrite" url="/images/stop_hotlinking.jpg" />
</rule>
</rules>
<rewriteMaps>
<rewriteMap name="DomainsWhiteList" defaultValue="block">
<add key="pinterest.com" value="allow" />
</rewriteMap>
</rewriteMaps>
</rewrite>
</system.webServer>
</configuration>

This blocks almost everybody and even goes a step further by silently redirecting the offenders to an image of my choosing (stop_hotlinking.jpg in this example). However, there are many use cases where hotlinking is desired (Pinterest is one example that comes to mind), and so being able to provide exceptions (white lists) is also helpful, and the rewriteMap section accomplishes that.

Stay safe!

Related Posts



June 18, 2014 15:08  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

There was a metric ton of information coming out of Build this year, and I am still trying to parse all of it for my personal development. While the casual observer would have seen the the Windows Phone and Azure announcements, it would be patently false to suggest that the ASP.NET platform has been left fallow this year. The following is a list of videos that directly or indirectly touch the world of .NET Web developers!

It will be a few more months, at least, before I get through all these videos. Happy learning!



April 16, 2014 21:50  Comments [0]
Tagged in .NET | ASP.NET | C# | JavaScript | Training
Share on Twitter, Facebook and Google+

When an MVC application receives a user request there is a whole list of things that occur that, during the development of most remedial applications, you simply do not have to worry about. As the developer we usually pick up the story at the point where we write Controllers and Actions.

The MvcHandler is really *the* thing responsible for initiating the ASP.NET pipeline for an MVC application, it receives a Controller instance from the MVC controller factory, these are the five steps to memorize:

  1. MVCHandler creates a Controller Factory.
  2. Controller Factory creates the corresponding Controller object and MVCHandler calls the Execute Method.
  3. ControllerActionInvoker examines RequestContext and uses that to determine the action call.
  4. ControllerActionInvoker also determines the values to be passed to the action as parameters.
  5. ControllerActionInvoker then runs the action

If you forget everything else, please remember the above steps (eventually there will be a test). It does not get much simpler than that, if you want an in depth reveal on MVCHandler (and trust me you do) check out this post.



October 7, 2013 22:37  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

In spite of extensive exposure to ASP.NET MVC I still tend to solve problems by first using ASP.NET Web Forms. Shallow criticisms aside, Web Forms provides a powerful mechanism for quickly creating complex user interfaces. That said, I still feel guilty about not pushing the MVC pattern just a little further when the opportunity arises. A valid criticisms of Web Forms development is its lack of support for test driven development, however, if you have started a Web Forms project and still want to integrate the MVC pattern, I would submit that ASP.NET Web API is one way to do that.

Web API

ASP.NET Web API is a framework that makes it easy to build building RESTful (uses the HTTP protocol) applications on the .NET Framework. Creating these services provides the opportunity to move code from “code behind” files into an API controller. Accessing the controller is done via AJAX which can allow for a more responsive UI. To start begin by installing the Web API package as follows:

Install-Package WebApi.All -Version 0.6.0

This install will allow you to Add->New Item->Web API Controller Class, and this is what you will get by default:

public class EmployeeController : ApiController
{
    // GET api/
    public IEnumerable Get()
    {
        return new string[] { "value1", "value2" };
    }

    // GET api//5
    public string Get(int id)
    {
        return "value";
    }

    // POST api/
    public void Post([FromBody]string value)
    {
    }

    // PUT api//5
    public void Put(int id, [FromBody]string value)
    {
    }

    // DELETE api//5
    public void Delete(int id)
    {
    }
}

The new controller shows off some of the coding by convention inherent to Web APIs, for example, the “Post” method will be called whenever a page is posted back to the server, to create other POST methods you can decorate said method with the [HttpPost] attribute. I will leave you to do the heavy lifting of deciding what code would be transferred to the API Controller and how you validate.

Routing

After creating your controller the next step is to create a routing rule that lets the application know which URIs are configured and what controllers will respond to those requests. In the following example routeTemplate we have defined a url that will redirect to our API Controller given the following format http://www.PoppaString.com/api/Employee/1.

using System.Web.Routing;
using System.Web.Http;

namespace WebApplication1
{
    public class Global : HttpApplication
    {
        void Application_Start(object sender, EventArgs e)
        {
            RouteTable.Routes.MapHttpRoute(
                    name: "ActionApi",
                    routeTemplate: "api/{controller}/{id}",
                    defaults: new { Controller = "Employee" }
                );
        }

With this URIs exposed it becomes a trivial task to start plugging AJAX directly into the webpages and calling these endpoints directly. Hopefully this kind after the fact change will assuage our collective Web Forms guilt.



May 14, 2013 1:28  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

This week I was troubleshooting a project that provided static JavaScript files from a WCF services. However I found that when running tests locally, my debugging tools indicated that the endpoint was not accessible and would result in this error:

HTTP Error 404.17 - Not Found
The requested content appears to be script and will not be served by the static file handler.

My first check was to ensure that IIS mapping was setup correctly, and for my 64 bit machine there were, as I expected, three entries that appropriately matched the 'svc' extension with WCF.

 

image

 

In my particular case I had some doubts about the order IIS, WCF and .NET were installed on my PC and so I opted to re-register the script maps with IIS using the following command (open a command window as admin):

"%WINDIR%\Microsoft.Net\Framework\v3.0\Windows Communication Foundation\ServiceModelReg.exe" -r

…and success! This above solution was for .NET v3.0, for 4.x the following would be more appropriate:

"%WINDIR%\Microsoft.Net\Framework\v4.0.30319\aspnet_regiis" –i –enable
"%WINDIR%\Microsoft.Net\Framework\v4.0.30319\ServiceModelReg.exe" –r



August 25, 2012 17:26  Comments [0]
Tagged in ASP.NET | JavaScript | WCF
Share on Twitter, Facebook and Google+

portsI was trying to make a couple of small changes to a WCF services recently and I hit a brick wall while trying to update my references to that service in a WP7 app. I was noticing that instead of using the local IP address it was filling in the machine name and then picking a random port for the WSDL declaration.

The screen shot to the left was what I would see. The WCF service was available http://127.0.0.1:81/Services/MyService.svc, however, the WSDL for the service was forced to http://[MACHINENAME]:82/Services/MyService.svc?wsdl. Now I added a host entry for my machine name at 127.0.0.1 but the WSDL is actually at port 82 (and would randomly select other ports). This is not a valid URL (on my machine) even with the service running and would fail all the time. I tried pointing directly to the correct and valid IP/port but all the references inside are porting to the wrong location.


The solution. Define a behavior and attach it to your service, you can use the behavior to lock in the port number as follows:


  
    
      
        
          
        
      
    
  



February 20, 2012 0:18  Comments [1]
Tagged in .NET | ASP.NET | WCF
Share on Twitter, Facebook and Google+

I have been a huge fan of web services over the years, mainly due to the strong coupling with ASP.NET, the familiar coupling of HTTPContext (request and response) when I needed access to information being sent to the server in the typical HTTP headers. During my recent work with Visual Studio 2010 (.NET 4.0) I immediately started using the Windows Communication Foundation (WCF) projects as the primary means for developing service oriented API for my Windows Phone 7 apps.

One of the first things I realized I missed was the obvious connection to the HTTPContext information (IP Address, Port, etc.). There is, however, a means to get at similar information with scope of a WCF project as follows:

OperationContext op = OperationContext.Current;

MessageProperties mpa = op.IncomingMessageProperties;
var epp = mpa[RemoteEndpointMessageProperty.Name] as RemoteEndpointMessageProperty;

string address = epp.Address;
int port = epp.Port;

These properties were made available to us during version .NET 3.5, I have also had a couple of chances to integrate and HttpRequestMessageProperty which gives us direct access to things like HTTP Method (GET, POST, etc.), Headers and QueryString.



March 20, 2011 19:38  Comments [0]
Tagged in .NET | ASP.NET | C# | WCF
Share on Twitter, Facebook and Google+

As always my posts are usually based on experiencing some travesty of code that required me either to change or endure it. In this case I was looking at a web page who’s only purpose was to return data … for the more seasoned among us the preceding sentence should scream murder. The truth is a web page has a metric ton of overhead and simply using them as conduits for the delivery of raw unformatted non html information (jpeg, text,xml, etc) is a pure waste of resources. The following is an example of what not to do when you trying to return data:

using System;
using System.Collections.Generic;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Text;

public partial class WebPage : System.Web.UI.Page
{
    protected void Page_Load(object sender, EventArgs e)
    {
        Response.ContentType = "text/xml";
        Response.ContentEncoding = Encoding.UTF8;

        string xml = GetXMLString(); //Not interested in the details

        Response.Write(xml);

    }

}

This example takes advantage of the generic handler, which has all the flexibility of a web page but none of the overhead of the web page life cycle. While I am returning text/xml this could be any of your defined MIME types.

<%@ WebHandler Language="C#" Class="SomeHandler" %>

using System;
using System.Web;
using System.Text;

public class SomeHandler : IHttpHandler {
    
    public void ProcessRequest (HttpContext context) {
        context.Response.ContentType = "text/xml";
        context.Response.ContentEncoding = Encoding.UTF8;

        string xml = GetXMLString(); //Not interested in the details
        context.Response.Write(xml);
    }
 
    public bool IsReusable {
        get {
            return false;
        }
    }

}

In my humble opinion the above concept should be known to all senior asp.net developers, in fact this is one of the first interview questions I ask.

Technorati Tags: ,


February 9, 2010 17:16  Comments [0]
Tagged in ASP.NET | C#
Share on Twitter, Facebook and Google+

I was having a problem with validating two fields in a web page the other day. The two fields were not required but it was necessary that if either field was selected then the other field would also be required.

 

Most of the examples I have come across simply require both fields, or require both fields based on another control. Logically speaking the request we have is an XNOR (where Field1 ='A', To Field2='B'). Only when one text box is filled out should the page flag an error.

clip_image003

 

In order to achieve this I used a CountTrueConditionsValidator from the PeterBlum Validation and More, as follows:

<vam:CountTrueConditionsValidator id="NeedBothFieldsOrNeitherField" runat="server" 
    ErrorMessageLookupID="You need both fields" 
    Minimum="1" Maximum="1" NotCondition="True" EventsThatValidate="OnSubmit"> 
    <Conditions> 
        <vam:RequiredTextCondition ControlIDToEvaluate="Field1" /> 
        <vam:RequiredTextCondition ControlIDToEvaluate="Field2" /> 
    </Conditions> 
</vam:CountTrueConditionsValidator>


In this example it counts the number of required fields from the conditions elements with the Minimum and Maximum both set to 1. This means if only one field is active this constitutes a true scenario (XOR). To make this fulfill our scenario (XNOR) the NotCondition is set to true.

Peter Blum Controls are really flexible and immersive, I will not develop any meaningful website again without them!

 

Technorati Tags:


December 3, 2009 15:26  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

At this point I am a JavaScript newbie, I have assiduously avoided a full fledged dive into the language until recently. That is difficult to believe as at least 50% of my job involves ASP.NET. However, the introduction of some really cool libraries has forced me into a desperate game of catch up.

One of my recent tasks was to try calling a web service from Javascript using Visual Studio 2005 and the obvious solution involves the use of XMLHttpRequest. The following sample solution covers a HTTP POST while passing a couple of simple parameters.

var oReq = getXMLHttpRequest(); 

if (oReq != null) {
    oReq.open("POST", "http://localhost/mydemo.asmx", true);
    oReq.onreadystatechange = handler; //call back function defined below
    oReq.send(“param1=22&name=Michael";);
}
else {
    window.alert("AJAX (XMLHTTP) not supported.");
} 

function getXMLHttpRequest() 
{
    if (window.XMLHttpRequest) {
        return new window.XMLHttpRequest;
    }
    else {
        try {
            return new ActiveXObject("MSXML2.XMLHTTP.3.0");
        }
        catch(ex) {
            return null;
        }
    }
} 

//called when the service returns
function handler()
{
    if (oReq.readyState == 4 /* complete */) {
        if (oReq.status == 200) {
            alert(oReq.responseText);
        }
    }
}

This works great but I wanted to look at sending more complex data structures (which for java scripts generally refers to arrays) in a really intuitive way. This sounded simple but I started to run into weird things when trying to send anything but regulars vars. I did happen upon a really intuitive JavaScript library created by Mateo Casati, called SoapClient. It provides great support for all the types of arrays and even classes, the only real limitation in what you can send is JavaScript itself.

<script type="text/javascript" src="soapclient.js">script>
var
url = "http://localhost/mydemo.asmx"); function handler(r) { alert(r); } function MyNewSample() { var list = "This is a test"; var pl = new SOAPClientParameters(); pl.add("list", list); SOAPClient.invoke(url, "MyNewSample", pl, true, handler); }

This option became almost immediately obsolete once I was given permission to use Visual Studio 2008 for the project. In VS2008 I am able to take advantage of the ScriptManager (as well as break points in JavaScript). This inferred upon me all the intellisense support for my web service that I would need for rapid development. It simply requires that you place the following script inside a form tag.

<asp:ScriptManager runat="server" ID="scriptManager">
    <Services>
        <asp:ServiceReference path="mydemo.asmx" />
    Services>
asp:ScriptManager>

image

 



October 22, 2009 3:34  Comments [0]
Tagged in ASP.NET | JavaScript | WCF
Share on Twitter, Facebook and Google+

I was on a website that I use at least once a year, and through a series of steps that I am unable to repeat I got the following message to appear.

 image

I have talked about this before we all need to be as defensive as possible when it comes to errors messages. In this case I am not sure I could use this information to do harm (that is not my motive anyway) but it strikes me as odd that the developers in this case decided to let this kind of error bubble to the top. I now know the server name, database name, table name…

I did not include the name of this site to protect the innocent, they have also made it incredibly difficult to contact them and tell them about the problem. Either way I removed enough information from the above message so that no one else can track down the site or the error.



August 3, 2009 23:55  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

I have been doing a fair share of security related audits and programming over the last few years, and the following is a list of my favorite faux pas.

I always feel that giving specific details of errors encountered on your site is a sure fire way to attract trouble. So my first defensive tip is to always use custom error pages.

<customErrors mode="On" defaultRedirect="YourErrorPage.htm" />

Secondly, always ensure that you are capturing application level errors in your application, there are many errors that do not show up within any error handling that you place at the web form level.

void Application_Error(object sender, EventArgs e)
{
   //get reference to the source of the exception chain
   Exception ex = Server.GetLastError().GetBaseException();

   //log the details of the exception!
   EventLog.WriteEntry("PoppaString",
     "MESSAGE: " + ex.Message + 
     "\nSOURCE: " + ex.Source +
     "\nFORM: " + Request.Form.ToString() + 
     "\nQUERYSTRING: " + Request.QueryString.ToString() +
     "\nTARGETSITE: " + ex.TargetSite +
     "\nSTACKTRACE: " + ex.StackTrace, 
     EventLogEntryType.Error);
}

The threat of cross site scripting is real one and could performed in a variety of ways. While most developers tend to check for text input validation I have also seen omission in the the validation of cookies and URLs, these inputs are as open to attack and should be validated before using.

HttpUtility.HtmlEncode(Request.Form["name"]);

note: This is by no means an exhaustive list and is really only meant to represent a few low hanging fruit in coding securely for ASP.NET.

Technorati tags:



July 15, 2008 1:46  Comments [0]
Tagged in ASP.NET | Security
Share on Twitter, Facebook and Google+

Don and I have been going back and forth on why MVC is so important to Web developers? I must admit I was missing the reasoning. I had been reviewing the demos by various alpha geeks, and I actually got bored with the whole thing, but that was fueled by my lack of understanding. I then came across a great blog post by Rick Strahl, who starts the MVC discussion by building up and subsequently dismantling Web Forms programming based on its weakness' and strengths. He then continues by showing how MVC helps solve the architectural issue.

The problem, in short, with ASP.NET is that it was built with a marked attempt to pull in Windows application developers. As a result mythical creatures sprung from Pandora's box in the form of ViewState, PostBack (event driven model) and the Visual Web Designer. Wonderful as they maybe, they inherently promoted the bloat of the entire ASP.NET paradigm. All these features did allow us to move swiftly into the web development world without really knowing html and simultaneously provided additional complexity to the entire Page execution cycle.

The other problem with ASP.NET is the lack of separation of concerns between business logic and the UI. While most of the work I have done has not included code in the ASPX file, it would be accurate to say that a large portion of the business logic sits in the code behind file, and this can lead to code that is exceptionally difficult to maintain.

It is these problems that MVC is designed to address, so when you watch the next demo and just before your eyes begin to glaze over with the question of why? Remember that ASP.NET does have some serious problems. To quote the Don "How is nice. Why is priceless!"

Technorati tags:



March 11, 2008 11:34  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

In my line of work I am often given the solution before the problem, that is to suggest, well meaning engineers often pass on suggestions for problems they have found. Today's example included avoiding some errors we were seeing by setting the ValidateRequest flag to false in the Web.Config file.

For example, when the Validate Request flag is set false you are able to send scripts to the server as follows.

image

The ValidateRequest flag is design to mitigate the problems of cross site scripting (XSS) and produce a much more defensive response to script injections as follows.

image

Now to be safer all headers, cookies, query strings, form fields and hidden fields should be verified for invalid characters and character sequences by the developer regardless of what this flag is set, also if this flag needs to be modified it should be done on a page by page basis and with extreme caution.

Technorati tags: ,


February 8, 2008 2:56  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

Up until now I have turned my metaphoric back on AJAX for 2 reasons. Firstly we use Peter Blum controls at work which cover, more than adequately, 90 % of the use cases we encounter with our clients. So JavaScript validation continues to be something we do not concentrate on or worry about. Secondly I thought that I would have to get more familiar with Java Script, which was absolutely incorrect. I will not rant here about why I hate JavaScript development, except to say if I had a IDE I would be more receptive.

I have been summarily ignoring a couple of training opportunities provided at work until recently and I have to say that I am completely on board with AJAX, in fact I am now wondering why we are using Peter Blum controls at all? but I digress...

I was given this list of wonderful "How do I?" videos that step you through how to get started, from the point of downloading and installation all the way to making your own AJAX control kit. 

These videos are a must see!

Technorati tags: ,


September 21, 2007 0:18  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

Due to the simplicity of the XCOPY deployment strategy in ASP.NET you can easily set your self up for dumb mistakes. It took me a good 15 minutes (a little embarrassing) to realize what my deployment issue was below.

Configuration Error

Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately.
Parser Error Message: Unrecognized attribute 'xmlns'.
Source Error:

Line 1:  <?xml version="1.0"?>
Line 2:  <configuration xmlns="http://schemas.microsoft.com/.NetConfiguration/v2.0">
Line 3:  	<system.web>
Line 4:  		<compilation debug="true">

Source File: C:\Inetpub\wwwroot\TTCOG\web.config    Line: 2


Version Information: Microsoft .NET Framework Version:1.1.4322.2032; ASP.NET Version:1.1.4322.2032

It screamed at me in the red line (and the version information) but I could not quite see it ... then I realized when you manually create a virtual directory for your project it defaults to version 1.1. This was my first official 2.0 deploy so I can be excused for this oversight.

image

"Nothing ever comes to one, that is worth having, except as a result of hard work." - Booker T. Washington

Technorati tags: , ,


July 12, 2007 23:44  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

As I read more about Silverlight from Mix 07 I must admit that I am totally confused as to what this will now mean for AJAX. I thought AJAX was supposed to be the rich GUI that we have been waiting for with baited breath. Am I missing something or does Silverlight spell the end for AJAX?

I have just finished downloading the ASP.NET futures which seems to plug the gap between AJAX and Silverlight but I am wondering why we need AJAX in the middle at all. I guess I will have to wait and see what the fall out for Mix 07 truly reveals!



May 1, 2007 22:49  Comments [0]
Tagged in ASP.NET | Visual Studio
Share on Twitter, Facebook and Google+

I have been thinking about the detailed plumbing of calling an ASP.NET web page lately due to some weird things that have happened at work. So I thought it wise to refresh myself and my five two faithful readers about what actually goes into displaying a page. 

I briefly described the HTTP traverse from browser to server, and will not delve in at that level. However, when a page request is sent to the Web server it is cycles through a series of events during its creation and disposal. Being able to understand the events (and the order) is critical for any potential ASP.NET developer. Everyone knows we start with an aspx page and end up with a beautifully rendered HTML page, however, we need to know what happens in between.

1. Object Initialization
Controls on a given page are initialized, by declaring the objects in the constructor of the C# code-behind file. If objects are created from within the aspx file they have no attributes or properties available in the code-behind and there is no reliable way to verify the order the controls will be created or if they will be created at all. The initialization event can be overridden using the OnInit method.

2. Load Viewstate Data
After the Init event, controls can be referenced using their IDs only. During LoadViewState event, the initialized controls receive their first properties from the viewstate information (handled by ASP.NET) that was persisted back to the server on the last submission. The event is overridden using the LoadViewState method and is used to modify the data received by the control.

3. LoadPostData, Processes Postback Data
When a page submits a form, the framework will implement the IPostBackDataHandler interface on each control that updated its data. The page then triggers the LoadPostData event and goes through the page to find each control that implements the applied interface and updates the control state with the correct postback data. ASP.NET checks each control by verifying the control's unique ID with the stored name/value pair.

4. Object Load
All object are arranged in the Control Tree (formerly known as the DOM) and can be referenced easily in code. Objects are now at liberty to apply the client-side properties set in the HTML, such as height, visibility, etc. This is generally considered the hardest working event in the process. This event can be overridden by calling OnLoad.

5. Raise PostBack Change Events
This event occurs immediately after all controls that implement the IPostBackDataHandler interface have been updated with the current postback data. This operation flags each control with a true\false based on if it was changed since the last post. ASP.NET looks for this flag and raises RaisePostDataChanged event.

6. Process Client-Side PostBack Event
The object which initiated the postback is handled by the RaisePostBackEvent event. The object is usually a control that posted the page back to the server (autopostback) or a submit from a button. The RaisePostBackEvent is last in the series of postbacks.

7. Prerender the Objects
This event is a critical one, as it marks the last chance the developer has to make any persistable changes to the objects. Immediately after the PreRender event changes to objects are locked and can no longer be saved to the viewstate. This event can be overridden using OnPreRender.

8. ViewState Saved
The viewstate is saved after all changes to the page have finalized. At the SaveViewState event, values can be saved to the ViewState object, but changes to page controls are not persisted.

9. Render To HTML
During the Render event, the page coerces each object into rendering itself into HTML. The page collects the HTML for transport to the client browser. When the Render event is overridden, the developer can write their own HTML to the browser that will actually override the HTML gathered by the page. The Render method uses the HtmlTextWriter to create HTML that will be streamed to the client browser. Changes can still technically be made here, but they will only show up at client browser.

10. Disposal
The Dispose event is the opportunity to destroy any objects or references you have created during the creation of the page.

Phew that is a lot of steps ... Monorail anyone. I think at some point I should also go over HTTPModules and HTTPHandlers. They provide really slick ways of jumping in the middle of a page cycle without necessarily touching every page! During the MCP test I noticed they really flogged the server and user control horse to death. It is a wonder to me that HTTPModules\Handlers were not covered with equal passion.

"Nothing contributes so much to tranquilizing the mind as a steady purpose - a point on which the soul may fix its intellectual eye." - Mary Shelley



March 16, 2007 21:57  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+

Service oriented development is dominating my current programming landscape. Service Oriented Architecture (SOA) can be defined as loosely coupled software services that support requirements of a business process. SOA is also characterized by being technology agnostic, that is, the underlying service can be implemented in a variety of ways RPC, DCOM, CORBA or Web Services without worrying about the source.

Currently most of the projects I am involved with consume web service where the underlying technology is completely unknown to me. As with all banking systems security is always key so I wanted to review some of the options available for someone using ASP.NET.

Windows - Basic: Used for non-secure identification of clients, as the user name and password are sent in base 64-encoded strings in plain text. Passwords and user names are encoded, but not encrypted, in this type of authentication. A determined, malicious user equipped with a network-monitoring tool can intercept user names and passwords, this type of authentication is generally limited to secure networks.

Windows - Basic over SSL: Used with secure identification of clients in Internet scenarios. The user name and password are sent over the network using Secure Sockets Layer (SSL) encryption, rather than plain text. This is relatively easy to configure and works for Internet scenarios. However, using SSL degrades performance.

Windows - Digest: Used for secure identification of clients in Internet scenarios and uses hashing to transmit client credentials in an encrypted manner so the password is not transmitted in clear text. In addition, Digest authentication can work through proxy servers. However, it is not widely supported on other platforms.

Windows - Integrated Windows: Uses NTLM or Kerberos. Uses a cryptographic exchange with the user's Microsoft Internet Explorer Web browser.

Windows - Client Certificates: Use for secure identification of clients in Internet and intranet scenarios. Requires each client to obtain a certificate from a mutually trusted certificate authority. Certificates are optionally mapped to user accounts, which are used by IIS for authorizing access to the XML Web service.

SOAP headers – Custom: Useful for both secure and non-secure Internet scenarios. User credentials are passed within the SOAP header of the SOAP message. The Web server, regardless of the platform hosting the XML Web service, provides a custom authentication implementation.

"The mystery of government is not how Washington works but how to make it stop." - PJ O'Rourke



January 22, 2007 7:49  Comments [0]
Tagged in ASP.NET | Security | WCF
Share on Twitter, Facebook and Google+

I have been working with improving some images in ASP.NET and compiled this code from various MSDN sources. Generally I have found the issues revolve around the concept of anti-aliasing.

private void Button1_Click(object sender, System.EventArgs e)
{
    Bitmap bmp = null;
    Graphics g = null;

    try
    {
        bmp = new Bitmap(@"c:\inetpub\wwwroot\MyWebTest\MYimage.jpg");
        g = Graphics.FromImage(bmp);
        g.CompositingMode = CompositingMode.SourceCopy;
        g.SmoothingMode = SmoothingMode.HighQuality;  //Specifies high quality, low speed rendering
        g.InterpolationMode = InterpolationMode.HighQualityBicubic; //This mode produces the highest quality transformed images.

        Response.ContentType = "image/jpeg"

        //Create a parameter collection
        EncoderParameters codecParameters = new EncoderParameters(1);
        //Fill the only parameter
        codecParameters.Param[0] = new EncoderParameter(Encoder.Quality,100L);
        //Get the codec info
        ImageCodecInfo codecInfo = FindEncoder(ImageFormat.Jpeg);
        //Save the image
        bmp.Save(Response.OutputStream,codecInfo, codecParameters);

    }
    catch
(Exception ex)
    {
        Response.Write(ex.Message);
    }
    finally
    {
        if (g != null)
        {
            g.Dispose();
        }
        if (bmp != null)
        {
            bmp.Dispose();
        }
    }
}

private static ImageCodecInfo FindEncoder(ImageFormat fmt)
{
    ImageCodecInfo[] infoArray1 = ImageCodecInfo.GetImageEncoders();
    ImageCodecInfo[] infoArray2 = infoArray1;
    for (int num1 = 0; num1 < infoArray2.Length; num1++)
    {
        ImageCodecInfo info1 = infoArray2[num1];
        if (info1.FormatID.Equals(fmt.Guid))
        {
            return info1;
        }
    }
    return null;
}

 

Technorati tags: ,


November 15, 2006 6:22  Comments [0]
Tagged in ASP.NET
Share on Twitter, Facebook and Google+