Tuesday, October 2, 2012

TallComponents discontinues PDFWebViewer.NET

Source code now available on CodePlex (here and here)

On the company blog, CEO Frank Rem states that the product is not part of their core technology and revenue is insufficient to justify the load on support that comes from a browser based component.

As the original developer of the product I'm happy with TallComponent's decision to open source the product rather than to keep it locked down and let it die.

Looking at the source code brings back some good memories of building a nice product. I'm also curious to see what has become of it in the past two years.

Looking back

Technology wise it hasn't changed much. The server side was built on TallComponents' excellent flag-ship products PdfKit.NET and PDFRasterizer.NET.

At the time PDFWebViewer.NET was created, ASP.NET MVC was an emerging technology and Microsoft hadn't embraced open source yet. So the client-side implementation was based on the now pretty much obsolete MS Ajax framework.

Though version 2.0 looks like it's been modernized it hasn't really evolved. I'm sure this also accounts for much of the support load that TallComponents must have been experiencing for this product. Especially with HTML5 and mobile devices on the rise, as mentioned on the blog, a full rewrite would probably be in order.

The essence of the product was to use common HTML constructs only to display PDF, a couple of years back that meant using divs and images with plain vanilla javascript.

Looking ahead

Nowadays it would probably be safe to implement the control using a canvas, which would make it much easier to implement a lot of the functionality. For example, rotating a page required a round trip to the server in the original implementation but can be done in the browser when using a canvas.

That would also eliminate a lot of common issues with alignment, scrollbars and the like.

In addition to that, commonly used javascript frameworks (like jQuery) would probably also go a long way in solving cross-browser scripting issues. It should also help reduce maintenance overhead by reducing the amount of code involved in handling events and manipulating the DOM.

Life expectancy

An open source project that relies on commercial components (however good they are) is probably doomed. I’ve been supporting and using software components long enough to know that any component (or open source project for that matter) gets developer attention as long as the project that is using the component is in development. After that some occasional maintenance may trigger a bug report but there’s not going to be any developer love.

Since the product was not sold in large volumes there are probably not a lot of developers actively working with PDFWebViewer.NET. Therefore the most likely scenario is that there will be little, if any, activity on the Codeplex projects.

The life expectancy could be improved if somebody decides to replace the proprietary core components with freely available or open-source counter parts.

Final words

As with any software project I’ve worked on I do hope people will continue to use PDFWebViewer.NET. I’d love to see people forking and contributing but, as Frank has hinted in the blog post, the product wasn’t a big seller so I’m not expecting much.

Having said that – feel free to contact me if you need help with these projects.

Friday, May 25, 2012

Fixing Sitefinity 3.7 URL handling–Part 2

A while back I wrote about URL handling in Sitefinity 3.7. The default internal URL handling is not working well with IIS 7 Rewrite module. Over the past weeks I’ve had more problems with making Sitefinity 3.7 behave correctly, this time its about SEO friendly 404 handling.

How ASP.NET handles errors

Sitefinity is based on ASP.NET and by default any ASP.NET application will handle errors by either showing a generated error page (the dreaded Yellow Screen Of Death – YSOD) or by redirecting to a predefined error page. This is all configured in web.config:

<customErrors defaultRedirect="Error.aspx" mode="On">
   <error statusCode="404" redirect="404.aspx"/>

This bit of XML instructs ASP.NET to redirect users to Error.aspx when an (unhandled) error occurs, unless it’s a 404 error in which case the user should be redirected to 404.aspx.

SEO friendly 404 handling

Redirecting is an acceptable way to help your visitors explain what’s going on. Your users get a decent explanation and can continue on their way. If you’re using a CMS like Sitefinity you can manage the 404 page within the CMS and even drop in a smart control that offers relevant suggestions based on the requested URL.

Search engines indexing the site will however have difficulty understanding what is going on. A conversation between a crawler like the Google bot and your site would look like this:


The conversation ends with a successful HTTP 200 status code, indicating to the search engine that the page was found… The crawler will even index the 404 page unless it’s explicitly told not to via meta tags or robots.txt.

In order for search engines to understand what’s going on the conversation should look like this:


Fortunately the customized Sitefinity CMS module from my previous post can provide us with the hooks needed to set this up.

Step 1 – intercept errors

Since the CMS module is a HttpModule, it can register for the ASP.NET Error event. If that event occurs we can check the type of error and lookup where to get the alternate content from the customErrors section in web.config.

var context = HttpContext.Current;

var error = context.Server.GetLastError() as HttpException;
if ( null != error && error.GetHttpCode() == 404 )
  // use the web.config custom errors information to 
  // decide whether to redirect
  var config = ( CustomErrorsSection )WebConfigurationManager
                  .GetSection( "system.web/customErrors" );
  if ( config.Mode == CustomErrorsMode.On ||
       ( config.Mode == CustomErrorsMode.RemoteOnly 
                        && !context.Request.IsLocal ) )
    // redirect to the error page defined in web.config
    var redirectUrl = config.DefaultRedirect;
    if ( config.Errors["404"] != null )
       redirectUrl = config.Errors["404"].Redirect;
    // now render the content

Step 2 – render alternate content

This is where things get interesting. In IIS7 with Integrated Pipeline mode there’s a Server.TransferRequest method that makes it easy to do an internal redirect. It’ll do a full run of the request pipeline. TransferRequest will simulate an actual request and you can specify any parameters you want to pass which will be available in the request through the HttpContext.Params collection.

If not using Integrated Pipeline mode, the Server.Transfer can do an internal redirect. The redirected request will however not go through the full ASP.NET pipeline and vital events will not fire. Most notably, some events used by Sitefinity to resolve the page that needs to be rendered. The code below works around that by setting up the request the same way Sitefinity would before handing it off to the main entry point.

Both methods will however discard the HTTP status code from the original request. To work around that the status code is reset in the transferred request.

if ( HttpRuntime.UsingIntegratedPipeline )
                      redirectUrl, true, "GET",
                      new NameValueCollection { { "__sf__error", "404" } } );
        var context404 = 
          CmsSiteMap.Provider.FindSiteMapNode( redirectUrl ) 
            as CmsSiteMapNode;
        if ( null != context404 )
          context.Response.StatusCode = 404;
          CmsUrlContext.Current = context404;
          context.Items["cmspageid"] = context404.PageID;
          context.Server.Transfer( "~/sitefinity/cmsentrypoint.aspx" );

In integrated pipeline mode you have no control over the executing request. So to rest the HTTP status code it’s passed to the transferred request using a custom header. In the PostRequestHandlerExecute event handler in the Cms module the header is picked up and used to alter the status code:

private void PostRequestHandlerExecute( object sender, EventArgs e )
   var context = HttpContext.Current;
   // Set the error code passed in the headers when TransferRequest was invoked.
   var error = context.Request.Headers["__sf__error"];
   if ( null != error && context.Response.StatusCode == 200 )
      int errorCode;
      if ( Int32.TryParse( error, out errorCode ) )
         context.Response.StatusCode = errorCode;
         context.Response.TrySkipIisCustomErrors = true;

Full code and installation instructions available on GitHub.

This code has been tested with Sitefinity 3.7 SP4 and is in use on production systems.

Wednesday, May 9, 2012

Setup Powershell for .NET development

Here’s a quick recipe to setup a Powershell Command Prompt for development with Visual Studio and .NET in general.

Being a complete Powershell noob until quite recently I was not sure where to start. All the required info is out there but fragmented so I figured I’d put it here for other developers that run into the same problem.

What I want

I want a PowerShell box to do everything the Visual Studio Command Prompt can do and then some.

  • Run builds with MSBuild
  • Run Visual Studio tools and Windows SDK tools
  • Work with Git from the command line

How I got it working

PowerShell is easily extended using Modules. Fortunately, there’s a lot of those available already. Being a noob on Powershell I was not exactly sure how to install modules. Thankfully, there’s a module for that too.

Easy module installation with Psget
PsGet is a handy module installer named that supports a directory of modules. Installation of new modules is a no-brainer with a single command. Installation of PsGet (one time only) is a bit more complicated:

(new-object Net.WebClient).DownloadString("http://psget.net/GetPsGet.ps1") | iex

The above command is copied verbatim from the PsGet home page.

Support for Visual Studio and MSBuild
Visual Studio comes with a batch file that configures the environment (path and variables) for .NET development. It’s the same batch file used by the Visual Studio Command Prompt. We can use that to configure PowerShell too. However, in PowerShell any changes to the environment by a script or batch file are not transferred to the calling environment. There’s a fix for that in the PowerShell Community eXtensions module (pscx). So we’ll use that:

Install-Module pscx

When PowerShell starts up it loads the user profile first so we can use that to configure the environment. The profile is located under


If the file doesn’t exist yet, create it and add the following lines:

Import-Module Pscx 

# Load Visual Studio environment Invoke-BatchFile 'C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\vcvarsall.bat' x86

You may need to modify the path if you have a 32-bit Windows version or installed VS in a different location.

Now when you start a PowerShell prompt, you can run MSBuild and all the other tools used in .NET development.

Git support
If you’ve already installed MSysGit you should be able to use git from PowerShell. To further integrate git into PowerShell you can use posh-git. Posh-git provides command completion using the Tab-key and adds the repository status to the prompt. To install using PsGet:

Install-Module posh-git

The installation will update your PowerShell profile to load the module automatically.


Friday, April 20, 2012

Reusing views between areas in ASP.NET MVC

ASP.NET MVC Area's are great for organizing functionality that logically goes together. However, the default view location logic is somewhat limited.

If I want to use a view from one area in a controller for another, I can specify it's path explicitly. This works fine. However, when the view uses EditorTemplates or DisplayTemplates in the corresponding subfolders, the view engine will not be able to locate these because it will try to resolve these independently from the view that is being rendered based on the executing controller and area.

What's going wrong?

The ViewEngine is in charge of resolving views and templates based on the controller name, the view name and the area name. By default the WebFormViewEngine considers the following paths:


It then falls back to non-area based views :


As you can see, the view engine expects the view and any EditorTemplate or DisplayTemplate to be located in either the area of the controller or in the main views folder.

So, one possible solution for sharing views and templates between areas is to move the files to the main views folder. However, since my main views folder is crowded enough as it is, I prefer not to do that.

Since the default view engine uses simple path generation based on the area, controller and view names it's not going to be trivial to force it to look for templates in the folders relative to the currently rendering view. That context is simply not available.


An alternate approach is to expand on the Shared folder pattern by introducing a Shared area. We need to tweak the view engine a bit to get that supported:

public class WebFormViewEngine : System.Web.Mvc.WebFormViewEngine
  public WebFormViewEngine()
     : this( null )

  public WebFormViewEngine( IViewPageActivator viewPageActivator ) : base( viewPageActivator )
     AreaViewLocationFormats = AreaViewLocationFormats
        .Union( new[] { "~/Areas/Shared/Views/{1}/{0}.aspx", "~/Areas/Shared/Views/{1}/{0}.ascx" } )

     AreaPartialViewLocationFormats = AreaViewLocationFormats;

That's all there is to it. We've added some additional search paths for the view engine to consider. It will now look in the Shared area folder for views and templates.
During application startup, you do need to replace the default WebFormsViewEngine with this customized version:

ViewEngines.Engines.Remove( ViewEngines.Engines.First( e => e is System.Web.Mvc.WebFormViewEngine ) );
ViewEngines.Engines.Insert( 0, new WebFormViewEngine() );

If you're using Razor views, you can apply a similar fix to the RazorViewEngine.


Reusing views from one area in a controller for another area is possible by specifying the full path to the view. Doing so however breaks the Editor- and DisplayTemplates because they are resolved independently from the view that is being rendered.
The solution is to introduce a Shared area. Move the views and templates there and then tweak the view engine to also consider the shared area when resolving the views and templates.

Further reading

Tuesday, January 17, 2012

Applying DATEDIFF to DateTimeOffset in SQL Server

In a recent post I described how you can upgrade columns from DateTime to DateTimeOffset and add the missing time zone offset. After you do that, you may notice that the DATEDIFF function does not work the way it did before.
For example:
DECLARE @timeInZone1 datetimeoffset, @timeInZone2 datetimeoffset

-- Two times on the same day in timezone UTC +1 (Western Europe)
SET @timeInZone1 = '2012-01-13 00:00:00 +1:00';
SET @timeInZone2 = '2012-01-13 23:59:59 +1:00';

SELECT DATEDIFF( day, @timeInZone1, @timeInZone2 );
-- result is 1 !!!
Not exactly what I had expected. DATEDIFF(day, x, y) will return the number of datelines crossed between x and y. Since both dates are on the same day in the same time zone you’d expect the function to return 0.
It returns 1 however because DATEDIFF does nothing with the time zone offset and compares the underlying UTC time in stead. In UTC the first date is actually January 12, 2012 23:00.
To get the result I had expected, I need to compare the local times not the UTC times. The trick is to convert to DateTime first:
    CAST(@timeInZone1 AS DateTime), 
           CAST(@timeInZone2 AS DateTime) );
-- result is 0
This assumes however that both dates are in the same time zone. If that is not the case you can use SWITCHOFFSET to normalize on a time zone. Then cast the values to DateTime and apply the DATEDIFF function.
DECLARE @offset int;
SET @offset=120; -- UTC +2
    CAST(SWITCHTIMEZONE(@timeInZone1, @offset) AS DateTime), 
           CAST(SWITCHTIMEZONE(@timeInZone1, @offset)@timeInZone2 AS DateTime) );
-- result is 0

Looks like I’m not the only one that was a little surprised about the behavior.

Friday, January 13, 2012

Converting to DateTimeOffset in SQL Server 2008

SQL Server 2008 introduces a new temporal data type, DateTimeOffset. This is the only data type that can hold both a date-time value and time zone offset information, so it’s perfect for holding data collected from around the world.

I’m preparing an application to support global scalability so it’s time to upgrade some columns from DateTime to DateTimeOffset. Conversion however is not exactly trivial. Unlike for example PostgreSQL, SQL Server does not add timezone offset information. You’ll have to do that yourself.

Converting DateTime to DateTimeOffset
When altering the column type to DateTimeOffset type, the timezone offset is set to 0 which is UTC.
The actual offset at any moment in time is determined by the time zone offset (i.e. +1 hour for most of Western Europe) and daylight saving time (DST). So that’s 0 in winter and +1 hour in summer.

The basic time zone offset is fixed, but the date DST starts and ends is different every year (though totally predictable).
The data set I’m updating spans a time period from early 2009 up to now, January 2012. I wrote a simple function that determines the DST time offset based on the actual date that needs to be converted.

CASE WHEN ( ( @date1 > '2009-3-29 2:00' AND @date1 <= '2009-10-25 2:00' )
    OR ( @date1 > '2010-3-28 2:00' AND @date1 <= '2010-10-31 2:00' )
    OR ( @date1 > '2011-3-27 2:00' AND @date1 <= '2011-10-30 2:00' ) )
    THEN 60 -- offset in minutes
    ELSE 0

SQL Server handles time zone offsets in minutes. The actual offset from UTC is the DST offset plus the base offset. A conversion would look like this:

DECLARE @baseoffset INT;
SET @baseoffset = 60; -- UTC +1 = 60 minutes
   [Created]=SWITCHOFFSET([Created], @baseoffset + dbo.fn_DSTOffset( [Created] ) );

The SWITCHOFFSET method applies the specified time zone offset without altering the date and time.