Sunday, February 28, 2010

The Forgotten Web.Config Inheritance Elegance

When you are running multiple Web Applications on a server you sometimes bumped into a challenge that the web.config are cumulative.  This can make life sweet for some things, for example you only need to specify the SMTP server settings once, in the root web.config.

 

There can be other times when you wish to have only a percentage of the parent web.config inherited into your application. The magic control element is <location>.

 

You may put this around any child element of <configuration>(remembering to keep it valid XML) and then include or exclude elements.

 

<configuration>
    <location path="PublicSite">
        <system.web>
            <!-- settings for "PublicSite" go here... -->
        </system.web>
    </location>
    <location path="PrivateSite">
        <system.web>
            <!-- settings for "PrivateSite" go here... -->
        </system.web>
    </location>
    <location path="." inheritInChildApplications="false">
        <system.web>
        </system.web>
    </location>
</configuration>

The MSDN Documentation is here. It applies to Net 2.0 and later. It would be nice if it could be placed around any element .. hint hint hint to MSFT.

Wednesday, February 24, 2010

Using SQL Updateable Views to do live refactoring

Recently I inherited a project with a doubly ugly code base. Both the C# and the database were ugly.  To describe it simply, property information was gathered from various counties with the column names matching the terms each county used on their web site and each county given it’s own table.  A totally denormalized datamodel from purgatory.

 

The project is live and thus options for a refactor were reduced. The first step was to build an application that allowed everything to be mapped to standardized field names. Once that was done, the magic was to create an updatable view on each table using the new field names. Super-setting the fields ended up a UNION able set of views with the same field names. Voila, I could start doing new development or refactors against these views while not breaking the existing ugly code base as well as providing cross county queries easily

 

As more and more refactoring happens, I can then flip the views to tables and retire the tables from hell.  It’s a practical solution to a really messy problem with a relatively clean path.

Monday, February 22, 2010

Adapting the XML mindset with SQL Server

The addition of native XML support to SQL Server has been one of the smartest database moves that Microsoft has done and has moved it ahead of all competitors. There is a challenge because many database developers are Xml-illiterate or marginally literate so it often is under utilized or misused.

 

A common “trick” that I used to judge a Sql developer skills with XML is to ask them this question:

  • You have a Class table with ClassId, ClassName etc
  • A Student table with StudentId, Student Name etc
  • A Registration table containing only ClassId and StudentId

Please write a TSQL statement to return all students with the name of the classes they are in as a XmlDocument.

 

What I looked for is if the person asks if “Do you want two nodes (Student, Class) or three nodes (Student, Registration, Class)” – Answer I want two: Student, Class.  I don’t want to see the Role table.

 

If they seem confused about what I want as output (I tend to look for folks that will fill in the dots automatically), then I sketch this on the board:

<School>
  <Student StudentName="John Adams">
    <Class ClassName="SQL 101" />
  </Student>
  <Student StudentName="John Smith">
    <Class ClassName="SQL 102" />
  </Student>
  <Student StudentName="John Tams">
    <Class ClassName="SQL 101" />
    <Class ClassName="SQL 102" />
  </Student>
</School>

If they asked how do you get the <School> in it, it tells me they should go back to school. Or worst, they try concatenating <school> + … + </school> to the query.  The solution is of course:

 

Select StudentName, ClassName From Student
Join
(Select StudentId,ClassName From Class A Join Registration B On 
A.ClassId=B.CLassid) Class
ON Student.StudentId=Class.StudentId 
Order by StudentName
for Xml Auto, Root('School')

Of course, Order by StudentName, ClassName would also be fine.  The next question is simple… in SQL Management Studio, how many rows and columns will be displayed (using the default settings).

sqlxml

Of course, it’s one row and one column – if they get that wrong… to assume zero real experience is likely a reasonable assumption.

 

If the developer asserts that they are also C# or .Net developers (or more dangerously, that .Net is their primary strength, but they are strong in SQL), then I hit them with my .Net/SQL XML differentiating questions.  You have the above query and it returns a 30K XmlDocument. Show me how you would get this data from SQL Server into C#. I may remind them that we get exactly one row and one column back from the above query.

 

  • The reminder is actually an attempt to leave them down the garden path to the cliff of lemmings. A developer who has not done it, would immediately jump to ExecuteScalar() assuming this was a simple pro-forma question. It’s not.  The Xml will be truncated at 2033 characters… oops we are 28K short and have invalid Xml. (see KB article Q310378: XML Data Is Truncated When You Use SqlDataReader)

Now there are two reasonable solutions to this.

  • Use the ExecuteXmlReader properly (there are a lot of examples on the web that will work for XML < 2033 characters ONLY, including on Microsoft.com)
   1: StringWriter sw = new StringWriter();
   2: var rdr = cmd.ExecuteXmlReader();
   3: rdr.Read();
   4: while (rdr.ReadState != ReadState.EndOfFile)
   5: {
   6:     sw.Write(rdr.ReadOuterXml());
   7: }
   8: rdr.Close();
   9: XmlTextReader tr = new XmlTextReader(new StreamReader(sw.ToString()));

Or convert it to a nvarchar(max) in TSQL which you can get with an ExecuteScalar. The TSQL would be:

Select cast(
(Select StudentName, ClassName From Student
Join
(Select StudentId,ClassName From Class A Join Registration B On 
A.ClassId=B.CLassid) Class
ON Student.StudentId=Class.StudentId 
Order by StudentName
for Xml Auto, Root('School')) as nvarchar(max))

You must remember that the pattern is:

SELECT CAST(  (select …. xml auto) AS NVARCHAR(MAX))

The extra ( ) around the select is essential.

 

The question of course, is which performs better. I will address that in a future blog.

 

OK, why do this????  The answer is simple – if you are using web services or WCF, the most effective (performance wise) is to avoid any serialization/de-serialization of .Net objects and just pass a string from the database to WCF to the client.  XML is sweet because it can express  hierarchy (and reduce data volume by eliminating redundant data). More in future blogs.

Friday, February 19, 2010

Using IEnumerable

This method I posted a couple of days ago is my favorite way to get data from SQL server; it returns an IEnumerable of a private class.  Programming with IEnumerable requires a different thought process then when you are dealing with arrays.

 

For example when you are dealing with arrays you can check to see if they are empty by calling the Length property and comparing it against zero.  Which would be very expensive if you returning all the rows from SQL server call just to see if one row existed.  However, with IEnumerable you only need one row to “prove” that some exist.  Here is my IEnumerable HasMembers() method.

 

   1:  public Boolean HasMembers
   2:  {
   3:     get
   4:     {
   5:         foreach(Member member in Members())
   6:            return(true);
   7:   
   8:         return(false);
   9:      }
  10:  }

Along with my Members() method that returns the IEnumerable, I include this method to quickly check for existence of members.

 

{6230289B-5BEE-409e-932A-2F01FA407A92}

A strategy for building a Section 508/WCAG Telerik radGrid page

Accessibility and translations can often result in a medusa head for code. On one hand you want a feature rich grid but you are constrained by Section 508.  Managing and tracking resource mnemonic across a large project becomes a nightmare (and very very boring code typing for developers which usually impacts quality control).  Another pain with a large site is consistency of presentation.

 

What if I claim that all you need to do is toss the code below into a page and be done with coding!

 

The Web Page Content:

   1: <telerik:RadGrid runat="server" ID="AccountSummaryGrid" DataSourceID="AccountSummaryDb" AccessKey="X">
   2:     <MasterTableView AutoGenerateColumns="false" DataKeyNames="AccountName">
   3:         <Columns>
   4:             <telerik:GridBoundColumn DataField="AccountName" UniqueName="Grid_AccountName">
   5:             </telerik:GridBoundColumn>
   6:             <telerik:GridBoundColumn DataField="Description" UniqueName="Grid_AccountDescription">
   7:             </telerik:GridBoundColumn>
   8:             <telerik:GridBoundColumn DataField="Balance" UniqueName="Grid_AccountBalance">
   9:             </telerik:GridBoundColumn>
  10:             <telerik:GridBoundColumn DataField="CreditLimit" UniqueName="Grid_AccountCreditLimit">
  11:             </telerik:GridBoundColumn>
  12:             <telerik:GridBoundColumn DataField="OwnershipType" UniqueName="Grid_OwnershipType">
  13:             </telerik:GridBoundColumn>
  14:         </Columns>
  15:     </MasterTableView>
  16: </telerik:RadGrid>

With the page behind code being a horrible:

   1: protected void Page_Load(object sender, EventArgs e)
   2: { 
   3:     RadGridUtilities.ApplyDefaultSettings(AccountSummaryGrid);
   4: }

There are many tricks (and enhancement not shown), but the first item is to give the user the option to put the site into Accessibility Mode. If you give that option then only the pages while in Accessibility Mode must comply with Section 508. You can be as feature rich as you want elsewhere!!

 

This is what the one C# statement above does.

  • Provide web site defaults to the grid (every grid…), you want the defaults changed – it occurs in one spot only not in 100 pages!
  • Provide translation that is based off of the Control.Id (SO no more “Button1” but descriptive “ButtonToRetrieveStatement”
  • Modify the grid for what you want to do for accessibility.

There are a few items that I excluded (like automatically creating Resx entries for every phrase dynamically and self-auditing for 508 compliance) so the code is clearer.

/// <summary>
/// Applies the default settings to the page. If user customization is implemented
/// the settings here would come from their preferences stored in a Session object.
/// </summary>
/// <param name="grid">a radGrid control</param>
/// <param name="onGetOnly">Determines if the settings are down on every postback or only on first GET</param>
public static void ApplyDefaultSettings(RadGrid grid, bool onGetOnly)
{
    // Uniquenames should have an underscore (_) in it. grid_ is the recommended prefix so that
    //localization may identify items used in grids
    char[] sepUnique = { '_', ' ' };

    // Items requiring     
    grid.ItemCreated += grid_ItemCreatedAddRowScope;

    // Get translations. For items that Telerik may have translated use fallback version. 
    grid.MasterTableView.Caption = ResourceTranslation.TranslateField(string.Format(CultureInfo.CurrentCulture, "%ContentPage%{0}_Caption%", grid.ID));
    grid.MasterTableView.CommandItemSettings.AddNewRecordText = ResourceTranslation.TranslateField("%MasterPage%Grid_AddNewRecordText%", grid.MasterTableView.CommandItemSettings.AddNewRecordText);
    grid.MasterTableView.CommandItemSettings.ExportToCsvText = ResourceTranslation.TranslateField("%MasterPage%Grid_ExportToCsvText%", grid.MasterTableView.CommandItemSettings.ExportToCsvText);
    grid.MasterTableView.CommandItemSettings.ExportToExcelText = ResourceTranslation.TranslateField("%MasterPage%Grid_ExportToExcelText%", grid.MasterTableView.CommandItemSettings.ExportToExcelText);
    grid.MasterTableView.CommandItemSettings.ExportToPdfText = ResourceTranslation.TranslateField("%MasterPage%Grid_ExportToPdfText%", grid.MasterTableView.CommandItemSettings.ExportToPdfText);
    grid.MasterTableView.CommandItemSettings.ExportToWordText = ResourceTranslation.TranslateField("%MasterPage%Grid_ExportToWordText%", grid.MasterTableView.CommandItemSettings.ExportToWordText);
    grid.MasterTableView.CommandItemSettings.RefreshText = ResourceTranslation.TranslateField("%MasterPage%Grid_RefreshText%", grid.MasterTableView.CommandItemSettings.RefreshText);
    grid.MasterTableView.CssClass = ResourceTranslation.TranslateField("%MasterPage%Grid_CssClass%", grid.MasterTableView.CssClass);
    grid.MasterTableView.Summary = ResourceTranslation.TranslateField(string.Format(CultureInfo.CurrentCulture, "%ContentPage%{0}_Summary%", grid.ID));

    // This section would be used if images are to be customized for languages (unlikely)
    //    grid.MasterTableView.CommandItemSettings.AddNewRecordImageUrl = ResourceTranslation.TranslateField(grid.MasterTableView.CommandItemSettings.AddNewRecordImageUrl);
    //    grid.MasterTableView.CommandItemSettings.ExportToCsvImageUrl = ResourceTranslation.TranslateField(grid.MasterTableView.CommandItemSettings.ExportToCsvImageUrl);
    //    grid.MasterTableView.CommandItemSettings.ExportToExcelImageUrl = ResourceTranslation.TranslateField(grid.MasterTableView.CommandItemSettings.ExportToExcelImageUrl);
    //    grid.MasterTableView.CommandItemSettings.ExportToPdfImageUrl = ResourceTranslation.TranslateField(grid.MasterTableView.CommandItemSettings.ExportToPdfImageUrl);
    //    grid.MasterTableView.CommandItemSettings.ExportToWordImageUrl = ResourceTranslation.TranslateField(grid.MasterTableView.CommandItemSettings.ExportToWordImageUrl);
    
    // Site specific options. 
    // If User Options are allowed, the user's setting would be assigned instead.
    grid.AllowPaging = true;
    grid.AllowSorting = true;
    grid.ExportSettings.FileName = grid.ID;
    grid.ExportSettings.IgnorePaging = true;
    grid.ExportSettings.OpenInNewWindow = true;
    grid.ExportSettings.Pdf.Author = "Lassesen Consulting, LLC";
    grid.ExportSettings.Pdf.PageTitle = grid.MasterTableView.Caption;
    grid.ExportSettings.Pdf.Subject = grid.ID;
    grid.MasterTableView.AllowPaging = true;
    grid.MasterTableView.AllowSorting = true;
    grid.MasterTableView.CommandItemDisplay = GridCommandItemDisplay.TopAndBottom;
    grid.MasterTableView.CommandItemSettings.ShowExportToCsvButton = true;
    grid.MasterTableView.CommandItemSettings.ShowExportToExcelButton = true;
    grid.MasterTableView.CommandItemSettings.ShowExportToPdfButton = true;
    grid.MasterTableView.CommandItemSettings.ShowExportToWordButton = true;
    grid.MasterTableView.PagerStyle.Mode = GridPagerMode.NextPrevNumericAndAdvanced;
    grid.PageSize = 60;

    // Changes for accessibility on a Grid Level
    // Remember hanidcap can be cognitive or physical -- so trim lots
    // of features
    if (Section508.UserSetting)
    {
        grid.AllowCustomPaging = false;
        grid.AllowFilteringByColumn = false;
        grid.AllowMultiRowEdit = false;
        grid.AllowMultiRowSelection = false;
        grid.MasterTableView.AllowPaging = false;
        grid.MasterTableView.AllowSorting = false;
        grid.MasterTableView.CommandItemDisplay = GridCommandItemDisplay.None;
        grid.MasterTableView.CommandItemSettings.ShowExportToCsvButton = false;
        grid.MasterTableView.CommandItemSettings.ShowExportToExcelButton = false;
        grid.MasterTableView.CommandItemSettings.ShowExportToPdfButton = false;
        grid.MasterTableView.CommandItemSettings.ShowExportToWordButton = false;
        grid.MasterTableView.ShowGroupFooter = false;
        grid.ShowGroupPanel = false;
        grid.ShowStatusBar = false;
    }

    int colNo = 0;
    foreach (GridColumn col in grid.MasterTableView.Columns)
    {
        // uniquekeys are assumed to be compounded with an underscore
        //separator. Only last part is used as HeaderAbbr
        var translationkey = col.UniqueName;
        var parts = col.UniqueName.Split(sepUnique, System.StringSplitOptions.RemoveEmptyEntries);
        col.HeaderAbbr = parts[parts.Length - 1];
        // CHANGES for Accessibilty at a Column Level
        if (Section508.UserSetting)
        {
            col.AutoPostBackOnFilter = false;
            col.Groupable = false;
            col.HeaderButtonType = GridHeaderButtonType.PushButton;
            col.Resizable = false;
        }
        string tooltipMnemonic = string.Empty;
        //Determine if sortable and adopt appropriate tooltip
        if (grid.AllowSorting)
        {            
            if (! string.IsNullOrEmpty(col.SortExpression))
            {
                tooltipMnemonic = "Sort";
            }
            tooltipMnemonic =string.Format(CultureInfo.CurrentCulture, "%ContentPage%{0}_{1}Tooltip%", translationkey,tooltipMnemonic);
            col.HeaderTooltip = ResourceTranslation.TranslateField(tooltipMnemonic);
        }
        col.HeaderText = ResourceTranslation.TranslateField(string.Format(CultureInfo.CurrentCulture, "%ContentPage%{0}%", translationkey));
    }
}

 

The first item of importance is the event handler that is added

grid.ItemCreated += grid_ItemCreatedAddRowScope;

 

Grids require at least one <td scope=”row”> for Section 508 which is not natively provided by Telerik, so we must add it while the item is being created. This is done with the code below. The columns are those that are in the DataKeyNames, which makes it almost happens by designed-accident.

   1: /// <summary>
   2: /// Routine to add scope="row" to the grid for 508 compliance. The items are those specified in
   3: /// DataKeyNames
   4: /// </summary>
   5: /// <param name="sender">a telerik radgrid control</param>
   6: /// <param name="e"></param>
   7: static void grid_ItemCreatedAddRowScope(object sender, GridItemEventArgs e)
   8: {
   9:     if (e.Item is GridDataItem)
  10:     {
  11:         GridDataItem dataItem = e.Item as GridDataItem;
  12:         foreach (string key in dataItem.OwnerTableView.DataKeyNames)
  13:         {
  14:             foreach (GridColumn col in dataItem.OwnerTableView.Columns)
  15:             {
  16:                 if (col.IsBoundToFieldName(key))
  17:                 {
  18:                     TableCell cell = dataItem[col.UniqueName];
  19:                     cell.Attributes["scope"] = "row";
  20:                 }
  21:             }
  22:         }
  23:     }
  24: }

The other item is the ResourceTranslation.TranslateField which is a widget between the code and the usual handling of resources. It allows me to identify if a resource is missing as well as do some fancy stuff like “lookup the term and if you do not find it, then use what Telerik provides”;

 

That’s it.

Thursday, February 18, 2010

Planning for Web Accessibility Compliance

Web Accessibility compliance is often the unknown country for web developers. There are a few cases where it is required (or if specified in the contract and ignored, can change a profitable contract into a major loss leader). A constant problem is looking at the past and not the future – do you really want to deliver a product that will likely violate either the law or standards within a few months? Well, some folks would really want to do so, in order to have the work of updating the sites to conform next year.

 

The timeline below shows the current state of evolution:

gif_1[5]

To understand the trend remember the following:

  • Section 508 was based on the drafts of WCAG 1.0
  • The proposed update of Section 508 was based on drafts of WCAG 2.0
  • The Access Board is an ongoing government group that sets the technical details for Section 508. They have the legal authority to update the standards to most, if not all, of WCAG 2.0

Now, if you wish to deliver a STANDARDS based web site, then since WCAG 2.0 is the current standard (WCAG 1.0 is NO LONGER a standard, it has been superceeded) you should target that (which will usually also results in the 1998 508 compliance – with a few exceptions; the exceptions may disappear when the pending updates happen).

 

That’s it in a snap shot. Suggested further readings are:

Additionally, if the web site has a market in different countries then you may have additional regulations. Fortunately, WCAG 2.0 tends to be the superset of such regulations.

Wednesday, February 17, 2010

The forgotten file for .Net Web Sites and Applications: Now to throttle a website

I frequently observe that may sites and applications do not have a Global.asax file. One sweet use of such a file is the ability to keep a site appearing well behaved to users and the ability to throttle load the moment any component starts to fail due to load, network issues or whatever.

  • First, you NEVER want a site to vomit .Net/C# error messages. Not only is it not professional, the information shown can sometime be used to hack the site.
  • Second, if components are failing due to load or unexpected problems, you really want to tell the users to go away and come back later. The problem is knowing when to send users away.

First we add a global error trap – if you forgot to do a try/catch somewhere in your code – fear not, this will handle it!

   1: /// Page to send user to if any unexpected error occurs
   2: private static string offlinepage = "~/NotAvailable.htm";
   3: /// Cookie Name
   4: private static string offline = "_Offline_";   
   5: void Application_Error(object sender, EventArgs e)
   6: {
   7:     if (String.Compare(Request.AppRelativeCurrentExecutionFilePath, offlinepage, true) == 0)
   8:     {
   9:         return;
  10:     }
  11:     // Production/QA Logic
  12:     DateTime offlineuntil = DateTime.Now.AddMinutes(15);
  13:     HttpCookie cookie = new HttpCookie(offline, offlineuntil.ToString());
  14:     cookie.Expires = offlineuntil;
  15:     Response.Cookies.Add(cookie);
  16:     Response.Redirect(offlinepage, true);
  17: }

So when an error occurs, the user will be redirected to the offlinepage AND a cookie added that will last for 15 minutes.

 

Now we use another event to make sure that any attempt for this user to connect again is gracefully and politely refused.

   1: void Application_AuthorizeRequest(object sender, EventArgs e)
   2: {
   3:     DateTime now = DateTime.Now;
   4:     if (!string.IsNullOrEmpty(Request.Cookies[offline].Value)
   5:         && DateTime.TryParse(Request.Cookies[offline].Value, out now)
   6:         && now > DateTime.Now)
   7:     {
   8:         if (String.Compare(Request.AppRelativeCurrentExecutionFilePath, offlinepage, true) == 0)
   9:         {
  10:             return;
  11:         }
  12:         Response.Redirect(offlinepage, true);
  13:     }
  14: }

We actually overdid the testing by not only checking for the cookie (which should have expired), but checking the datetime if the cookie is present (incase something goes odd with the browser handling of cookies).

 

So what happens? Users are denied access on the first error. If a component is timing out – that’s fine, the load will be quickly trimmed to the volume that can be handled without error. No crashing servers. No ugly error messages. A well behaved website!

 

P.S. The offline page should be a html page (unless localization is an issue) so it has minimal load on the IIS Server.

Serializing Data Transfer for Performance

Sometime when you review other’s code you see coding that causes you to ask “Why code it THAT way, instead of {2-6 alternative approaches}”. With CLR I have long stated, “What you get for performance is often NOT what you expected for performance”. I decided to look at the performance of three ways of passing data via an Array (best performer in early blog) via serialization. I exclude the actual data transfer via TCP that would occur in a Web Service/WCF – but included the LENGTH of data serialized as a data point.

  • Assembly the data
  • Convert to an Object or a Structure and add to an List
  • Convert the list to an Array
  • Serialize the array
  • Deserialize the array to return the original array.

The objects are simple: A Structure – light weight and one would expect best performance

   1: public struct ViaStructure
   2: {
   3:     public string Name;
   4:     public int NameLength;
   5:     public bool IsEmpty;
   6: }

A classic object equivalent:

   1: public class ViaObjectPast
   2: {
   3:     private string _Name;
   4:     private int _NameLength;
   5:     private bool _IsEmpty;
   6:     public ViaObjectPast()
   7:     {
   8:     }
   9:     public string Name { get { return _Name; } set { _Name = value; } }
  10:     public int NameLength { get { return _NameLength; } set { _NameLength = value; } }
  11:     public bool IsEmpty { get { return _IsEmpty; } set { _IsEmpty = value; } }
  12: }

And the new style object equivalent:

   1: public ViaObjectPresent()
   2: {
   3: }
   4: public string Name { get ;set;}
   5: public int NameLength { get ;set;}
   6: public bool IsEmpty { get ;set;}
   7:     }

There are two ways of creating and applying values to these:

  • A – Using Initializers
  • B – Classic assignment
   1: Array.ForEach(asList, item =>
   2: {
   3:     vitem = new ViaStructure() { Name = item.Name.LocalName, NameLength = item.Name.LocalName.Length, IsEmpty = item.IsEmpty };
   4:     list.Add(vitem);
   5: });

and B:

   1: Array.ForEach(asList, item =>
   2:          {
   3:              vitem = new ViaStructure();
   4:              vitem.NameLength = item.Name.LocalName.Length;
   5:              vitem.IsEmpty = item.IsEmpty;
   6:              vitem.Name = item.Name.LocalName;
   7:              list.Add(vitem);
   8:          });

Once the list was built, we converted it to an Array

   1: var list1 = list.ToArray();

Then serialized via this code, and then deserialize it.

   1: using (var sw = new StringWriter())
   2: {
   3:     XmlSerializer ser = new XmlSerializer(typeof(ViaStructure[]));
   4:     ser.Serialize(sw, list1);
   5:     build2=DateTime.Now;
   6:     slen = serialized.Length;
   7:     using(StringReader sw=new StringReader(serialized))
   8:     {
   9:     var list2=(ViaStructure[]) ser.Deserialize(sw);
  10:     build3=DateTime.Now;
  11:     }
  12:     
  13: }

A bunch of parallel code that follow the pattern of my prior blogs. Executing thirty loops for each and received the results below.

 

Platform Targeted for x86

  Values      
Row Labels Min of BuildArray Min of Deserialize Min of Serialize Min of Length
NoOp 0.0 0.0 0.0 0.0
Object Current A 24.4 1072.3 1030.3 20155197.0
Object Current B 23.4 1058.6 1058.6 20155197.0
Object Past A 23.4 1065.4 952.1 19266381.0
Object Past B 24.4 1015.6 955.1 19266381.0
Structure A 39.1 1003.9 2249.0 18970109.0
Structure B 35.2 986.3 2221.7 18970109.0

Platform Targeted for x64

  Values      
Row Labels Min of BuildArray Min of Deserialize Min of Serialize Min of Length
NoOp 0.0 0.0 0.0 0.0
Object Current A 23.4 1040.0 986.3 20155197.0
Object Current B 23.4 1063.5 994.1 20155197.0
Object Past A 22.5 1017.6 899.4 19266381.0
Object Past B 22.5 1046.9 894.5 19266381.0
Structure A 33.2 996.1 2211.9 18970109.0
Structure B 30.3 1000.0 2230.5 18970109.0

 

Platform Targeted for Any

  Values      
Row Labels Min of BuildArray Min of Deserialize Min of Serialize Min of Length
NoOp 0.0 0.0 0.0 0.0
Object Current A 25.4 1044.9 1137.7 20155197.0
Object Current B 25.4 1083.0 1117.2 20155197.0
Object Past A 23.4 1025.4 989.3 19266381.0
Object Past B 24.4 1078.1 966.8 19266381.0
Structure A 42.0 1015.6 2225.6 18970109.0
Structure B 37.1 1018.6 2282.2 18970109.0

Conclusions

 

There seems to be no significant difference between using initializers and the classic assignment of properties. One shocker is that targeting for ANY appears to result in the poorest performance (about 10% worst overall) – this makes sense because as “no man can serve two masters”, no code can perform better (or equivalent) when it has two targets – the flexibility of going either way comes at a cost!

  • For Building the Array
    • x64 appear marginally faster (longer runs are needed to determine if this is true).
    • There appears to be no clear difference between the two ways of defining properties.
    • Structures are about 50% slower (unexpected results -- .Net Clr may need tuning here)
  • Serialization
    • Current pattern ({get;set}) runs 10% slower (unexpected results) then using private variables
    • Structures are > 120% slower (unexpected results)
    • There appears to be no clear difference between the two ways of defining properties.
  • Deserialization
    • Current pattern ({get;set}) are fuzzy. With x64 it appears slightly faster, with x86 it appears slightly slower.
    • Structures are > 5% faster
    • There appears to be no clear difference between the two ways of defining properties.
  • Length has more variation then expected:
    • Structure was 5% more compact then current object pattern (get;set}
    • Structure was 1% more compact then past object pattern (private variables)

Before jumping to conclusions, I should mention that my source was XML. If I had simply transferred the Xml without serialization/deserialization and built the array at the destination the performance difference could be in the order of 2.2 seconds(serialization) versus 0.023 seconds (a 100 fold difference). This begs the question – why not move either a dataset(serialized to Xml via SaveXml) or raw xml instead of inflecting the heavy cost of serialization?

A forthcoming blog will look at moving XML from the database to a client with a variety of approaches.