Posts

Showing posts from July, 2010

Time to get SCIENCE into Computer Science

Yesterday while hiking I listened to BBC Discovery and they were talking about the first controlled experiments on humans done by a British Navy Surgeon, James Lind , with people suffering from scurvy.  The closing comments about the difficulty of people to keep scientific (basing decisions on hard good data and not popular belief, religious belief, anecdotes etc) struck home with me.   Often I have had a discussions about which technology paths to take and usually end up using the following criteria for my recommendations: Number of lines of code to produce (lower is better) intended results Application performance – how fast is it going to run Application scalability – can we handle bigger load Expected time to completion: Expected in a mathematical sense : Estimated Time x Probability of being correct. Maintainability of code base, which breaks down into: Availability on the market of people skilled enough to do maintenance and developme

SQL Server and Rich-Sparse Data (Real Estate) – Part I

Over the last two years I have been involved with two clients that deal with Real Estate (different market segments so no conflict of interest for me). I would describe real estate data as being sparse, rich data. The typical scenario is that the data on one house consists of MLS data and County Assessor data. In the case of SQL Server 2008R2 ,the number of fields involve quickly exceed the number of columns available in a SQL Server table (non-wide 1024 columns), but less than the number of columns of a wide table (30,000).  Wide tables were introduced in SQL Server 2008, so if you are forced to run on older versions of SQL Server the options changes.   In the case of wide tables, you are restricted to 4096 columns for Inserts, Select or Updates (despite having 30000 columns available). The wide table has a column set which is essentially an untyped XML that combines all the sparse columns [ more info ]. This means that you could use XML if you are running SQL Server 2005. There’s

#if and [Conditional]

Old school folks who have done Fortran, C, C++ etc knows the old friend, #if.  Net introduced a [Conditional] attribute and give examples for use. On occasion I have seen newer developers  voicing a dislike of seeing #if in code and advocate the use of [Conditional] as a standard.  This does not fly for me – simply because it results in code becoming twisted.   Often the reason for the resistance is simply unfamiliarity with it, or in some cases, some past experience working with horrible usage (I reviewed the code base from a major (not-Microsoft) software provider of tax software and saw nightmare usage patterns with the #if) has left a conditioning on their psyche.   For example, the following code WILL NOT COMPILE Code Snippet public void Init( HttpApplication context) {     context.EndRequest += EndRoutine;   } [ Conditional ( "TRACE" )] private void EndRoutine( object o, EventArgs e) {   Error    1    Cannot create delegate with 

Tracking user navigation on a web site

Typically you put up a website and hope everything goes well and is used.  It is often a good idea to track the pages that users actually go to, and from where.  A novice developer would likely add code to every page to record this information. A better developer may add code to a master page to do the same. A better (and simpler) solution is to just drop a IHttpModule on to the website and have it record. If you produce many websites, then just compile it to a DLL and add it to each   The code is very simple, as shown below. Code Snippet   using System; using System.Web; using System.Configuration; using System.Data;   using System.Data.SqlClient;   public class RequestTracking : IHttpModule {      private HttpApplication httpApp;      private string _Connection;      public void Init( HttpApplication httpApp)     {            this .httpApp = httpApp;         httpApp.BeginRequest += new EventHandler (httpApp_BeginRequest);         _Connec

The Art of Application UI Design

One of my project involves a client that has an excellent idea but not experience in UI design. This often is the beginning of conflicts between “this is what I envision” and “this is what is best design practices”.  There can be a lot of head-bumping, for example for each dialog/page title bar: Customer wants the firm logo and service mark on all of them Wants to really sell this motto Developer wants the name of the dialog/functionality there so the user knows where they are at For a support call, it makes it easy to help, ask for the title at the top It’s allow the user to scroll through dialog titles to select where they want to go (depending on what a dialog is, and the environment) I do not know the solution, my recommendation is to ask the customer to read some design books as a start.  Some examples of items on my shelf dealing with web design: Web Design in a Nutshell , Jennifer Niederst, O’Reilly Web Navigation, Designing the

Extending Sitemaps to provide rich page features and controls

On one web project there was a need to turn on and off the ability to print on individual pages. The print control is on the master page and the administrator wishes to be able to adjust which pages may print easily. The solution is actually simple:   Add a new attribute to the Sitemap node, for illustration we will use a XmlSiteMap as shown below with @ mapPrint : <? xml version ="1.0" encoding ="utf-8" ? > < siteMap xmlns ="http://schemas.microsoft.com/AspNet/SiteMap-File-1.0" > < siteMapNode url ="" title ="" description ="" > < siteMapNode url ="default.aspx" title ="Home Page" description ="" mayPrint ="true" /> < siteMapNode url ="about.aspx" title ="About Us" description ="" mayPrint ="false" />

The sweetest HTML/Web Site Validator around -- Qualidator

I came across this tool last week and have been using the free version (and likely to upgrade soon to get more features). As the name implies, it not only evaluate technical conformity, but also evaluate the quality of the Html and Css on the page.   Validation separates  HTML Workmen from HTML Professionals / Craftmen. Workmen simply gets the coarse job done. You want a door installed –it’s installed. There may be misalignments, gaps, missing cosmetic hardware, etc. The workmen may see them, but unless nagged  (or threaten) will not do anything more. Craftmen takes pride in their craft – the door will fit, be aligned, look good, open and close smoothly, trim will be well cut and fitted etc. On the web you will hear voices saying that validation is not needed etc. You will hear fewer voices saying it’s absolutely essential for any professional site. Often the difference is a lack of skills, maturity, and discipline in one of these groups. “If you don’t have the bandwidt

Doing Web-Based Standards Validation

There are a lot of sites, especially W3C that provides site-wide validation. On the other side, are web sites that require logons or internal only – never the twain can meet. Or can they?   I have a reasonable solution: To Web.Config, add an AppSetting “StandardsReview” that indicates if this site is in review mode. Automate things like logins with a specific account if any page is called and the user is not logged in. I use Master Pages, so this is an easy. Put an if/else around any code that you want to protect from accidental calls. Since you are using a specific account above – the account may be a safe testing-account and this coding is not needed. It really should be such a safe account. The following assumes that the validators will walk all of the links from the home page. On the home page I drop this simple code: < div style ="display: none" > < uc:SiteValidationSitemap runat ="server" ID =&q