Showing posts from March, 2010

Configuring any SQL Server – the forgotten homework…

Often developers just toss SQL Server on any handy drive and forget about it.  Later when they are asked to provide guidance on installing SQL Server they provide zero often, or slip into pro-forma “mom’s apple pie” statements.   Today, I decided to do a little homework and downloaded SQLIO tool from Microsoft to get some numbers on my drives. I have two RAID-0 logical drives, one RAID-1 logical drive, 1 SSD Drive and an external SATA drive. In general, SATA drives are developer grade and definitely not production grade – production grade is SCSI drives.   So running just one test (using random access writes ) on the drives, I got the following interesting table. I opted for random writes because it tends to bring out the nasty.   Drive Latency Average Latency Max Ios/Sec MBs/Sec SSD (Intel) 32 414 966 121 Mirror @ 5200 RPM 207 515 153

Sanity for equipping employees

I work as an independent consultant, one of the reasons that I like doing so is that my productivity is not inhibited by poor equipment! I have worked as a consultant for major firms, as well as an employee for folks like Microsoft and constantly been dumb founded by the triumph of Accountants over employee productivity! i.e. Capital cost versus fixed cost.  The profit center aspect of employees seem to disappear from the equations.   The typical developer to days earn between 60-100K. Adding in benefits we are looking at $100-200K often. Giving an employee a $2-3K budget at year to do hardware and software upgrades as they think would best serve their tasks  (subject to reasonableness review – ideally by an external source) seems so simple and productive – that I can assume some folks never did a business management course…   Today,  you can get a workstation with built in RAID on the motherboard, 12 Gigs DDR3 ram, quad true cores (with hyperthreading), four terabyte drive (fo

It is not

A quick way to programmatically post to twitter is to create a URL that posts directly to their web site. This little tidbit you can find on several other blog posts out there: ( link this one ). Here is the C# code I am using to launch the default browser from a Winform application: private void twitterToolStripButton_Click( object sender, EventArgs e) { Cursor.Current = Cursors.WaitCursor; Uri twitterUri = new Uri(String.Format( "{0}" , HttpUtility.UrlEncode(((INotification)_current).Tweet))); try { System.Diagnostics.Process.Start(twitterUri.ToString()); } finally { Cursor.Current = Cursors.Default; } } This works nicely even when the Tweet has hash marks (#) etc… -- because of the HttpUtility.UrlEncode. However, in the original code I was going to instead of which is a problem – since the subnet of www redirects to the and g

A Summary of why you should aggressively phrase out SQL IMAGE, TEXT and NTEXT

In my earlier post, I demonstrated the poor Sql Performance for Blobs (up to 4x worst ). These are thankfully depreciated in SQL Server 2005 [1] and soon may no longer exist. By Blobs I mean the following data types: Text nText Image So in theory, no one should be using those for new development since 2005 – well, people have this horrible habit of not reading documentation – which keeps me very gainfully employed !   Some additional motivations to purge code of these are simple, they require much more complex coding to access – for example, reading an image from a column return in myReader (SqlDataReader) [2] : // Create a file to hold the output. fs = new FileStream( " employee " + emp_id + " .bmp " , FileMode.OpenOrCreate, FileAccess.Write); bw = new BinaryWriter(fs); startIndex = 0 ; retval = myReader.GetBytes( 1 , startIndex, outbyte, 0 , bufferSize); while (retval == bufferSize) { bw.Write(outbyte); bw.Flush(); start

Code-less passing multiple selected items to a DataSource

Sometime on an ASPX page you need to use an item like a multiple select ListBox or a CheckBoxList but if this is to be a parameter to a DataSource, you find that there is no suitable property to pass as a parameter !   My solution is simple, just extend the control and add a new property that extracts the multiple values into a XML string. Why a Xml String? Simple, because it is robust for whatever happens to be in the Value for each item.   So we have this piece of simple code (using XmlTextWriter for performance)   namespace Ken.Web.UI.WebControls { using System.Xml; using System.IO; public class CheckBoxLIst : System.Web.UI.WebControls.CheckBoxList public string CheckedValuesXml { get { using (var swriter = new StringWriter()) { using (var xwriter = new XmlTextWriter(swriter) { Formatting = Formatting.Indented }) { xwriter.WriteStartDocument(); xwriter.WriteStartElement(ID);

Code samples for Advanced C# Programming by Paul Kimmel

Over this week I was wanting to look at some code samples for PerfMon in C#. I found a lot pages referencing a PerfMon.sln, including on MSDN – but no one could find the source code.  The above book ( ) had an entire chapter on it and was available as a eBook.   After a few emails back and forth with McGraw-Hill Education, they finally sent me a downloadable location for the source:   I suspect other folks would appreciate getting access to the samples!   The purpose of getting the code samples was to build a check box tree to select the counters that I wanted to include in SQL Server Management Data Warehouse (aka SQL Server Performance Studio) so it’s a click the counter, click a button and you are logging those counters in your MDW database.

Before you ship code – make sure there are no leaks etc

Whether you coding in C# or TSQL, you can often have a variety of challenges in your code that never expose themselves on a development box. Good practice is create unit tests that runs for ever exercising all code paths and seeing what happens.   When you do this, you want to open PerfMon and watch the following items carefully (there are other items that are performance related – these are items indicating common code problems). Object Performance Counter Issue Memory Free System Page Table Entries > 5000: Memory Leak   Pool Non-Paged Bytes > 175MB: Memory Leak   Pool Paged Bytes > 250MB: Memory Leak   Pages Per Second > 1000: Memory Leak Processor % User Time >80%: Code refactor may be needed Process Handle Count > 10000: Handle L

A very inconvenient bug for Array.ForEach

Today I was trying to debug a user control and kept hitting an error being thrown on this line in the debugger:           Controls.Add(cell = new HtmlGenericControl("span"));   To make it more confusing the message was: Exception Details: System.ArgumentNullException: Value cannot be null. Parameter name: array   So now I have a double confusion – a statement that should never throw an error that complains about an ARRAY!!!!     Finally, I got frustrated and just let the page error out when I noticed that the error shown on the page was at a different location:   Line 36: Array.ForEach(GetEmailAddresses(), item => Line 37: { Line 38: sb.Append("<br/>");   In other words, the interactive debugger was pointing to the wrong line!   I am very interested to see if this problem will occur in RTM of VS2010. Be warned if you are using Array.ForEach!

Checking Bandwidth to SQL Server

Once in a while the bandwidth to a remote SQL Server accounts for poor performance. In some odd cases, the problem can be tracked down to someone accidently dropping a 10BaseT switch or hub in a route that is otherwise 1000BaseT. This can be very hard to detect because your machine to the internet shows the speed of the first link and not the speed across the entire link.   There is an easy way to do this if you have a big table on your SQL Server. Open SSMS, connect to the database execute Shift-Alt-S (Query / Show Client Statistics) . Execute a Select * From {tablename} and look at the Client Statistics tab. Do a division of bytes received from server/client processing time to give you a rough estimate. Two examples are shown below: which results in 136 which results in 11207   The first case was hitting a SQL Server over a 3G link. The second case was hitting a database on the local network. A nice simple way of checking bandwidth end-to-end.

Why you want to run Server 2008 R2 and not Server 2003 if you have a newer Dev Box.

Windows Server 2003 is licensed by core. Enterprise edition supports 8 SMP cores . With new CPUs supporting 6 cores with hyperthreading, you end up with 12 cores – 4 of which will not be used by the OS!!!! This can be a bummer if you are running SQL Server Enterprise Edition because it will use only the cores supported by the OS (i.e. 8, not the 12 that you have)…. ooops!   On the other hand Server 2008 R2, the licensing is by X64 sockets . So a single CPU with 12 cores will use all of them with any edition (including Web and Standard). In short, you will get more thru-put with Server 2008 R2 Standard Edition then you would with Server 2003 Enterprise Edition. If you go for a fancy box, you could be running 48 cores on 4 x64 sockets with Server 2008 R2 Standard Edition (something that required the Data Center Edition with Server 2003 ).

A silly SQL Performance test

A question came up on the relative performance of the old Blobs (Image, Text, NText) and the newer nvarchar(max) and nvarbinary(max). It seemed a moot question since the blobs are depreciated in SQL Server 2008 and may not see the light of 2011. The question of relative performance came up and that was an interesting problem to try to assess natively in SQL Server. I like interesting questions. Some Bingoogling found nothing on the web evaluating actual performance between the two – so an experiment was in order.   First I created a simple table: CREATE TABLE [ dbo ] . [ PerfTest ] ( [ Id ] [ int ] IDENTITY ( 1 , 1 ) NOT NULL , [ FieldA ] [ nvarchar ] ( max ) NULL , [ FieldB ] [ ntext ] NULL , [ FieldC ] [ image ] NULL , [ FieldD ] [ text ] NULL , [ FieldE ] [ varchar ] ( max ) NULL , [ FieldF ] [ varbinary ] ( max ) NULL   Then I grabbed a 500K row table and inserted data into FieldE. Once inserted, I updated all of the other fields from it – so a

SSD Drives and SQL Server Temp DB

I’m in the midst of specing out a new server and I know that it will have a SSD drive in it on which the operating system and key applications will sit.  The idea came up that this is where I should also put my TempDB. Lightening performance!!!!  Right?   After all the literature reports:   with striping and short stroking of 10 high-end 15,000rpm hard disk drives performance gains of more than 16× are attainable, resulting in more than 3,000 random reads per second. [*]   I am coming to the conclusion that this could be a dumb idea if randomly implemented. There are two main types of SSD Drives: single-level cell ( SLC ) NAND memory – typical enterprise. multi-level cell (MLC) NAND flash memory – typical consumer, cheaper, slower, as much as 10 times fewer read/writes over its life span So I looked at the specification of one MLC drive, the Super Talent MasterDrive PX Series 64GB 2.5in SATA II Solid State Drive, SLC NAND Flash and read: Read : Unlimited

Best organization for a one man consulting shop…

This question came up twice this last week so I thought that I would dump my logic and see what people may correct me on… First, the difference between a LLC and proprietorships is simple – with a LLC, you are not personally liable; so a LLC is definitely preferred. For a S-Corp versus a C-Corp, the difference is that with a C-Corp you may have double taxation. Bottom line is a LLC (which may be a proprietorship) and a S-Corp. For a short summary see here   The argument is often to go with an S-Corporation because you can take the money out as distributions and avoid employment taxes. This is true if the reasonable salary for your skill set is below the magic ceiling of $106,800 . If you are a software consultant – forget it – the first time that the IRS does an audit, your lower reasonable salary will be laughed at and you will be hit by back taxes, penalty and interest!  At $106,800 you have no more employment tax. The rest is subject to a 2.9% Medicare tax.   So there is

Making a full data backup of SQL Server in XML via .Net

I have often worked with clients whose data is on a hosted SQL Server database. They have often encountered problems getting a copy of the database on their local machine or have experienced pain when the hosting service failed. They can script the database fine, but often scripting the data will not work.   The code below does a full database TABLE backup into a collection of XML files. The logic is simple: Get all of the tables in the database (aka CATALOG) Do a SELECT * on each table specifying that the data should be formatted as XML This will write BINARY and IMAGE data correctly . This will write XML data correctly . Read the XML (which many sites have bad code for). Write it to a folder with today’s date. Nice and simple! Just code your configuration file and away you go!!!   using System; using System.Configuration; using System.IO; using System.Collections.Generic; using System.Xml; using System.Text; using System.Data; usi

A Deception building controls off WebControl!

When you build a web control inheriting from WebControl you will see AccessKey as a property and assume that if you assign to it that it will appear in the HTML produce. WRONG.   Not only does it not appear but if you naively do something like this .Attributes.Add(“accesskey”,AccessKey); you will see it in the HTML but usually it will NOT work….   The first problem is that the accesskey is valid only on a few html elements according to W3C . They are: A Area Button Input Label Legend TextArea Tables elements, divisions and spans may NOT use accesskey So what happens when you want to have the access key jump to a pure data <table>. The answer is simple, add a <a name=”xyz” accesskey=”V” /> None of the other items are appropriate. This also answers the question when you wish to give an access key to a <span>, <div>, <ul>. <ol>, etc etc etc.   So if you are writing a webcontrol based on web con

A robust set of HTML Access Key assignments

Access keys are used to insure easier navigation of a web page by the blind etc. They are required for Section 508 Accessibility. One of the problems is that different browsers consume different keys, or do not support certain keys. Additionally some accessibility add-ins also consumes some of the keys.  In short, it’s a mine field!   The following is a recommendation on how to walk this mine field following government standards and the quarks of various browsers and add-ins.  If there are any problems, please email or post.   NOTE: Some items are duplicated – this is not avoidable with this mine field. There are only five open slots unless you get into the funky symbols.   Key Conflict in Usage A IE   B NS7,O7   C