Wednesday, June 17, 2015

Azure VM LIcensing Error

When you log in to the Windows Azure VM, you may receive a popup that says you need to configure the licensing server. If you don't configure the licensing server, you only have X days left of access.

You need to sign in to the Azure portal, download the RDP file for that virtual server, and only access the VM from that RDP. 

The RDP file has access to the server, even after the licensing server grace period ends.


Umbraco CMS Refresh

Umbraco is an Asp.Net based Content Management System. It has many pieces and configuration elements.

Recycle/Refresh
While I have been working on the Merchello Package, I've learned some ways to cycle Umbraco if something isn't working right.

  1. Touch Web.Config
  2. Reindex Examine Indexes via Back Office
  3. Delete files in \App_Data\Temp
  4. Republish content node or entire content tree. 

Sunday, January 26, 2014

Parsing Word Document XML via MsXml in VBA

An old friend ping me because while having some success loading word documents saved as xml in his VBA routines in word, he was having massive problems getting XPATH to work. I resolved the basic issue and then asked him to give me his hardest issue for me to show some example code for. I think that the last time that I did this was a few years ago, MsXml was (and is) a bit lame compared to later implementations, so his troubles were very reasonable to have.

 

Declaring name spaces

You can’t use XPath on a document with namespaces without providing the name spaces. The process is pretty simple, just convert the namespaces declare at the top of the XML document to a VB String. There happen to be many of them in a word xml document..

 

Dim domPrefix As String
domPrefix = "xmlns:wpc='http://schemas.microsoft.com/office/word/2010/wordprocessingCanvas' "
domPrefix = domPrefix + "xmlns:mc='http://schemas.openxmlformats.org/markup-compatibility/2006' "
domPrefix = domPrefix + "xmlns:o='urn:schemas-microsoft-com:office:office' "
domPrefix = domPrefix + "xmlns:r='http://schemas.openxmlformats.org/officeDocument/2006/relationships' "
domPrefix = domPrefix + "xmlns:m='http://schemas.openxmlformats.org/officeDocument/2006/math' "
domPrefix = domPrefix + "xmlns:v='urn:schemas-microsoft-com:vml' "
domPrefix = domPrefix + "xmlns:wp14='http://schemas.microsoft.com/office/word/2010/wordprocessingDrawing' "
domPrefix = domPrefix + "xmlns:wp='http://schemas.openxmlformats.org/drawingml/2006/wordprocessingDrawing' "
domPrefix = domPrefix + "xmlns:w10='urn:schemas-microsoft-com:office:word' "
domPrefix = domPrefix + "xmlns:w='http://schemas.openxmlformats.org/wordprocessingml/2006/main' "
domPrefix = domPrefix + "xmlns:w14='http://schemas.microsoft.com/office/word/2010/wordml' "
domPrefix = domPrefix + "xmlns:w15='http://schemas.microsoft.com/office/word/2012/wordml' "
domPrefix = domPrefix + "xmlns:wpg='http://schemas.microsoft.com/office/word/2010/wordprocessingGroup' "
domPrefix = domPrefix + "xmlns:wpi='http://schemas.microsoft.com/office/word/2010/wordprocessingInk' "
domPrefix = domPrefix + "xmlns:wne='http://schemas.microsoft.com/office/word/2006/wordml' "
domPrefix = domPrefix + "xmlns:wps='http://schemas.microsoft.com/office/word/2010/wordprocessingShape'"
The next item is just loading the xml file, this was his existing code and worked fine
    Set xmlDoc = New MSXML2.DOMDocument60
    xmlDoc.validateOnParse = False
    xmlDoc.async = False
    xmlDoc.Load sFile    
    If Not xmlDoc.Load(sFile) Then  'strXML is the string with XML'
        sErr = xmlDoc.parseError.ErrorCode & "  " & xmlDoc.parseError.reason
        GoTo errXML
    End If
The next item was just setting two properties on the xmlDoc (unlike later implementations, there was no XmlNameSpaceManager needed). The key one is using the domPrefix string from above.
    xmlDoc.setProperty "SelectionNamespaces", domPrefix
    xmlDoc.setProperty "SelectionLanguage", "XPath"

So if he wanted to find all of the tables, it was a simple:

Dim foundNodes As MSXML2.IXMLDOMNodeList  
Set foundNodes = xmlDoc.DocumentElement.SelectNodes("//w:tbl")

His tough problem

He wanted to find all images in the document, obtain their dimensions and the associated file name, plus some related stuff. The code was pretty clean to write with xPath, as shown below

 ' Find the  actual drawing nodswe
    Set foundNodes = xmlDoc.DocumentElement.SelectNodes("//w:drawing")
        Debug.Print foundNodes.Length & " Drawings found"
        iNode2 = 1
    For Each node In foundNodes
    Debug.Print "Image #" & iNode2
    'We search from this node
    Set node2 = node.SelectSingleNode(".//wp:extent")
    Debug.Print node2.Attributes(0).BaseName & "=" & node2.Attributes(0).NodeValue
    Debug.Print node2.Attributes(1).BaseName & "=" & node2.Attributes(1).NodeValue
    Set node2 = node.SelectSingleNode(".//wp:docPr")
        Debug.Print node2.Attributes(2).BaseName & "=" & node2.Attributes(2).NodeValue
    'Directly finding parent p node does not appear to be supported so..
    Set node3 = node.ParentNode
    Do While node3.BaseName <> "p"
        Set node3 = node3.ParentNode
    Loop
    
    Set node2 = node3.SelectSingleNode(".//w:pStyle")
        Debug.Print node2.Attributes(0).BaseName & "=" & node2.Attributes(0).NodeValue
     iNode2 = iNode2 + 1
  Next node
There were a few hiccups such as finding that ancestor: and descendent: did not appear to work, but there were easy work arounds for that.
The output is shown below
output
Friend helped, some memories refreshed, and a blog post that will hopefully help someone else!

Saturday, January 18, 2014

Merging two AspNet Profiles–more complex then one would think!

During recent code revision, a developer accidentally removed in Web.Config, the applicationName on the SQLProfileProvider element. The consequence became apparent when support started to report customers saved-reports-definitions(which was stored here) had disappeared.

   <add name="ErrorProfileProvider" connectionStringName="LocalSQLServer" type="System.Web.Profile.SqlProfileProvider" />
The problem was identified, but there was a second problem…  We had a significant number of new customers that also saved reports-definitions to this error profile provider. A simple switch back by adding the applicationName would mean that these reports-definitions would disappear (which customer support did not want to see happen to new customers – a rather bad experience).
My initial take was “Oh, I have used multiple membership providers in one web application in the past, so we just need to use the same pattern…” 
  • Oops that pattern did not work, Grrr.
  • Searching stackoverflow and other sites, did not find any “canned solutions”.
  • We looked at trying to merge the data in SQL Server, it could be done – but the TSQL code was complicated and would require major setup in QA to test properly. Not a good scenario for an agile solution.
What I ended up doing was my original pattern but I had to go to lower level coding to do it (i.e. using inherited classes features). 
First, I indeed put multiple profiles in web.config as was my original plan:
<profile inherits="AdminConsole.CustomProfile" defaultProvider="GoodProfileProvider" enabled="true">
      <providers>        
<clear />        
<add name=GoodProfileProvider" applicationName="/Admin Console" connectionStringName="LocalSQLServer" type="System.Web.Profile.SqlProfileProvider" />        
<add name="ErrorProfileProvider" applicationName="/" connectionStringName="LocalSQLServer" type="System.Web.Profile.SqlProfileProvider" />      
</providers>    
</profile>
The next step was to write a method (ProfileMerge) that would be called when users logged in. I added it as a static in my profile class. The code is below, the basic steps were:
  1. Check if the customer has two profiles, if not and only the correct one – then we are done
    1. Otherwise just move one into the other “in bulk”
  2. Next we need to check when the last update happened, the old one was off line, so
    1. Older then error one – we need to update
    2. Newer – we have already updated (No need to create a “updated” field to track this)
  3. We used inherited classes to call GetPropertyValues so we can process the information as collections.
  4. We walked the ErrorData (the recently created saved-reports-definitions) to find any recent definitions that we needed to merge into the old profile data. When we found one, we called a merging routine (specific to how the reports were being saved)
  5. Once all of this was done, we set the results into the good profile and returned. The profile was already being saved in the calling code, so there was not a need to do an explicit save.
   public class CustomProfile : ProfileBase
    {
        public static void ProfileMerge(string userName, bool isForced)
        {
            try
            {
                if (String.IsNullOrWhiteSpace(userName))
                    return;
                int total = 0;
                var profile = CustomProfile.Create(userName);
                if (profile==null || profile.IsAnonymous) return;
                var errorPM = (SqlProfileProvider)System.Web.Profile.ProfileManager.Providers["ErrorProfileProvider"];
                if (errorPM == null) return;
                var errorDate = DateTime.MaxValue;
                foreach (ProfileInfo item in errorPM.FindProfilesByUserName(ProfileAuthenticationOption.Authenticated, userName, 0, 1, out total))
                {
                    errorDate = item.LastUpdatedDate;
                }
                var goodPM = (SqlProfileProvider)System.Web.Profile.ProfileManager.Providers["GoodProfileProvider"];
                if (goodPM == null)
                {
                    //Should never occur
                    AdminConsole.Diagnostics.SplunkLogger.Fatal("Profile not found for " + userName);
                    return;
                }
                // If date is in the future, there is no data to merge
                if (errorDate > DateTime.Now)
                    return;
                // Already done the update BUT allow manual repeat if needed.
                if (profile.LastUpdatedDate > errorDate && ! isForced) return;
                var errorData = errorPM.GetPropertyValues(profile.Context, CustomProfile.Properties);
                var goodData = goodPM.GetPropertyValues(profile.Context, CustomProfile.Properties);
                foreach (SettingsPropertyValue item in errorData)
                {
                    if(item.PropertyValue !=null && ! String.IsNullOrWhiteSpace( item.PropertyValue.ToString()))
                    switch (item.Name)
                    {
                        case "UserSavedFilters_Consumer":
                        case "UserSavedFilters_TransactionUsage":
                        case "UserSavedFilters":
                            goodData[item.Name].PropertyValue= MergeStrings(item.Name,goodData[item.Name].PropertyValue.ToString(),item.PropertyValue.ToString());
                            break;
                    }
                }
                goodPM.SetPropertyValues(profile.Context, goodData);
            }
            catch (Exception ex)
            {
                AdminConsole.Diagnostics.SplunkLogger.Error(ex);
            }
        }
        private static string MergeStrings(string name,string oldData, string newData)
        {
            var logEntry = new Dictionary<string, string>();
            logEntry.Add("AP_name", name);
            logEntry.Add("AP_old", oldData);
            logEntry.Add("AP_current", newData);
            char[] spliter = { '$' };
            if (String.IsNullOrWhiteSpace(oldData)) return newData;
            if (String.IsNullOrWhiteSpace(newData)) return oldData;
            var oldCollection = oldData.Split(spliter, StringSplitOptions.RemoveEmptyEntries);
            var newCollection = newData.Split(spliter, StringSplitOptions.RemoveEmptyEntries);
            var merge = new HashSet<String>();
            foreach (string item in oldCollection)
            {
                if (!merge.Contains(item))
                    merge.Add(item);
            }
            foreach (string item in newCollection)
            {
                if (!merge.Contains(item))
                    merge.Add(item);
            }
            var result = new StringBuilder();
            foreach (string item in merge)
            {
                result.Append(item);
                result.Append("$");
            }
            logEntry.Add("AP_result", result.ToString());
            AdminConsole.Diagnostics.SplunkLogger.Debug(logEntry);
            return result.ToString();
        }
You may have noted that I included an ability to ignore dates and force a merge.  This was done to provide a backdoor mechanism for support if the unexpected happen. They could enter the userid on a dark page (only visible to them), and force the update to happened again (which would log the results of the merge so we could find why our code (and QA) had failed).
The good news is that the above has rolled into production and our QA person reports that customer service is happy because everything was returned as expected. We did not need to use our dark page.

Tuesday, December 31, 2013

Converting SalesForce SOQL results to a Linked Dataset for use by C#

In my last post, I illustrated the 'simple' way of getting the data back - by just creating a Xml Document from the results. This is not always efficient -- in fact, I have gotten out of system memory errors on some queries.

In this post I will take the same query and show a little slight of code. The query was:

SELECT Name, 
  (SELECT Name, Email FROM Contacts), 
  (SELECT CaseNumber, CreatedDate, Subject, Status, Resolution__c,  ClosedDate, SuppliedEmail,SuppliedName FROM Cases
   WHERE CreatedDate = LAST_N_DAYS:366 
   AND Origin != 'Internal' 
   AND OwnerId !=     '00G40000001RTyyEAG' 
   AND RecordTypeId IN ('01240000000UTWP', '01240000000UWxM', '01240000000UWxl', '01240000000UWxb')

  ) 
 FROM Account  


 WHERE ID IN (SELECT AccountId FROM Contact Where Email in ({0}))

Extension Method to Handle SOQL Result Columns

The code below takes the results from the query and creates a DataTable from one column. This is our fundamental building block suitable for all SalesForce queries.

You could call this in a nested manner if you have nested objects in your query -- but I assume just one layer deep.

    ///
    /// Converts a Salesforce style XMLElement into a datatable
    ///
    /// A "Column" returned from Salesforce
    /// DataTable
    public static DataTable SalesForceToDataTable(this XmlElement element)
    {
        var result = new DataTable(element.LocalName);
            if (element.ChildNodes.Count == 1)
            {
                if (!result.Columns.Contains(element.LocalName))
                {
                    result.Columns.Add(element.LocalName);
                }
                DataRow row = result.NewRow();
                row[element.LocalName] = element.InnerText;
                result.Rows.Add(row);
            }
            else
                foreach (XmlNode child in element.ChildNodes)
                {
                    switch (child.LocalName)
                    {
                        case "size":
                        case "queryLocator":
                        case "done": break;
                        case "records":
                            DataRow row = result.NewRow();

                            foreach (XmlElement field in child.ChildNodes)
                            {
                                var colName = element.LocalName + "_" + field.LocalName;

                                if (!result.Columns.Contains(colName))
                                {
                                    result.Columns.Add(colName);
                                }
                                row[colName] = field.InnerText;
                            }
                            result.Rows.Add(row);
                            break;
                    }
                }
        return result;
    }

Building and linking a DataSet across columns

As we saw in our last post, we have subqueries appearing in a single column, as illustrated below:

The routine above handles each column. To put it all together, we use the code below.

public static DataSet QueryCases(string myquery)
{
    var result = new DataSet();
        var binding = new AdminConsole.WebSalesForce.SforceService();
        binding.SessionHeaderValue = new AdminConsole.WebSalesForce.SessionHeader();
        var salesforce = SaleforceUrl;
        if (salesforce != null)
        {
            binding.SessionHeaderValue.sessionId = salesforce.sessionId;
            binding.Url = salesforce.serverUrl;

            binding.UseDefaultCredentials = true;
            var results = binding.query(myquery);
            var done = false;
            while (!done && results.size > 0)
            {
                var masterKey = 0;
                foreach (var record in results.records)
                {
                    var tableIndex = 0;
                    foreach (XmlElement key in record.Any)
                    {
                        var newTable = key.SalesForceToDataTable();
                        newTable.Columns.Add(DataSetKey, masterKey.GetType());
                        foreach (DataRow row in newTable.Rows)
                        {
                            row[DataSetKey] = masterKey;
                        }
                        // On the first table we ADD to dataset, on subsequent we append the records
                        if (masterKey > 0)
                        {
                            foreach (DataRow row in newTable.Rows)
                            {
                                result.Tables[tableIndex].ImportRow(row);
                            }
                        }
                        else
                        {
                            result.Tables.Add(newTable);
                        }
                        if (result.Tables.Count > 1)
                        {
                            result.Relations.Add(result.Tables[0].Columns[DataSetKey], newTable.Columns[DataSetKey]);
                        }
                        tableIndex++;
                    }
                }
                masterKey++;
                done = results.done;
            }
        }
    return result;
}

Converting Linked DataSet to DataTable

What we are doing is adding a new key to tie the data in each column together and adding it as a relationship to the dataset. This is generic code still -- works for almost any query.

The last step is flattening this dataset into a data table so I may display it in a Telerik Grid. I need to match case to contact (which would be easy with a proper SQL language). My solution is shown below:


private DataTable FlattenDataSet(DataSet dataset)
{
        var result = new DataTable();
        try
        {
            // Create all columns
            foreach (DataTable table in dataset.Tables)
            {
                foreach (DataColumn col in table.Columns)
                {
                    if (!result.Columns.Contains(col.ColumnName))
                        result.Columns.Add(col.ColumnName);
                }
            }
            //Import the 1:1 relationships into the table
            foreach (DataRow row in dataset.Tables["Cases"].Rows)
            {
                result.ImportRow(row);
            }
            // Import the 1: of the 1:m relationship
            foreach (DataRow master in dataset.Tables["Name"].Rows)
            {
                DataRow[] rd = result.Select(DAL.daSalesForce.DataSetKey + "=" + master[DAL.daSalesForce.DataSetKey]);
                foreach (DataRow row in rd)
                {
                    foreach (DataColumn col in dataset.Tables[0].Columns)
                    {
                        row[col.ColumnName] = master[col.ColumnName];
                    }
                }
            }
            // For each contact (expected to be fewer than cases) We lookup the cases they submitted
            foreach (DataRow master in dataset.Tables["Contacts"].Rows)
            {
                DataRow[] rd = result.Select("Cases_ContactId='" + master["Contacts_Id"] + "'");
                foreach (DataRow row in rd)
                {
                    foreach (DataColumn col in dataset.Tables["Contacts"].Columns)
                    {
                        row[col.ColumnName] = master[col.ColumnName];
                    }
                }
            }
     return result;
    }
    else
    {
        return (DataTable)Session["SalesForce"];
    }
}

Bottom Line

Salesforce SOQL results can be handled in .Net using an extension and moving the data into a DataSet.  Once the data is there, you may have a little dataset grunting to get it transformed into the appropriate format.

Monday, December 30, 2013

Handing Salesforce SOQL Relationship aka Joins in C#

Salesforce SOQL is not SQL which often causes great frustrations to experienced database developers when they work with it. Being of the generation that worked with Network Data Models and Hierarchical Data Model  - i.e. pre-Relational Databases! (as well as database on Tape!), I actually find a lot of "familiar tones" in SOQL.

Recently I was needing to build a query that in SQL would be:

SELECT Account.Name AS AccountName, 
  Contact.Name, Contact.Email, 
  Case.CaseNumber, Case.CreatedDate, Case.Subject, Case.Status, Case.Resolution__c,  Case.ClosedDate, Case.SuppliedEmail,      Case.SuppliedName 
 FROM Account  JOIN Case ON ... JOIN Contact ON ....
 WHERE Contact.Email in ({0})
  AND Case.CreatedDate = LAST_N_DAYS:366 
   AND Case.Origin != 'Internal' 
   AND Case.OwnerId !=     '00G40000001RTyyEAG' 
   AND Case.RecordTypeId IN ('01240000000UTWP', '01240000000UWxM', '01240000000UWxl', '01240000000UWxb')

The equivalent query in SOQL turned out to be less complex to read (because of the intrinsic joins that exists in Salesforce). Understanding these intrinsic - relationships is essential for writing queries!

SELECT Name, 
  (SELECT Name, Email FROM Contacts), 
  (SELECT CaseNumber, CreatedDate, Subject, Status, Resolution__c,  ClosedDate, SuppliedEmail,SuppliedName FROM Cases
   WHERE CreatedDate = LAST_N_DAYS:366 
   AND Origin != 'Internal' 
   AND OwnerId !=     '00G40000001RTyyEAG' 
   AND RecordTypeId IN ('01240000000UTWP', '01240000000UWxM', '01240000000UWxl', '01240000000UWxb')

  ) 
 FROM Account  
 WHERE ID IN (SELECT AccountId FROM Contact Where Email in ({0}))

The next piece is dealing with the query results. When you execute the above in the developer's console, you will see that the data returns as JSON.
This means a little magic when the response is received. What we have received is a dataset consisting of three linked tables. There are two approaches that seem obvious:
  • Explode the dataset into a single data table
  • Use the dataset as a linked table
In my case, I needed to display the results in a flat grid of cases, so I went down the first path. There's a little magic that needed to be done because one account has many contacts(1:n) and many cases (1:m). This means that you need to walk the cases, and lookup the contact for each case record. As a result, I have to include the ContactId in Cases to flatten the data structure and get the matching contact ID. The result should be a table with m rows.


SELECT Name, 
  (SELECT Id, Name, Email FROM Contacts), 
  (SELECT ContactId, CaseNumber, CreatedDate, Subject, Status, Resolution__c,  ClosedDate, SuppliedEmail,SuppliedName FROM Cases
   WHERE CreatedDate = LAST_N_DAYS:366 
   AND Origin != 'Internal' 
   AND OwnerId !=     '00G40000001RTyyEAG' 
   AND RecordTypeId IN ('01240000000UTWP', '01240000000UWxM', '01240000000UWxl', '01240000000UWxb')

  ) 
 FROM Account  
 WHERE ID IN (SELECT AccountId FROM Contact Where Email in ({0}))

Examining the results reveal that we are dealing with XML and can thus convert them easily via XML activites
I noticed that applying LINQ to SOQL results seems to be difficult for some, but if you work at the XML level, some of the pain disappears quickly.

The code below uses the "fat" XmlDocument (and thus should not be used in real systems) for purposes of illustration:

        public static XmlDocument QueryCases(string myquery)
        {
            XmlDocument dom = new XmlDocument();
            dom.LoadXml("");
            XmlNode recordNode;
            try
            {
                var binding = new AdminConsole.WebSalesForce.SforceService();
                binding.SessionHeaderValue = new AdminConsole.WebSalesForce.SessionHeader();
                var salesforce = SaleforceUrl;
                if (salesforce != null)
                {
                    binding.SessionHeaderValue.sessionId = salesforce.sessionId;
                    binding.Url = salesforce.serverUrl;

                    binding.UseDefaultCredentials = true;
                    var results = binding.query(myquery);
                    var done = false;
                    while (!done && results.size > 0)
                    {
                        dom.DocumentElement.AppendChild(recordNode = dom.CreateElement("record"));
                        foreach (var record in results.records)
                        {
                            foreach (XmlElement key in record.Any)
                            {
                                recordNode.AppendChild(dom.ImportNode(key, true));
                            }
                        }
                        done = results.done;
                    }
                }
            }
            catch (Exception ex)
            {
                AdminConsole.Diagnostics.SplunkLogger.Error(ex);
            }           
            return dom;
        }

Now you have your data is a format that you can proceed easily with via LINQ etc.

Wednesday, October 16, 2013

Doing Tests and Unit Tests of AspForm Web Applications: Setup

This is the beginning of a series of post dealing with AspForm unit tests.

We have two options for running Unit Tests. Both have advantages and disadvantages.

Running Unit Tests Against your local development Box

A test would be written as shown below



This will produce a result such as


To have the test work we MUST configure IIS to provide the appropriate site, for example on 666


Check list:
  • Physical Path is pointing to the image you wish to test
  • IIS port and path are matching
  • IIS Server is running

Running Unit Tests Against a Development Server

Note: You need to STOP the above server for the following to work.
Our test is now written as:


The AspNetDevelopmentServerHost must be pointing at the source folder you wish to test.
We must now change the Web Application project to be found and available:


Again the test will run (with ability to debug into the web application if needed)

Gotcha of this approach

The Web.Config is rewritten at the start and at the end of running unit tests. Occasionally you will get error messages about web.config being locked by another process.

Recommended Approach

The following template is recommended:


That is
        [TestMethod]
        [HostType("ASP.NET")]
        [UrlToTest("http://localhost:666/default.aspx?unittest=devadmin@avalara.com?Unittestlogin=devadmin@avalara.com")]
       // [AspNetDevelopmentServerHost("C:\\GitHub\\develop\\AdminConsole", "/")]
This has the following advantages:
  • If you have multiple branches, then by just changing the IIS settings to point at each branch, you can run the tests against each branch without touching the files.
  • If there is a team working on the project, then there is no dependency on the physical location of the files (a big plus!)
  • If there is a problem that needs investigation,
    • Turn off IIS
    • Uncomment the AspNetDevelopmentServerHost line and adjust the path (if needed)