Friday, July 31, 2015

CascadiaJs Best Sessions

Here are the session videos that blew my mind. The session titles are terrible, ignore them. The presenters are amazing.

Shay Howe - Css as a Service: Maintaining Style

Ben Straub - Hacking Culture with Javascript

ANDREI KASHCHA - A story of package managers, graphs, and one million vertices





Friday, July 3, 2015

How to disable Umbraco's Client Dependency Handler (dependencyhandler.axd) for development

One of the tasks of the dependency handler (dependencyhandler.axd) used in Umbraco is to manage the return of client-side files such as css and javascript. For live code, caching this makes sense but for development, it just makes the troublesome line of code harder to find.



While in development, make sure you change the web.config file's setting of compilation to debug=true.


Then when you refresh the page. You should now see the client files in the network trace instead of the dependency handler. For more information, review the docs.

For more extensive configuration, find the ClientDependency.config file in the Umbraco website's /Config directory, off of the root.

Wednesday, June 17, 2015

Azure VM LIcensing Error

When you log in to the Windows Azure VM, you may receive a popup that says you need to configure the licensing server. If you don't configure the licensing server, you only have X days left of access.

You need to sign in to the Azure portal, download the RDP file for that virtual server, and only access the VM from that RDP. 

The RDP file has access to the server, even after the licensing server grace period ends.


Umbraco CMS Refresh

Umbraco is an Asp.Net based Content Management System. It has many pieces and configuration elements.

Recycle/Refresh
While I have been working on the Merchello Package, I've learned some ways to cycle Umbraco if something isn't working right.

  1. Touch Web.Config
  2. Reindex Examine Indexes via Back Office
  3. Delete files in \App_Data\Temp
  4. Republish content node or entire content tree. 

Sunday, January 26, 2014

Parsing Word Document XML via MsXml in VBA

An old friend ping me because while having some success loading word documents saved as xml in his VBA routines in word, he was having massive problems getting XPATH to work. I resolved the basic issue and then asked him to give me his hardest issue for me to show some example code for. I think that the last time that I did this was a few years ago, MsXml was (and is) a bit lame compared to later implementations, so his troubles were very reasonable to have.

 

Declaring name spaces

You can’t use XPath on a document with namespaces without providing the name spaces. The process is pretty simple, just convert the namespaces declare at the top of the XML document to a VB String. There happen to be many of them in a word xml document..

 

Dim domPrefix As String
domPrefix = "xmlns:wpc='http://schemas.microsoft.com/office/word/2010/wordprocessingCanvas' "
domPrefix = domPrefix + "xmlns:mc='http://schemas.openxmlformats.org/markup-compatibility/2006' "
domPrefix = domPrefix + "xmlns:o='urn:schemas-microsoft-com:office:office' "
domPrefix = domPrefix + "xmlns:r='http://schemas.openxmlformats.org/officeDocument/2006/relationships' "
domPrefix = domPrefix + "xmlns:m='http://schemas.openxmlformats.org/officeDocument/2006/math' "
domPrefix = domPrefix + "xmlns:v='urn:schemas-microsoft-com:vml' "
domPrefix = domPrefix + "xmlns:wp14='http://schemas.microsoft.com/office/word/2010/wordprocessingDrawing' "
domPrefix = domPrefix + "xmlns:wp='http://schemas.openxmlformats.org/drawingml/2006/wordprocessingDrawing' "
domPrefix = domPrefix + "xmlns:w10='urn:schemas-microsoft-com:office:word' "
domPrefix = domPrefix + "xmlns:w='http://schemas.openxmlformats.org/wordprocessingml/2006/main' "
domPrefix = domPrefix + "xmlns:w14='http://schemas.microsoft.com/office/word/2010/wordml' "
domPrefix = domPrefix + "xmlns:w15='http://schemas.microsoft.com/office/word/2012/wordml' "
domPrefix = domPrefix + "xmlns:wpg='http://schemas.microsoft.com/office/word/2010/wordprocessingGroup' "
domPrefix = domPrefix + "xmlns:wpi='http://schemas.microsoft.com/office/word/2010/wordprocessingInk' "
domPrefix = domPrefix + "xmlns:wne='http://schemas.microsoft.com/office/word/2006/wordml' "
domPrefix = domPrefix + "xmlns:wps='http://schemas.microsoft.com/office/word/2010/wordprocessingShape'"
The next item is just loading the xml file, this was his existing code and worked fine
    Set xmlDoc = New MSXML2.DOMDocument60
    xmlDoc.validateOnParse = False
    xmlDoc.async = False
    xmlDoc.Load sFile    
    If Not xmlDoc.Load(sFile) Then  'strXML is the string with XML'
        sErr = xmlDoc.parseError.ErrorCode & "  " & xmlDoc.parseError.reason
        GoTo errXML
    End If
The next item was just setting two properties on the xmlDoc (unlike later implementations, there was no XmlNameSpaceManager needed). The key one is using the domPrefix string from above.
    xmlDoc.setProperty "SelectionNamespaces", domPrefix
    xmlDoc.setProperty "SelectionLanguage", "XPath"

So if he wanted to find all of the tables, it was a simple:

Dim foundNodes As MSXML2.IXMLDOMNodeList  
Set foundNodes = xmlDoc.DocumentElement.SelectNodes("//w:tbl")

His tough problem

He wanted to find all images in the document, obtain their dimensions and the associated file name, plus some related stuff. The code was pretty clean to write with xPath, as shown below

 ' Find the  actual drawing nodswe
    Set foundNodes = xmlDoc.DocumentElement.SelectNodes("//w:drawing")
        Debug.Print foundNodes.Length & " Drawings found"
        iNode2 = 1
    For Each node In foundNodes
    Debug.Print "Image #" & iNode2
    'We search from this node
    Set node2 = node.SelectSingleNode(".//wp:extent")
    Debug.Print node2.Attributes(0).BaseName & "=" & node2.Attributes(0).NodeValue
    Debug.Print node2.Attributes(1).BaseName & "=" & node2.Attributes(1).NodeValue
    Set node2 = node.SelectSingleNode(".//wp:docPr")
        Debug.Print node2.Attributes(2).BaseName & "=" & node2.Attributes(2).NodeValue
    'Directly finding parent p node does not appear to be supported so..
    Set node3 = node.ParentNode
    Do While node3.BaseName <> "p"
        Set node3 = node3.ParentNode
    Loop
    
    Set node2 = node3.SelectSingleNode(".//w:pStyle")
        Debug.Print node2.Attributes(0).BaseName & "=" & node2.Attributes(0).NodeValue
     iNode2 = iNode2 + 1
  Next node
There were a few hiccups such as finding that ancestor: and descendent: did not appear to work, but there were easy work arounds for that.
The output is shown below
output
Friend helped, some memories refreshed, and a blog post that will hopefully help someone else!

Saturday, January 18, 2014

Merging two AspNet Profiles–more complex then one would think!

During recent code revision, a developer accidentally removed in Web.Config, the applicationName on the SQLProfileProvider element. The consequence became apparent when support started to report customers saved-reports-definitions(which was stored here) had disappeared.

   <add name="ErrorProfileProvider" connectionStringName="LocalSQLServer" type="System.Web.Profile.SqlProfileProvider" />
The problem was identified, but there was a second problem…  We had a significant number of new customers that also saved reports-definitions to this error profile provider. A simple switch back by adding the applicationName would mean that these reports-definitions would disappear (which customer support did not want to see happen to new customers – a rather bad experience).
My initial take was “Oh, I have used multiple membership providers in one web application in the past, so we just need to use the same pattern…” 
  • Oops that pattern did not work, Grrr.
  • Searching stackoverflow and other sites, did not find any “canned solutions”.
  • We looked at trying to merge the data in SQL Server, it could be done – but the TSQL code was complicated and would require major setup in QA to test properly. Not a good scenario for an agile solution.
What I ended up doing was my original pattern but I had to go to lower level coding to do it (i.e. using inherited classes features). 
First, I indeed put multiple profiles in web.config as was my original plan:
<profile inherits="AdminConsole.CustomProfile" defaultProvider="GoodProfileProvider" enabled="true">
      <providers>        
<clear />        
<add name=GoodProfileProvider" applicationName="/Admin Console" connectionStringName="LocalSQLServer" type="System.Web.Profile.SqlProfileProvider" />        
<add name="ErrorProfileProvider" applicationName="/" connectionStringName="LocalSQLServer" type="System.Web.Profile.SqlProfileProvider" />      
</providers>    
</profile>
The next step was to write a method (ProfileMerge) that would be called when users logged in. I added it as a static in my profile class. The code is below, the basic steps were:
  1. Check if the customer has two profiles, if not and only the correct one – then we are done
    1. Otherwise just move one into the other “in bulk”
  2. Next we need to check when the last update happened, the old one was off line, so
    1. Older then error one – we need to update
    2. Newer – we have already updated (No need to create a “updated” field to track this)
  3. We used inherited classes to call GetPropertyValues so we can process the information as collections.
  4. We walked the ErrorData (the recently created saved-reports-definitions) to find any recent definitions that we needed to merge into the old profile data. When we found one, we called a merging routine (specific to how the reports were being saved)
  5. Once all of this was done, we set the results into the good profile and returned. The profile was already being saved in the calling code, so there was not a need to do an explicit save.
   public class CustomProfile : ProfileBase
    {
        public static void ProfileMerge(string userName, bool isForced)
        {
            try
            {
                if (String.IsNullOrWhiteSpace(userName))
                    return;
                int total = 0;
                var profile = CustomProfile.Create(userName);
                if (profile==null || profile.IsAnonymous) return;
                var errorPM = (SqlProfileProvider)System.Web.Profile.ProfileManager.Providers["ErrorProfileProvider"];
                if (errorPM == null) return;
                var errorDate = DateTime.MaxValue;
                foreach (ProfileInfo item in errorPM.FindProfilesByUserName(ProfileAuthenticationOption.Authenticated, userName, 0, 1, out total))
                {
                    errorDate = item.LastUpdatedDate;
                }
                var goodPM = (SqlProfileProvider)System.Web.Profile.ProfileManager.Providers["GoodProfileProvider"];
                if (goodPM == null)
                {
                    //Should never occur
                    AdminConsole.Diagnostics.SplunkLogger.Fatal("Profile not found for " + userName);
                    return;
                }
                // If date is in the future, there is no data to merge
                if (errorDate > DateTime.Now)
                    return;
                // Already done the update BUT allow manual repeat if needed.
                if (profile.LastUpdatedDate > errorDate && ! isForced) return;
                var errorData = errorPM.GetPropertyValues(profile.Context, CustomProfile.Properties);
                var goodData = goodPM.GetPropertyValues(profile.Context, CustomProfile.Properties);
                foreach (SettingsPropertyValue item in errorData)
                {
                    if(item.PropertyValue !=null && ! String.IsNullOrWhiteSpace( item.PropertyValue.ToString()))
                    switch (item.Name)
                    {
                        case "UserSavedFilters_Consumer":
                        case "UserSavedFilters_TransactionUsage":
                        case "UserSavedFilters":
                            goodData[item.Name].PropertyValue= MergeStrings(item.Name,goodData[item.Name].PropertyValue.ToString(),item.PropertyValue.ToString());
                            break;
                    }
                }
                goodPM.SetPropertyValues(profile.Context, goodData);
            }
            catch (Exception ex)
            {
                AdminConsole.Diagnostics.SplunkLogger.Error(ex);
            }
        }
        private static string MergeStrings(string name,string oldData, string newData)
        {
            var logEntry = new Dictionary<string, string>();
            logEntry.Add("AP_name", name);
            logEntry.Add("AP_old", oldData);
            logEntry.Add("AP_current", newData);
            char[] spliter = { '$' };
            if (String.IsNullOrWhiteSpace(oldData)) return newData;
            if (String.IsNullOrWhiteSpace(newData)) return oldData;
            var oldCollection = oldData.Split(spliter, StringSplitOptions.RemoveEmptyEntries);
            var newCollection = newData.Split(spliter, StringSplitOptions.RemoveEmptyEntries);
            var merge = new HashSet<String>();
            foreach (string item in oldCollection)
            {
                if (!merge.Contains(item))
                    merge.Add(item);
            }
            foreach (string item in newCollection)
            {
                if (!merge.Contains(item))
                    merge.Add(item);
            }
            var result = new StringBuilder();
            foreach (string item in merge)
            {
                result.Append(item);
                result.Append("$");
            }
            logEntry.Add("AP_result", result.ToString());
            AdminConsole.Diagnostics.SplunkLogger.Debug(logEntry);
            return result.ToString();
        }
You may have noted that I included an ability to ignore dates and force a merge.  This was done to provide a backdoor mechanism for support if the unexpected happen. They could enter the userid on a dark page (only visible to them), and force the update to happened again (which would log the results of the merge so we could find why our code (and QA) had failed).
The good news is that the above has rolled into production and our QA person reports that customer service is happy because everything was returned as expected. We did not need to use our dark page.

Tuesday, December 31, 2013

Converting SalesForce SOQL results to a Linked Dataset for use by C#

In my last post, I illustrated the 'simple' way of getting the data back - by just creating a Xml Document from the results. This is not always efficient -- in fact, I have gotten out of system memory errors on some queries.

In this post I will take the same query and show a little slight of code. The query was:

SELECT Name, 
  (SELECT Name, Email FROM Contacts), 
  (SELECT CaseNumber, CreatedDate, Subject, Status, Resolution__c,  ClosedDate, SuppliedEmail,SuppliedName FROM Cases
   WHERE CreatedDate = LAST_N_DAYS:366 
   AND Origin != 'Internal' 
   AND OwnerId !=     '00G40000001RTyyEAG' 
   AND RecordTypeId IN ('01240000000UTWP', '01240000000UWxM', '01240000000UWxl', '01240000000UWxb')

  ) 
 FROM Account  


 WHERE ID IN (SELECT AccountId FROM Contact Where Email in ({0}))

Extension Method to Handle SOQL Result Columns

The code below takes the results from the query and creates a DataTable from one column. This is our fundamental building block suitable for all SalesForce queries.

You could call this in a nested manner if you have nested objects in your query -- but I assume just one layer deep.

    ///
    /// Converts a Salesforce style XMLElement into a datatable
    ///
    /// A "Column" returned from Salesforce
    /// DataTable
    public static DataTable SalesForceToDataTable(this XmlElement element)
    {
        var result = new DataTable(element.LocalName);
            if (element.ChildNodes.Count == 1)
            {
                if (!result.Columns.Contains(element.LocalName))
                {
                    result.Columns.Add(element.LocalName);
                }
                DataRow row = result.NewRow();
                row[element.LocalName] = element.InnerText;
                result.Rows.Add(row);
            }
            else
                foreach (XmlNode child in element.ChildNodes)
                {
                    switch (child.LocalName)
                    {
                        case "size":
                        case "queryLocator":
                        case "done": break;
                        case "records":
                            DataRow row = result.NewRow();

                            foreach (XmlElement field in child.ChildNodes)
                            {
                                var colName = element.LocalName + "_" + field.LocalName;

                                if (!result.Columns.Contains(colName))
                                {
                                    result.Columns.Add(colName);
                                }
                                row[colName] = field.InnerText;
                            }
                            result.Rows.Add(row);
                            break;
                    }
                }
        return result;
    }

Building and linking a DataSet across columns

As we saw in our last post, we have subqueries appearing in a single column, as illustrated below:

The routine above handles each column. To put it all together, we use the code below.

public static DataSet QueryCases(string myquery)
{
    var result = new DataSet();
        var binding = new AdminConsole.WebSalesForce.SforceService();
        binding.SessionHeaderValue = new AdminConsole.WebSalesForce.SessionHeader();
        var salesforce = SaleforceUrl;
        if (salesforce != null)
        {
            binding.SessionHeaderValue.sessionId = salesforce.sessionId;
            binding.Url = salesforce.serverUrl;

            binding.UseDefaultCredentials = true;
            var results = binding.query(myquery);
            var done = false;
            while (!done && results.size > 0)
            {
                var masterKey = 0;
                foreach (var record in results.records)
                {
                    var tableIndex = 0;
                    foreach (XmlElement key in record.Any)
                    {
                        var newTable = key.SalesForceToDataTable();
                        newTable.Columns.Add(DataSetKey, masterKey.GetType());
                        foreach (DataRow row in newTable.Rows)
                        {
                            row[DataSetKey] = masterKey;
                        }
                        // On the first table we ADD to dataset, on subsequent we append the records
                        if (masterKey > 0)
                        {
                            foreach (DataRow row in newTable.Rows)
                            {
                                result.Tables[tableIndex].ImportRow(row);
                            }
                        }
                        else
                        {
                            result.Tables.Add(newTable);
                        }
                        if (result.Tables.Count > 1)
                        {
                            result.Relations.Add(result.Tables[0].Columns[DataSetKey], newTable.Columns[DataSetKey]);
                        }
                        tableIndex++;
                    }
                }
                masterKey++;
                done = results.done;
            }
        }
    return result;
}

Converting Linked DataSet to DataTable

What we are doing is adding a new key to tie the data in each column together and adding it as a relationship to the dataset. This is generic code still -- works for almost any query.

The last step is flattening this dataset into a data table so I may display it in a Telerik Grid. I need to match case to contact (which would be easy with a proper SQL language). My solution is shown below:


private DataTable FlattenDataSet(DataSet dataset)
{
        var result = new DataTable();
        try
        {
            // Create all columns
            foreach (DataTable table in dataset.Tables)
            {
                foreach (DataColumn col in table.Columns)
                {
                    if (!result.Columns.Contains(col.ColumnName))
                        result.Columns.Add(col.ColumnName);
                }
            }
            //Import the 1:1 relationships into the table
            foreach (DataRow row in dataset.Tables["Cases"].Rows)
            {
                result.ImportRow(row);
            }
            // Import the 1: of the 1:m relationship
            foreach (DataRow master in dataset.Tables["Name"].Rows)
            {
                DataRow[] rd = result.Select(DAL.daSalesForce.DataSetKey + "=" + master[DAL.daSalesForce.DataSetKey]);
                foreach (DataRow row in rd)
                {
                    foreach (DataColumn col in dataset.Tables[0].Columns)
                    {
                        row[col.ColumnName] = master[col.ColumnName];
                    }
                }
            }
            // For each contact (expected to be fewer than cases) We lookup the cases they submitted
            foreach (DataRow master in dataset.Tables["Contacts"].Rows)
            {
                DataRow[] rd = result.Select("Cases_ContactId='" + master["Contacts_Id"] + "'");
                foreach (DataRow row in rd)
                {
                    foreach (DataColumn col in dataset.Tables["Contacts"].Columns)
                    {
                        row[col.ColumnName] = master[col.ColumnName];
                    }
                }
            }
     return result;
    }
    else
    {
        return (DataTable)Session["SalesForce"];
    }
}

Bottom Line

Salesforce SOQL results can be handled in .Net using an extension and moving the data into a DataSet.  Once the data is there, you may have a little dataset grunting to get it transformed into the appropriate format.