Monday, December 5, 2022

GraphQL on GitHub: a fragment for aggregated repo data

GitHub provides the GraphQL explorer to play with GraphQL data and learn how to shape your queries. When your group of queries grows to the point of repeating objects and their fields, its time to move to fragments. 

A fragment in GraphQL allows you to have:

  • readability - a well-named fragment shortens queries and mutations
  • reusability - reuse fragments in queries and mutations
  • performance - on the client, fragments and their components are a cache layer
  • type-safety -  the code generator that builds your GraphQL SDK includes named fragments so you can access any deeply nested objects as fragments without the types and their guards you would need to manage

Typical places to create and use fragments to DRY up your GraphQL queries include the most common schema objects. For a GitHub GraphQL schema, those can include the User and Repository.

To get the entire list of repositories in a GitHub org, you need to compensate for the cursor/paging as well as the return results. An example query for that looks like:

query OrgReposAg(
  $organization: String!
  $pageSize: Int
  $after: String
) {
  organization(login: $organization) {
    repositories(
      after: $after
      first: $pageSize
      orderBy: { field: STARGAZERS, direction: DESC }
    ) {
      totalCount
      pageInfo {
        startCursor
        hasNextPage
        endCursor
      }
      edges {
        cursor
        node {
        ...MyRepoFields
        }
      }
    }
  }
}

# This fragment extracts each repository in the edges array
# to a named type MyRepoFields, created by codegen
fragment MyRepoFields on Repository {
  repositoryName: name
  id
  url
  descriptionHTML
  updatedAt
  stargazers {
    totalCount
  }
  forks {
    totalCount
  }
  issues(states: [OPEN]) {
    totalCount
  }
  pullRequests(states: [OPEN]) {
    totalCount
  }
}

The OrgReposAg query uses the named fragment, MyRepoFields, to extract the Repository object fields and aggregations needed such as the total stargazers, forks and open issues. 

This query uses the after variable to page through the org's repo list, 100 repos at a time. The variables object intially contains:

{
  "organization": "Azure-samples",
  "after": null,
  "pageSize": 100
}

To page through all the repos in an org, each request needs to capture the paging information via the pageInfo.endCursor, in order to use that value in the next request of where to start after.

If you need the list of repos in an org, and you weren't using a fragment, you would need to create your own TypeScript type for the repository fields. That in itself isn't a huge barrier, but you need to manage the type, separately from the generated types, and you need to manage it over the time of the project and the people that come and go from it. It becomes one more thing that has to be known and verified.  

This becomes increasingly problematic as the query changes over time with the nested depths of items or the remapping of data to the final shape needed by the UI. Moving the core information of the query to a fragment allows the query to organically change without interferring with your ability to get at the core information. It also provides assurance that when you need this core information in other queries in the same schema, the same core information is returned in the same named fragment (and its cooresponding type in the SDK) in each place it is used. 

If you have a different solution, let me know. @dfberry

Saturday, November 27, 2021

Migrate Azure Scheduler jobs to Azure Functions


Migrate a code-based Azure Scheduler job to Azure Functions


Recently, I migrated a scheduled job from .NET Framework 4.6 to .NET Core and from Azure App Service to Azure Functions. This was part of a larger effort to move to .NET Core, and the retirement of a dependency of Azure Scheduler jobs

While there is nothing wrong with .NET Framework 4.6 or Azure App service, the project was moving toward separate, smaller, quicker deployments for the various projects instead of a single monolithic deployment.


Existing scheduled job: Add feature flags

Add the ability to control you job with a feature flag.
  1. In your Azure App service, create a feature flag for every scheduled job. This can be as simple as an App Configuration setting with a value or you can use Azure App Configuration. 
  2. Add code for the feature flag and make sure it has a default value if it isn't detected in the environment. 
  3. Add logging statements to report your runtime feature flag value. 
  4. Deploy the code to Azure App service and make sure you can stop and start the scheduled job with the feature flag before continuing. 

string FeatureFlagFindRetweets = Environment.GetEnvironmentVariable("FeatureFlagScheduledJobFindRetweets");
if (!String.IsNullOrEmpty(FeatureFlagFindRetweets) && FeatureFlagFindRetweets.ToLower() == "false")
{
    log.WriteLine($"FindRetweet executed at: {DateTime.Now} feature flag disabled");
    return;
}
log.WriteLine($"FindRetweet executed at {DateTime.Now} feature flag enabled");

New Azure Function: Timer trigger

Create a new Azure Function app to replace the scheduled job and add the same feature flags

  1. Create an Azure Function app locally with Visual Studio 2021 and the latest .NET Core you intend to support.
  2. Create a timer trigger and add the same code inside the function as your original scheduled job including the logging.  
  3. Add the same feature flags you use in the original scheduled job. While Azure Functions has an easy way to disable a function, a feature flag allows for greater flexibility in the future. 
  4. Add any dependencies the original project used with NuGet Manager. 
  5. Build your function. 
  6. You probably have a few build errors because the .NET versions are different and there may be breaking changes. These are usually minor issues that take a little investigation. Common issues I've come across are:
    • HTTP Client library changes - used to integrated with other APIs
    • JSON (de)serialization library changed from NewtonSoft to System.Text.Json
    • Entity Framework changes - or whatever database library you use. 
    • Authentication library changes
  7. Fix your build issues but don't change any of the logic of the code yet. This is a straight migration. 
  8. Once your project builds, migrate your tests and perform the same dependency migrations. 
  9. Run your tests and verify your new timer trigger logic still succeeds.
  10. Deploy your timer trigger with your existing deployment infrastructure making sure the feature flags are not enabled. 

Disable Azure Schedule job, enable Azure Function Timer trigger

Azure App Service and Azure Functions both allow you to control the Configuration Settings from a variety of methods include Azure CLI, PowerShell, Azure SDKs, and the Azure portal. 
  1. Determine how you want to automatically deploy configuration changes. For this initial switch over, in a low priority job, you could change these feature flags manually in the Azure portal. For more critical jobs, you should automate the task and include it as part of your deployment pipeline. 
  2. Disable the original scheduler jobs and verify with logging that the job executed but didn't continue past the check of the feature flag. 
  3. Enable the timer trigger and verify with logging that the job executed and did continue past the feature flag check. 

Have a better method? 

Let me know @dfberry

Saturday, November 13, 2021

Entity Framework 6 won't save (update) an entity returned from Stored Procedure

Symptom

The issue was that code that previously worked, update property then save, wasn't working any more. The application is a .NET 4.6 Web API project using Entity Framework 6. 

Issue

The fix isn't in the code which saved (updated) the entity, but was caused because the entity that is updated isn't correctly formed by the Stored Procedure. Before the SP was added, the entity was returned from EF directly.  

Fix

After the stored procedure completes, refetch the entity using straight Entity Framework code such as `await TwitterAccounts.FirstAsync(table => table.TwitterHandle == twitterHandle);` where TwitterAccounts is a table in the EF Context. 

Step to debug issue: capture save result

The original EF code to save didn't capture the returned int of number of rows changed. 

Before

account.IsProperty = true;
_context.SaveChanges();
return OK();

After

account.IsProperty = true;
int result= _context.SaveChanges();
if(result==0)
{
throw new Exception("property was not updated");
}
return Ok(account);

Step to debug issue: see T-SQL from Entity Framework in Visual Studio Output window

Once the row count was returned as zero, the issue was why? To see the actual T-SQL sent to the database, I added the following code to the BotContext class, which inherits from DbContext. The constructor now includes:

#if DEBUG
Database.Log = s => System.Diagnostics.Debug.WriteLine(s);
#endif

This shows all T-SQL from Entity Framework in the output window. This showed that EF wasn't producing any T-SQL which meant there was either an issue with the variables (those hadn't changed) or there were no changes detected in the entity. 

Step to debug issue: see the EF changes detected

To see the changes detected, I added the following code to my BotContext class:

        public void DisplayTrackedEntities(DbChangeTracker changeTracker)
        {
            var entries = changeTracker.Entries();
            foreach (var entry in entries)
            {
                System.Diagnostics.Debug.WriteLine("Entity Name: {0}", entry.Entity.GetType().FullName);
                System.Diagnostics.Debug.WriteLine("Status: {0}", entry.State);
            }
        }

Then to use the code, call this method after the entity is changed but before it is saved:

account.IsProperty = false;
DisplayTrackedEntities(context.ChangeTracker);
int result = context.SaveChanges();

if (result == 0)
{
    throw new Exception("disable-auto-tweets not updated successfully");
}
return Ok(account);

At this point, because no changes were detected, I knew the recent switch to a SP caused the issue that changes couldn't be detected on this artificial entity. 

Step to debug issue: get EF entity after SP update

Originally the code checked a value and returned the account using something like:

await TwitterAccounts.FirstAsync(table => table.TwitterHandle == twitterHandle);

That method changed to use a stored procedure which returned the TwitterAccount entity type. That entity wasn't getting updated. Changing the method to fetch and return the EF entity after the SP updated fixed the issue  - the account was updated. 

Have a better fix? 

There is probably some way to get EF to update my entity returned from the SP. Do you know what it is? Let me know on Twitter.

@dfberry

 



Tuesday, November 2, 2021

Convert GoDaddy PEM certificate to PFX certificate to use in Azure App Service

 When you purchase GoDaddy certificates, you should get 3 files:

  • *.crt file
  • *.pem file
  • *-key.txt file
1. Change the key file's name to the key extension so any future tools can find it by its extension.

2. If you download and/or open the key file on a Windows computer, your key file may now have the wrong encoding. Use a bash terminal and the iconv CLI tool to convert to the correct encoding in a new file with a "2" added at the end of the filename to indicate the different file:


iconv -c -f UTF-8 -t ASCII your-domain-name.key > your-domain-name.2.key



3. Convert to the PFX format with the openssl CLI tool:

openssl pkcs12 -export -out your-domain-name.pfx -inkey your-domain-name.2.key -in your-domain-name.crt

4. You need to enter a new password when creating the PFX. Remember the password, you will need it when you add your certificate to Azure App Service.  

5. In the Azure portal, for your App Service, select TLS/SSL settings. 

6. Add a new TLS/SSL settings. 

7. Select your TLS/SSL settings:

  • Your new local file system's PFX file
  • Your password from step 4
  • Select the thumbpint and TLS/SSL type - there should be one choice in the drop down box
8. Select Add Binding

9. Restart your App service.

10. On the Azure portal's Overview for your App Service, select your URL such as https://YOUR-DOMAIN. 

11. The browser may take a few seconds to reload. Notice the browser indicates your website is secure with the typical lock icon. 


@dfberry

Oct 2021 - Copy an Azure SQL Database to dev computer


There are several blog posts on this site answering how to copy an Azure SQL database. Since they are older or in some cases lost (Wayne's Microsoft blog posts are gone), I'll link to the Microsoft docs. 

Enterprise data or functionality?

Don't use these processes if you need change detection or other enterprise-level requirements. Copy the Azure SQL database and keep it in the cloud.

Make a copy of your Azure Cloud database

To copy an existing Azure SQL Database in the Azure portal, you can copy or export. Generally, I choose to export to Azure Storage Blob container as a bacpac file. This allows me to use it as either a cloud or local database.

The bacpac file includes data and schema.

Watch for export completion

To watch for export completion, from the Azure Server (not database) in the portal, use the  Import/Export History in the Data Management section.

Download bacpac file from Azure Storage

To download the bacpac file, Azure Storage container in the Azure portal. Select the bacpac file then download it to your local computer.

Restore SQL database to local machine from bacpac file

Using SSMS, import a data-tier application. Select the downloaded bacpac file. 

Because the Azure SQL db is a contained db, you need to change the master db on your local server with the following t-sql executed against the master db.

exec sp_Configure 'contained database authentication', 1

go 

reconfigure 


@dfberry




Monday, October 25, 2021

Azure Functions + System.Text.JSON Deserialize exception

If you run into a System.IO.FileNotFoundException exception from the Azure Function when it is deserializing JSON with System.Text.Json to your class, make sure you are not using System.Text.Json 5.0.2. 

Issue

FileNotFoundException: Could not load file or assembly 'System.Text.Encodings.Web, Version=5.0.0.0, Culture=neutral

Fix

Downgrade to System.Text.Json 5.0.1. 

Project description

The issue was found while moving code from a working .NET Framework 4.6.1 web api project into a .NET Core 3.1 class library, which is called from an Azure Function 3. The deserialization happens in the class library. The manual and automated testing described in the post were both local to my development machine, not on the Azure Cloud.

Initially it was tested by the Azure Function manually. Later, as I thought the issue was about the JSON returned, I added automated programmatic tests to call into the class library in a separate test project. The test project worked so the issue had to be the Azure Function runtime or its dependencies.  

Before unit tests

Before the unit test, I knew the FileNotFound exception was a symptom but I assumed the real problem was the JSON being deserialized. The deserialization code went from the HTTP response content directly to a stream wrapped in the deserialization code, so it immediately threw without seeing, in the Visual Studio 2019 debugger, what JSON was returned. 

Adding unit tests

I broke apart the response from the deserialization. That allowed me to see the JSON and create unit tests. When I realized the JSON was deserializable from unit tests, I thought it was something about how the debugger was compensating for the text. There could be some difference between the raw text in the unit test and the text processed by the Azure Function. Adding my Debug.Write statements between maual and automated testing runs showed that the text was the same and that the problem was still with deserialization. 

At this point, I assumed it was a runtime issue but thought it was something about the HTTP call and how it was creating the JSON.

Offload deserialization to a queue

As a bonus, breaking up the JSON fetch from the deserialization allows the system to offload the deserialization and downstream tasks to a queue. In that architecture, where an Azure Function timeout isn't at risk for the caller, I could add as much data cleaning as needed in future sprints. 

The runtime problem

Now that tests could deserialize, the hunt was on for the runtime problem. The JSON deserialization worked in .NET 4.6.1 Framework with Newtonsoft deserialization but not the Azure Function with System.Text.Json. At this point, searching the Internet led to several issues about System.Text.Json in Azure Functions. Downgrading the library to 5.0.1 was the listed fix.  

Other problems

Before the unit tests, other things tried:

* Downloaded and setup the Fusion logger to look for mismatched dependencies - nothing turned up with that. 

* Tried to add a lock around the code to see into the function call, because it was in an async loop. The thought was this was some missed async handling somewhere that was showing up here. The lock in .NET Core was cumbersome and unfamiliar and wasn't blocking when I wanted it to so I stopped going down that rabbit hole. 


@dfberry



Sunday, August 22, 2021

Create Azure DevOps Pipeline for React App deployment to Azure app service

 Disclaimer: All opinions in this post are mine and not my employer (Microsoft). If you know of a more correct or performant way to accomplish work discussed in this post, please let me know at email javascript-developer@outlook.com. Many projects I work on are in progress by the time I work on them. I can't change previous design or architecture choices, but just solve a specific technical issue. 

Secrets stored in Azure Key Vault

All secrets for the React app are stored in Azure Key Vault. These secrets need to be pulled from Key Vault and set into the environment so that the `npm build` script uses those values. In order for the Azure DevOps Pipeline to connect to Azure Key Vault, you need to complete some work before you develop your Pipeline:

  • Create a Key Vault and store your React build secrets, such as an Azure Functions key, used to authenticate and use the Function. Your secret doesn't have to have the same name as your React build variable. Please don't try. Key vault secrets don't allow underscores in names anyway. Just give the variable a human readable name. The pipeline can map between the secret name and the build environment name.
  • Create a connection from Pipelines to Key Vault - In Azure DevOps, in the project settings, add a service connection to Azure Key Vault. This process creates a service principal. You can find all your service principals in Azure Portal, under the Azure Active Directory section. Service Principals are part of Azure App Registrations. 
  • Set up Azure Key Vault access policies - In the Azure portal, find your Key Vault, and add the Pipeline's Service Principal for `list` and `get`. 

Create an Azure DevOps Pipeline to build React app

The React app is deployed to an Azure app service wrapped in a .NET Core application. The vmImage is `windows-latest`. 

The YAML file is:


# ASP.NET Core (.NET Framework)


# Build and test ASP.NET Core projects targeting the full .NET Framework.
# Add steps that publish symbols, save build artifacts, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/dotnet-core

trigger:
main

pool:
  vmImage'windows-latest'

# Variables used in this pipeline
# The `dinatrue` variables is a test used in the echo command to understand the syntax for container:
# bringing in variables.
variables:
  solution'**/*.sln'
  buildPlatform'Any CPU'
  buildConfiguration'Release'
  dinatrue'hello dina'
  buildDate$[format('{0:yyyy}{0:MM}{0:dd}', pipeline.startTime)]


steps:

# Get React app secrets from Key Vault
# Key Vault is used for projects beyond this single client app. If the Key Vault
# were only for this client app, change the `SecretsFilter` value to `*`.
taskAzureKeyVault@2
  inputs:
    azureSubscription'SQL Projects (98c...)'
    KeyVaultName'MSTwitterKeyVault'
    SecretsFilter'PublicApiMessageBeforeLoginKey,PublicApiGetUtcNowKey,PublicApiUrl,ServerUri'
    RunAsPreJobfalse

# Verify/debug variables
taskCmdLine@2
  inputs:
    script'echo %dinatrue% %buildDate%'

# Set Key vault secrets to React app environment variables
# Verify/debug values with SET command
# secrets' values will be `***` on purpose
taskCmdLine@2
  inputs:
    script'SET > env.log && cat env.log'
  env:
    REACT_APP_APP_SERVER_BASE_URL_PUBLIC_API_APP_MESSAGE_BEFORE_LOGIN_KEY$(PublicApiMessageBeforeLoginKey)
    REACT_APP_APP_SERVER_BASE_URL_PUBLIC_API_GET_UTC_NOW_KEY$(PublicApiGetUtcNowKey)
    REACT_APP_APP_SERVER_BASE_URL_PUBLIC_API$(PublicApiUrl)
    REACT_APP_SERVER_URL$(ServerUri)
    REACT_APP_CACHE_BUST$(buildDate)

# Get source code from repo
taskNuGetCommand@2
  inputs:
    restoreSolution'$(solution)'

# Build project, which ultimately builds React app into `$(build.artifactStagingDirectory)\WebApp.zip`
taskVSBuild@1
  inputs:
    solution'$(solution)'
    msbuildArgs'/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:DesktopBuildPackageLocation="$(build.artifactStagingDirectory)\WebApp.zip" /p:DeployIisAppPath="Default Web Site"'
    platform'$(buildPlatform)'
    configuration'$(buildConfiguration)'

# Deploy Zip file to Azure app service, slot named `client-stage` using webDeploy
taskAzureRmWebAppDeployment@4
  inputs:
    ConnectionType'AzureRM'
    azureSubscription'SQL Projects (98c...)'
    appType'webApp'
    WebAppName'MSTwitterApp'
    deployToSlotOrASEtrue
    ResourceGroupName'MSTwitterBot'
    SlotName'client-stage'
    packageForLinux'$(build.artifactStagingDirectory)\WebApp.zip'
    enableCustomDeploymenttrue
    DeploymentType'webDeploy'