Saturday, November 27, 2021

Migrate Azure Scheduler jobs to Azure Functions

Migrate a code-based Azure Scheduler job to Azure Functions

Recently, I migrated a scheduled job from .NET Framework 4.6 to .NET Core and from Azure App Service to Azure Functions. This was part of a larger effort to move to .NET Core, and the retirement of a dependency of Azure Scheduler jobs

While there is nothing wrong with .NET Framework 4.6 or Azure App service, the project was moving toward separate, smaller, quicker deployments for the various projects instead of a single monolithic deployment.

Existing scheduled job: Add feature flags

Add the ability to control you job with a feature flag.
  1. In your Azure App service, create a feature flag for every scheduled job. This can be as simple as an App Configuration setting with a value or you can use Azure App Configuration. 
  2. Add code for the feature flag and make sure it has a default value if it isn't detected in the environment. 
  3. Add logging statements to report your runtime feature flag value. 
  4. Deploy the code to Azure App service and make sure you can stop and start the scheduled job with the feature flag before continuing. 

string FeatureFlagFindRetweets = Environment.GetEnvironmentVariable("FeatureFlagScheduledJobFindRetweets");
if (!String.IsNullOrEmpty(FeatureFlagFindRetweets) && FeatureFlagFindRetweets.ToLower() == "false")
    log.WriteLine($"FindRetweet executed at: {DateTime.Now} feature flag disabled");
log.WriteLine($"FindRetweet executed at {DateTime.Now} feature flag enabled");

New Azure Function: Timer trigger

Create a new Azure Function app to replace the scheduled job and add the same feature flags

  1. Create an Azure Function app locally with Visual Studio 2021 and the latest .NET Core you intend to support.
  2. Create a timer trigger and add the same code inside the function as your original scheduled job including the logging.  
  3. Add the same feature flags you use in the original scheduled job. While Azure Functions has an easy way to disable a function, a feature flag allows for greater flexibility in the future. 
  4. Add any dependencies the original project used with NuGet Manager. 
  5. Build your function. 
  6. You probably have a few build errors because the .NET versions are different and there may be breaking changes. These are usually minor issues that take a little investigation. Common issues I've come across are:
    • HTTP Client library changes - used to integrated with other APIs
    • JSON (de)serialization library changed from NewtonSoft to System.Text.Json
    • Entity Framework changes - or whatever database library you use. 
    • Authentication library changes
  7. Fix your build issues but don't change any of the logic of the code yet. This is a straight migration. 
  8. Once your project builds, migrate your tests and perform the same dependency migrations. 
  9. Run your tests and verify your new timer trigger logic still succeeds.
  10. Deploy your timer trigger with your existing deployment infrastructure making sure the feature flags are not enabled. 

Disable Azure Schedule job, enable Azure Function Timer trigger

Azure App Service and Azure Functions both allow you to control the Configuration Settings from a variety of methods include Azure CLI, PowerShell, Azure SDKs, and the Azure portal. 
  1. Determine how you want to automatically deploy configuration changes. For this initial switch over, in a low priority job, you could change these feature flags manually in the Azure portal. For more critical jobs, you should automate the task and include it as part of your deployment pipeline. 
  2. Disable the original scheduler jobs and verify with logging that the job executed but didn't continue past the check of the feature flag. 
  3. Enable the timer trigger and verify with logging that the job executed and did continue past the feature flag check. 

Have a better method? 

Let me know @dfberry

Saturday, November 13, 2021

Entity Framework 6 won't save (update) an entity returned from Stored Procedure


The issue was that code that previously worked, update property then save, wasn't working any more. The application is a .NET 4.6 Web API project using Entity Framework 6. 


The fix isn't in the code which saved (updated) the entity, but was caused because the entity that is updated isn't correctly formed by the Stored Procedure. Before the SP was added, the entity was returned from EF directly.  


After the stored procedure completes, refetch the entity using straight Entity Framework code such as `await TwitterAccounts.FirstAsync(table => table.TwitterHandle == twitterHandle);` where TwitterAccounts is a table in the EF Context. 

Step to debug issue: capture save result

The original EF code to save didn't capture the returned int of number of rows changed. 


account.IsProperty = true;
return OK();


account.IsProperty = true;
int result= _context.SaveChanges();
throw new Exception("property was not updated");
return Ok(account);

Step to debug issue: see T-SQL from Entity Framework in Visual Studio Output window

Once the row count was returned as zero, the issue was why? To see the actual T-SQL sent to the database, I added the following code to the BotContext class, which inherits from DbContext. The constructor now includes:

Database.Log = s => System.Diagnostics.Debug.WriteLine(s);

This shows all T-SQL from Entity Framework in the output window. This showed that EF wasn't producing any T-SQL which meant there was either an issue with the variables (those hadn't changed) or there were no changes detected in the entity. 

Step to debug issue: see the EF changes detected

To see the changes detected, I added the following code to my BotContext class:

        public void DisplayTrackedEntities(DbChangeTracker changeTracker)
            var entries = changeTracker.Entries();
            foreach (var entry in entries)
                System.Diagnostics.Debug.WriteLine("Entity Name: {0}", entry.Entity.GetType().FullName);
                System.Diagnostics.Debug.WriteLine("Status: {0}", entry.State);

Then to use the code, call this method after the entity is changed but before it is saved:

account.IsProperty = false;
int result = context.SaveChanges();

if (result == 0)
    throw new Exception("disable-auto-tweets not updated successfully");
return Ok(account);

At this point, because no changes were detected, I knew the recent switch to a SP caused the issue that changes couldn't be detected on this artificial entity. 

Step to debug issue: get EF entity after SP update

Originally the code checked a value and returned the account using something like:

await TwitterAccounts.FirstAsync(table => table.TwitterHandle == twitterHandle);

That method changed to use a stored procedure which returned the TwitterAccount entity type. That entity wasn't getting updated. Changing the method to fetch and return the EF entity after the SP updated fixed the issue  - the account was updated. 

Have a better fix? 

There is probably some way to get EF to update my entity returned from the SP. Do you know what it is? Let me know on Twitter.



Tuesday, November 2, 2021

Convert GoDaddy PEM certificate to PFX certificate to use in Azure App Service

 When you purchase GoDaddy certificates, you should get 3 files:

  • *.crt file
  • *.pem file
  • *-key.txt file
1. Change the key file's name to the key extension so any future tools can find it by its extension.

2. If you download and/or open the key file on a Windows computer, your key file may now have the wrong encoding. Use a bash terminal and the iconv CLI tool to convert to the correct encoding in a new file with a "2" added at the end of the filename to indicate the different file:

iconv -c -f UTF-8 -t ASCII your-domain-name.key > your-domain-name.2.key

3. Convert to the PFX format with the openssl CLI tool:

openssl pkcs12 -export -out your-domain-name.pfx -inkey your-domain-name.2.key -in your-domain-name.crt

4. You need to enter a new password when creating the PFX. Remember the password, you will need it when you add your certificate to Azure App Service.  

5. In the Azure portal, for your App Service, select TLS/SSL settings. 

6. Add a new TLS/SSL settings. 

7. Select your TLS/SSL settings:

  • Your new local file system's PFX file
  • Your password from step 4
  • Select the thumbpint and TLS/SSL type - there should be one choice in the drop down box
8. Select Add Binding

9. Restart your App service.

10. On the Azure portal's Overview for your App Service, select your URL such as https://YOUR-DOMAIN. 

11. The browser may take a few seconds to reload. Notice the browser indicates your website is secure with the typical lock icon. 


Oct 2021 - Copy an Azure SQL Database to dev computer

There are several blog posts on this site answering how to copy an Azure SQL database. Since they are older or in some cases lost (Wayne's Microsoft blog posts are gone), I'll link to the Microsoft docs. 

Enterprise data or functionality?

Don't use these processes if you need change detection or other enterprise-level requirements. Copy the Azure SQL database and keep it in the cloud.

Make a copy of your Azure Cloud database

To copy an existing Azure SQL Database in the Azure portal, you can copy or export. Generally, I choose to export to Azure Storage Blob container as a bacpac file. This allows me to use it as either a cloud or local database.

The bacpac file includes data and schema.

Watch for export completion

To watch for export completion, from the Azure Server (not database) in the portal, use the  Import/Export History in the Data Management section.

Download bacpac file from Azure Storage

To download the bacpac file, Azure Storage container in the Azure portal. Select the bacpac file then download it to your local computer.

Restore SQL database to local machine from bacpac file

Using SSMS, import a data-tier application. Select the downloaded bacpac file. 

Because the Azure SQL db is a contained db, you need to change the master db on your local server with the following t-sql executed against the master db.

exec sp_Configure 'contained database authentication', 1




Monday, October 25, 2021

Azure Functions + System.Text.JSON Deserialize exception

If you run into a System.IO.FileNotFoundException exception from the Azure Function when it is deserializing JSON with System.Text.Json to your class, make sure you are not using System.Text.Json 5.0.2. 


FileNotFoundException: Could not load file or assembly 'System.Text.Encodings.Web, Version=, Culture=neutral


Downgrade to System.Text.Json 5.0.1. 

Project description

The issue was found while moving code from a working .NET Framework 4.6.1 web api project into a .NET Core 3.1 class library, which is called from an Azure Function 3. The deserialization happens in the class library. The manual and automated testing described in the post were both local to my development machine, not on the Azure Cloud.

Initially it was tested by the Azure Function manually. Later, as I thought the issue was about the JSON returned, I added automated programmatic tests to call into the class library in a separate test project. The test project worked so the issue had to be the Azure Function runtime or its dependencies.  

Before unit tests

Before the unit test, I knew the FileNotFound exception was a symptom but I assumed the real problem was the JSON being deserialized. The deserialization code went from the HTTP response content directly to a stream wrapped in the deserialization code, so it immediately threw without seeing, in the Visual Studio 2019 debugger, what JSON was returned. 

Adding unit tests

I broke apart the response from the deserialization. That allowed me to see the JSON and create unit tests. When I realized the JSON was deserializable from unit tests, I thought it was something about how the debugger was compensating for the text. There could be some difference between the raw text in the unit test and the text processed by the Azure Function. Adding my Debug.Write statements between maual and automated testing runs showed that the text was the same and that the problem was still with deserialization. 

At this point, I assumed it was a runtime issue but thought it was something about the HTTP call and how it was creating the JSON.

Offload deserialization to a queue

As a bonus, breaking up the JSON fetch from the deserialization allows the system to offload the deserialization and downstream tasks to a queue. In that architecture, where an Azure Function timeout isn't at risk for the caller, I could add as much data cleaning as needed in future sprints. 

The runtime problem

Now that tests could deserialize, the hunt was on for the runtime problem. The JSON deserialization worked in .NET 4.6.1 Framework with Newtonsoft deserialization but not the Azure Function with System.Text.Json. At this point, searching the Internet led to several issues about System.Text.Json in Azure Functions. Downgrading the library to 5.0.1 was the listed fix.  

Other problems

Before the unit tests, other things tried:

* Downloaded and setup the Fusion logger to look for mismatched dependencies - nothing turned up with that. 

* Tried to add a lock around the code to see into the function call, because it was in an async loop. The thought was this was some missed async handling somewhere that was showing up here. The lock in .NET Core was cumbersome and unfamiliar and wasn't blocking when I wanted it to so I stopped going down that rabbit hole. 


Sunday, August 22, 2021

Create Azure DevOps Pipeline for React App deployment to Azure app service

 Disclaimer: All opinions in this post are mine and not my employer (Microsoft). If you know of a more correct or performant way to accomplish work discussed in this post, please let me know at email Many projects I work on are in progress by the time I work on them. I can't change previous design or architecture choices, but just solve a specific technical issue. 

Secrets stored in Azure Key Vault

All secrets for the React app are stored in Azure Key Vault. These secrets need to be pulled from Key Vault and set into the environment so that the `npm build` script uses those values. In order for the Azure DevOps Pipeline to connect to Azure Key Vault, you need to complete some work before you develop your Pipeline:

  • Create a Key Vault and store your React build secrets, such as an Azure Functions key, used to authenticate and use the Function. Your secret doesn't have to have the same name as your React build variable. Please don't try. Key vault secrets don't allow underscores in names anyway. Just give the variable a human readable name. The pipeline can map between the secret name and the build environment name.
  • Create a connection from Pipelines to Key Vault - In Azure DevOps, in the project settings, add a service connection to Azure Key Vault. This process creates a service principal. You can find all your service principals in Azure Portal, under the Azure Active Directory section. Service Principals are part of Azure App Registrations. 
  • Set up Azure Key Vault access policies - In the Azure portal, find your Key Vault, and add the Pipeline's Service Principal for `list` and `get`. 

Create an Azure DevOps Pipeline to build React app

The React app is deployed to an Azure app service wrapped in a .NET Core application. The vmImage is `windows-latest`. 

The YAML file is:

# ASP.NET Core (.NET Framework)

# Build and test ASP.NET Core projects targeting the full .NET Framework.
# Add steps that publish symbols, save build artifacts, and more:



# Variables used in this pipeline
# The `dinatrue` variables is a test used in the echo command to understand the syntax for container:
# bringing in variables.
  buildPlatform'Any CPU'
  dinatrue'hello dina'
  buildDate$[format('{0:yyyy}{0:MM}{0:dd}', pipeline.startTime)]


# Get React app secrets from Key Vault
# Key Vault is used for projects beyond this single client app. If the Key Vault
# were only for this client app, change the `SecretsFilter` value to `*`.
    azureSubscription'SQL Projects (98c...)'

# Verify/debug variables
    script'echo %dinatrue% %buildDate%'

# Set Key vault secrets to React app environment variables
# Verify/debug values with SET command
# secrets' values will be `***` on purpose
    script'SET > env.log && cat env.log'

# Get source code from repo

# Build project, which ultimately builds React app into `$(build.artifactStagingDirectory)\`
    msbuildArgs'/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:DesktopBuildPackageLocation="$(build.artifactStagingDirectory)\" /p:DeployIisAppPath="Default Web Site"'

# Deploy Zip file to Azure app service, slot named `client-stage` using webDeploy
    azureSubscription'SQL Projects (98c...)'

Friday, August 6, 2021

Moving database from Azure SQL to localdb

I recently moved an Azure SQL database back to a local development database and ran into a few issues. I took these notes so that they might help the next person that hits the problem.

In my local SSMS, I use the Import/Export wizard with the datasource using SQL Server native client 11.0. This moves the tables and data. Any destination tables will not have IDENTITY as the source tables did. 

Solution # 1 

Move away from INT Identity to use GUIDS. This requires work in the database and client code but is the better choice if you need to move data out of the source datatabase then back into the source database. 

Solution #2

More immediate fix to get past the block that your inserts don't autoincrement. 


The following steps are completed in SSMS in the destination (local) database and should have all the data but not the IDENTITY column.

  • Rename the mytable to mytable2.
  • Generate CREATE, INSERT, and SELECT scripts for the table.
  • Modify the CREATE script to use the table name (change mytable2 to mytable) and change the PK row to include the identity requirement.

CREATE TABLE [dbo].[mytable](
[Id] [int] IDENTITY (1,1) NOT NULL,
[Text] [nvarchar](40) NULL,

  • Run the creation script.
  • Create second script from the INSERT/SELECT scripts

INSERT INTO [dbo].[mytable]
  FROM [dbo].[mytable2]