Saturday, November 27, 2021

Migrate Azure Scheduler jobs to Azure Functions


Migrate a code-based Azure Scheduler job to Azure Functions


Recently, I migrated a scheduled job from .NET Framework 4.6 to .NET Core and from Azure App Service to Azure Functions. This was part of a larger effort to move to .NET Core, and the retirement of a dependency of Azure Scheduler jobs

While there is nothing wrong with .NET Framework 4.6 or Azure App service, the project was moving toward separate, smaller, quicker deployments for the various projects instead of a single monolithic deployment.


Existing scheduled job: Add feature flags

Add the ability to control you job with a feature flag.
  1. In your Azure App service, create a feature flag for every scheduled job. This can be as simple as an App Configuration setting with a value or you can use Azure App Configuration. 
  2. Add code for the feature flag and make sure it has a default value if it isn't detected in the environment. 
  3. Add logging statements to report your runtime feature flag value. 
  4. Deploy the code to Azure App service and make sure you can stop and start the scheduled job with the feature flag before continuing. 

string FeatureFlagFindRetweets = Environment.GetEnvironmentVariable("FeatureFlagScheduledJobFindRetweets");
if (!String.IsNullOrEmpty(FeatureFlagFindRetweets) && FeatureFlagFindRetweets.ToLower() == "false")
{
    log.WriteLine($"FindRetweet executed at: {DateTime.Now} feature flag disabled");
    return;
}
log.WriteLine($"FindRetweet executed at {DateTime.Now} feature flag enabled");

New Azure Function: Timer trigger

Create a new Azure Function app to replace the scheduled job and add the same feature flags

  1. Create an Azure Function app locally with Visual Studio 2021 and the latest .NET Core you intend to support.
  2. Create a timer trigger and add the same code inside the function as your original scheduled job including the logging.  
  3. Add the same feature flags you use in the original scheduled job. While Azure Functions has an easy way to disable a function, a feature flag allows for greater flexibility in the future. 
  4. Add any dependencies the original project used with NuGet Manager. 
  5. Build your function. 
  6. You probably have a few build errors because the .NET versions are different and there may be breaking changes. These are usually minor issues that take a little investigation. Common issues I've come across are:
    • HTTP Client library changes - used to integrated with other APIs
    • JSON (de)serialization library changed from NewtonSoft to System.Text.Json
    • Entity Framework changes - or whatever database library you use. 
    • Authentication library changes
  7. Fix your build issues but don't change any of the logic of the code yet. This is a straight migration. 
  8. Once your project builds, migrate your tests and perform the same dependency migrations. 
  9. Run your tests and verify your new timer trigger logic still succeeds.
  10. Deploy your timer trigger with your existing deployment infrastructure making sure the feature flags are not enabled. 

Disable Azure Schedule job, enable Azure Function Timer trigger

Azure App Service and Azure Functions both allow you to control the Configuration Settings from a variety of methods include Azure CLI, PowerShell, Azure SDKs, and the Azure portal. 
  1. Determine how you want to automatically deploy configuration changes. For this initial switch over, in a low priority job, you could change these feature flags manually in the Azure portal. For more critical jobs, you should automate the task and include it as part of your deployment pipeline. 
  2. Disable the original scheduler jobs and verify with logging that the job executed but didn't continue past the check of the feature flag. 
  3. Enable the timer trigger and verify with logging that the job executed and did continue past the feature flag check. 

Have a better method? 

Let me know @dfberry

Saturday, November 13, 2021

Entity Framework 6 won't save (update) an entity returned from Stored Procedure

Symptom

The issue was that code that previously worked, update property then save, wasn't working any more. The application is a .NET 4.6 Web API project using Entity Framework 6. 

Issue

The fix isn't in the code which saved (updated) the entity, but was caused because the entity that is updated isn't correctly formed by the Stored Procedure. Before the SP was added, the entity was returned from EF directly.  

Fix

After the stored procedure completes, refetch the entity using straight Entity Framework code such as `await TwitterAccounts.FirstAsync(table => table.TwitterHandle == twitterHandle);` where TwitterAccounts is a table in the EF Context. 

Step to debug issue: capture save result

The original EF code to save didn't capture the returned int of number of rows changed. 

Before

account.IsProperty = true;
_context.SaveChanges();
return OK();

After

account.IsProperty = true;
int result= _context.SaveChanges();
if(result==0)
{
throw new Exception("property was not updated");
}
return Ok(account);

Step to debug issue: see T-SQL from Entity Framework in Visual Studio Output window

Once the row count was returned as zero, the issue was why? To see the actual T-SQL sent to the database, I added the following code to the BotContext class, which inherits from DbContext. The constructor now includes:

#if DEBUG
Database.Log = s => System.Diagnostics.Debug.WriteLine(s);
#endif

This shows all T-SQL from Entity Framework in the output window. This showed that EF wasn't producing any T-SQL which meant there was either an issue with the variables (those hadn't changed) or there were no changes detected in the entity. 

Step to debug issue: see the EF changes detected

To see the changes detected, I added the following code to my BotContext class:

        public void DisplayTrackedEntities(DbChangeTracker changeTracker)
        {
            var entries = changeTracker.Entries();
            foreach (var entry in entries)
            {
                System.Diagnostics.Debug.WriteLine("Entity Name: {0}", entry.Entity.GetType().FullName);
                System.Diagnostics.Debug.WriteLine("Status: {0}", entry.State);
            }
        }

Then to use the code, call this method after the entity is changed but before it is saved:

account.IsProperty = false;
DisplayTrackedEntities(context.ChangeTracker);
int result = context.SaveChanges();

if (result == 0)
{
    throw new Exception("disable-auto-tweets not updated successfully");
}
return Ok(account);

At this point, because no changes were detected, I knew the recent switch to a SP caused the issue that changes couldn't be detected on this artificial entity. 

Step to debug issue: get EF entity after SP update

Originally the code checked a value and returned the account using something like:

await TwitterAccounts.FirstAsync(table => table.TwitterHandle == twitterHandle);

That method changed to use a stored procedure which returned the TwitterAccount entity type. That entity wasn't getting updated. Changing the method to fetch and return the EF entity after the SP updated fixed the issue  - the account was updated. 

Have a better fix? 

There is probably some way to get EF to update my entity returned from the SP. Do you know what it is? Let me know on Twitter.

@dfberry

 



Tuesday, November 2, 2021

Convert GoDaddy PEM certificate to PFX certificate to use in Azure App Service

 When you purchase GoDaddy certificates, you should get 3 files:

  • *.crt file
  • *.pem file
  • *-key.txt file
1. Change the key file's name to the key extension so any future tools can find it by its extension.

2. If you download and/or open the key file on a Windows computer, your key file may now have the wrong encoding. Use a bash terminal and the iconv CLI tool to convert to the correct encoding in a new file with a "2" added at the end of the filename to indicate the different file:


iconv -c -f UTF-8 -t ASCII your-domain-name.key > your-domain-name.2.key



3. Convert to the PFX format with the openssl CLI tool:

openssl pkcs12 -export -out your-domain-name.pfx -inkey your-domain-name.2.key -in your-domain-name.crt

4. You need to enter a new password when creating the PFX. Remember the password, you will need it when you add your certificate to Azure App Service.  

5. In the Azure portal, for your App Service, select TLS/SSL settings. 

6. Add a new TLS/SSL settings. 

7. Select your TLS/SSL settings:

  • Your new local file system's PFX file
  • Your password from step 4
  • Select the thumbpint and TLS/SSL type - there should be one choice in the drop down box
8. Select Add Binding

9. Restart your App service.

10. On the Azure portal's Overview for your App Service, select your URL such as https://YOUR-DOMAIN. 

11. The browser may take a few seconds to reload. Notice the browser indicates your website is secure with the typical lock icon. 


@dfberry

Oct 2021 - Copy an Azure SQL Database to dev computer


There are several blog posts on this site answering how to copy an Azure SQL database. Since they are older or in some cases lost (Wayne's Microsoft blog posts are gone), I'll link to the Microsoft docs. 

Enterprise data or functionality?

Don't use these processes if you need change detection or other enterprise-level requirements. Copy the Azure SQL database and keep it in the cloud.

Make a copy of your Azure Cloud database

To copy an existing Azure SQL Database in the Azure portal, you can copy or export. Generally, I choose to export to Azure Storage Blob container as a bacpac file. This allows me to use it as either a cloud or local database.

The bacpac file includes data and schema.

Watch for export completion

To watch for export completion, from the Azure Server (not database) in the portal, use the  Import/Export History in the Data Management section.

Download bacpac file from Azure Storage

To download the bacpac file, Azure Storage container in the Azure portal. Select the bacpac file then download it to your local computer.

Restore SQL database to local machine from bacpac file

Using SSMS, import a data-tier application. Select the downloaded bacpac file. 

Because the Azure SQL db is a contained db, you need to change the master db on your local server with the following t-sql executed against the master db.

exec sp_Configure 'contained database authentication', 1

go 

reconfigure 


@dfberry