Sunday, August 22, 2021

Create Azure DevOps Pipeline for React App deployment to Azure app service

 Disclaimer: All opinions in this post are mine and not my employer (Microsoft). If you know of a more correct or performant way to accomplish work discussed in this post, please let me know at email javascript-developer@outlook.com. Many projects I work on are in progress by the time I work on them. I can't change previous design or architecture choices, but just solve a specific technical issue. 

Secrets stored in Azure Key Vault

All secrets for the React app are stored in Azure Key Vault. These secrets need to be pulled from Key Vault and set into the environment so that the `npm build` script uses those values. In order for the Azure DevOps Pipeline to connect to Azure Key Vault, you need to complete some work before you develop your Pipeline:

  • Create a Key Vault and store your React build secrets, such as an Azure Functions key, used to authenticate and use the Function. Your secret doesn't have to have the same name as your React build variable. Please don't try. Key vault secrets don't allow underscores in names anyway. Just give the variable a human readable name. The pipeline can map between the secret name and the build environment name.
  • Create a connection from Pipelines to Key Vault - In Azure DevOps, in the project settings, add a service connection to Azure Key Vault. This process creates a service principal. You can find all your service principals in Azure Portal, under the Azure Active Directory section. Service Principals are part of Azure App Registrations. 
  • Set up Azure Key Vault access policies - In the Azure portal, find your Key Vault, and add the Pipeline's Service Principal for `list` and `get`. 

Create an Azure DevOps Pipeline to build React app

The React app is deployed to an Azure app service wrapped in a .NET Core application. The vmImage is `windows-latest`. 

The YAML file is:


# ASP.NET Core (.NET Framework)


# Build and test ASP.NET Core projects targeting the full .NET Framework.
# Add steps that publish symbols, save build artifacts, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/dotnet-core

trigger:
main

pool:
  vmImage'windows-latest'

# Variables used in this pipeline
# The `dinatrue` variables is a test used in the echo command to understand the syntax for container:
# bringing in variables.
variables:
  solution'**/*.sln'
  buildPlatform'Any CPU'
  buildConfiguration'Release'
  dinatrue'hello dina'
  buildDate$[format('{0:yyyy}{0:MM}{0:dd}', pipeline.startTime)]


steps:

# Get React app secrets from Key Vault
# Key Vault is used for projects beyond this single client app. If the Key Vault
# were only for this client app, change the `SecretsFilter` value to `*`.
taskAzureKeyVault@2
  inputs:
    azureSubscription'SQL Projects (98c...)'
    KeyVaultName'MSTwitterKeyVault'
    SecretsFilter'PublicApiMessageBeforeLoginKey,PublicApiGetUtcNowKey,PublicApiUrl,ServerUri'
    RunAsPreJobfalse

# Verify/debug variables
taskCmdLine@2
  inputs:
    script'echo %dinatrue% %buildDate%'

# Set Key vault secrets to React app environment variables
# Verify/debug values with SET command
# secrets' values will be `***` on purpose
taskCmdLine@2
  inputs:
    script'SET > env.log && cat env.log'
  env:
    REACT_APP_APP_SERVER_BASE_URL_PUBLIC_API_APP_MESSAGE_BEFORE_LOGIN_KEY$(PublicApiMessageBeforeLoginKey)
    REACT_APP_APP_SERVER_BASE_URL_PUBLIC_API_GET_UTC_NOW_KEY$(PublicApiGetUtcNowKey)
    REACT_APP_APP_SERVER_BASE_URL_PUBLIC_API$(PublicApiUrl)
    REACT_APP_SERVER_URL$(ServerUri)
    REACT_APP_CACHE_BUST$(buildDate)

# Get source code from repo
taskNuGetCommand@2
  inputs:
    restoreSolution'$(solution)'

# Build project, which ultimately builds React app into `$(build.artifactStagingDirectory)\WebApp.zip`
taskVSBuild@1
  inputs:
    solution'$(solution)'
    msbuildArgs'/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:DesktopBuildPackageLocation="$(build.artifactStagingDirectory)\WebApp.zip" /p:DeployIisAppPath="Default Web Site"'
    platform'$(buildPlatform)'
    configuration'$(buildConfiguration)'

# Deploy Zip file to Azure app service, slot named `client-stage` using webDeploy
taskAzureRmWebAppDeployment@4
  inputs:
    ConnectionType'AzureRM'
    azureSubscription'SQL Projects (98c...)'
    appType'webApp'
    WebAppName'MSTwitterApp'
    deployToSlotOrASEtrue
    ResourceGroupName'MSTwitterBot'
    SlotName'client-stage'
    packageForLinux'$(build.artifactStagingDirectory)\WebApp.zip'
    enableCustomDeploymenttrue
    DeploymentType'webDeploy'


Friday, August 6, 2021

Moving database from Azure SQL to localdb

I recently moved an Azure SQL database back to a local development database and ran into a few issues. I took these notes so that they might help the next person that hits the problem.

In my local SSMS, I use the Import/Export wizard with the datasource using SQL Server native client 11.0. This moves the tables and data. Any destination tables will not have IDENTITY as the source tables did. 

Solution # 1 

Move away from INT Identity to use GUIDS. This requires work in the database and client code but is the better choice if you need to move data out of the source datatabase then back into the source database. 

Solution #2

More immediate fix to get past the block that your inserts don't autoincrement. 

Steps:

The following steps are completed in SSMS in the destination (local) database and should have all the data but not the IDENTITY column.

  • Rename the mytable to mytable2.
  • Generate CREATE, INSERT, and SELECT scripts for the table.
  • Modify the CREATE script to use the table name (change mytable2 to mytable) and change the PK row to include the identity requirement.

CREATE TABLE [dbo].[mytable](
[Id] [int] IDENTITY (1,1) NOT NULL,
[Text] [nvarchar](40) NULL,
) ON [PRIMARY]
GO

  • Run the creation script.
  • Create second script from the INSERT/SELECT scripts

SET IDENTITY_INSERT mytable ON
INSERT INTO [dbo].[mytable]
           ([Id]
           ,[Text])
SELECT [Id]
      ,[Text]
  FROM [dbo].[mytable2]     
SET IDENTITY_INSERT mytable OFF


Thursday, December 10, 2020

Node.js, npm, yarn, & those lock files

The following is my opinion and doesn't represent my employer. 

Install Node.js

If you are like me and work across operating systems and environments (local, container, cloud), you have one or two bullet-proof ways you install Node.js. The Node.js organization has done a great job of listing those. 

I'm sticking with the following:

* For Windows installations - use the Windows download
* For all other installations, including VM, Container, Mac, *Nix - use the bash script

Never, under any circumstances, use the apt or apt-get package manager to install Node.js. At this point, that is equivalent to a code-smell. 

NPM vs YARN

NPM should be your package manager of choice when installing an NPM package. If you run into problems, log an issue

Debugging Node.js projects with NVM and lock files

The Node Version Manager (nvm) and lock files are for the developer of a project to be able to get back to an exact version of the entire development environment to debug an issue. When you install and use someone's package, you don't need their lock file. NPM doesn't install dependencies for a package based on the log file. 

Check in the lock file so you need to get back to a specific development environment. 



Friday, March 6, 2020

Hosting front-end applications on Azure

I have a catalog of small front-end Javascript apps built with Angular (ng) and React (CRA and Next) I would like to host on Azure.

Hosting on Azure

Azure provides features for a web host:

* Options from big to small - full VM, container, web app, or host static website
* HTTPS and URL - every Azure resource is served via HTTPS with a dedicated URL - no need to buy a certificate or domain name until needed.

No Configuration changes

In order to reduce hidden problems, each front-end app should be installed and running with no configuration change away from the development environment such as a to change the routing of assets from absolute to relative, or folder and subfolder naming.

Consistent & immediate deployment

Once the app is ready to deploy as a collection of static files, there shouldn't be a need to spend time preparing the hosting environment. The front-end system deployment steps should be consistent regardless of which front-end framework the project uses. Deployment failure and debugging should be minimized considering it is just a static front-end app.

Reverse proxy considerations

One solution I've used in previous years requires a reverse-proxy like Nginx and domain name control. This violates the configuration and immediate deployment goals but is a strong consideration when the project is ready for production deployment - the cost and control of an Azure VM makes that an easy choice.

Azure Storage versus Azure Web app

This narrowed the choices to:
* Azure Blob Storage static website - would only handle front-end apps completely disconnected from the back-end
* Azure Web app allows both front and back-end apps.

Both are easy to configure with minimal changes in the Azure portal.

Pricing and availability

* Azure Blob Storage is priced by data storage and throughput
* Azure Web app comes with a monthly charge. Since these are very small hobby apps, Azure Blob Storage seems the best choice for now.

Preparing a test app

To test Azure Blob storage, use the NextJS static export process to build a static app to a local out directory in the local project folder.


How to deploy static files 

After creating an Azure Storage resource, in the Azure portal, configure the Static website. This creates a Blob container named $web in the Storage account.


Copy one of the endpoints to use later to test that the static website works.


Upload files to the $web container. 

Download and use the storage explorer to move the front-end website files and folders to the new $web container. 
 

Test the static website

Use the URL endpoint provided (when you configured the static website) to verify your website works.

View usage metrics

Azure provides metrics to see how the website is doing.

 


Hosting more sites

Azure Storage builds out containers with URL routes: domain / container / file.  The route includes the container name and doesn't, by default, make assumptions about the content type to return when serving the file.

In order to host more than 1 static website from the same Azure Storage resource, all the of the configuration I'm trying to avoid would need to be done for each site and I would need to manage content types.

Creating a new Storage account is easy, cheap, and very simple so each new site will be hosted in its own Azure Storage resource. When it is time to assign a custom domain name, create a CDN then map the name to that resource.

CLI to create static website

Now that basic usage through the portal is done, use the Azure CLI to create Azure Storage and configure a static website.

Further investigation

The solution needs to solve for management takes such as listing all Static Websites and updating static websites via GitHub hook. Building the automation is a next step.





Friday, February 28, 2020

OSS @Microsoft - Docs

Open source software (OSS) at Microsoft allows everyone to see, comment, and contribute to the products, services, documentation, sample code, and SDKs they use. I work at Microsoft and my opinions are my own.

The Microsoft documentation set is not a single GitHub repository. It is many repositories, all with active writers, support engineers, and SLAs.

When you see an issue in the docs or you think a concept or technique is unclear, GitHub repositories allow you to:

  • Add an issue, via an Edit button, that is sent directly to the Product group or Content Developer. 
  • Create a pull request (PR), that is sent directly to the Product group or Content Developer. 


Is this open source though? Absolutely. You can immediately impact all users of the docs in a positive way.

2 things you can do when logging an issue against the docs:

  1. Be specific. Your comment is attached to the entire doc. Be specific where in the doc the issue is and what wording, image, or sample code is problematic. 
  2. Be available. Some times an issue is deeper than a simple fix. If someone has a followup question, respond with information if you have it. 


Can you make the fix via a pull request? Yes. That speeds up the process.


Wednesday, May 1, 2019

Fetching a Private Key From An Azure Key Vault Certificate

If you create a private certificate in Azure Key Vault and use the fancy features like auto rotation, you might like to be able to fetch the private key from the vault and rehydrate it as a X509Certificate2 class in your C# code.

Here is how you do that:

KeyVaultClient keyVaultClient = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(GetToken));
var certificateBundle = await keyVaultClient.GetCertificateAsync(certificateIdentifier);
var certficiateSecret = await keyVaultClient.GetSecretAsync(certificateBundle.SecretIdentifier.Identifier);
byte[] certificateDecoded = Convert.FromBase64String(certficiateSecret.Value);
var certificate = new X509Certificate2(certificateDecoded, password: "");

The Certificate Bundle passed back from the GetCertificateAsync call has a .Cer property, however that is just the bytes for the pubic key, if you do this:

var publicCertificate = new X509Certificate2(certificateBundle.Cer);

The X509Certificate2 instance will only contain the public key.  Instead you need to fetch the full secret and decode it to bytes, once you do that the only other thing you need to know is that Azure Key Vault stores the private certificate with a blank password.

GetToken is a method above, that the Key Vault Client uses to fetch the authication that will be used to access both the certificate and the secrets.  Notice the caller needs both secrets Get, and certificate Get access policy set in the portal for the Azure Key Vault.  

For More Information About How GetToken Works https://docs.microsoft.com/en-us/azure/key-vault/key-vault-developers-guide

{6230289B-5BEE-409e-932A-2F01FA407A92}


Sunday, June 24, 2018

Regex to search VSCode for Azure subscription keys

Before you check in code, make sure the Azure subscription keys are removed or replaced with obvious markers.

In VSCode, select the magnifying glass, the select the last icon on the line, "*." indicating the search is by regular expression. Enter [a-z0-9]{32} in the search text box and select enter.

The search results appear below the search text box. Scan the results for any highlighted keys that are real key values.