Cosmos Status Code 400 - SubStatus Code 1004

When calling Cosmos DB via Gateway Mode and you get a HTTP 400 status code, the Cosmos DB Gateway is telling you the request contains invalid data or is missing required parameters.     A sub status code 1004 means that the query is not support cross partition.   Cosmos DB can run partitions (as defined by the partition key) on multiple nodes in the region, however some queries can’t be executed on sperate nodes and then combined in the response.   For example queries with Order By, Top, District, Offset, Limit and Group by require the data from this separate nodes to be evaluated together.   To solve this issue, do the aggregate, ordering and grouping at the client after the data is return.   Or target just a single partition with queries that contain this syntax. {6230289B-5BEE-409e-932A-2F01FA407A92}

GraphQL on GitHub: a fragment for aggregated repo data

GitHub provides the GraphQL explorer to play with GraphQL data and learn how to shape your queries. When your group of queries grows to the point of repeating objects and their fields, its time to move to fragments.  A fragment in GraphQL allows you to have: readability - a well-named fragment shortens queries and mutations reusability - reuse fragments in queries and mutations performance - on the client, fragments and their components are a cache layer type-safety -  the  code generator  that builds your GraphQL SDK includes named fragments so you can access any deeply nested objects as fragments without the types and their guards you would need to manage Typical places to create and use fragments to DRY up your GraphQL queries include the most common schema objects. For a GitHub GraphQL schema, those can include the User and Repository . To get the entire list of repositories in a GitHub org, you need to compensate for the cursor/paging as well as the return results. An example

Migrate Azure Scheduler jobs to Azure Functions

Migrate a code-based Azure Scheduler job to Azure Functions Recently, I migrated a scheduled job from .NET Framework 4.6 to .NET Core and from Azure App Service to Azure Functions. This was part of a larger effort to move to .NET Core, and the retirement of a dependency of Azure Scheduler jobs .  While there is nothing wrong with .NET Framework 4.6 or Azure App service, the project was moving toward separate, smaller, quicker deployments for the various projects instead of a single monolithic deployment. Existing scheduled job: Add feature flags Add the ability to control you job with a feature flag. In your Azure App service, create a feature flag for every scheduled job. This can be as simple as an App Configuration setting with a value or you can use Azure App Configuration.  Add code for the feature flag and make sure it has a default value if it isn't detected in the environment.  Add logging statements to report your runtime feature flag value.  Deploy the code to Azure App s

Entity Framework 6 won't save (update) an entity returned from Stored Procedure

Symptom The issue was that code that previously worked, update property then save, wasn't working any more. The application is a .NET 4.6 Web API project using Entity Framework 6.  Issue The fix isn't in the code which saved (updated) the entity, but was caused because the entity that is updated isn't correctly formed by the Stored Procedure. Before the SP was added, the entity was returned from EF directly.   Fix After the stored procedure completes, refetch the entity using straight Entity Framework code such as ` await TwitterAccounts.FirstAsync(table => table.TwitterHandle == twitterHandle); ` where TwitterAccounts is a table in the EF Context.  Step to debug issue: capture save result The original EF code to save didn't capture the returned int of number of rows changed.  Before account.IsProperty = true; _context.SaveChanges(); return OK(); After account.IsProperty = true; int result= _context.SaveChanges(); if(result==0) { throw new Exception("property

Convert GoDaddy PEM certificate to PFX certificate to use in Azure App Service

 When you purchase GoDaddy certificates, you should get 3 files: *.crt file *.pem file *-key.txt file 1. Change the key file's name to the key extension so any future tools can find it by its extension. 2. If you download and/or open the key file on a Windows computer, your key file may now have the wrong encoding. Use a bash terminal and the  iconv CLI tool to convert to the correct encoding in a new file with a " 2 " added at the end of the filename to indicate the different file: iconv -c -f UTF-8 -t ASCII your-domain-name.key > your-domain-name. 2 .key 3. Convert to the PFX format with the openssl CLI tool: openssl pkcs12 -export -out your-domain-name.pfx -inkey your-domain-name. 2 .key -in your-domain-name.crt 4. You need to enter a new password when creating the PFX. Remember the password, you will need it when you add your certificate to Azure App Service.   5. In the Azure portal, for your App Service, select TLS/SSL settings.  6. Add a new TLS/SSL settings. 

Oct 2021 - Copy an Azure SQL Database to dev computer

There are several blog posts on this site answering how to copy an Azure SQL database. Since they are older or in some cases lost (Wayne's Microsoft blog posts are gone), I'll link to the Microsoft docs.  Enterprise data or functionality? Don't use these processes if you need change detection or other enterprise-level requirements. Copy the Azure SQL database and keep it in the cloud. Make a copy of your Azure Cloud database To copy an existing Azure SQL Database in the Azure portal, you can copy or export. Generally, I choose to export to Azure Storage Blob container as a bacpac file . This allows me to use it as either a cloud or local database. The bacpac file includes data and schema. Watch for export completion To watch for export completion, from the Azure Server (not database) in the portal, use the  Import/Export History in the Data Management section. Download bacpac file from Azure Storage To download the bacpac file, Azure Storage container in the Azure portal.

Azure Functions + System.Text.JSON Deserialize exception

If you run into a System.IO.FileNotFoundException exception from the Azure Function when it is deserializing JSON with System.Text.Json to your class, make sure you are not using System.Text.Json 5.0.2.  Issue FileNotFoundException: Could not load file or assembly 'System.Text.Encodings.Web, Version=, Culture=neutral Fix Downgrade to System.Text.Json 5.0.1.  Project description The issue was found while moving code from a working .NET Framework 4.6.1 web api project into a .NET Core 3.1 class library, which is called from an Azure Function 3. The deserialization happens in the class library. The manual and automated testing described in the post were both local to my development machine, not on the Azure Cloud. Initially it was tested by the Azure Function manually. Later, as I thought the issue was about the JSON returned, I added automated programmatic tests to call into the class library in a separate test project. The test project worked so the issue had to be the Azure