OdeToCode by K. Scott AllenOdeToCode by K. Scott Allen(c) 2004 to 2018 OdeToCode LLCscott@OdeToCode.comOdeToCode 2.0https://odetocode.com/images/odetocode.jpgOdeToCode by K. Scott Allenhttp://odetocode.com/blogs/scott/archive/2018/11/08/rest-client-is-the-best-client.aspxhttp://odetocode.com/blogs/scott/archive/2018/11/08/rest-client-is-the-best-client.aspxscott@OdeTocode.comREST Client is the Best Client<p><a href="https://marketplace.visualstudio.com/items?itemName=humao.rest-client">REST Client</a> is a Visual Studio Code extension for sending HTTP requests and viewing responses from the productive confines of Code itself.</p><p><img width="1604" height="570" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="REST Client Extension for VS Code" src="https://odetocode.com/images2/Open-Live-Writer/REST-Client-is-the-Best-Client_12D9C/testhttp_1.png" border="0"></p>
<p>There is no shortage of tools for building HTTP messages, but many of these tools have become unnecessarily complex. I have to select the HTTP method from a tiny drop-down list, enter a header by typing the key and value into two separate inputs, and toggle open a third input to enter the body content. These tools try too hard.</p>
<p>In contrast, REST Client lets me type into an interface designed for creating and editing text – a text editor. I can check-in my experiment to source control, and use standard editor features like search and replace.</p>
<p>On top of all this simplicity, add in REST Client features like variables and the ability to keep multiple requests in the same file and you have a tool that is easy to use and powerful.</p>
<p>Give it a try!</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=BEols1Nxel8:YgF8lx-D97U:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Thu, 08 Nov 2018 10:03:00 Zhttp://odetocode.com/blogs/scott/archive/2018/11/05/tracking-down-a-mysterious-502-5-error-in-asp-net.aspxhttp://odetocode.com/blogs/scott/archive/2018/11/05/tracking-down-a-mysterious-502-5-error-in-asp-net.aspxscott@OdeTocode.comTracking Down a Mysterious 502.5 Error in ASP.NET Core<p><img width="636" height="687" title="" align="right" style="margin: 0px 0px 10px 10px; float: right; display: inline; background-image: none;" alt="Azure Application Insights Extension" src="https://odetocode.com/images2/Open-Live-Writer/4170c30240e6_1034F/extensions_1.png" border="0">When I posted <a href="https://odetocode.com/blogs/scott/archive/2018/07/16/7-tips-for-troubleshooting-asp-net-core-startup-errors.aspx">7 Tips for Troubleshooting ASP.NET Core Startup Errors</a>, I thought I had covered every tip needed to track down any possible startup error. In particular, the last step never fails to reveal the culprit.</p>
<p>Then, the last step failed me.</p>
<p>This particular application would not run in an Azure App Service but would run <em>everywhere</em> else - in development, in debug, in release, <em>and</em> in the command line console on the App Service server. How’s that possible?</p>
<h3>Soup du Jour</h3>
<p>There is a proverb that says "too many cooks will spoil the broth", meaning when too many people are trying to add their own expertise on the same project, the result will be a disaster. Such is currently the case when two chefs, Visual Studio and the Azure portal, both try to cook a pot of Application Insights.</p>
<p>Application Insights is an invaluable resource for monitoring and troubleshooting an application, but AppInsights has also been a bit flaky to setup with ASP.NET Core in the past. Sometimes AI <a href="https://docs.microsoft.com/en-us/azure/application-insights/app-insights-asp-net-troubleshoot-no-data">doesn’t work</a> no matter how hard you try, and sometimes AI <em>does</em> work when you <a href="https://github.com/aspnet/AspNetCore/issues/2051#issuecomment-327709318">try to make it stop</a>.</p>
<p>Despite the flakiness, AppInsights works great once everything is configured. It is no surprise that both the Azure Portal and Visual Studio encourage the use of AppInsights, but this leads to common questions about the best approach to use AppInsights.</p>
<p>Am I supposed to install the AppInsights NuGet to my project? This "build time" configuration allows an app to use the entire SDK and provide custom telemetry to AppInsights.</p>
<p>Or, am I supposed to setup AppInsights as an Azure App Service extension? This is known as "run-time" configuration and doesn’t require any code changes or new deployments.
The <a href="https://docs.microsoft.com/en-us/azure/application-insights/app-insights-azure-web-apps#run-time-or-build-time">official doc</a> uses wording to encourage the "build-time" approach, but you can also find <a href="https://docs.microsoft.com/en-us/azure/application-insights/app-insights-monitor-performance-live-website-now">a page</a> that says to use both approaches for the best experience. </p>
<p>The failing application took the "both" approach, and it turns out the "both" approach was the source of the error.</p>
<h3>Finding the Failure</h3>
<p>When the application failed with an HTTP 502.5 error, I went to Kudu and ran the app from the console. The application would work from the console, but consistently fail when launched from the worker process in a Windows App Service. This behavior was curious, because both approaches run the same app on the same virtual server with the same file system and the same environment variables. But, decades of debugging experience gave me the insight that <strong>something</strong> about the environment <strong>must</strong> be different.</p>
<p>Ultimately I found the true source of the failure by looking at the live stream of the app’s standard output stream.</p><p><img width="2585" height="599" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="ASP.NET Core live stream standard output in Azure App Service" src="https://odetocode.com/images2/Open-Live-Writer/4170c30240e6_1034F/livestream_3.png" border="0"></p>
<p>Even this error was a bit curious since the <code>dotnet publish</code> operation included the required AppInsights assembly in the output. I dug in further and eventually looked at Kudu diagnostics for the <code>w3wp.exe</code> worker process. That’s when the answer jumped out at me.</p><p><img width="1789" height="508" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="Azure App Service Worker Process Environment Variables" src="https://odetocode.com/images2/Open-Live-Writer/4170c30240e6_1034F/stRTUPASM_1.png" border="0"></p>
<p>ASP<span>.</span>NET 2.0 <a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/host/platform-specific-configuration?view=aspnetcore-2.1&amp;tabs=windows">introduced</a> the <code>ASPNETCORE_HOSTINGSTARTUPASSEMBLIES</code> environment variable seen above. The variable allows a host to inject additional startup logic to a compiled application by naming assemblies with <code>IHostingStartup</code> components inside. I had no idea what StartupBootstrapper could be, but a quick search revealed this assembly to be part of the Azure AppInsights extension. In the end, the AppInsights installed by Visual Studio and the AppInsights installed by the Azure extension were incompatible versions. To get the app working, I could do any of the following:</p>
<ul>
<li>Uninstall the AppInsights extension in the App Service</li>
<li>Remove the AppInsights package reference from the project</li>
<li>Leave both in place, but make sure the package version and extension version are compatible</li>
</ul>
<p>Option #1 seems like the best option, since one might never know when the extension will update and break the application again.</p>
<h3>Summary</h3>
<p>If you think you know exactly what your ASP.NET Core application does during Startup, you could be wrong. IHostingStartup components give platforms an extensibility point to change existing applications, but can also lead to unexpected problems when dependencies conflict. Check for ASPNETCORE_HOSTINGSTARTUPASSEMBLIES if you run into strange startup problems.</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=fZbFXxT2dGs:MdCcW7a4eZ0:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Mon, 05 Nov 2018 09:03:00 Zhttp://odetocode.com/blogs/scott/archive/2018/10/29/net-core-opinion-6-ndash-be-wary-of-gui.aspxhttp://odetocode.com/blogs/scott/archive/2018/10/29/net-core-opinion-6-ndash-be-wary-of-gui.aspxscott@OdeTocode.com.NET Core Opinion #6 - Be Wary of GUI Build Tools<p>It&rsquo;s not that GUI build tools are bad, per se, but you have to watch out for tools that use too much magic, and tools that don&rsquo;t let you version control your build with the same zealousness that you use for the source code of the system you are building.</p>
<p>Let&rsquo;s start with the 2nd type of tool.</p>
<h3>Why Text Files Rule</h3>
<p>Many open source .NET Core projects use <a href="https://www.appveyor.com/">AppVeyor</a> to build, test, and deploy applications and NuGet packages. When defining an AppVeyor build, you have a choice of using a GUI, or a configuration file.</p>
<p>With the GUI, you can point and click your way to a deployment. This is a perfectly acceptable approach for some projects, but lacks some of the <em>rigor</em> you might need for larger projects or teams.</p>
<p><img style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" title="" src="https://odetocode.com/images2/Open-Live-Writer/5c6f4529f315_C63F/guibuild_1.png" alt="AppVeyor GUI Build" width="804" height="455" border="0" /></p>
<p>You can also define your build by checking a YAML configuration file into the root of your repository. Here&rsquo;s an excerpt:</p>
<p><a href="https://odetocode.com/images2/Open-Live-Writer/5c6f4529f315_C63F/ymlbuild.png"><img style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" title="" src="https://odetocode.com/images2/Open-Live-Writer/5c6f4529f315_C63F/ymlbuild_thumb.png" alt="AppVeyor YAML Build" width="660" height="369" border="0" /></a></p>
<p>Think about the advantages of the source-controlled YAML approach:</p>
<ul>
<li>
<p>You can version the build with the rest of your software</p>
</li>
<li>
<p>You can use standard diff tools to see what has changed</p>
</li>
<li>
<p>You can see who changed the build</p>
</li>
<li>
<p>You can copy and share your build with other projects.</p>
</li>
</ul>
<p>Also note that in the screen shot above, the YAML file calls out to a build script &ndash; <code>build.ps1</code>. I believe you should encapsulate as many build steps as possible into a package you can run on the build server <em>and</em> on a development machine. Doing so allows you to make changes, test changes, and troubleshoot build steps quickly. You can use MSBuild, Powershell, <a href="https://cakebuild.net/">Cake</a>, or any technology that makes builds easier. Integration points, like publishing to NuGet, will stay as configurable steps in the build platform.</p>
<p>Azure Pipelines also offers GUI editors for authoring build and release pipelines.</p>
<p><img style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" title="" src="https://odetocode.com/images2/Open-Live-Writer/5c6f4529f315_C63F/azuregui_3.png" alt="Azure Pipelines GUI Editor" width="804" height="1053" border="0" /></p>
<p>Fortunately, a YAML option arrived recently and is <a href="https://docs.microsoft.com/en-us/azure/devops/release-notes/2018/sprint-142-update">getting better with every sprint</a>.</p>
<p><img style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" title="" src="https://odetocode.com/images2/Open-Live-Writer/5c6f4529f315_C63F/cac_1.png" alt="Configure Azure Pipelines with YAML" width="1205" height="248" border="0" /></p>
<h3>Magic Tools</h3>
<p>Magic tools are tools like Visual Studio. For over 20 years, Visual Studio has met the goal of making complex tasks simple. However, any time you want to perform a more complex task, the simplicity and hidden complexity can get in the way.</p>
<p>For example, what does the &ldquo;Build&rdquo; command in Visual Studio do, exactly? Call MSBuild? Which MSBuild? I have 10 copies of msbuild.exe on a relatively fresh machine. What parameters are passed? All the details are hidden and there&rsquo;s nothing Visual Studio gives me as a starting point if I want to create a build pipeline on some other platform.</p>
<p>Another example of magic is the Docker support in Visual Studio. It is nice that I can right-click an ASP.NET Core project and say &ldquo;Add Docker Support&rdquo;. Seconds later I&rsquo;ll be able to build an image, and not only run my project in a container, but debug through code executing in a container.</p>
<p>But, try to build or run the same image outside of Visual Studio and you&rsquo;ll discover just how much context is hidden by tooling. You have to dig around in the build output to discover some of the parameters, and then you'll realize VS is mapping user secrets and setting up a different environment for the container. You might also notice the quiet installation of <code>Microsoft.VisualStudio.Azure.Containers.Tools.Targets</code> into your project, but you won't find any documentation or source code for thus NuGet package.</p>
<p>I think it is tooling like the Docker tooling that gives VS Code good traction. VS Code relies on configuration files and scripts that can make complex tasks simpler without making customizations and understanding inaccessible. Want to make a simple change to the Visual Studio approach? Don&rsquo;t tell me you are going to <a href="https://stackoverflow.com/questions/44280024/use-custom-visual-studio-run-configuration-with-docker/49155353#49155353">edit the IL in Microsoft&rsquo;s assembly</a>! VS Code is to Visual Studio what YAML files are to GUI editors.</p>
<h3>Summary</h3>
<p>To wrap up this post in a single sentence: build and release definitions need source control, too.</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=IvVS0FwFCT4:EIK4wNsgdOo:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Mon, 29 Oct 2018 10:03:00 Zhttp://odetocode.com/blogs/scott/archive/2018/10/23/major-updates-for-my-building-secure-services-in-azure-course.aspxhttp://odetocode.com/blogs/scott/archive/2018/10/23/major-updates-for-my-building-secure-services-in-azure-course.aspxscott@OdeTocode.comMajor Updates for my Building Secure Services in Azure Course<p>I’ve completely reworked my <a href="https://www.pluralsight.com/courses/microsoft-azure-dotnet-secure-services-applications" target="_blank">secure services course</a> from scratch. There’s a lot of demos across a wide range of technologies here, including:</p>
<h3>Docker Containers</h3>
<ul>
<li><p>Building ASP.NET Core projects using Visual Studio and Docker containers.</p>
</li>
<li><p>Deploying container images using Docker Hub and Azure App Services for Linux</p>
</li>
<li><p>Setting up continuous deployment for containers</p>
</li>
</ul>
<h3>Automation and Azure Resource Manager</h3>
<ul>
<li><p>Using ARM templates to deploy and provision resources in Azure (infrastructure as code)</p>
</li>
<li><p>Setting up Azure Key Vault</p>
</li>
<li><p>Storing secrets in Key Vault for use in ARM templates</p>
</li>
</ul>
<h3>Microservices and Container Orchestration</h3>
<ul>
<li><p>Using the new IHttpClientFactory and resiliency patterns for HTTP networking in ASP.NET Core</p>
</li>
<li><p>Container orchestration using Docker compose</p>
</li>
<li><p>Creating and using an Azure Container Registry (ACR)</p>
</li>
<li><p>Deploying multiple images using ACR And Compose</p>
</li>
</ul>
<h3>Cloud Identity</h3>
<ul>
<li><p>Creating your own test instance of Azure Active Directory</p>
</li>
<li><p>Authentication with OpenID Connect (OIDC) and Azure Active Directory</p>
</li>
<li><p>Securing APIs using Azure Active Directory and JWT tokens</p>
</li>
<li><p>Invoking secure APIs</p>
</li>
<li><p>Setting up an Azure B2C instance and defining your own policies</p>
</li>
<li><p>Securing an application using Azure B2C.</p>
</li>
</ul>
<p>Note: this updated course is an hour shorter than the original course. Pluralsight authors generally want to make courses longer, not shorter, but I learned how to tell a better story this second time around. Also, the Docker story and tooling is markedly improved from last year, which saves times.</p><p><a href="https://app.pluralsight.com/library/courses/microsoft-azure-dotnet-secure-services-applications/table-of-contents" target="_blank"><img width="1028" height="579" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="Building Secure Services in Azure" src="https://odetocode.com/images2/Open-Live-Writer/f1a6b24e91c0_9732/image_3.png" border="0"></a></p>
<p>I hope you enjoy the <a href="https://www.pluralsight.com/courses/microsoft-azure-dotnet-secure-services-applications" target="_blank">course</a>!</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=ehu1-7wd_Tg:LArC6HqRlNA:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Tue, 23 Oct 2018 10:03:00 Zhttp://odetocode.com/blogs/scott/archive/2018/10/17/net-core-opinion-5-deployment-scripts-and-templates.aspxhttp://odetocode.com/blogs/scott/archive/2018/10/17/net-core-opinion-5-deployment-scripts-and-templates.aspxscott@OdeTocode.com.NET Core Opinion #5 - Deployment Scripts and Templates<p>Previously, we looked at <a href="https://odetocode.com/blogs/scott/archive/2018/09/13/net-core-opinion-3-ndash-other-folders-to-include.aspx">some folders</a> to include in your source code repository. One folder I didn’t mention at the time is a <code>deployment</code> folder.</p>
<p>Not every project needs a <code>deployment</code> folder, but if you are building an application, a service, or a component that requires a deployment, then this folder is useful, even if a deployment is as simple as copying files to a well-known location.</p>
<p>What goes into the folder?</p>
<h3>Setup Instructions</h3>
<p>At one extreme, the folder might contain markdown instructions about how to setup a development environment, or a list of prerequisites to develop and run the software. There’s nothing automated about markdown files, but the developer starting this week doesn’t need to figure out the setup using trial and error.</p>
<h3>Configuration as Code</h3>
<p>At the other extreme, you can automate anything these days. Does a project need specific software on Windows for development? Write a script to call <a href="https://chocolatey.org/">Chocolatey</a>. Does the project use resources in Azure for development? The <a href="https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest">Azure CLI</a> is easy to use, and Azure Resource Manager templates can declaratively take on some of the load.</p><p><img width="787" height="519" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="Generating an ARM template from the Azure portal puts you one step closer to automating the setup of an entire resource group. " src="https://odetocode.com/images2/Open-Live-Writer/5136ef32f927_8E28/arm_1.png" border="0"></p>
<p>Ruthlessly automating software from development to production requires time and dedication, but the benefits are enormous. Not wasting time on setup and debugging misconfigurations is one advantage. Being able to duplicate a given environment with no additional work comes in handy, too.</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=iIKY3_AAnRs:IXYm59jCcao:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Wed, 17 Oct 2018 10:03:00 Zhttp://odetocode.com/blogs/scott/archive/2018/10/15/tackling-costs-in-azure.aspxhttp://odetocode.com/blogs/scott/archive/2018/10/15/tackling-costs-in-azure.aspxscott@OdeTocode.comTackling Costs in Azure<p>"Cloud Computing Governance" sounds like a talk I’d want to attend after lunch when I need an afternoon nap, but ever since the CFO walked into my office waving Azure invoices in the air, the topic is on my mind.</p>
<p>It seems when you turn several teams of software developers loose in the cloud, you typically set the high-level priorities like so:</p>
<ol>
<li>Make it secure</li>
<li>Make it fast</li>
<li>Make it more secure</li>
</ol>
<p>Missing from the list is the priority to "make the monthly cost as cheap as possible", but cost is easy to overlook when the focus is on security, quality, and scalability. After the CFO left, I reviewed what was happening across a dozen Azure subscriptions and I started to make some notes:</p>
<p><img width="580" height="772" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="Cloud Costs Adhoc Impromptu Unstructured Analysis " src="https://odetocode.com/images2/Open-Live-Writer/0b9c0f868141_CA1A/IMG_0027_1.jpg" border="0"></p>
<p>Yes, there’s 104 un-pooled Azure SQL instances, and 38 app services running on 30 app service plans.</p>
<h3>Cutting Costs</h3>
<p>There are countless people in the world who want to sell tools and consulting services to help a company reduce costs in the cloud. To me, outside consultants start with only 1 of the 3 areas of expertise needed to optimize cost. The three areas are:</p>
<ol>
<li>Expertise in Azure features, pricing, and licensing</li>
<li>In depth knowledge of the system under development</li>
<li>An understanding of where the business is headed in 1 to 3 years, including an understanding of the contractual relationships with SaaS customers.</li>
</ol>
<p>In Venn diagram form:</p>
<p><img width="1024" height="665" title="" style="border: 0px currentcolor; border-image: none; margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="The Ideal Person to Optimize Cost as a Function of Cloud" src="https://odetocode.com/images2/Open-Live-Writer/0b9c0f868141_CA1A/cloudcostvenn_3.png" border="0"></p>
<p>Let’s dig into the details of where these three areas of knowledge come into play.</p>
<h3>Cutting Tools</h3>
<p>Let’s say your application needs data from dozens of large customers. How will the data move into Azure? An outside consultant can’t just say “Event Hubs” or “Data Factory” without knowing some details. Is the data size measured in GB or TB? How often does the data move? Where does the data live at the customer? What needs to happen with the data in the cloud? Will any of these answers change in a year?</p>
<p>Without a good understanding of the Azure offerings, a tech person often answers with the technology they already know. A SQL oriented developer, for example, will use Data Factory to pump data into an Azure SQL database. But, this isn’t the most cost effective answer if the data requires heavy duty processing after delivery, because Azure SQL instances are priced for line of business transactions that need atomicity, reliability, redundancy, high availability, and automatic backups, not hardcore compute and I/O.</p>
<p>But let’s say the answer is SQL Server. Now what?</p>
<h3>Cutting Boards</h3>
<p>Now a consultant needs to dig deeper to find out the best approach to SQL Server in the cloud. There are three broad approaches:</p>
<ol>
<li>SQL Server on a virtual machine</li>
<li>SQL Server as a managed instance</li>
<li>Azure SQL Server</li>
</ol>
<p>Option #1 is best for lift and shift solutions, but there is no need to take on the responsibility for clustering, upgrades, and backups if you can start with PaaS instead of IaaS. Option #2 is also designed for moving on-prem applications to the cloud, because a managed instance has better compatibility with an on-prem SQL Server, but without some of the IaaS and management hassles. For greenfield development, option #3 is the best option for most scenarios.</p>
<p>Once you’ve decided on option 3, there is another two levels of cost and performance options to consider. It’s not so much that Azure SQL is complicated, but Microsoft provides flexibility to cover different business scenarios. For any given Azure SQL instance, you can:</p>
<ol>
<li>Run the database as a single database</li>
<li>Add the database to a pool for resource sharing</li>
</ol>
<p>Option #1 is the best option when you manage a single database, or you have a database with unique performance characteristics. A pool is usually better from a cost to performance ratio when you have 3 or more databases.</p>
<p>After you’ve decided to pool, the next decision is to decide how you’ll specify the performance characteristic of the pool. Will you use DTUs? Or will you use vCPUs? DTUs are frustratingly vague, but we do know that 20 DTUs are twice as powerful as 10 DTUs. vCPUs are at least a bit familiar, because we equated CPUs with performance capability for decades.</p>
<h3>Cutting Edge</h3>
<p>One significant difference between the DTU model and the vCPU model is that only the vCPU model allows for reserved instances and the “hybrid benefit”. Both of these options can lead to huge cost savings, but both require some business knowledge.</p>
<p>The “hybrid benefit” is the ability to bring your own SQL Server license. The benefit is ideal for moving SQL databases from on-prem to the cloud, because you can make use of a license you already own. Or, perhaps your organization already has a number of free licenses from the Microsoft partner program, or discounted licenses from enterprise agreements.</p>
<p>Reserved instances will save you 21 to 33 percent if you commit to a certain level of provisioning for 1 to 3 years. If you customers sign one year contracts to use your service, a one year reserved instance is a quick cost savings with little risk.</p>
<p>If everything I’ve said so far makes it sound like you could benefit from a using a spread sheet to run hypothetical test, then yes, setting up a spreadsheet does help.</p>
<p><img width="1028" height="653" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="Cloud Cost Optimization Engine with Automatic Dependency Management" src="https://odetocode.com/images2/Open-Live-Writer/0b9c0f868141_CA1A/image_8.png" border="0"></p>
<h3>Cutting Ends</h3>
<p>Once you have a plan, you have to enforce the plan and reevaluate the plan as time moves forward. But, logging into the portal and eyeballing resources only works for a small number of resources. If things go as planned, I’ll be blogging about an automated approach over the next few months.</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=vZpByYD2IoA:T8xB-QchtMs:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Mon, 15 Oct 2018 10:03:00 Zhttp://odetocode.com/blogs/scott/archive/2018/10/02/an-updated-cloud-patterns-and-architecture-course.aspxhttp://odetocode.com/blogs/scott/archive/2018/10/02/an-updated-cloud-patterns-and-architecture-course.aspxscott@OdeTocode.comAn Updated Cloud Patterns and Architecture Course<p>I’ve updated my <a href="https://app.pluralsight.com/library/courses/microsoft-azure-dotnet-cloud-architecture/table-of-contents">Cloud Patterns and Architecture</a> course on Pluralsight.</p>
<p>The overall goal of this course is to show the technologies and techniques you can use to build scalable, resilient, and highly available applications in the cloud, specifically with Azure.</p><p><a href="https://app.pluralsight.com/library/courses/microsoft-azure-dotnet-cloud-architecture/table-of-contents" target="_blank"><img width="1028" height="539" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="Sample System Design with Azure Platform Services" src="https://odetocode.com/images2/Open-Live-Writer/bd4ac59021a7_8394/image_9.png" border="0"></a></p>
<p>In addition to walking through sample architectures, demonstrating design patterns, and adding bits of theory on topics like the CAP theorem, here are some of the lower level demos in the course:</p>
<ul>
<li><p>Setting up Azure Traffic Manager and using Traffic Manager profiles to route traffic to a geo-distributed web application.</p>
</li>
<li><p>Setting up Azure Service Bus to send and receive queued messages.</p>
</li>
<li><p>Creating an Azure Redis Cache and using the cache with an SDK, as well as configuring the cache to operate behind the ASPNET IDistributedCache interface.</p>
</li>
<li><p>Provisioning a Content Delivery Network (Azure CDN) and pushing static web site content into the CDN.</p>
</li>
<li><p>Importing a web API into Azure API Manager using OpenAPI and the ASPNET Swashbuckle package, then configuring an API to apply a throttling policy.</p>
</li>
<li><p>Creating, tweaking, running, and analyzing load tests using Azure DevOps and Visual Studio load testing tools.</p>
</li>
</ul>
<p>And more! I hope you enjoy <a href="https://app.pluralsight.com/library/courses/microsoft-azure-dotnet-cloud-architecture/table-of-contents">the course</a>!</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=75AGOqLpUDs:POm5zK5a5pE:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Tue, 02 Oct 2018 10:03:00 Zhttp://odetocode.com/blogs/scott/archive/2018/10/01/thoughts-on-azure-devops.aspxhttp://odetocode.com/blogs/scott/archive/2018/10/01/thoughts-on-azure-devops.aspxscott@OdeTocode.comThoughts on Azure DevOps<p><img width="204" height="204" title="" align="right" style="float: right; display: inline; background-image: none;" alt="Azure DevOps" src="https://odetocode.com/images2/Open-Live-Writer/Thoughts-on-Azure-DevOps_8346/image_3.png" border="0">I’ve been using Azure DevOps since the early days when the service carried the name <em>Visual Studio Online</em>. I’ve used the service for both professional projects and personal projects, and I’ve enjoyed the service so much I’ve demonstrated and recommended the service on consulting gigs, at workshops, at user groups, and in Pluralsight courses.</p>
<p>ADO has sometimes been a hard sell. The previous monikers for the service tied the product to Windows, Visual Studio, and Microsoft, so getting a Node developer to sit down and see a continuous delivery pipeline for a pure JS project wasn’t always easy. People already <em>in</em> the Microsoft ecosystem would also resist given the baggage of its on-premises ancestor, Team Foundation Services. And by baggage, I mean heavy enterprise baggage overstuffed with XML. I’ve gotten a lot of work done with TFS over the years, but TFS is also the only Microsoft product I’ve upgraded by hiring an external expert. I did not want to learn the installation incantations for the unholy amalgamation of TFS, SQL Server, SSRS, and SharePoint. TFS is also the only source code provider I’ve seen crash a commercial-grade network router while fetching source code.</p>
<p>But, the past is gone, and no other service exemplifies the evolution of Microsoft quite as well as evolution of TFS to Azure DevOps. We’ve gone from a centralized behemoth with an enterprise focus to a modern looking and sleek cloud platform that aggressively supports small open source projects as well as larger organizations.</p>
<p>Here are the constituent services that form Azure DevOps.</p>
<h3>Azure Pipelines</h3>
<p>Pipelines provide a platform for building, testing, packaging, and deploying applications. It’s a feature rich build system that is extensible and easy to use. I’d consider this the crown jewel of Azure DevOps. All the heavy lifting uses build machines that the service transparently provisions in the cloud. Here are three more fun facts:</p>
<ul>
<li><p>Pipelines are not tied to source control in Azure. You can pull source from public and private repositories, including GitHub.</p>
</li>
<li><p>Build minutes for OSS projects are free and unlimited.</p>
</li>
<li><p>You can build anything for anyone since the supported build environments include Linux, Windows and macOS.</p>
</li>
</ul>
<p>My biggest complaint about Pipelines in the past has been the inability to define builds using source controlled text files instead of the web UI. Fortunately, YML files have come to the rescue and the ability to codify and version build and release definitions should soon be generally available.</p><p><img width="640" height="417" title="" style="margin-right: auto; margin-left: auto; float: none; display: block;" alt="Azure Pipelines at Work" src="https://odetocode.com/images2/Open-Live-Writer/Thoughts-on-Azure-DevOps_8346/image_6.png"></p>
<h3>Azure Boards</h3>
<p>Boards are where a team can track issues, bugs, work items, and epics. There are Kanban boards, of course, and custom workflows. The service is well featured, particularly since it is free for 5 users and about $9,000 USD a year for 100 users (note that developers with MSDN subscriptions will have free access). There are other products that have many more bells and whistles, but they’ll also start license negotiations at $20,000 for 100 users.</p>
<h3>Azure Repos</h3>
<p>Git source control with an unlimited number of private or public repositories.</p>
<h3>Azure Test Plans</h3>
<p>Automated tests will typically execute in a Pipeline. The Test Plans service is more of a place for tests <em>not</em> executing in a pipeline, so this service covers manual tests, and exploratory tests, as well as load tests (which are automated, but fall here for some reason).</p>
<p>The load testing features are the only features I’m qualified to speak about since I’ve been using the testing tools in VS Enterprise for years. Unfortunately, the tools themselves remain pretty much unchanged over these years and feel dated. The test recorder requires Internet Explorer and a plugin. The “Browser Mix” feature will allow you to make HTTP requests using an IE9 UA string, but there is no mention of any browser released <em>after</em> IE9, and even having a browser mix feature in 2018 is questionable.</p>
<p>Behind the scenes, the load testing artifacts are relatively simple XML files, so it is possible to avoid the tools in some workflows.</p>
<p>On the plus side, the load testing framework can intelligently provision hardware in the cloud to generate load. There is no need to setup test agents and monitor the agents to see if they are overworked. See my <a href="https://app.pluralsight.com/library/courses/microsoft-azure-dotnet-cloud-architecture/table-of-contents" target="_blank">Cloud Patterns</a> course for more. </p><p><img width="644" height="216" title="" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" alt="Azure Load Test Results" src="https://odetocode.com/images2/Open-Live-Writer/Thoughts-on-Azure-DevOps_8346/image_12.png" border="0"></p>
<h3>Azure Artifacts</h3>
<p>Your own ultimate package repository for Maven, npm, and NuGet packages. Publish packages here for internal use at your organization.</p>
<h3>Extensions</h3>
<p>The <em>app store</em> for DevOps contains some high quality extensions. There is also an extensive HTTP API throughout the DevOps service to integrate with other products and your own custom utilities. Between the API and the custom extensions, there is always a way to make something work in DevOps, all you need is the will.</p>
<h3>How This Relates to GitHub</h3>
<p>My opinion: GitHub is community focused, Azure DevOps is focused on organizations. But, there is some crossover. If you have an OSS project, you’ll want to host on GitHub and build in Pipelines.</p>
<h3>Summary</h3>
<p><a href="https://docs.microsoft.com/en-us/azure/devops/release-notes/">Look for yourself</a> at the aggressive and transparent evolution of Azure DevOps over the years. My only worry today is Azure DevOps using the word "DevOps" in the name. DevOps <em>requires</em> a <strong>way of thinking</strong> and a <strong>culture</strong>. I hope organizations don't adopt DevOps tools in the same way they adopted Agile tools and then proclaimed themselves Agile.</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=RlGPsU1Aigo:i--OfZbcHiE:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Mon, 01 Oct 2018 15:10:00 Zhttp://odetocode.com/blogs/scott/archive/2018/09/21/net-core-opinion-4-ndash-increase-productivity-with-dev.aspxhttp://odetocode.com/blogs/scott/archive/2018/09/21/net-core-opinion-4-ndash-increase-productivity-with-dev.aspxscott@OdeTocode.com.NET Core Opinion #4 - Increase Productivity with Dev Scripts<p>In a <a href="https://odetocode.com/blogs/scott/archive/2018/09/13/net-core-opinion-3-ndash-other-folders-to-include.aspx">previous post</a> I mentioned that a <code>scripts</code> directory can be a welcome addition to any source code repository. What goes into <code>scripts</code>? Anything you can automate to make a developer&rsquo;s life easier!</p>
<h3>Examples for Inspiration</h3>
<p>Here&rsquo;s a script I&rsquo;ve used to simplify adding an EF migration. All I need to do from the command line is <code>addmigration [migration_name]</code>.</p>
<pre class="brush: text; gutter: false; toolbar: false; ">pushd src\beaverleague.data
dotnet ef migrations add %1
dotnet ef database update
popd
</pre>
<p>I also have a <code>recreatedb</code> script I can use to start fresh after pulling changes.</p>
<pre class="brush: text; gutter: false; toolbar: false; ">pushd src\beaverleague.web
dotnet run dropdb migratedb seeddb stop
popd
</pre>
<p>More on how the parameters above work in a future post.</p>
<p>The EF repo itself uses a <a href="https://github.com/aspnet/EntityFrameworkCore/tree/release/2.2/tools">tools folder</a> instead of a scripts folder, but the idea is the same. Inside you&rsquo;ll find scripts to clean up test environments by dropping and shrinking databases, like this one that uses a combination of <code>sqlcmd</code> and <code>sqllocaldb</code> command line tools, as well as a script to query for all the non-system databases in a DB instance.</p>
<pre class="brush: text; gutter: false; toolbar: false; ">@echo off
sqlcmd -S "(localdb)\mssqllocaldb" -i "DropAllDatabases.sql" -o "DropAll.sql"
sqlcmd -S "(localdb)\mssqllocaldb" -i "DropAll.sql"
del "DropAll.sql"
sqllocaldb stop mssqllocaldb
sqllocaldb delete mssqllocaldb
ShrinkLocalDBModel.cmd
</pre>
<p>For more examples and ideas, checkout the <a href="https://github.com/Microsoft/TypeScript/tree/master/scripts">TypeScript</a> repo with scripts for everything from running tests to automating GitHub issues with OctoKit. There&rsquo;s the <a href="https://github.com/Microsoft/TypeScript/tree/master/scripts">vscode</a> repo with scripts to setup an environment. The repo to <a href="https://github.com/Microsoft/dotnet-framework-docker/tree/master/scripts">build the official .NET Docker images</a> includes Powershell scripts to execute <code>docker pull</code> with retries.</p>
<p>These are all examples where 5 or 6 lines of script code can not only save time for the entire team in the long run, but also codify a common operation.</p>
<h3>dotnet</h3>
<p>I specifically want to call out special capabilities of the <code>dotnet</code> CLI tool. We&rsquo;ve always had the ability to build, publish, and package from the command line, but the new <a href="https://docs.microsoft.com/en-us/dotnet/core/tools/global-tools">global tools</a> feature gives us an npm-ishly easy path to installing <em>new</em> tools and use them from anywhere.</p>
<p>Here are some of the tools I use.</p>
<ul>
<li><a href="https://www.nuget.org/packages/dotnet-certes/">dotnet-certes</a> &ndash; automate certificate acquisition.</li>
<li><a href="https://www.nuget.org/packages/dotnet-aspnet-codegenerator/">dotnet-aspnet-codegenerator</a> - scaffold new items without using Visual Studio</li>
<li><a href="https://www.nuget.org/packages/Cake.Tool/">dotnet-cake</a> &ndash; execute Cake builds.</li>
<li><a href="https://www.nuget.org/packages/dotnet-ignore/">dotnet-ignore</a> &ndash; download .gitignore starter files from GitHub.</li>
<li><a href="https://www.nuget.org/packages/dotnet-script/">dotnet-script</a> &ndash; execute C# scripts.</li>
<li><a href="https://www.nuget.org/packages/dotnet-search/">dotnet-search</a> &ndash; search for NuGet packages</li>
<li><a href="https://www.nuget.org/packages/dotnet-serve/">dotnet-serve</a> &ndash; a simple HTTP server like npm&rsquo;s http-server.</li>
<li><a href="https://www.nuget.org/packages/git-status-cli/">dotnet-git-status-cli</a> &ndash; look at the status of multiple repositories in a directory tree.</li>
<li><a href="https://www.nuget.org/packages/srihash-cli">dotnet-srihash-cli</a> &ndash; compute the sub-resource integrity hash for script tags.</li>
</ul>
<p>Nate McMaster maintains a more complete <a href="https://github.com/natemcmaster/dotnet-tools">list</a> of global tools.</p>
<h3>Summary</h3>
<p>Take advantage of the command line renaissance in .NET Core to speed up a <em>repeatable</em> development process.</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=bqZA3x9XIKo:IpP0UxVaUGM:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Fri, 21 Sep 2018 10:03:00 Zhttp://odetocode.com/blogs/scott/archive/2018/09/17/how-the-next-delegate-works-in-asp-net-core-middleware.aspxhttp://odetocode.com/blogs/scott/archive/2018/09/17/how-the-next-delegate-works-in-asp-net-core-middleware.aspxscott@OdeTocode.comHow the Next Delegate Works In ASP.NET Core Middleware<p>How does <code>next</code> know how to call the next piece of middleware in the HTTP processing pipeline? I’ve been asked this question more than once when helping to write middleware components for ASP.NET Core.</p>
<p>I thought it might be fun to answer the question by showing the code for an implementation of <code>IApplicationBuilder</code>. Keep in mind the code is meant to demonstrate how to build a middleware pipeline. There is no error handling, no optimizations, no pipeline branching features, and no service provider.</p>
<h3>The Goal</h3>
<p>We want an app builder with a <code>Use</code> method just like a real application builder, that is a <code>Use</code> method that takes a <code>Func&lt;RequestDelegate, RequestDelegate&gt;</code>. This <code>Func&lt;&gt;</code> represents a middleware component.</p>
<p>When we invoke the function we have to pass in a <code>next</code> delegate that represents the next piece of middleware in the pipeline. What we get back when we invoke the function is a <em>second</em> function that we can use to process each individual HTTP request.</p>
<p>The code below looks just like the code in the Configure method of a web app, although the middleware doesn’t do any real work. Instead, the components write log statements into a fake HTTP context.</p>
<pre class="brush: csharp; gutter: false; toolbar: false; ">app.Use(next =&gt;
{
return async ctx =&gt;
{
ctx.AddLogItem("Enter middleware 1");
await next(ctx);
ctx.AddLogItem("Exit middleware 1");
};
});
app.Use(next =&gt;
{
return async ctx =&gt;
{
ctx.AddLogItem("Enter middleware 2");
await next(ctx);
ctx.AddLogItem("Exit middleware 2");
};
});
app.Use(next =&gt;
{
return async ctx =&gt;
{
ctx.AddLogItem("Enter middleware 3");
await next(ctx);
ctx.AddLogItem("Exit middleware 3");
};
});
</pre>
<p>If we were to look at the log created during execution of the test, we should see log entries in this order:</p>
<pre class="brush: text; gutter: false; toolbar: false; ">Enter middleware 1
Enter middleware 2
Enter middleware 3
Exit middleware 3
Exit middleware 2
Exit middleware 1
</pre>
<p>In a unit test with the above code, I expect to be able to use the app builder to build a pipeline for processing requests represented by an <code>HttpContext</code>.</p>
<pre class="brush: csharp; gutter: false; toolbar: false; ">var pipeline = app.Build();
var request = new TestHttpContext();
pipeline(request);
var log = request.GetLogItem();
Assert.Equal(6, log.Count);
Assert.Equal("Enter middleware 1", log[0]);
Assert.Equal("Exit middleware 1", log[5]);
</pre>
<h3>The Implementation</h3>
<p>Each time there is a call to <code>app.Use</code>, we are going to need to keep track of the middleware component the code is adding to the pipeline. We’ll use the following class to hold the component. The class will also hold the <code>next</code> pointer, which we’ll have to compute later after all the calls to <code>Use</code> are finished and we know which component comes next. We’ll also store the <code>Process</code> delegate, which represents the HTTP message processing function returned by the component <code>Func</code> (which we can’t invoke until we know what comes next).</p>
<pre class="brush: csharp; gutter: false; toolbar: false; ">public class MiddlewareComponentNode
{
public RequestDelegate Next;
public RequestDelegate Process;
public Func&lt;RequestDelegate, RequestDelegate&gt; Component;
}
</pre>
<p>In the application builder class, we only need to store a list of the component being registered with each call to <code>Use</code>. Later, when building the pipeline, the ability to look forwards and backwards from a given component will prove useful, so we’ll add the components to a linked list.</p>
<pre class="brush: csharp; gutter: false; toolbar: false; ">public void Use(Func&lt;RequestDelegate, RequestDelegate&gt; component)
{
var node = new MiddlewareComponentNode
{
Component = component
};
Components.AddLast(node);
}
LinkedList&lt;MiddlewareComponentNode&gt; Components = new LinkedList&lt;MiddlewareComponentNode&gt;();
</pre>
<p>The real magic happens in <code>Build</code>. We’ll start with the last component in the pipeline and loop until we reach the first component. For each component, we have to create the <code>next</code> delegate. <code>next</code> will either point to the processing function for the next middleware component, or for the last component, be a function we provide that has no logic, or maybe sets the response status to 404. Once we have the <code>next</code> delegate, we can invoke the component function to create the processing function itself.</p>
<pre class="brush: csharp; gutter: false; toolbar: false; ">public RequestDelegate Build()
{
var node = Components.Last;
while(node != null)
{
node.Value.Next = GetNextFunc(node);
node.Value.Process = node.Value.Component(node.Value.Next);
node = node.Previous;
}
return Components.First.Value.Process;
}
private RequestDelegate GetNextFunc(LinkedListNode&lt;MiddlewareComponentNode&gt; node)
{
if(node.Next == null)
{
// no more middleware components left in the list
return ctx =&gt;
{
// consider a 404 status since no other middleware processed the request
ctx.Response.StatusCode = 404;
return Task.CompletedTask;
};
}
else
{
return node.Next.Value.Process;
}
}
</pre>
<p>This has been a "Build Your Own AppBuilder" excercise. "Build you own ________" exercises like this are a fun challenge and a good way to understand how a specific piece of software works behind the scene.</p><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/OdeToCode?a=ljnO-O6Bcm4:zC1xIUusbeE:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/OdeToCode?d=yIl2AUoC8zA" border="0"></img></a>
</div>Mon, 17 Sep 2018 10:03:00 Z