Use .NET 5.0 To Be Rid of Windows Server!

I work at a three-letter company. The majority of our development teams are heavily invested in Java and .NET Framework. Our team has traditionally shied away from .NET Framework for what I'm guessing is a pretty common reason: it only runs on Windows. For many shops this probably isn't a big deal but for our team it was a bit of an annoyance. Most of our software gets wrapped up in containers and deployed to an on-site Kubernetes cluster. However, we have one small application that needs to talk to a proprietary vendor library written in .NET Framework. That means that 99% of our stuff is deployed using the same methods and in the same location. The other 1% was this little app, and it actually caused us quite a few headaches. Luckily, .NET 5.0 came out recently. It represents the convergence of .NET Core and .NET Framework into a single, cross-platform runtime. This is one of the only times in my career that a single upgrade has magically solved everything.

The library in question is an older .NET Framework (not .NET Standard, which can be included in Framework or Core projects) collection of .dlls that lets us communicate with the vendor's systems. It's basically a platform-dependent SDK. Our only requirement was to make accessible a table of information on the vendor's system via the SDK. At the time, the simplest thing to do seemed to be an ASP.NET Core app that rendered a very basic HTML table constructed from vendor data. And it worked just fine, tested great. The whole project wasn't more than a few thousand SLOC.

Everything was going swimmingly all the way through deployment. We knew that because we had to target .NET Framework for our project to build, we would need to build a Windows Server instance. So we did that and set used Visual Studio + IIS to create a deployment pipeline. This was alright; click publish and the new version builds, deploys, and goes live. Then things started to get weird.

Our application had some config to know which servers to log into. This was stored in appsettings.Env.json, where Env changed for Development and Production. For whatever reason, regardless of the environment we specified (typically ASPNETCORE_ENVIRONMENT=Production), the config would default to Development after an application restart. There were a few problems interacting at once here. First, why was the app restarting? It seemed like if we started it up, it would run great. If we left it alone for a while, it would die. Then, accessing its URL again would slowly bring it back to life. When it came back to life it had the Development config. Only a manual app restart would bring the Production config back, and then only until it went to sleep again.

Keep in mind this was a purely default deployment environment. Vanilla Windows Server 2016, base install of IIS with no tweaking. Even the ASP.NET Core project was scaffolded by the dotnet CLI tool. We just added some source code that changed the homepage by running a few library functions.

Eventually we tracked down the

- resources - patching/AMI upgrades - IIS weirdness

quick ASP.NET Core app on top of their .NET Framework library app, fired up Windows Server, and shoved it there. It wouldn't let us forget about it. For a myriad of reasons, maintaining IIS as a reverse proxy to the application (for certificates, etc.) produced strange and unusual behavior. For example, IIS comes with a collection of