The .NET Stacks #53: 🚀 This issue was compiled ahead of time

This week, we talk about Blazor WebAssembly AOT and get updates from the community.

Dave Brock
Dave Brock

Happy Monday, all. I wanted to thank everybody for the birthday wishes as the newsletter turned 1 last week. Also, welcome to all the new subscribers!

This week wasn't as crazy as we're recovering from Build, but as always, there's plenty to discuss. Let's get started.

  • One big thing: Taking a look at Blazor WebAssembly AOT
  • The little things: NuGet improvements in Visual Studio, Tye gets a VS Code extension
  • Last week in the .NET world

One big thing: Taking a look at Blazor WebAssembly AOT

With Blazor, you've got two hosting models to consider, Blazor Server and Blazor WebAssembly.

With Blazor Server, your download size isn't a concern. You can leverage server capabilities with all the .NET-compatible APIs, and you can use thin-clients—you will, however, you need to consider the higher latency and scale if you have a lot of concurrent users. (Unless you have a lot of concurrent connections, it likely won't be an issue.)  With Blazor WebAssembly, you can leverage client capabilities and a fully functioning app once the client downloads it. If you want to embrace Jamstack principles with a SPA calling off to serverless APIs (with attractive options like Azure Static Web Apps), Blazor WebAssembly is a nice option. The download size is larger, and apps do take significantly longer to load, though.

In my experience across the community, I do see many Blazor scenarios geared towards Blazor Server. Many times, folks are also packaging an ASP.NET Core API, and the download size and loading times of WebAssembly might be holding people back. "I want to wait until we get AOT," I've heard a lot of people say.  

Last week, with .NET 6 Preview 4, Blazor ahead-of-time compilation (AOT) is finally here. With AOT, you can compile .NET code directly to WebAssembly to help boost runtime performance. Before AOT, Blazor WebAssembly apps run from a .NET IL interpreter, meaning the .NET WebAssembly code is significantly slower than a normal .NET runtime (like with Blazor Server).

I gave it a go using Steve Sanderson's Picture Fixer app featured in Build talks and the .NET 6 Preview 4 release announcement. As mentioned in the announcement, you need to do two things:

  • Install the Blazor WebAssembly AOT build tools as an optional .NET SDK workload
  • Add <RunAOTCompilation>true</RunAOTCompilation> to your project file

Then, you'll need to publish your application. On my beefy Dell XPS 15 and its 64 gigs of RAM, it took almost 15 minutes to AOT 15 assemblies and use the Emscripten toolchain to do the heavy lifting. The team notes they are working on speeding this up, but it's good to note this wait only occurs during publish time and not local development. With AOT in place, you can see dramatic changes in data-intensive tasks specifically.

As with the decision to go with Blazor Server or Blazor WebAssembly, you need to consider tradeoffs when introducing AOT with your Blazor projects. As AOT-compiled projects are typically double the size, you need to consider the value of trading off load time performance for runtime performance. You can pick and choose when you use AOT, of course, so typical use cases would take a hybrid approach and leveraging AOT specifically for data-intensive tasks.

When Steve Sanderson talked to us a while back, he said:

If we can get Blazor WebAssembly to be faster than JS in typical cases (via AoT compilation, which is very achievable) and somehow simultaneously reduce download sizes to the point of irrelevance, then it would be very much in the interests of even strongly JS-centric teams to reconsider and look at all the other benefits of C#/.NET too.

AOT is a big step in that direction. The size of the .NET runtime will never compare to small JS frameworks, but getting close to download size and performance will be a giant step towards providing a modern, fast SWA solution for folks open to the .NET ecosystem.


NuGet improvements in Visual Studio, Tye gets a VS Code extension

Last week, Christopher Gill announced that Visual Studio is introducing NuGet Package Suggestions in Visual Studio 16.10. With IntelliCode Package Suggestions, NuGet uses your project's metadata—like what packages you have installed and your project type—to suggest packages you might need. Package suggestions are shown in the NuGet Package Manager UI before you enter a search query.

As for the fine print, it currently only works on the project level and does not suggest packages outside of nuget.org. Also, it won't support deprecated packages or ones that are transitively installed.


Also from last week, Microsoft rolled out a Visual Studio Code extension for Tye. If you aren't familiar Project Tye is an open-source project that helps developers work with microservices and distributed applications.

We're seeing a lot of nice updates for Tye, and I wonder when it'll get out of the "it's experimental, things might break" phase. For example, the official repo on GitHub still states: "Project Tye is an open source experiment at the moment. We are using this time to try radical ideas to improve microservices developer productivity and see what works...For the duration of the experiment, consider every part of the tye experience to be volatile." It has a lot of nice use cases with Dapr, for example, and I'd love to see when Microsoft is going to put a ring on it.


🌎 Last week in the .NET world

🔥 The Top 3

📢 Announcements

📅 Community and events

🌎 Web development

🥅 The .NET platform

⛅ The cloud

🔧 Tools

📱 Xamarin

🏗 Design, testing, and best practices

🎤 Podcasts

🎥 Videos

Consider subscribing to The .NET Stacks, my free weekly newsletter. I write about news and trends, interview community leaders, and catch you up fast. (No spam, ever, and unsubscribe whenever you want.)


    Consider subscribing to The .NET Stacks, my free weekly newsletter. I write about news and trends, interview community leaders, and catch you up fast. (No spam, ever, and unsubscribe whenever you want.)