Native AOT
Native ahead of time (AOT) compilation is a new feature that went live with the general availability of .NET 7. Native AOT produces an application that is pre-compiled to native code. This allows users of the app to run it on a machine without the .NET runtime being installed.
One of the benefits of native AOT is the start-up speed. As the application is pre-compiled to native code there is no need for just-in-time (JIT) compilati0on, instead the app is ready to run immediately. This is beneficial in environments with lots of deployed instances, like AWS Lambda. Native AOT applications target a specific runtime and must be compiled on the OS and processor architecture that they will run on in production.
Due to the underlying OS of AWS Lambda, native AOT will only run on the x84 architecture.
There are some limitations to native AOT, the biggest being the lack of support for run-time code generation. This has an impact on any systems that use unconstrained reflection. Both System.Text.Json and Newtonsoft.Json rely heavily on reflection to function, meaning any JSON (de)serialiazation using either of these libraryies will require changes.
Let's dive into how you can run native AOT applications on AWS Lambda.
AWS Tooling
AWS announched tooling to support to make it easier to build and deploy native AOT applications to Lambda. This tooling makes use of Docker, ensure you have Docker running on your system. Also ensure you have version 5.6.0 or later of the Amazon.Lambda.Tools.
AWS also announched pre-built templates to quickly get started with native AOT. Ensure you have the latest version of the Amazon.Lambda.Templates.
Getting Started
To get started with your first native AOT Lambda function run the following command to start a new project:
Open up the project in the IDE of your choice and let's have a look at the code.
Function Code
Native AOT compiles application code down to a single binary. This means the application entry-point needs to be a static Main()
method. The main method uses the LambdaBootstrapBuilder
class that comes from the Amazon.Lambda.RuntimeSupport
Nuget package. The .Create
method bootstraps the Lambda runtime, passing in the actual FunctionHandler method as well as a serializer to use. The .RunAsync()
method makes the function ready to receive requests.
(De)Serialization
As mentioned earlier, native AOT removes the support for using common JSON (de)seraizliation libraries. In .NET 6, Microsoft introduced source generated serializers. Source generated serialization generates the code required for (de)serailization at compile time.
To use source generated serializers you need to specify a partial class that inherits from the JsonSerializerContext
class. Annotations are then added to that class to define which objects to generate compile time code for. In this instance, this is just a string. All objects you need to (de)seriailize need to be added as an annotation, including any Lambda event sources like SQSEvent
or APIGatewayHttpApiV2ProxyRequest
.
Project File Updates
Updates are required to the csproj file to enable both native AOT and allow the code to run on AWS Lambda. The first is to set the TargetFramework
to net7.0
.
Native AOT on Lambda makes use of Lambda custom runtimes. Custom runtimes allow you to bring your own runtime to Lambda. When using a custom runtime the Lambda service looks for a file named bootstrap
. For that reason, the compiled assembly name needs to be output with the name bootstrap. The final change is to set the PublishAot
flag to true.
Trimming Options
Native AOT compilation trims your application code, making the bundle size as small as possible. Many Nuget libraries are not yet 'trim friendly', meaning required pieces of code may be removed. Microsoft provide a way to exclude libraries from trimming using the trimming options built into the compiler. To exclude an assembly from the trimming, specify it as a <TrimmerRootAssembly>
.
Function Code
Native AOT compiles application code down to a single binary. This means the application entry-point needs to be a static Main()
method. The main method uses the LambdaBootstrapBuilder
class that comes from the Amazon.Lambda.RuntimeSupport
Nuget package. The .Create
method bootstraps the Lambda runtime, passing in the actual FunctionHandler method as well as a serializer to use. The .RunAsync()
method makes the function ready to receive requests.
(De)Serialization
As mentioned earlier, native AOT removes the support for using common JSON (de)seraizliation libraries. In .NET 6, Microsoft introduced source generated serializers. Source generated serialization generates the code required for (de)serailization at compile time.
To use source generated serializers you need to specify a partial class that inherits from the JsonSerializerContext
class. Annotations are then added to that class to define which objects to generate compile time code for. In this instance, this is just a string. All objects you need to (de)seriailize need to be added as an annotation, including any Lambda event sources like SQSEvent
or APIGatewayHttpApiV2ProxyRequest
.
Project File Updates
Updates are required to the csproj file to enable both native AOT and allow the code to run on AWS Lambda. The first is to set the TargetFramework
to net7.0
.
Native AOT on Lambda makes use of Lambda custom runtimes. Custom runtimes allow you to bring your own runtime to Lambda. When using a custom runtime the Lambda service looks for a file named bootstrap
. For that reason, the compiled assembly name needs to be output with the name bootstrap. The final change is to set the PublishAot
flag to true.
Trimming Options
Native AOT compilation trims your application code, making the bundle size as small as possible. Many Nuget libraries are not yet 'trim friendly', meaning required pieces of code may be removed. Microsoft provide a way to exclude libraries from trimming using the trimming options built into the compiler. To exclude an assembly from the trimming, specify it as a <TrimmerRootAssembly>
.
Deploying
When ready to deploy, it's as simple as using the deploy-function
command in the CLI global tooling. This command downloads a Docker image built using Amazon Linux 2 (AL2) as a base image. Your local file system is then attached to a running container, your code is compiled within this container. If the deploy-function comamnd is executed on a machine running AL2 then the publish will run as normal.