Simplifying distributed application development with Azure Kubernetes Service

AKS, the Azure Kubernetes Service, is Azure’s managed Kubernetes offering and its main host for container-based microservice applications. It’s a powerful tool, one that simplifies building and managing Kubernetes infrastructures while offering advanced monitoring and management tooling. So it wasn’t surprising to see Microsoft give it a lot of attention at Build, with several important updates.

Kubernetes is a complex tool, one that gives you a platform that runs containers hosting your services. It’s designed to manage distributed systems, working across a cluster of servers, scaling in response to demand. Cloud services like Azure scale compute resources differently from traditional architectures, as they manage the underlying virtual machines transparently. That approach has led to the development of the Virtual Kubelet, a way of treating Kubernetes nodes as serverless elements; controlled by the cloud provider.

AKS support for Virtual Kubelets has led to the launch of AKS virtual nodes. Your code runs in a Kubernetes node as normal, but as compute demand increases, AKS adds new virtual nodes as required, scaling back automatically as demand drops. If all your resources are in a single node, you get the benefits of a serverless environment hosting your container. It’s a useful way of quickly adding scaling to apps running in ASP.NET Core or Node.js, without having to change your code or manage your infrastructure.

Similar serverless scaling features come with KEDA, Kubernetes Event Driven Autoscaling. Developed by Microsoft and RedHat, KEDA is being added to AKS to give you a new way of scaling your environments. Instead of scaling based on resource usage, you will be able to scale AKS applications as events arrive, reducing the risk of bottlenecks in message-based applications like IoT servers.

KEDA unlocks rich connectivity directly to the event sources, enabling rapid and proactive scaling while preserving direct interaction with the event. Source: Microsoft

Perhaps the most important AKS announcement was the news that Azure Dev Spaces for AKS was now widely available. Designed to make it easy for developers to quickly roll and test distributed applications, Dev Spaces are created with a single line of code in the Azure CLI. Dev Spaces aren’t only for existing code, there’s another single line command that helps scaffold up a new environment, creating Dockerfiles and Helm Charts that will make your code easier to manage and deploy.

Once it’s set up, you deploy code to your Dev Space from your preferred development environment, only deploying the code you’re working on. It’s a space you share with colleagues and use to test your code before you push it into your version control system. There’s a live connection between a supported IDE and the Dev Spaces containers, so code is synchronised and tested as you write it.

Dev Spaces gives you a short cut to debugging code, with a hostname prefix to route traffic to development services, letting you drop them into a live environment for testing without affecting live traffic. You then use the existing .NET Core remote debugging tools to monitor your new code.

 Azure API Management meets Functions

One of the lesser known Azure Services is its API management platform. Designed to provide a gateway for microservices, controlling access to, securing, and helping publish your APIs. Its new consumption tier is designed to help manage microservice APIs, with pay-per-action pricing. It’s an ideal tool for managing HTTP triggers for Azure Functions serverless code, tying the two services into a common pricing model. Using Azure API Management you publish OpenAPI Swagger definitions for your APIs, and if you’re using Application Insights it will ensure that API usage is logged alongside application performance.

Azure API Management is designed to apply policy-based management to your APIs, an ideal approach to controlling external access to your services. It ties all your APIs to a single endpoint address, reducing your attack surface, keeping risks low. It provides one place to authenticate users, and protects them and their apps from changes to your code, by versioning access to our APIs. Instead of opening your code to the world, using API Management with Functions gives them a front door, one that comes with its own digital doorman.

Running Linux Azure App Services for the web

Azure App Services is one of the oldest parts of Azure, hosting web and backend-as-a-service applications. Dating back to the original Azure service, its web jobs helped lead the development of Azure’s Serverless Functions.

Microsoft now supports App Service on Linux, with a fully managed compute environment and a range of different development stacks, including Node.js, Java, PHP, Python, Ruby, and .NET Core. All you need to do is choose the appropriate image for your code and Azure will automate deployment and operations. This lets you run existing code on Azure, porting it across to the appropriate image and modifying where necessary to use Azure storage. Any platform updates will be delivered automatically, as well as bug fixes.

App Service on Linux supports a number of Built-in images in order to increase
developer productivity. If the runtime your application requires is not supported in the
built-in images, there are instructions on how to build your own Docker image
to deploy to Web App for Containers. Source: Microsoft

The latest update changes the pricing model to now include a permanent free tier. That way you use App Services to experiment with new and existing code without worrying about changing your billing status. There’s now improved support for enterprise-grade networking, allowing it to work with Azure vnets and using tools like ExpressRoute or VPNs to connect to your on-premises applications and services. It’s an approach that would let you run your backend services on premises, with Azure hosting a scalable frontend web application using App Services.

Integration at scale with Logic Apps

Azure Logic Apps are often left to one side when we talk about Azure’s developer offerings. But low code tools like Logic Apps, with a focus on workflow and application integration are a powerful business tool, simplifying the connections between line-of-business software.

Azure Integration Services lets you connect cloud and on-premises applications through a unified set of cloud services. Source: Microsoft

Now part of Azure’s Integration Services with API Management, Service Bus, and Event Grid, Logic Apps are an important part of Microsoft’s enterprise development strategy. At Build, Microsoft highlighted two key Logic Apps developments. First is a new option that allows you to write and deploy your own code as a Logic App step. With it you quickly write JavaScript expressions to process data that’s being transferred through a workflow, adding your own actions and triggers, as well as transforms using the new workflowContext object.

Microsoft has improved how Logic Apps connects to some key business applications with a new SAP connector. If you haven’t been looking at using Logic Apps (or the closely related Flow) in your applications, it may well be worth giving it a try.

The members of the Grey Matter Managed Services team are experts at managing Azure related projects. If you need their technical advice or wish to discuss Azure options and costs, please call them on 01364 654200.

Find out more about how Grey Matter Ltd can help you with this subject. Send us a message:

By completing this form you are agreeing to our Privacy Notice.