Microsoft hosted a virtual event called Microsoft Connect(); to make strategic announcements related to developers tools, platforms, and cloud. With the latest enhancements, Azure gained new capabilities to run containerized applications, industrial IoT workloads and training machine learning models.
Microsoft Connect(); is typically scheduled just before AWS re:Invent. The Azure team utilizes the event to counter some of the announcements made by its archrival. This year, Microsoft chose to host it a week later. But the significance of the announcements is no less than what’s announced at premier events such as Build and Ignite.
The first and most exciting announcement is the general availability of Azure ML Services – a Platform as a Service offering to train, experiment, and deploy machine learning and deep learning models. The service joins Microsoft’s AI and ML portfolio comprising of Cognitive Services and Azure ML Studio.
After going back and forth with the ML PaaS strategy, Microsoft finally got it right. Firstly, the service is based on the popular Jupyter Notebooks for developing ML models in Python. Microsoft built native APIs for other Azure services such as storage and compute that data scientists can mix and match with familiar modules like Numpy and Pandas. The service can be used to test the model locally before running the training job on expensive but powerful GPU clusters. Developers and data scientists can use ML framework of their choice. The service has excellent support for Sckit-learn, TensorFlow, CNTK, PyTorch among others.
Personally, I like the fact that Microsoft has exposed APIs for model serving based on Kubernetes, serverless containers, and plain vanilla VMs. This integration delivers a closed loop for all the phases involved in ML development including data prep, feature engineering, training, experimentation, hyperparameter tuning, model management and scalable deployment.
One of the key differentiators of Azure ML Service is AutoML feature for classical machine learning problems like regression and classification. AutoML simplifies generating ML models by taking over the complex part of training.
Overall, Azure ML Service is one of the most mature ML platforms in the public cloud that covers the majority of the use cases expected by enterprise customers.
As a key contributor to Open Neural Network Exchange (ONNX), Microsoft has been championing the need for interoperability among deep learning frameworks. The company has announced that it is open sourcing the ONNX runtime encouraging OS, platform and tool vendors to offer tighter integration. ONNX is jointly developed by Microsoft, AWS, and Facebook.
ML.NET, the toolkit for .NET developers, enables them to create and infuse custom AI into applications without prior experience in developing or tuning machine learning models. At Connect();, Microsoft has released the latest public preview of ML.NET, ML.NET 0.8.
When it comes to IoT and Edge, Microsoft is already leading the pack. It further enhanced the platform with the general availability of Azure Steam Analytics module for IoT Edge, updates to the Azure IoT Device Simulation, updates to the Azure IoT Remote Monitoring suite, and updates to Azure Time Series Insights service.
Azure Steam Analytics on IoT Edge enables developers to deploy near-real-time intelligence closer to IoT devices. It runs within the Azure IoT Edge framework, which means that it can be easily deployed and managed through Azure IoT Hub – the cloud-based device gateway.
The Azure IoT Device Simulation solution accelerator makes it possible to script complex device behavior, include multiple device models in a single simulation, and let simulations run for as long as necessary to emulate real-world scenarios.
Azure IoT Remote Monitoring solution accelerator now supports integration with IoT Edge.
There are quite a few announcements related to containers, Kubernetes, and functions.
ACI, the serverless container platform on Azure, now supports graphics processor unit (GPU)-enabled virtual machines, giving developers additional choices for running containers in VMs, and enabling them to run compute-intensive jobs such as machine learning.
AKS, Microsoft’s managed Kubernetes hosted model got an autoscale feature. Based on the upstream Kubernetes cluster autoscaler project, AKS cluster autoscaling automatically adds and removes nodes to meet the needs of the workloads that the customer has deployed, subject to minimum and maximum thresholds.
Microsoft is making it easy to integrate serverless containers (ACI) with Kubernetes (AKS) through Virtual Node. With this support, workloads can be seamlessly deployed within the Kubernetes CaaS or cheaper single-VM containers.
Virtual Kubelet, the open source technology that enables the integration between Kubernetes and other control planes is donated to the Cloud Native Computing Foundation (CNCF) by Microsoft. As an official CNCF project, Virtual Kubelet will gain better adoption from the Kubernetes ecosystem.
In a bid to close the gap with competition, Microsoft is moving fast in adding new capabilities to Azure. Its investments in compute, data, AI, IoT and edge are transforming Azure into a robust enterprise cloud platform.
Credit: Source link