Microsoft Azure AI, data, and application innovations help turn your AI ambitions into reality

Welcome to Microsoft Ignite 2023! The past year has been one of true transformation. Companies are seeing real benefits today and are eager to explore what’s next—including how they can do more with their data investments, build intelligent applications, and uncover what AI can do for their business.

We recently commissioned a study through IDC and uncovered insights into how AI is driving business results and economic impact for organizations worldwide. More than 2,000 business leaders surveyed confirmed they’re already using AI for employee experiences, customer engagement, and to bend the curve on innovation.  

The study illustrates the business value of AI but it really comes to life through the stories of how our customers and partners are innovating today. Customers like Heineken, Thread, Moveworks, the National Basketball Association (NBA), and so many more are putting AI technologies to work for their businesses and their own customers and employees. 

From modern data solutions uniquely suited for the era of AI, beloved developer tools, and application services, we’re building Microsoft Azure as the AI supercomputer for customers, no matter the starting point.

This week at Ignite, the pace of innovation isn’t slowing down. We’ll share more stories about how organizations are turning to new solutions to drive their business forward. We’re also announcing many new capabilities and updates to make it easier than ever to use your favorite tools, maximize existing investments, save time, and innovate on Azure as a trusted platform.

Modern data solutions to power AI transformation

Every intelligent app starts with data—and your AI is only as good as your data—so a modern data and analytics platform is increasingly important. The integration of data and AI services and solutions can be a unique competitive advantage because every organization’s data is unique.

Last year, we introduced the Microsoft Intelligent Data Platform as an integrated platform to bring together operational databases, analytics, and governance and enable you to integrate all your data assets seamlessly in a way that works for your business.

At Ignite this week, we are announcing the general availability of Microsoft Fabric, our most integrated data and AI solution yet, into the Intelligent Data Platform. Microsoft Fabric can empower you in ways that weren’t possible before with a unified data platform. This means you can bring AI directly to your data, no matter where it lives. This helps foster an AI-centered culture to scale the power of your data value creation so you can spend more time innovating and less time integrating.

EDP is a global energy company that aims to transform the world through renewable energy sources. They’re using Microsoft Fabric and OneLake to simplify data access across data storage, processing, visualization, and AI workflows. This allows them to fully embrace a data-driven culture where they have access to high-value insights and decisions are made with a comprehensive view of the data environment.

We’re also announcing Fabric as an open and extensible platform. We will showcase integrations with many of our partners like LSEG, Esri, Informatica, Teradata and SAS, who have been demonstrating the possibilities of bringing their product experiences as workloads into Fabric, widening their reach and breadth of capabilities.

Every organization is eager to save time and money as they transform. We’re announcing several new features and updates for Azure SQL that make Azure the ideal and most cost-effective place for your data. Updates include lower pricing for Azure SQL Database Hyperscale compute, Azure SQL Managed Instance free trial offer, and a wave of other new features. 

Lufthansa Technik AG has been running Azure SQL to support its application platform and data estate, leveraging fully managed capabilities to empower teams across functions. They’re joining on stage during a breakout session on cloud-scale databases, so you can learn more about their experience directly. 

Easily build, scale, and deploy multimodal generative AI experiences responsibly with Azure

The AI opportunity for businesses is centered on the incredible power of generative AI. We’re inspired by customers who are now nimbly infusing content generation capabilities to transform all kinds of apps into intuitive, contextual experiences that impress and captivate their own customers and employees.

Siemens Digital Industries is one company using Azure AI to enhance its manufacturing processes by enabling seamless communication on the shop floor. Their newest solution helps field engineers report issues in their native language, promoting inclusivity, efficient problem resolution, and faster response times. 

Today organizations need more comprehensive, unified tools to build for this next wave of generative AI-based applications. This is why we’re announcing new updates that push the boundaries of AI innovation and make it easier for customers to responsibly deploy AI at scale across their business.

Everything you need to build, test, and deploy AI innovations in one convenient location

At Ignite, we’re thrilled to introduce the public preview of Azure AI Studio, a groundbreaking platform for AI developers by Microsoft. Everything organizations need to tackle generative AI is now in one place: cutting-edge models, data integration for retrieval augmented generation (RAG), intelligent search capabilities, full-lifecycle model management, and content safety. 

We continue to expand choice and flexibility in generative AI models beyond Azure OpenAI Service. We announced the model catalog at Build and at Ignite, we’re announcing Model as a Service in managed API endpoint coming soon within the model catalog. This will enable pro developers to easily integrate new foundation models like Meta’s Llama 2, G42’s Jais, Command from Cohere and Mistral’s premium models into their applications as an API endpoint and fine-tune models with custom training data, without having to manage the underlying GPU infrastructure. This functionality will help eliminate the complexity for our customers and partners of provisioning resources and managing hosting. 

Large language models (LLM) orchestration and grounding RAG are top of mind as momentum for LLM-based AI applications grows. Prompt flow, an orchestration tool to manage prompt orchestration and LLMOps, is now in preview in Azure AI Studio and generally available in Azure Machine Learning. Prompt flow provides a comprehensive solution that simplifies the process of prototyping, experimenting, iterating, and deploying your AI applications.

We’re also announcing at Ignite that Azure AI Search, formerly Azure Cognitive Search, is now available in Azure AI Studio so everything remains in one convenient location for developers to save time and boost productivity.

Azure AI Content Safety is also available in Azure AI Studio so developers can easily evaluate model responses all in one unified development platform. We’re also announcing the preview of new features inside Azure AI Studio powered by Azure AI Content Safety to address harms and security risks that are introduced by large language models. The new features help identify and prevent attempted unauthorized modifications, and identify when large language models generate material that leverages third-party intellectual property and content. 

With Azure AI Content Safety, developers can monitor human and AI-generated content across languages and modalities and streamline workflows with customizable severity levels and built-in blocklists.

It’s great to see customers already leveraging this to build their AI solutions. In just six months, Perplexity brought Perplexity Ask, a conversational answer engine, to market with Azure AI Studio. They were able to streamline and expedite AI development, get to market faster, scale quickly to support millions of users, and cost-effectively deliver security and reliability.

If you’re creating a custom copilot, improving search, enhancing call centers, developing bots, or a blend of all of this, Azure AI Studio offers everything you need. You can check out Eric Boyd’s blog to learn more about Azure AI Studio.

Azure AI Studio Hero Image

Generative AI is now multi-modal

We are excited to enable a new chapter in the generative AI journey for our customers with GPT-4 Turbo with Vision, in preview, coming soon to the Azure OpenAI Service and Azure AI Studio. With GPT-4 Turbo with Vision, developers can deliver multi-modal capabilities in their applications. 

We are adding several new updates to Azure AI Vision. GPT-4 Turbo with Vision in combination with our Azure AI Vision service can see, understand, and make inferences like video analysis or video Q&A from visual inputs and associated text-based prompt instructions.

In addition to GPT-4 Turbo with Vision, we are happy to share other new innovations to Azure OpenAI Service including GPT-4 Turbo in preview and GPT-3.5 Turbo 16K 1106 in general availability coming at the end of November and image model DALL-E 3 in preview now.

Search in the era of AI

Effective retrieval techniques, like those powered by search, can improve the quality of responses and response latency. A common practice for knowledge retrieval (retrieval step in RAG), is to use vector search. Search can power effective retrieval techniques to vastly improve the quality of responses and reduce latency, which is essential for generative AI apps as they must be grounded on content from data, or websites, to augment responses generated by LLMs. 

Azure AI Search is a robust information retrieval and search platform that enables organizations to use their own data to deliver hyper-personalized experiences in generative AI applications. We’re announcing the general availability of vector search for fast, highly relevant results from data.

Vector search is a method of searching for information within various data types, including images, audio, text, video, and more. It’s one of the most critical elements of AI-powered, intelligent apps, and the addition of this capability is our latest AI-ready functionality to come to our Azure databases portfolio.

Semantic ranker, formerly known as semantic search, is also generally available and provides access to the same machine learning-powered search re-ranking technology used to power Bing. Your generative AI applications can deliver the highest quality responses to every user Q&A with a feature-rich vector database integrated with state-of-the-art relevance technology.

Azure AI Search Hero Image

Accelerate your AI journey responsibly and with confidence

At Microsoft, we’re committed to safe and responsible AI. It goes beyond ethical values and foundational principles, which are critically important. We’re integrating this into the products, services, and tools we release so organizations can build on a foundation of security, risk management, and trust. 

We are pleased to announce new updates at Ignite to help customers pursue AI responsibly and with confidence.

Setting the standard for responsible AI innovation—expanding our Copilot Copyright Commitment

Microsoft has set the standard with services and tools like Azure AI Content Safety, the Responsible AI Dashboard, model monitoring, and our industry-leading commitment to defend and indemnify commercial customers from lawsuits for copyright infringement.   

Today, we are announcing the expansion of the Copilot Copyright Commitment, now called Customer Copyright Commitment (CCC), to customers using Azure OpenAI Service.As more customers build with generative AI inside their organizations, they are inspired by the potential of this technology and are eager to commercialize it externally.   

By extending the CCC to Azure OpenAI Service, Microsoft is broadening our commitment to defend our commercial customers and pay for any adverse judgments if they are sued for copyright infringement for using the outputs generated by Azure OpenAI Service. This benefit will be available starting December 1, 2023. 

 As part of this expansion, we’ve published new documentation to help Azure OpenAI Service customers implement technical measures and other best practices to mitigate the risk of infringing content. Customers will need to comply with the documentation to take advantage of the benefit. Azure OpenAI Service is a developer service and comes with a shared commitment to build responsibly.  We look forward to customers leveraging it as they build their own copilots. 

Announcing the Azure AI Advantage offer

We want to be your trusted partner as you deliver next-gen, transformative experiences with pioneering AI technology, a deeply integrated platform, and leading cloud security.  

Azure offers a full, integrated stack purpose-built for cloud-native, AI-powered applications, accelerating your time to market and giving you a competitive edge and superior performance. ​To help on that journey we are happy to introduce a new offer to help new and existing Azure AI and GitHub Copilot customers realize the value of Azure AI and Azure Cosmos DB together and get on the fast track to developing AI powered applications. You can learn more about the Azure AI Advantage offer and register here

Azure Cosmos DB and Azure AI combined deliver many benefits, including enhanced reliability of generative AI applications through the speed of Azure Cosmos DB, a world-class infrastructure and security platform to grow your business while safeguarding your data, and provisioned throughput to scale seamlessly as your application grows.

Azure AI services and GitHub Copilot customers deploying their AI apps to Azure Kubernetes Service may be eligible for additional discounts. Speak to your Microsoft representative to learn more. 

Empowering all developers with AI powered tools

There is so much in store this week at Ignite to improve the developer experience, save time, and increase productivity as they build intelligent applications. Let’s dive into what’s new.

Updates for Azure Cosmos DB—the database for the era of AI

For developers to deliver apps more efficiently and with reduced production costs, at Ignite we’re sharing new features in Azure Cosmos DB.

Now in preview, dynamic scaling provides developers new flexibility to scale databases up or down and brings cost savings to customers, especially those with operations around the globe. We’re also bringing AI deeper into the developer experience and increasing productivity with the preview of Microsoft Copilot for Azure enabling natural language queries in Azure Cosmos DB.  

Bond Brand Loyalty turned to Azure Cosmos DB to scale to more than two petabytes of transaction data while maintaining security and privacy for their own customers. On Azure, Bond built a modern offering to support extensive security configurations, reducing onboarding time for new clients by 20 percent.

We’re announcing two exciting updates to enable developers to build intelligent apps: general availability of both Azure Cosmos DB for MongoDB vCore and vector search in Azure Cosmos DB for MongoDB vCore.

Azure Cosmos DB for MongoDB vCore allows developers to build intelligent applications with full support for MongoDB data stored in Azure Cosmos DB, which unlocks opportunities for app development thanks to deep integration with other Azure services. That means developers can enjoy the benefits of native Azure integrations, low total cost of ownership (TCO), and a familiar vCore architecture when migrating existing applications or building new ones. 

Vector search in Azure Cosmos DB for MongoDB vCore allows developers to seamlessly integrate data stored in Azure Cosmos DB into AI-powered applications, including those using Azure OpenAI Service embeddings. Built-in vector search enables you to efficiently store, index, and query high-dimensional vector data, and eliminates the need to transfer the data outside of your Azure Cosmos DB database.

PostgreSQL developers have used built-in vector search in Azure Database for PostgreSQL and Azure Cosmos DB for PostgreSQL since this summer. Now, they can take advantage of the public preview of Azure AI extension in Azure Database for PostgreSQL to build LLMs and rich generative AI solutions.

KPMG Australia used the vector search capability when they turned to Azure OpenAI Service and Azure Cosmos DB to build their own copilot application. The KymChat app has helped employees speed up productivity and streamline operations. The solution is also being made available to KPMG customers through an accelerator that combines KymChat’s use cases, features, and lessons learned, helping customers accelerate their AI journey.

Building cloud-native and intelligent applications

Intelligent applications combine the power of AI and cloud-scale data with cloud-native app development to create highly differentiated digital experiences. The synergy between cloud-native technologies and AI is a tangible opportunity for evolving traditional applications, making them intelligent, and delivering more value to end users. We’re dedicated to continually enhancing Azure Kubernetes Service to meet these evolving demands of AI for customers who are just getting started as well as those who are more advanced.

Customers can now run specialized machine learning workloads like LLMs on Azure Kubernetes Service more cost-effectively and with less manual configuration. The Kubernetes AI toolchain Operator automates LLMs deployment on AKS across available CPU and GPU resources by selecting optimally sized infrastructure for the model. It makes it possible to easily split inferencing across multiple lower-GPU-count virtural machines (VMs) thus increasing the number of Azure regions where workloads can run, eliminating wait times for higher GPU-count VMs, and lowering overall cost. Customers can also run preset models from the open source hosted on AKS, significantly reducing costs and overall inference service setup time while eliminating the need for teams to be experts on available infrastructure. 

Azure Kubernetes Fleet Manager is now generally available and enables multi-cluster and at-scale scenarios for Azure Kubernetes Service clusters. Fleet manager provides a global scale for admins to manage workload distribution across clusters and facilitate platform and application updates so developers can rest assured they are running on the latest and most secure software. 

We’ve also been sharing learnings about how to help engineering organizations enable their own developers to get started and be productive quickly, while still ensuring systems are secure, compliant, and cost-controlled. Microsoft is providing a core set of technology building blocks and learning modules to help organizations get started on their journey to establish a platform engineering practice. 

New Microsoft Dev Box capabilities to improve the developer experience

Maintaining a developer workstation that can build, run, and debug your application is critical to keeping up with the pace of modern development teams. Microsoft Dev Box provides developers with secure, ready-to-code developer workstations for hybrid teams of any size. 

We’re introducing new preview capabilities to give development teams more granular control over their images, the ability to connect to Hosted Networks to simplify connecting to your resources securely, and templates to make it easier to get up and running. Paired with new capabilities coming to Azure Deployment Environments, it’s easier than ever to deploy those projects to Azure.

Build upon a reliable and scalable foundation with .NET 8

.NET 8 is a big leap forward towards making .NET one of the best platforms to build intelligent cloud-native applications, with the first preview of .NET Aspire – an opinionated cloud ready stack for building observable, production ready, distributed cloud native applications. It includes curated components for cloud-native fundamentals including telemetry, resilience, configuration, and health checks. The stack makes it easier to discover, acquire, and configure essential dependencies for cloud-native applications on day 1 and day 100. 

.NET 8 is also the fastest version of .NET ever, with developer productivity enhancements across the stack – whether you are building for cloud, a full stack web app, a desktop or mobile app suing .NET MAUI, or integrating AI to build the next copilot for your app. These are available in Visual Studio, which also releases today.  

Azure Functions and Azure App Service have full support for .NET 8 both in Linux and Windows, and both Azure Kubernetes Service and Azure Container Apps also support .NET 8 today.  

There are no limits to your innovation potential with Azure

There’s so much rolling out this week with data, AI, and digital applications so I hope you’ll tune into the virtual Ignite experience and hear about the full slate of announcements and more about how you can put Azure to work for your business. 

This week’s announcements are proof of our commitment to helping customers take that next step of innovation and stay future-ready. I can’t wait to see how your creativity and new innovations unfold for your business. 

You can check out these resources to learn more about everything shared today. We hope you have a great Ignite week!