azure logic apps
28 TopicsBuilding the Agentic Future
As a business built by developers, for developers, Microsoft has spent decades making it faster, easier and more exciting to create great software. And developers everywhere have turned everything from BASIC and the .NET Framework, to Azure, VS Code, GitHub and more into the digital world we all live in today. But nothing compares to what’s on the horizon as agentic AI redefines both how we build and the apps we’re building. In fact, the promise of agentic AI is so strong that market forecasts predict we’re on track to reach 1.3 billion AI Agents by 2028. Our own data, from 1,500 organizations around the world, shows agent capabilities have jumped as a driver for AI applications from near last to a top three priority when comparing deployments earlier this year to applications being defined today. Of those organizations building AI agents, 41% chose Microsoft to build and run their solutions, significantly more than any other vendor. But within software development the opportunity is even greater, with approximately 50% of businesses intending to incorporate agentic AI into software engineering this year alone. Developers face a fascinating yet challenging world of complex agent workflows, a constant pipeline of new models, new security and governance requirements, and the continued pressure to deliver value from AI, fast, all while contending with decades of legacy applications and technical debt. This week at Microsoft Build, you can see how we’re making this future a reality with new AI-native developer practices and experiences, by extending the value of AI across the entire software lifecycle, and by bringing critical AI, data, and toolchain services directly to the hands of developers, in the most popular developer tools in the world. Agentic DevOps AI has already transformed the way we code, with 15 million developers using GitHub Copilot today to build faster. But coding is only a fraction of the developer’s time. Extending agents across the entire software lifecycle, means developers can move faster from idea to production, boost code quality, and strengthen security, while removing the burden of low value, routine, time consuming tasks. We can even address decades of technical debt and keep apps running smoothly in production. This is the foundation of agentic DevOps—the next evolution of DevOps, reimagined for a world where intelligent agents collaborate with developer teams and with each other. Agents introduced today across GitHub Copilot and Azure operate like a member of your development team, automating and optimizing every stage of the software lifecycle, from performing code reviews, and writing tests to fixing defects and building entire specs. Copilot can even collaborate with other agents to complete complex tasks like resolving production issues. Developers stay at the center of innovation, orchestrating agents for the mundane while focusing their energy on the work that matters most. Customers like EY are already seeing the impact: “The coding agent in GitHub Copilot is opening up doors for each developer to have their own team, all working in parallel to amplify their work. Now we're able to assign tasks that would typically detract from deeper, more complex work, freeing up several hours for focus time." - James Zabinski, DevEx Lead at EY You can learn more about agentic DevOps and the new capabilities announced today from Amanda Silver, Corporate Vice President of Product, Microsoft Developer Division, and Mario Rodriguez, Chief Product Office at GitHub. And be sure to read more from GitHub CEO Thomas Dohmke about the latest with GitHub Copilot. At Microsoft Build, see agentic DevOps in action in the following sessions, available both in-person May 19 - 22 in Seattle and on-demand: BRK100: Reimagining Software Development and DevOps with Agentic AI BRK 113: The Agent Awakens: Collaborative Development with GitHub Copilot BRK118: Accelerate Azure Development with GitHub Copilot, VS Code & AI BRK131: Java App Modernization Simplified with AI BRK102: Agent Mode in Action: AI Coding with Vibe and Spec-Driven Flows BRK101: The Future of .NET App Modernization Streamlined with AI New AI Toolchain Integrations Beyond these new agentic capabilities, we’re also releasing new integrations that bring key services directly to the tools developers are already using. From the 150 million GitHub users to the 50 million monthly users of the VS Code family, we’re making it easier for developers everywhere to build AI apps. If GitHub Copilot changed how we write code, Azure AI Foundry is changing what we can build. And the combination of the two is incredibly powerful. Now we’re bringing leading models from Azure AI Foundry directly into your GitHub experience and workflow, with a new native integration. GitHub models lets you experiment with leading models from OpenAI, Meta, Cohere, Microsoft, Mistral and more. Test and compare performance while building models directly into your codebase all within in GitHub. You can easily select the best model performance and price side by side and swap models with a simple, unified API. And keeping with our enterprise commitment, teams can set guardrails so model selection is secure, responsible, and in line with your team’s policies. Meanwhile, new Azure Native Integrations gives developers seamless access to a curated set of 20 software services from DataDog, New Relic, Pinecone, Pure Storage Cloud and more, directly through Azure portal, SDK, and CLI. With Azure Native Integrations, developers get the flexibility to work with their preferred vendors across the AI toolchain with simplified single sign-on and management, while staying in Azure. Today, we are pleased to announce the addition of even more developer services: Arize AI: Arize’s platform provides essential tooling for AI and agent evaluation, experimentation, and observability at scale. With Arize, developers can easily optimize AI applications through tools for tracing, prompt engineering, dataset curation, and automated evaluations. Learn more. LambdaTest HyperExecute: LambdaTest HyperExecute is an AI-native test execution platform designed to accelerate software testing. It enables developers and testers to run tests up to 70% faster than traditional cloud grids by optimizing test orchestration, observability and streamlining TestOps to expedite release cycles. Learn more. Mistral: Mistral and Microsoft announced a partnership today, which includes integrating Mistral La Plateforme as part of Azure Native Integrations. Mistral La Plateforme provides pay-as-you-go API access to Mistral AI's latest large language models for text generation, embeddings, and function calling. Developers can use this AI platform to build AI-powered applications with retrieval-augmented generation (RAG), fine-tune models for domain-specific tasks, and integrate AI agents into enterprise workflows. MongoDB (Public Preview): MongoDB Atlas is a fully managed cloud database that provides scalability, security, and multi-cloud support for modern applications. Developers can use it to store and search vector embeddings, implement retrieval-augmented generation (RAG), and build AI-powered search and recommendation systems. Learn more. Neon: Neon Serverless Postgres is a fully managed, autoscaling PostgreSQL database designed for instant provisioning, cost efficiency, and AI-native workloads. Developers can use it to rapidly spin up databases for AI agents, store vector embeddings with pgvector, and scale AI applications seamlessly. Learn more. Java and .Net App Modernization Shipping to production isn’t the finish line—and maintaining legacy code shouldn’t slow you down. Today we’re announcing comprehensive resources to help you successfully plan and execute app modernization initiatives, along with new agents in GitHub Copilot to help you modernize at scale, in a fraction of the time. In fact, customers like Ford China are seeing breakthrough results, reducing up to 70% of their Java migration efforts by using GitHub Copilot to automate middleware code migration tasks. Microsoft’s App Modernization Guidance applies decades of enterprise apps experience to help you analyze production apps and prioritize modernization efforts, while applying best practices and technical patterns to ensure success. And now GitHub Copilot transforms the modernization process, handling code assessments, dependency updates, and remediation across your production Java and .NET apps (support for mainframe environments is coming soon!). It generates and executes update plans automatically, while giving you full visibility, control, and a clear summary of changes. You can even raise modernization tasks in GitHub Issues from our proven service Azure Migrate to assign to developer teams. Your apps are more secure, maintainable, and cost-efficient, faster than ever. Learn how we’re reimagining app modernization for the era of AI with the new App Modernization Guidance and the modernization agent in GitHub Copilot to help you modernize your complete app estate. Scaling AI Apps and Agents Sophisticated apps and agents need an equally powerful runtime. And today we’re advancing our complete portfolio, from serverless with Azure Functions and Azure Container Apps, to the control and scale of Azure Kubernetes Service. At Build we’re simplifying how you deploy, test, and operate open-source and custom models on Kubernetes through Kubernetes AI Toolchain Operator (KAITO), making it easy to inference AI models with the flexibility, auto-scaling, pay-per-second pricing, and governance of Azure Container Apps serverless GPU, helping you create real-time, event-driven workflows for AI agents by integrating Azure Functions with Azure AI Foundry Agent Service, and much, much more. The platform you choose to scale your apps has never been more important. With new integrations with Azure AI Foundry, advanced automation that reduces developer overhead, and simplified operations, security and governance, Azure’s app platform can help you deliver the sophisticated, secure AI apps your business demands. To see the full slate of innovations across the app platform, check out: Powering the Next Generation of AI Apps and Agents on the Azure Application Platform Tools that keep pace with how you need to build This week we’re also introducing new enhancements to our tooling to help you build as fast as possible and explore what’s next with AI, all directly from your editor. GitHub Copilot for Azure brings Azure-specific tools into agent mode in VS Code, keeping you in the flow as you create, manage, and troubleshoot cloud apps. Meanwhile the Azure Tools for VS Code extension pack brings everything you need to build apps on Azure using GitHub Copilot to VS Code, making it easy to discover and interact with cloud services that power your applications. Microsoft’s gallery of AI App Templates continues to expand, helping you rapidly move from concept to production app, deployed on Azure. Each template includes fully working applications, complete with app code, AI features, infrastructure as code (IaC), configurable CI/CD pipelines with GitHub Actions, along with an application architecture, ready to deploy to Azure. These templates reflect the most common patterns and use cases we see across our AI customers, from getting started with AI agents to building GenAI chat experiences with your enterprise data and helping you learn how to use best practices such as keyless authentication. Learn more by reading the latest on Build Apps and Agents with Visual Studio Code and Azure Building the agentic future The emergence of agentic DevOps, the new wave of development powered by GitHub Copilot and new services launching across Microsoft Build will be transformative. But just as we’ve seen over the first 50 years of Microsoft’s history, the real impact will come from the global community of developers. You all have the power to turn these tools and platforms into advanced AI apps and agents that make every business move faster, operate more intelligently and innovate in ways that were previously impossible. Learn more and get started with GitHub Copilot1.2KViews2likes0CommentsReimagining App Modernization for the Era of AI
This blog highlights the key announcements and innovations from Microsoft Build 2025. It focuses on how AI is transforming the software development lifecycle, particularly in app modernization. Key topics include the use of GitHub Copilot for accelerating development and modernization, the introduction of Azure SRE agent for managing production systems, and the launch of the App Modernization Guidance to help organizations modernize their applications with AI-first design. The blog emphasizes the strategic approach to modernization, aiming to reduce complexity, improve agility, and deliver measurable business outcomes2KViews2likes0CommentsDiagnose Web App Issues Instantly—Just Drop a Screenshot into Conversational Diagnostics
It’s that time of year again—Microsoft Build 2025 is here! And in the spirit of pushing boundaries with AI, we’re thrilled to introduce a powerful new preview feature in Conversational Diagnostics. 📸 Diagnose with a Screenshot No more struggling to describe a tricky issue or typing out long explanations. With this new capability, you can simply paste, upload, or drag a screenshot into the chat. Conversational Diagnostics will analyze the image, identify the context, and surface relevant diagnostics for your selected Azure Resource—all in seconds. Whether you're debugging a web app or triaging a customer issue, this feature helps you move from problem to insight faster than ever. Thank you!337Views2likes0CommentsFSI Knowledge Mining and Intelligent Document Process Reference Architecture
FSI customers such as insurance companies and banks rely on their vast amounts of data to provide sometimes hundreds of individual products to their customers. From assessing product suitability, underwriting, fraud investigations, and claims handling, many employees and applications depend on accessing this data to do their jobs efficiently. Since the capabilities of GenAI have been realised, we have been helping our customers in this market transform their business with unified systems that simplify access to this data and speed up the processing times of these core tasks, while remaining compliant with the numerous regulations that govern the FSI space. Combining the use of Knowledge Mining with Intelligent Document processing provides a powerful solution to reduce the manual effort and inefficacies of ensuring data integrity and retrieval across the many use cases that most of our customers face daily. What is Knowledge Mining and Intelligent Document Processing? Knowledge Mining is a process that transforms large, unstructured data sets into searchable knowledge stores. Traditional search methods often rely on keyword matching, which can miss the context of the information. In contrast, knowledge mining uses advanced techniques like natural language processing (NLP) to understand the context and meaning behind the data, providing a robust searching mechanism that can look across all these data sources, understand the relationships between the data therefore providing more accurate and relevant results. Intelligent Document Processing (IDP) is a workflow automation technology designed to scan, read, extract, categorise, and organise meaningful information from large streams of data. Its primary function is to extract valuable information from extensive data sets without human input, thereby increasing processing speed and accuracy while reducing costs. By leveraging a combination of Artificial Intelligence (AI), Machine Learning (ML), Optical Character Recognition (OCR), and Natural Language Processing (NLP), IDP handles both structured and unstructured documents. By ensuring that the processed data meets the "gold standard" - structured, complete, and compliant - IDP helps organizations maintain high-quality, reliable, and actionable data. The Power of Knowledge Mining and Intelligent Document Processing as a Unified Solution Knowledge Mining excels at quickly responding to natural language queries, providing valuable insights and making previously unsearchable data accessible. At the same time, IDP ensures that the processed data meets the "gold standard"—structured, complete, and compliant—making it both reliable and actionable. Together, these technologies empower organisations to harness the full potential of their data, driving better decision-making and improved efficiency. __________________________________________________________________ Meet Alex: A Day in the Life of a Fraud Case Worker Responsibilities: Investigate potential fraud cases by manually searching across multiple systems. Read and analyse large volumes of information to filter out relevant data. Ensure compliance with regulatory requirements and maintain data accuracy. Prepare detailed reports on findings and recommendations. Lost in Data: The Struggles of Manual Fraud Investigation Alex receives a new fraud case and starts by manually searching through multiple systems to gather information. This process takes several hours, and Alex has to read through numerous documents and emails to filter out relevant data. The inconsistent data formats and locations make it challenging to ensure accuracy. By the end of the day, Alex is exhausted and has only made limited progress on the case. Effortless Efficiency: Fraud Investigation Transformed with Knowledge Mining and IDP Alex receives a new fraud case and needs to gather all relevant information quickly. Instead of manually searching through multiple systems, Alex inputs the following natural language query into the unified system: "Show me all documents, emails, and notes related to the recent transactions of client X that might indicate fraudulent activity." The system quickly retrieves and presents a comprehensive summary of all relevant documents, emails, and notes, ensuring that the data is structured, complete, and compliant. This allows Alex to focus on analysing the data and making informed decisions, significantly improving the efficiency and accuracy of the investigation. How has Knowledge Mining and IDP transformed Alex's role? Before implementing Knowledge Mining and Intelligent Document Processing, Alex faced a manual process of searching across multiple systems to gather information. This was time-consuming and labour-intensive, often leading to delays in investigations. The overwhelming volume of data from various sources made it difficult to filter out relevant information, and the inconsistent data formats and locations increased the risk of errors. This high workload not only reduced Alex's efficiency but also led to burnout and decreased job satisfaction. However, with the introduction of a unified system powered by Knowledge Mining and IDP, these challenges were significantly mitigated. Automated searches using natural language queries allowed Alex to quickly find relevant information, while IDP ensured that the data processed was structured, complete, and compliant. This unified system provided a comprehensive view of the data, enabling Alex to make more informed decisions and focus on higher-value tasks, ultimately improving productivity and job satisfaction. ____________________________________________________________________ Example Architecture Knowledge Mining Users can interact with the system through a portal on the customer’s front-end of choice. This will serve as the entry point for submitting queries and accessing the knowledge mining service. Front-end options could include web apps, container services or serverless integrations. Azure AI Search provides powerful RAG capabilities. Meanwhile, Azure Open AI provides access to large language models to summarise responses. These services combined will take the user’s query to search the knowledge base and return relevant information which can be augmented as required. Prompt engineering can provide customisation to how the data is returned. You define what the data sources your Azure AI Search will consume. This can be Azure storage services or other data repositories. Data that meets a pre-defined gold standard is queried by Azure AI Search and relevant data is returned to the user. Gold standard data could be based on compliance or business needs. Power BI can be used to create analytical reports based on the data retrieved and processed. This step involves visualising the data in an interactive and user-friendly manner, allowing users to gain insights and make data-driven decisions. Intelligent Document Processing (Optional) Azure Data Factory is a data integration service that allows you to create workflows for data movement and transforming data at scale. This business data can be easily ingested to your Azure data storage solutions using pre-built connectors. This event driven approach ensures that as new data is generated, it can automatically be processed and ready for use in your knowledge mining solution. Data can be transformed using Functions apps and Azure OpenAI. Through prompt engineering, the large language model (LLM) can highlight specific issues in the documents, such as grammatical errors, irrelevant content, or incomplete information. The LLM can then be used to rewrite text to improve clarity and accuracy, add missing information, or reformat content to adhere to guidelines. Transformed data is stored as gold standard data. ____________________________________________________________________ Additional Cloud Considerations Networking VNETs (Virtual Networks) are a fundamental component of cloud infrastructure that enable secure and isolated networking configurations within a cloud environment. They allow different resources, such as virtual machines, databases, and services, to communicate with each other securely. Virtual networks ensure that services such as Azure AI Search, Azure OpenAI, and Power BI, can securely communicate with each other. This is crucial for maintaining the integrity and confidentiality of sensitive financial data. Express Route or VPN are expected to be used when connecting on-premises infrastructure to Azure for several reasons. Your company Azure ExpressRoute provides a private, reliable, and high-speed connection between your data center and Microsoft Azure. It allows you to extend your infrastructure to Azure by providing private access to resources deployed in Azure Virtual Networks and public services like App service, private end points to various other services. This private peering ensures that your traffic never enters the public Internet, enhancing security and performance. ExpressRoute uses Border Gateway Protocol (BGP) for dynamic routing between your on-premises networks and Azure, ensuring efficient and secure data exchange. It also offers built-in redundancy and high availability, making it a robust solution for critical workloads. Azure Front Door is a cloud-based Content Delivery Network (CDN) and application delivery service provided by Microsoft. It offers several key features, including global load balancing, dynamic site acceleration, SSL offloading, and a web application firewall, making it an ideal solution for optimizing and protecting web applications. We are expecting to use Front door in scenarios when the architecture will be expected to be used by users outside the organisation. Azure API Management in this scenario is expected to be used when we look to rollout the solution to larger groups. We look to then integrate much more security, rate limiting, load balancing, etc. Monitoring and Governance Azure Monitor: This service collects and analyses telemetry data from various resources, providing insights into the performance and health of the system. It enables proactive identification and resolution of issues, ensuring the system runs smoothly. Azure Cost Management and Billing: Provides tools for monitoring and controlling costs associated with the solution. It offers insights into spending patterns and resource usage, enabling efficient financial governance. Application Insights: Provides application performance monitoring (APM) designed to help you understand how your applications are performing and to identify issues that may affect their performance and reliability These components together ensure that the Knowledge Mining and Intelligent Document Processing solution is monitored for performance, secured against threats, compliant with regulations, and managed efficiently from a cost perspective. ____________________________________________________________________ Next steps: Identify the data and its sources that will feed into your own Knowledge Mine. Consider if you also need to implement Intelligent Document Processing to ensure data quality. Define your 'gold standards'. These guidelines will determine how your data might be transformed. Consider how to provide access to the data through an application portal, choose the right front-end technology for your use case. Once you have configured Azure AI search to point to the chosen data, consider how you might augment responses using Azure AI LLM models. Useful resources AI Landing Zone reference architecture Azure and Open AI with API Manager Secure connectivity from on premesis to Azure hosted solutions258Views1like0CommentsUnlock New AI and Cloud Potential with .NET 9 & Azure: Faster, Smarter, and Built for the Future
.NET 9, now available to developers, marks a significant milestone in the evolution of the .NET platform, pushing the boundaries of performance, cloud-native development, and AI integration. This release, shaped by contributions from over 9,000 community members worldwide, introduces thousands of improvements that set the stage for the future of application development. With seamless integration with Azure and a focus on cloud-native development and AI capabilities, .NET 9 empowers developers to build scalable, intelligent applications with unprecedented ease. Expanding Azure PaaS Support for .NET 9 With the release of .NET 9, a comprehensive range of Azure Platform as a Service (PaaS) offerings now fully support the platform’s new capabilities, including the latest .NET SDK for any Azure developer. This extensive support allows developers to build, deploy, and scale .NET 9 applications with optimal performance and adaptability on Azure. Additionally, developers can access a wealth of architecture references and sample solutions to guide them in creating high-performance .NET 9 applications on Azure’s powerful cloud services: Azure App Service: Run, manage, and scale .NET 9 web applications efficiently. Check out this blog to learn more about what's new in Azure App Service. Azure Functions: Leverage serverless computing to build event-driven .NET 9 applications with improved runtime capabilities. Azure Container Apps: Deploy microservices and containerized .NET 9 workloads with integrated observability. Azure Kubernetes Service (AKS): Run .NET 9 applications in a managed Kubernetes environment with expanded ARM64 support. Azure AI Services and Azure OpenAI Services: Integrate advanced AI and OpenAI capabilities directly into your .NET 9 applications. Azure API Management, Azure Logic Apps, Azure Cognitive Services, and Azure SignalR Service: Ensure seamless integration and scaling for .NET 9 solutions. These services provide developers with a robust platform to build high-performance, scalable, and cloud-native applications while leveraging Azure’s optimized environment for .NET. Streamlined Cloud-Native Development with .NET Aspire .NET Aspire is a game-changer for cloud-native applications, enabling developers to build distributed, production-ready solutions efficiently. Available in preview with .NET 9, Aspire streamlines app development, with cloud efficiency and observability at its core. The latest updates in Aspire include secure defaults, Azure Functions support, and enhanced container management. Key capabilities include: Optimized Azure Integrations: Aspire works seamlessly with Azure, enabling fast deployments, automated scaling, and consistent management of cloud-native applications. Easier Deployments to Azure Container Apps: Designed for containerized environments, .NET Aspire integrates with Azure Container Apps (ACA) to simplify the deployment process. Using the Azure Developer CLI (azd), developers can quickly provision and deploy .NET Aspire projects to ACA, with built-in support for Redis caching, application logging, and scalability. Built-In Observability: A real-time dashboard provides insights into logs, distributed traces, and metrics, enabling local and production monitoring with Azure Monitor. With these capabilities, .NET Aspire allows developers to deploy microservices and containerized applications effortlessly on ACA, streamlining the path from development to production in a fully managed, serverless environment. Integrating AI into .NET: A Seamless Experience In our ongoing effort to empower developers, we’ve made integrating AI into .NET applications simpler than ever. Our strategic partnerships, including collaborations with OpenAI, LlamaIndex, and Qdrant, have enriched the AI ecosystem and strengthened .NET’s capabilities. This year alone, usage of Azure OpenAI services has surged to nearly a billion API calls per month, illustrating the growing impact of AI-powered .NET applications. Real-World AI Solutions with .NET: .NET has been pivotal in driving AI innovations. From internal teams like Microsoft Copilot creating AI experiences with .NET Aspire to tools like GitHub Copilot, developed with .NET to enhance productivity in Visual Studio and VS Code, the platform showcases AI at its best. KPMG Clara is a prime example, developed to enhance audit quality and efficiency for 95,000 auditors worldwide. By leveraging .NET and scaling securely on Azure, KPMG implemented robust AI features aligned with strict industry standards, underscoring .NET and Azure as the backbone for high-performing, scalable AI solutions. Performance Enhancements in .NET 9: Raising the Bar for Azure Workloads .NET 9 introduces substantial performance upgrades with over 7,500 merged pull requests focused on speed and efficiency, ensuring .NET 9 applications run optimally on Azure. These improvements contribute to reduced cloud costs and provide a high-performance experience across Windows, Linux, and macOS. To see how significant these performance gains can be for cloud services, take a look at what past .NET upgrades achieved for Microsoft’s high-scale internal services: Bing achieved a major reduction in startup times, enhanced efficiency, and decreased latency across its high-performance search workflows. Microsoft Teams improved efficiency by 50%, reduced latency by 30–45%, and achieved up to 100% gains in CPU utilization for key services, resulting in faster user interactions. Microsoft Copilot and other AI-powered applications benefited from optimized runtime performance, enabling scalable, high-quality experiences for users. Upgrading to the latest .NET version offers similar benefits for cloud apps, optimizing both performance and cost-efficiency. For more information on updating your applications, check out the .NET Upgrade Assistant. For additional details on ASP.NET Core, .NET MAUI, NuGet, and more enhancements across the .NET platform, check out the full Announcing .NET 9 blog post. Conclusion: Your Path to the Future with .NET 9 and Azure .NET 9 isn’t just an upgrade—it’s a leap forward, combining cutting-edge AI integration, cloud-native development, and unparalleled performance. Paired with Azure’s scalability, these advancements provide a trusted, high-performance foundation for modern applications. Get started by downloading .NET 9 and exploring its features. Leverage .NET Aspire for streamlined cloud-native development, deploy scalable apps with Azure, and embrace new productivity enhancements to build for the future. For additional insights on ASP.NET, .NET MAUI, NuGet, and more, check out the full Announcing .NET 9 blog post. Explore the future of cloud-native and AI development with .NET 9 and Azure—your toolkit for creating the next generation of intelligent applications.9.3KViews2likes1CommentPast due! Act now to upgrade from these retired Azure services
This is your friendly reminder that the following Azure services were retired on August 31, 2024: Azure App Service Environment v1 and v2 Logic Apps Integration Service Environment Azure API Management stv1 Congratulations to the majority of our customers who have completed the migration to the latest versions! Your timely actions have ensured the continued security and performance of your applications and data. For those still running the retired environments, it is crucial to migrate immediately to avoid security risks and data loss. As part of the retirement process, Azure has already begun decommissioning the hardware. It is possible that your retired environment will experience intermittent outages, or it may be suspended. Please complete your migration as soon as possible. Azure App Service Environment (ASE) v1 and v2:If your environment experiences any intermittent outages, it is important that you acknowledge the outages in the Azure Portal and begin work to migrate immediately. You may also request a grace period to complete the migration. If there is no request for grace period or no action from customers after repeated reminders, the environment may be suspended or deleted, or we may attempt to auto-migrate to the new version. Please consider this only as a last resort and complete the migration using the available resources. This last-resort scenario may require additional configuration from customers to bring the applications back online. If your environment has been automatically migrated, please visit product documentation to learn more: Prevent and recover from an auto-migration of an App Service Environment - Azure App Service Environment | Microsoft Learn. Logic Apps Integration Services Environment: (ISE): Customers who remain on ISE after the retirement date may have experienced outages. To avoid service disruptions, please export your logic apps workflows from ISE to Logic Apps Standard at the earliest. Additionally, read-only instances will continue to incur standard charges. To avoid unnecessary costs, we recommend customers delete any instances that are no longer in use. As of October 1, 2024, Logic Apps executions on all ISE Developer and ISE Premium instances have been stopped and these instances are also read-only. Logic Apps deployed to these instances will be available for export for a limited time. From January 6, 2025 all ISE instances (Developer and Premium) will start being deleted, incurring loss of data. Azure API Management stv1: Customers who remain on APIM stv1 after the retirement date may have experienced outages. As of October 1, 2024 remaining APIM stv1 service instances have started to undergo auto-migration to the APIM stv2 compute platform. Automatic migration may cause downtime for upstream API consumers, and customers may need to update their network dependencies. All affected customers will be notified of the ongoing automatic migration one week in advance through emails to the subscription administrators and Azure Service Health Portal notifications. To avoid service disruptions, please migrate instances running on stv1 to stv2 at the earliest. The latest migration option addresses the networking dependencies, particularly the need for new subnets and IP changes. You can now retain the original IPs, both public and private, significantly simplifying the migration process. What is the impact on support and SLA? As of September 1, 2024, the Service Level Agreement (SLA) will no longer be applicable for continued use of the retired products beyond the retirement date. Azure customer support will continue to handle support cases in a commercially reasonable manner. No new security and compliance investments will be made. The ability to effectively mitigate issues that might arise from lower-level Azure dependencies may be impaired due to the retirement. What is the call to action? If you are still running one or more of the following services, please use the available resources listed here to complete the migration at the earliest. Announcement Learn live Migration Resources Public resources App Service Environment version 1 and version 2 will be retired on 31 August 2024 Episode 1 Bonus episode: Side by side migration App Service Environment version 3 migration Using the in-place migration feature Auto migration overview and grace period Estimate your cost savings Integration Services Environment will be retired on 31 August 2024 Episode 2 Logic Apps Standard migration Export ISE workflows to a Standard logic app ISE Retirement FAQ Support for API Management instances hosted on the stv1 platform will be retired by 31 August 2024. Episode 3 API Management STV2 migration576Views0likes0CommentsExciting Updates Coming to Conversational Diagnostics (Public Preview)
Last year, at Ignite 2023, we unveiled Conversational Diagnostics (Preview), a revolutionary tool integrated with AI-powered capabilities to enhance problem-solving for Windows Web Apps. This year, we're thrilled to share what’s new and forthcoming for Conversational Diagnostics (Preview). Get ready to experience a broader range of functionalities and expanded support across various Azure Products, making your troubleshooting journey even more seamless and intuitive.271Views0likes0CommentsUnlock Business Growth with Azure Application Platform innovations
AI is reshaping industries, driving transformation in how businesses operate, communicate, and serve customers. In today’s fast-evolving Generative AI landscape, businesses feel the urgency to transform their customer experiences and business processes with AI applications. In a recent study, IDC found that Generative AI usage has jumped significantly, up from 55% in 2023 to 75% in 2024. The return on investment for Generative AI is also significant. IDC found that companies on average experience a 3.7x return on their investment. To deliver business impact with AI applications, businesses are equally looking to modernizing existing applications. However, building and modernizing applications to deliver scalable, reliable, and highly performant AI applications can be complex, time consuming and resource intensive. Microsoft, a leader in Gartner Magic Quadrant for Cloud Application Platform and Container Management, provides a comprehensive application platform designed to address these challenges. Azure’s application platform and services provide developers and IT operators with a comprehensive suite of services to build, deploy, and scale modern intelligent applications in a quick, secure and efficient manner. Join us at Microsoft Ignite 2024 from Nov 18–22, in Chicago and online, to discover the latest Azure Application Platform updates, enhancements, and tools to accelerate your AI app development and app modernization journey. Accelerate business growth with AI Apps From container-based services with Azure Kubernetes Service (AKS) and Azure Container Apps (ACA), to Platform as a Service (PaaS) offering like Azure App Service, and powerful integration capabilities with Azure Integration Services and serverless capabilities with Azure Functions – Azure’s Application Platform provides a complete, end-to-end solution for building, deploying, and managing AI applications - all in one place. The unified platform enables businesses to go from ideas to production faster by leveraging an extensive array of 1600+ AI models in Azure AI Studio Model Catalog, integrated with popular developer tools like GitHub, GitHub Copilot and Visual Studio, and real-time transactional data in Azure databases. At Microsoft Ignite, we are announcing several new enhancements to our Application Platform to help you build transformational AI applications: Real-time AI inferencing with Serverless GPUs: Azure Container Apps now support serverless GPUs in public preview. Serverless GPU enables you to seamlessly run AI workloads on-demand with automatic scaling, optimized cold start, per-second billing with scale down to zero when not in use, and reduced operational overhead to support easy real-time custom model inferencing and other machine learning tasks. Learn more. Azure Container Apps Dynamic Sessions: Dynamic sessions in Azure Container Apps, now generally available, is a fast, sandboxed, ephemeral compute, suitable for running AI-generated, untrusted code at scale in multi-tenancy scenarios. Each session has full compute isolation using Hyper-V. You now have easy access to fast, ephemeral, sandboxed compute on Azure without managing infrastructure. Learn more. AI capabilities in Azure Logic Apps: AI capabilities are now available in the Azure Logic Apps Consumption SKU in public preview, offering advanced features like the Azure AI Search Connector, Azure OpenAI Connector, and Forms Recognizer. These enhancements enable intelligent document processing, enhanced search, and language capabilities for more intelligent workflows. Additionally, Azure Logic Apps Standard now supports Templates, providing pre-built workflow solutions to streamline integration workflow development. Learn more. AI Template Gallery: To help developers quickly get started with building AI apps, we’ve created an AI App Template Gallery that features templates for common AI use cases. With these templates, you can start building AI apps on Azure in as little as 5 minutes. Free services for AI apps: Start building AI apps with free Azure application, data, and AI services to minimize upfront costs. Explore which services offer free monthly amounts to estimate the cost for your project. Discover specialized programs for startups and students to develop AI-powered apps on Azure. Learn more. Learn how customers like H&R Block , Pimco, Novo Nordisk, Harvey, Jato Dynamics, Medigold Health and C.H.Robinson are delivering transformational business impact with AI applications on Azure. Modernize your Apps for AI and continuous innovation To remain competitive, organizations must keep up with modern app development trends and meet evolving customer expectations. This means accelerating application development speed, enhancing scalability and agility, and overcoming the limitations of legacy applications, which can be costly to maintain and challenging to innovate on. At Microsoft Ignite, we are announcing several new features and enhancements to help you accelerate your app modernization and become AI ready faster. Azure App Service: We are excited to announce the general availability of sidecars in Azure App Service, a versatile pathway for organizations to modernize existing apps and add powerful new capabilities without significant rewrites necessary in the main application code. They enable you to add new capabilities by adding SDKs—like AI, logging, monitoring, and security features—to your primary application without the need to significantly modify and redeploy the app. Learn more. GitHub Copilot upgrade assistant for Java: Keeping Java apps up to date with the latest versions can be a time-consuming task. GitHub Copilot upgrade assistant for Java, currently in private preview, enables you to use AI to simplify upgrading Java applications with autonomous AI agents, ensuring trust and transparency throughout the upgrade experience. Learn more. Bring Your Own License for JBoss EAP on Azure App Service: We are excited to announce General Availability of Bring Your Own License for JBoss EAP on Azure App Service is coming in December 2024. You can use existing JBoss EAP subscriptions to deploy applications to Azure App Service, making it far easier to move existing applications to the cloud while retaining support from both Red Hat and Microsoft across the application lifecycle. Learn more. Logic Apps Hybrid Deployment Model: Azure Logic Apps has introduced a new Hybrid Deployment Model, enabling businesses to run integration workflows on their own infrastructure—on-premises, private clouds, or other public clouds. This model allows greater flexibility for meeting specific regulatory, privacy, and network requirements while still benefiting from the rich 1400+ Logic Apps connector library for seamless integration with enterprise systems. Learn more. Azure Migrate application assessments: The newly released Application aware assessment capability in Azure Migrate helps with application-level migrations, rather than individual servers or application components. The Azure Migrate app and code assessment tool and GitHub Copilot Chat integration offer more granular code assessment and AI-assisted suggestions for changes required to successfully run .NET and Java applications on Azure Learn more. Learn how customers like Commercial Bank of Dubai, Scandinavian Airlines, Nexi, Ossur, Sapiens, MSC Mediterranean Shipping Company and Finastra leverage Azure’s application platform to modernize their business-critical applications and deliver enhanced customer experience. Scale and secure enterprise-grade production apps When working with customers on modernizing their applications, we consistently hear about challenges in securing and scaling legacy systems. Outdated infrastructure often leads to operational bottlenecks, slows development, and impacts competitiveness. With rising cyber threats, older systems can lack the robust security measures necessary to protect data and applications. Microsoft Azure addresses these needs with a globally available, enterprise-grade platform that integrates security at every level. Check out Azure application platform announcements at Microsoft Ignite to help you scale and secure your enterprise applications: Scale, secure, and optimize Azure Kubernetes Service (AKS): AKS is designed to simplify Kubernetes adoption for developers and operators of all skill levels. With the latest updates, AKS is now more user-friendly, secure, and cost-efficient, allowing you to focus on building and scaling your applications without worrying about the underlying infrastructure. Read the summaries below and this blog for more details. Cluster upgrades are now more reliable and efficient, and multiple clusters can be upgraded automatically. Additionally, AKS Automatic (preview) will now dynamically select an appropriate virtual machine (VM) based on the capacity in your Azure subscription. With AKS, you now have full visibility over runtime and host vulnerabilities in your AKS cluster. The AKS security dashboard (now available in preview as a blade in Azure portal) offers a simplified and streamlined view of security vulnerabilities and remediation insights for resource owners or cluster administrators. Trusted launch (generally available) enhances the security of certain virtual machines (VMs) by protecting against advanced and persistent attack techniques. Intelligent workload scheduling in Azure Kubernetes Fleet Manager is now generally available, providing operators more control when deploying workloads to optimize resource utilization and simplify multi-cluster scenarios. Auto-instrumentation through Application Insights (coming soon) makes it easier to access telemetry like metrics, requests, and dependencies, as well as visualizations like the application dashboard and application map. Expanded GenAI gateway capabilities in Azure API Management: Azure API Management has expanded support for GPT-4o models, including both text and image-based capabilities. Additionally, the new Generative AI Gateway Token Quota allows flexible token quotas (daily, weekly, or monthly), helping organizations control costs and track usage patterns for effective resource management. Learn more. Achieve instant fast scaling with Azure Functions: Flex Consumption plan is a new Azure Functions hosting plan that builds on the consumption pay-per-second serverless billing model with automatic scale down to zero when not in use for cost efficiency. With Flex Consumption plan now in General Availability, you can integrate with your virtual network at no extra cost, ensuring secure and private communication, with no considerable impact to your app’s scale out performance. Learn more. Learn how customers like BMW, ABN-AMRO, SPAR, ClearBank are scaling and operating mission critical enterprise-grade production apps on Azure application platform. With the Azure Application Platform, you can build new AI applications and modernize your existing applications faster, while ensuring end-to-end security and scalability across the entire app development journey, from development to production. Join us at Ignite this week and learn more about updates for Azure Digital and App Innovation: Here.505Views0likes0Comments