Sunday, March 30, 2025

Deploying Azure ML Prompt Flow to Azure as App Service: A step by step guide

Introduction

When exploring deployment options for my recent Azure ML Prompt Flow project, I found that Azure App Service offered the perfect balance of simplicity and functionality. This approach stood out for its ability to get AI applications into production quickly with minimal overhead.

My Integration Scenario

My specific use case centered around having Microsoft Copilot Studio handle all UI orchestration, using Direct Line API to connect via BotFramework-WebChat. Microsoft Copilot Studio effectively communicates with the backend deployed Azure Prompt Flow through REST API workflow tasks. The BotFramework-WebChat implementation was architected around a Redux state management pattern, ensuring efficient data flow and a responsive, dynamic user experience.

Solution Architecture

Architecture Diagram - Azure ML Prompt Flow with App Service



Why Choose Azure App Service for Prompt Flow?

- Quick Deployment: Get your AI flows into production faster

- Minimal Infrastructure Management: Focus on your application, not infrastructure

- Perfect for Smaller to Medium-Scale Applications: Right-sized solution

- Simple Yet Scalable Solution: Start small and scale as needed

Step-by-Step Deployment Guide

Pre-requisites and Environment Setup

Before starting the deployment process, ensure you have the following prerequisites installed and configured:

- Python 3.12 or  higher

- PowerShell or Git-Bash

- Azure CLI

- Docker Desktop (installed and configured)

- Conda Virtual Environment (created and activated)

- Prompt Flow Package (installed via pip)

Local Setup: Building and Testing Azure ML Prompt Flows

To begin, I cloned the Microsoft Prompt Flow repository as the foundation for my local development environment. Building directly on this codebase allowed me to leverage the existing Prompt Flow CLI and core functionalities.

Repository: https://github.com/microsoft/promptflow.git

Within this cloned repository, I created my specific Prompt Flow in VS Code, tailoring it to my application's requirements. To ensure proper functionality before deployment, I followed these steps for local testing:

1. Connection Setup: I established the necessary connections, such as the Azure OpenAI connection, using the Prompt Flow CLI:

   ```bash

   pf connection create --file .\honda-PROD\azure_openai_connection.yaml

   ```

   This step ensured that my flow could access the required AI models and services without issues.

2. Local Flow Serving: I then served my Prompt Flow locally using the `pf flow serve` command:

   ```bash

   pf flow serve --source .\honda-PROD\ --port 8085 --host localhost

   ```

   This allowed me to access my flow via `http://localhost:8085/` for immediate testing and iteration.

Preparing for Azure App Service Deployment

Build and Deploy the FLOW Setup

Steps:

1. Login to Azure portal using CLI

   ```bash

   az login    # Authenticate yourself


2. Create Resource Group in the Azure Portal

   ```bash

   az group create --name rg-for-honda-pf-app-service --location eastus2

   az group list --output table/json  

   ```

3. Create Container Registry in the Azure Portal

   ```bash

   az acr create \

     --resource-group <resource-group-name> \

     --name <container-registry-name> \

     --sku <sku> \

     --location <region>

   ```

   Handy Sample AZ-CLI

   ```bash

   az acr create --name mycrforpf --resource-group rg-for-honda-pf-app-service --sku ASP-P0v3-1 --location eastus2

   az acr update --name mycrforpf --admin-enabled true

   ```

4. Build the FLOW as docker format app

   Use the below command to build a flow as a docker format app:

   ```bash

   pf flow build --source ../../flows/standard/web-classification --output dist --format docker

   ```

   This will generate the DockerFile for you inside the dist folder.

5. Deploy the FLOW to Azure Portal as App Service

   The code provided by Microsoft is available in:

   - `/examples/tutorials/flow-deploy/azure-app-service/deploy.sh` (Bash Version)

   - `/examples/tutorials/flow-deploy/azure-app-service/deploy.ps1` (PowerShell Version)

   Use the above deploy script to build and deploy the image.

Testing the Deployed Flow

Once your flow is deployed to Azure App Service, you can test it by sending a POST request to the endpoint or by browsing the test page. 

To test the flow:

1. Use a REST client like Postman or CURL to send a POST request to your endpoint

   - Sample endpoint: `https://honda-pf-99d9m.azurewebsites.net/score`

   - Make sure to set the Content-Type header to `application/json`

   - Include your request payload in the body of the POST request

2. **Test via the built-in test page** that comes with the deployment

   - Access the test page by navigating to your App Service URL in a browser

   - This provides a simple interface to test your flow without additional tools

The deployed flow exposes an API endpoint that follows the same interface patterns as when you test locally, making it straightforward to transition from development to production.

Conclusion

By following these steps, I was able to successfully deploy my Azure ML Prompt Flow to Azure App Service, creating a robust and scalable solution for my AI application. This approach provided the perfect balance of simplicity and functionality, allowing me to get my application into production quickly with minimal overhead.

The combination of Microsoft Copilot Studio for UI orchestration and Azure App Service for backend deployment created a powerful and flexible architecture that can be adapted to a variety of AI application scenarios.

Join the Conversation: Together We Learn

If you've found a better way to handle certain aspects of the deployment, please share your insights - I'm always looking to improve this workflow!

No comments:

Post a Comment