Sunday, March 30, 2025

Deploying Azure ML Prompt Flow to Azure as App Service: A step by step guide

Introduction

When exploring deployment options for my recent Azure ML Prompt Flow project, I found that Azure App Service offered the perfect balance of simplicity and functionality. This approach stood out for its ability to get AI applications into production quickly with minimal overhead.

My Integration Scenario

My specific use case centered around having Microsoft Copilot Studio handle all UI orchestration, using Direct Line API to connect via BotFramework-WebChat. Microsoft Copilot Studio effectively communicates with the backend deployed Azure Prompt Flow through REST API workflow tasks. The BotFramework-WebChat implementation was architected around a Redux state management pattern, ensuring efficient data flow and a responsive, dynamic user experience.

Solution Architecture

Architecture Diagram - Azure ML Prompt Flow with App Service



Why Choose Azure App Service for Prompt Flow?

- Quick Deployment: Get your AI flows into production faster

- Minimal Infrastructure Management: Focus on your application, not infrastructure

- Perfect for Smaller to Medium-Scale Applications: Right-sized solution

- Simple Yet Scalable Solution: Start small and scale as needed

Step-by-Step Deployment Guide

Pre-requisites and Environment Setup

Before starting the deployment process, ensure you have the following prerequisites installed and configured:

- Python 3.12 or  higher

- PowerShell or Git-Bash

- Azure CLI

- Docker Desktop (installed and configured)

- Conda Virtual Environment (created and activated)

- Prompt Flow Package (installed via pip)

Local Setup: Building and Testing Azure ML Prompt Flows

To begin, I cloned the Microsoft Prompt Flow repository as the foundation for my local development environment. Building directly on this codebase allowed me to leverage the existing Prompt Flow CLI and core functionalities.

Repository: https://github.com/microsoft/promptflow.git

Within this cloned repository, I created my specific Prompt Flow in VS Code, tailoring it to my application's requirements. To ensure proper functionality before deployment, I followed these steps for local testing:

1. Connection Setup: I established the necessary connections, such as the Azure OpenAI connection, using the Prompt Flow CLI:

   ```bash

   pf connection create --file .\honda-PROD\azure_openai_connection.yaml

   ```

   This step ensured that my flow could access the required AI models and services without issues.

2. Local Flow Serving: I then served my Prompt Flow locally using the `pf flow serve` command:

   ```bash

   pf flow serve --source .\honda-PROD\ --port 8085 --host localhost

   ```

   This allowed me to access my flow via `http://localhost:8085/` for immediate testing and iteration.

Preparing for Azure App Service Deployment

Build and Deploy the FLOW Setup

Steps:

1. Login to Azure portal using CLI

   ```bash

   az login    # Authenticate yourself


2. Create Resource Group in the Azure Portal

   ```bash

   az group create --name rg-for-honda-pf-app-service --location eastus2

   az group list --output table/json  

   ```

3. Create Container Registry in the Azure Portal

   ```bash

   az acr create \

     --resource-group <resource-group-name> \

     --name <container-registry-name> \

     --sku <sku> \

     --location <region>

   ```

   Handy Sample AZ-CLI

   ```bash

   az acr create --name mycrforpf --resource-group rg-for-honda-pf-app-service --sku ASP-P0v3-1 --location eastus2

   az acr update --name mycrforpf --admin-enabled true

   ```

4. Build the FLOW as docker format app

   Use the below command to build a flow as a docker format app:

   ```bash

   pf flow build --source ../../flows/standard/web-classification --output dist --format docker

   ```

   This will generate the DockerFile for you inside the dist folder.

5. Deploy the FLOW to Azure Portal as App Service

   The code provided by Microsoft is available in:

   - `/examples/tutorials/flow-deploy/azure-app-service/deploy.sh` (Bash Version)

   - `/examples/tutorials/flow-deploy/azure-app-service/deploy.ps1` (PowerShell Version)

   Use the above deploy script to build and deploy the image.

Testing the Deployed Flow

Once your flow is deployed to Azure App Service, you can test it by sending a POST request to the endpoint or by browsing the test page. 

To test the flow:

1. Use a REST client like Postman or CURL to send a POST request to your endpoint

   - Sample endpoint: `https://honda-pf-99d9m.azurewebsites.net/score`

   - Make sure to set the Content-Type header to `application/json`

   - Include your request payload in the body of the POST request

2. **Test via the built-in test page** that comes with the deployment

   - Access the test page by navigating to your App Service URL in a browser

   - This provides a simple interface to test your flow without additional tools

The deployed flow exposes an API endpoint that follows the same interface patterns as when you test locally, making it straightforward to transition from development to production.

Conclusion

By following these steps, I was able to successfully deploy my Azure ML Prompt Flow to Azure App Service, creating a robust and scalable solution for my AI application. This approach provided the perfect balance of simplicity and functionality, allowing me to get my application into production quickly with minimal overhead.

The combination of Microsoft Copilot Studio for UI orchestration and Azure App Service for backend deployment created a powerful and flexible architecture that can be adapted to a variety of AI application scenarios.

Join the Conversation: Together We Learn

If you've found a better way to handle certain aspects of the deployment, please share your insights - I'm always looking to improve this workflow!

Saturday, March 29, 2025

Unleash Your LLM Potential: Deploying an Azure ML Prompt Flow: A Practical Guide

Introduction


In the rapidly evolving landscape of Large Language Models (LLMs), efficiently deploying your AI applications is crucial. Today, I want to share my recent exploration into deploying Azure Machine Learning Prompt Flows, a powerful tool for streamlining the entire LLM application development lifecycle.

What is Azure ML Prompt Flow?

Azure ML Prompt Flow is more than just a development tool, it's a complete ecosystem designed to streamline the entire lifecycle of AI application development. 

Think of it as an orchestrator that lets you chain together prompts, Python scripts, data sources, and evaluation metrics in a structured pipeline.

My Recent Project: Diving into Deployment

Azure ML Prompt Flow promised to simplify the process, and I was eager to see if it lived up to the hype. It definitely did! It provided a structured way to build, test, and iterate on my LLM-powered application. My goal was simple: transform a promising AI prototype into a robust, production-ready application.

My Journey Through Azure Prompt Flow Deployment Strategies

Deploying an AI application is no longer a one-size-fits-all endeavor. My recent project with Azure Prompt Flow illuminated the complexity and flexibility of modern AI deployment strategies. Drawing directly from Microsoft's official documentation, I'll break down the four primary deployment approaches that can transform your AI project from a prototype to a production-ready solution.

Deployment Approaches: A Deep Dive

These are the four available approaches recommended by Microsoft:

**1. Deploy to Azure App Service**

This method offers a fully managed platform for hosting web applications. This approach is particularly compelling for developers seeking:

* Rapid deployment

* Minimal infrastructure management

* Easy scaling capabilities

* Simplified web application hosting

**2. Deploy a flow using Docker**

Docker provides containerization, enabling you to package your Prompt Flow and its dependencies into a portable container. Containerization through Docker offers unprecedented consistency and portability for your Prompt Flow applications:

The key capabilities or features

* Package entire application environment

* Ensure consistency across development and production

* Simplify dependency management

* Enable seamless migration between different infrastructure

**3. Deploy a flow using Kubernetes**

For applications demanding maximum scalability and reliability, Kubernetes emerges as the gold standard:

The Key Benefits:

* Advanced container orchestration

* Automatic scaling and load balancing

* High availability architecture

* Complex microservice management

**4. Deploy the Prompt Flow code to Azure AI Foundry**

Microsoft's Azure AI Foundry represents the next evolution in AI application deployment:

Azure AI Foundry is a newer offering that helps to provide a development environment that makes it easier to create and share AI solutions.

This option is great for collaborative development and sharing of AI solutions.

The key capabilities or features

* Integrated AI development environment

* Streamlined model management

* Enhanced collaboration tools

* Comprehensive AI solution lifecycle support

Choosing Your Deployment Strategy

Selecting the right approach depends on multiple factors:

**Project Complexity:**

* Simple web app → Azure App Service

* Consistent environment needs → Docker

* Enterprise-scale applications → Kubernetes

* Collaborative AI development → Azure AI Foundry

**Other Factors:**

* Scalability Requirements

* Team Expertise

* Infrastructure Constraints

* Performance Expectations

My two cents:

If you're looking to deploy Azure ML Prompt Flows, don't be afraid to experiment. Choose a deployment method that aligns with your needs and comfort level, and be prepared to get your hands dirty. The learning experience is worth it!

I'd love to hear about your experiences deploying Prompt Flows! Share your stories and tips in the comments below.


Happy deploying!!!

Thursday, March 6, 2025

Webchat.JS / DirectLine API - how to clear the chat bot previous messages

 store.getState().activities = [];

For more details, please follow this post,
 
https://stackoverflow.com/questions/79487310/webchat-js-direct-line-api-copilot-studio-reset-the-chat-bot-messages