Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
129 changes: 129 additions & 0 deletions COBO-INTEGRATION-TEST.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
## COBO Agent E2E Test


1. Get the bicep
```powershell
azd init --template https://github.com/Azure-Samples/azd-ai-starter-basic
```

2. Install the required azd version and extension

Install azd daily build
```powershell
powershell -ex AllSigned -c "Invoke-RestMethod 'https://aka.ms/install-azd.ps1' -OutFile 'install-azd.ps1'; ./install-azd.ps1 -Version 'daily'"
```
Open a new powershell then
```
azd version
```
It should give you the expected commit number, most recently, you should see
```
azd version 1.20.2 (commit 2063759a9d972b4b4b8d9a5052bc4b5fa664d7e7)
```

Install extension
```powershell
azd extension install azure.foundry.ai.agents
```
Then
```
azd ext list
```
You should see
```
Id Name Version Installed Version Source
azure.coding-agent Coding agent configuration extension 0.5.1 azd
azure.foundry.ai.agents AI Foundry Agents 0.0.2 0.0.2 azd
microsoft.azd.demo Demo Extension 0.3.0 azd
microsoft.azd.extensions AZD Extensions Developer Kit 0.6.0 azd
```

3. Init Agent sample
```
azd ai agent init -m https://github.com/coreai-microsoft/foundry-golden-path/tree/main/idea-to-proto/01-build-agent-in-code/agent-catalog-code-samples/cobo-calculator-agent
```

Your folder structure should look like this now:
```
cobo-calculator-agent/
├── .azure/
├── infra/
├── agent.yaml
├── azure.yaml
├── Dockerfile
├── langraph_agent_calculator.py
└── requirements.txt
```


4. Test
```
azd up
```
Use the following parameters if you don't have a test subscription:
```
? Select an Azure Subscription to use: 87. azure-openai-agents-exp-nonprod-01 (921496dc-987f-410f-bd57-426eb2611356)
? Enter a value for the 'aiDeploymentsLocation' infrastructure parameter: 24. (US) West US 2 (westus2)
? Enter a value for the 'enableCoboAgent' infrastructure parameter: True
```
Please contact migu@microsoft to get permission to the subscription


When it finishes, you should see console output like:
```
--- Testing Agent with Data Plane Call ---
Data Plane POST URL: https://ai-account-kply6uaglbh5u.services.ai.azure.com/api/projects/migu-cobo-int-1602/openai/responses?api-version=2025-05-15-preview
Data Plane Payload: {
"stream": false,
"agent": {
"version": "2",
"name": "Cobo Calculator Agent",
"type": "agent_reference"
},
"input": "Tell me a joke."
}
Data Plane POST completed. Response:
{
"metadata": {},
"temperature": null,
"top_p": null,
"user": null,
"model": "",
"background": false,
"tools": [],
"id": "caresp_33443885792eaac0004iX4VWbkLbS4rTYxSKosWlyE6h1DZeFF",
"object": "response",
"status": "completed",
"created_at": 1761349040,
"error": null,
"incomplete_details": null,
"output": [
{
"type": "message",
"id": "msg_33443885792eaac0009lEaLBx6qSEMnII3pakiMz10q6ZnlBVI",
"status": "completed",
"role": "assistant",
"content": [
{
"type": "output_text",
"text": "Why don't skeletons fight each other?\n\nBecause they don't have the guts!",
"annotations": []
}
]
}
],
"instructions": null,
"parallel_tool_calls": false,
"conversation": null,
"agent": {
"type": "agent_id",
"name": "Cobo Calculator Agent",
"version": "2"
}
}

======================================
Azure Portal Links
======================================
Container App: https://portal.azure.com/#@/resource/subscriptions/921496dc-987f-410f-bd57-426eb2611356/resourceGroups/rg-migu-cobo-int-1602/providers/Microsoft.App/containerApps/ca-migu-cobo-int-1602-kply6uaglb
```
16 changes: 16 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# syntax=docker/dockerfile:1

FROM mcr.microsoft.com/devcontainers/python:3.11

WORKDIR /app

# Copy package sources (new_app project)
COPY . .

# Install the package with the langgraph extras
RUN pip install -r requirements.txt

# Expose the port that the agent server uses
EXPOSE 8088

CMD ["python", "langgraph_agent_calculator.py"]
27 changes: 27 additions & 0 deletions agent.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
agent:
kind: container
name: Cobo Calculator Agent
description: This Agent can perform basic arithmetic calculations such as addition, subtraction, multiplication, and division.
metadata:
example:
- role: user
content: |-
What's 3 / 1.5 + 2?
tags:
- example
- learning
authors:
- migu

models:
- id: gpt-4o-mini
provider: azure
deployment: {{model_deployment_name}}

parameters:
model_deployment_name:
schema:
type: string
default: gpt-4o-mini
description: the name of your model deployment
required: true
21 changes: 21 additions & 0 deletions azure.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,24 @@ requiredVersions:
extensions:
# the azd ai agent extension is required for this template
"azure.foundry.ai.agents": ">=0.0.1"

services:
cobo-agent:
project: .
language: py
host: containerapp
docker:
remoteBuild: true

hooks:
postdeploy:
windows:
shell: pwsh
run: ./scripts/postdeploy.ps1
continueOnError: true
interactive: true
posix:
shell: sh
run: ./scripts/postdeploy.sh
continueOnError: true
interactive: true
2 changes: 2 additions & 0 deletions infra/main.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -149,3 +149,5 @@ output AI_FOUNDRY_PROJECT_PRINCIPAL_ID string = enableCoboAgent ? coboAgent!.out
output AI_FOUNDRY_PROJECT_TENANT_ID string = enableCoboAgent ? coboAgent!.outputs.AI_FOUNDRY_PROJECT_TENANT_ID : ''
output AI_FOUNDRY_RESOURCE_ID string = '/subscriptions/${subscription().subscriptionId}/resourceGroups/${resourceGroupName}/providers/Microsoft.CognitiveServices/accounts/${aiProject.outputs.aiServicesAccountName}'
output AI_FOUNDRY_PROJECT_RESOURCE_ID string = aiProject.outputs.projectId
// Mock output of AGENT_NAME which should be populated by azd extension by reading from agent.yaml
output AGENT_NAME string = 'Cobo Calculator Agent'
2 changes: 1 addition & 1 deletion infra/main.parameters.json
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
"value": "${AZURE_PRINCIPAL_TYPE}"
},
"aiProjectDeploymentsJson": {
"value": "${AI_PROJECT_DEPLOYMENTS:=[]}"
"value": "[{\"name\":\"gpt-4o-mini\",\"model\":{\"name\":\"gpt-4o-mini\",\"format\":\"OpenAI\",\"version\":\"2024-07-18\"},\"sku\":{\"name\":\"GlobalStandard\",\"capacity\":50}}]"
},
"aiProjectConnectionsJson": {
"value": "${AI_PROJECT_CONNECTIONS:=[]}"
Expand Down
142 changes: 142 additions & 0 deletions langgraph_agent_calculator.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
import os

from dotenv import load_dotenv
from langchain.chat_models import init_chat_model
from langchain_core.messages import SystemMessage, ToolMessage
from langchain_core.tools import tool
from langgraph.graph import (
END,
START,
MessagesState,
StateGraph,
)
from typing_extensions import Literal
from azure.identity import DefaultAzureCredential, get_bearer_token_provider

from azure.ai.agentshosting import from_langgraph

load_dotenv()

deployment_name = os.getenv("AZURE_OPENAI_CHAT_DEPLOYMENT_NAME", "gpt-4o-mini")
api_key = os.getenv("AZURE_OPENAI_API_KEY", "")

if api_key:
llm = init_chat_model(f"azure_openai:{deployment_name}")
else:
credential = DefaultAzureCredential()
token_provider = get_bearer_token_provider(
credential, "https://cognitiveservices.azure.com/.default"
)
llm = init_chat_model(
f"azure_openai:{deployment_name}",
azure_ad_token_provider=token_provider,
)


# Define tools
@tool
def multiply(a: int, b: int) -> int:
"""Multiply a and b.

Args:
a: first int
b: second int
"""
return a * b


@tool
def add(a: int, b: int) -> int:
"""Adds a and b.

Args:
a: first int
b: second int
"""
return a + b


@tool
def divide(a: int, b: int) -> float:
"""Divide a and b.

Args:
a: first int
b: second int
"""
return a / b


# Augment the LLM with tools
tools = [add, multiply, divide]
tools_by_name = {tool.name: tool for tool in tools}
llm_with_tools = llm.bind_tools(tools)


# Nodes
def llm_call(state: MessagesState):
"""LLM decides whether to call a tool or not"""

return {
"messages": [
llm_with_tools.invoke(
[
SystemMessage(
content="You are a helpful assistant tasked with performing arithmetic on a set of inputs."
)
]
+ state["messages"]
)
]
}


def tool_node(state: dict):
"""Performs the tool call"""

result = []
for tool_call in state["messages"][-1].tool_calls:
tool = tools_by_name[tool_call["name"]]
observation = tool.invoke(tool_call["args"])
result.append(ToolMessage(content=observation, tool_call_id=tool_call["id"]))
return {"messages": result}


# Conditional edge function to route to the tool node or end based upon whether the LLM made a tool call
def should_continue(state: MessagesState) -> Literal["environment", END]:
"""Decide if we should continue the loop or stop based upon whether the LLM made a tool call"""

messages = state["messages"]
last_message = messages[-1]
# If the LLM makes a tool call, then perform an action
if last_message.tool_calls:
return "Action"
# Otherwise, we stop (reply to the user)
return END


# Build workflow
agent_builder = StateGraph(MessagesState)

# Add nodes
agent_builder.add_node("llm_call", llm_call)
agent_builder.add_node("environment", tool_node)

# Add edges to connect nodes
agent_builder.add_edge(START, "llm_call")
agent_builder.add_conditional_edges(
"llm_call",
should_continue,
{
"Action": "environment",
END: END,
},
)
agent_builder.add_edge("environment", "llm_call")

# Compile the agent
agent = agent_builder.compile()

if __name__ == "__main__":
adapter = from_langgraph(agent)
adapter.run()
3 changes: 3 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
python-dotenv
my-agents-adapter[langgraph]==0.0.7
azure-identity
Loading