Skip to content

Commit 776ab25

Browse files
authored
Merge pull request #17 from Ravi80335/feat/tc-mcp-agent
Backend Agent integrated for MS Teams tab app
2 parents d4a6a22 + 8602c72 commit 776ab25

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

76 files changed

+20703
-116
lines changed

.dockerignore

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Ignore node_modules
2+
node_modules
3+
teamsTab/node_modules
4+
5+
# Ignore build artifacts
6+
dist
7+
teamsTab/dist
8+
9+
# Ignore logs
10+
*.log

.env.example

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
#
2+
TOPCODER_API_BASE_URL="https://api.topcoder.com/v5"
3+
AUTH0_M2M_TOKEN_URL="https://auth0proxy.topcoder-dev.com/token" # "https://topcoder-dev.auth0.com/oauth/token"
4+
AUTH0_M2M_AUDIENCE="https://m2m.topcoder-dev.com/"
5+
AUTH0_CLIENT_ID=""
6+
7+
# Config for LLM Agent for MS Teams tab app
8+
# Azure AD SSO Configuration (from your App Registration in Entra ID)
9+
AZURE_AD_AUDIENCE="api://diamondlike-crosstied-yuette.ngrok-free.app/82d17b02-2d34-4594-b243-09c516aad2e8"
10+
AZURE_AD_TENANT_ID="9c8f4932-d163-42f7-aed6-b0117730a6e6"
11+
IS_SAME_AZURE_AD_TENANT=false # true to check loggedIn user tenant should be same
12+
MOCK_AZURE_AD_VALIDATION=false # true to use mock token instead of Azure AD
13+
14+
AWS_ACCESS_KEY_ID=""
15+
AWS_SECRET_ACCESS_KEY=""
16+
AWS_BEDROCK_REGION="us-east-1"
17+
AWS_BEDROCK_MODEL_ID="anthropic.claude-3-5-sonnet-20240620-v1:0";
18+
19+
MONGO_DB_URL="mongodb://localhost:27017/teams-ai-agent"
20+
21+
22+
# *** *********************************** *** #
23+
# *** Frontend /teamsTab VITE Environment *** #
24+
# Backend api url for frontend
25+
VITE_API_BASE_URL="https://api.topcoder.com/v6/mcp/agent"
26+
27+
# For local development in web browser, use this variable to bypass MS Teams related code like authentication, theme etc.
28+
VITE_IS_NOT_TEAMS_TAB=true
29+
30+
# Use this variable to bypass Azure AD SSO and use a mock token.
31+
# NOTE: In production, use Azure AD to authenticate requests.
32+
VITE_MOCK_VALIDATE_TOKEN=true

Dockerfile

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,16 @@ FROM node:22.13.1-alpine
55
RUN apk add --no-cache bash
66
RUN apk update
77

8+
# Declare ARGs to receive the variables from build command.
9+
ARG VITE_API_BASE_URL
10+
11+
ENV VITE_API_BASE_URL=$VITE_API_BASE_URL
12+
813
WORKDIR /app
914
COPY . .
1015
RUN npm install pnpm -g
1116
RUN pnpm install
17+
RUN pnpm run build:frontend
1218
RUN pnpm run build
1319
RUN chmod +x appStartUp.sh
1420
CMD ./appStartUp.sh

README.md

Lines changed: 189 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -1,56 +1,190 @@
1-
# Topcoder Model Context Protocol (MCP) Server
2-
3-
## Authentication Based Access via Guards
4-
5-
Tools/Resources/Prompts support authentication via TC JWT and/or M2M JWT. Providing JWT in the requests to the MCP server will result in specific listings and bahavior based on JWT access level/roles/permissions.
6-
7-
#### Using `authGuard` - requires TC jwt presence for access
8-
9-
```ts
10-
@Tool({
11-
name: 'query-tc-challenges-private',
12-
description:
13-
'Returns a list of Topcoder challenges based on the query parameters.',
14-
parameters: QUERY_CHALLENGES_TOOL_PARAMETERS,
15-
outputSchema: QUERY_CHALLENGES_TOOL_OUTPUT_SCHEMA,
16-
annotations: {
17-
title: 'Query Public Topcoder Challenges',
18-
readOnlyHint: true,
19-
},
20-
canActivate: authGuard,
21-
})
22-
```
23-
24-
#### Using `checkHasUserRole(Role.Admin)` - TC Role based guard
25-
26-
```ts
27-
@Tool({
28-
name: 'query-tc-challenges-protected',
29-
description:
30-
'Returns a list of Topcoder challenges based on the query parameters.',
31-
parameters: QUERY_CHALLENGES_TOOL_PARAMETERS,
32-
outputSchema: QUERY_CHALLENGES_TOOL_OUTPUT_SCHEMA,
33-
annotations: {
34-
title: 'Query Public Topcoder Challenges',
35-
readOnlyHint: true,
36-
},
37-
canActivate: checkHasUserRole(Role.Admin),
38-
})
39-
```
40-
41-
#### Using `canActivate: checkM2MScope(M2mScope.QueryPublicChallenges)` - M2M based access via scopes
42-
43-
```ts
44-
@Tool({
45-
name: 'query-tc-challenges-m2m',
46-
description:
47-
'Returns a list of Topcoder challenges based on the query parameters.',
48-
parameters: QUERY_CHALLENGES_TOOL_PARAMETERS,
49-
outputSchema: QUERY_CHALLENGES_TOOL_OUTPUT_SCHEMA,
50-
annotations: {
51-
title: 'Query Public Topcoder Challenges',
52-
readOnlyHint: true,
53-
},
54-
canActivate: checkM2MScope(M2mScope.QueryPublicChallenges),
55-
})
1+
# Microsoft Teams AI Agent
2+
3+
A production-grade Microsoft Teams Tab app featuring a conversational AI agent powered by **LangChain.js** and **AWS Bedrock**.
4+
The agent integrates with mcp tools to answer real-time queries beyond its base model knowledge.
5+
6+
---
7+
8+
## ✨ Key Features
9+
10+
* **Conversational AI:** Powered by LangChain.js + AWS Bedrock (Claude 3.5 Sonnet).
11+
* **External Tools:** Fetch live contextual data through external integrations.
12+
* **Real-time Chat Streaming:** Uses SSE for continuous agent thought updates.
13+
* **Conversation History:** Stored and grouped in MongoDB for persistence.
14+
* **Azure AD SSO:** Secure Teams authentication for verified access.
15+
* **Fluent UI + Teams SDK:** Seamless user experience inside Teams.
16+
* **Flexible Deployment:** Manual dev setup + production-ready Docker build.
17+
18+
---
19+
20+
## 🧩 Technology Stack
21+
22+
| Frontend (Vite) | Backend (Node.js + Express) |
23+
| --------------------------- | ----------------------------------------- |
24+
| ✅ React + TypeScript (Vite) | ✅ LangChain.js + AWS Bedrock (Claude 3.5) |
25+
| ✅ Fluent UI + Teams SDK | ✅ MongoDB (Mongoose ODM) |
26+
| ✅ Vite environment support | ✅ MCP Gateway for tool access |
27+
28+
| AI & Security | Infrastructure |
29+
| --------------------------------- | --------------------------------- |
30+
| ✅ AWS Bedrock (Claude 3.5 Sonnet) | ✅ Docker-based production build |
31+
| ✅ Azure Active Directory (SSO) | ✅ Environment-based configuration |
32+
| ✅ JWT Validation + JWKS-RSA | ✅ ngrok for local Teams testing |
33+
34+
---
35+
36+
## 🧠 Local Development Setup (Manual)
37+
38+
You can run the frontend and backend separately for local testing.
39+
This is the preferred approach during active development.
40+
41+
## Prerequisites
42+
43+
* **Node.js:** v22.x
44+
* **MongoDB:** Local instance or MongoDB Atlas
45+
* **ngrok:** To expose your servers for Teams testing
46+
* **[Azure AD App Registration: For SSO](./docs/AzureConfig.md)**
47+
* **AWS Bedrock access**
48+
49+
### Step 1: Clone Repository
50+
51+
```bash
52+
git clone https://github.com/topcoder-platform/tc-mcp.git
53+
cd tc-mcp
54+
```
55+
56+
---
57+
58+
### Step 2: Backend Setup
59+
60+
```bash
61+
pnpm install
62+
cp .env.example .env
63+
# Edit .env with MongoDB, Azure, AWS credentials, Frontend env too
64+
pnpm start:dev
5665
```
66+
67+
Backend will run at `http://localhost:3000/v6/mcp/*`.
68+
69+
---
70+
71+
### Step 3: Frontend Setup
72+
73+
```bash
74+
cd teamsTab
75+
pnpm install
76+
pnpm run dev
77+
```
78+
79+
Frontend will run at `http://localhost:5173/teamsTab`.
80+
81+
---
82+
83+
### Step 4: Expose Local Servers for Teams
84+
85+
To test inside Teams, both servers must be public. It's best to get static url from ngrok for frontend, So we can setup Azure AD, MS Teams app with this static url once, and also prefer to take static url for backend too.
86+
87+
```bash
88+
# For frontend
89+
ngrok http --url=your-ngrok-static-url-frontend.app 5173
90+
# For backend
91+
ngrok http 3000
92+
```
93+
94+
You’ll get two public URLs:
95+
96+
* Frontend → `https://your-ngrok-static-url-frontend.app`
97+
* Backend → `https://<backend-id>.ngrok-free.app`
98+
99+
> **Note:**
100+
> * ngrok frontend url should be added to `teamsTab\vite.config.ts` allowed hosts for development environment.
101+
> * ngrok backend url should be added to `.env` for `VITE_API_BASE_URL` development
102+
- Example: `VITE_API_BASE_URL=https://<backend-id>.ngrok-free.app/v6/mcp/agent`
103+
104+
---
105+
106+
### Step 5: Configure Teams Manifest
107+
108+
Edit:
109+
110+
```
111+
teamsTab/appPackageDev/manifest.json
112+
```
113+
114+
### Read: [MsTeamsConfig.md](./MsTeamsConfig.md)
115+
116+
117+
---
118+
119+
### Step 6: Sideload App in Teams
120+
121+
1. Zip the following from `teamsTab/appPackageDev/`:
122+
123+
* `manifest.json`
124+
* `color.png`
125+
* `outline.png`
126+
2. Go to **Microsoft Teams → Apps → Upload a custom app** and upload the zip.
127+
128+
You can now test the full AI agent directly in the Teams client.
129+
130+
---
131+
132+
# 🐳 Production / Deployment (Dockerized)
133+
134+
When you’re ready to deploy (e.g., on **Railway**, **Render**, or **AWS ECS**), use the provided `Dockerfile`.
135+
136+
### Dockerfile Overview
137+
138+
* Installs dependencies
139+
* Builds the frontend (`teamsTab/`)
140+
* Builds the backend
141+
* Starts the server using `appStartUp.sh`
142+
* Frontend & Backend Agent will be available in single url
143+
- http://localhost:3000/teamsTab - Frontend for MS Teams app
144+
- http://localhost:3000/v6/mcp/agent - Backend Agent for frontend
145+
- http://localhost:3000/v6/mcp/* - Other Backend endpoints - `/mcp`, `/sse` etc
146+
147+
148+
### Build and Run
149+
150+
```bash
151+
docker build -t teams-ai-agent .
152+
docker run -d -p 3000:3000 teams-ai-agent
153+
```
154+
155+
> **Note:**
156+
> * Configure environment variables **directly in your hosting platform’s dashboard**, such as **Railway**, **AWS ECS / Lightsail**, or **Render** — no `.env` file needed.
157+
> * Most CI/CD platforms automatically include environment variables for required build arguments when running the Docker build.
158+
>
159+
> - For example, the build command would be like:
160+
> `docker build --build-arg VITE_API_BASE_URL="https://api.topcoder.com/v6/mcp/agent" -t teams-ai-agent .`
161+
>
162+
> *
163+
> **💡 Note:**
164+
> * Local docker build will use root .env since it is not added to `.dockerignore`,
165+
> * So no need to pass VITE_API_BASE_URL as Arg at `docker build -t teams-ai-agent .`
166+
167+
---
168+
169+
170+
### Optional: ngrok for Local Preview in Docker
171+
172+
You can still run:
173+
174+
```bash
175+
ngrok http --url=your-ngrok-static-url-frontend.app 3000
176+
```
177+
178+
And use that public URL in your Teams manifest for quick cloud-like testing.
179+
180+
---
181+
### Read: [AzureConfig.md](./docs/AzureConfig.md), [MsTeamsConfig.md](./docs/MsTeamsConfig.md)
182+
183+
### Summary
184+
185+
| Mode | How to Run | Notes |
186+
| -------------- | ---------------------------------------------- | --------------------------------------------- |
187+
| **Local Dev** | frontend + backend separately | Fast iteration, live reload |
188+
| **Production** | `docker build && docker run` | Uses built Vite files, NestJS will serve frontend |
189+
| **Teams Test** | Use ngrok URLs | Needed for Teams to access your local servers |
190+

docs/Agent.md

Lines changed: 84 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
## Teams AI Agent: System Architecture
2+
3+
This diagram illustrates the high-level components of the application and their relationships. It shows how our services, hosted on Azure, interact with each other and with external services to deliver the full functionality to a user within Microsoft Teams.
4+
5+
### Architectural Summary
6+
7+
1. **Client-Side:** The user interacts with the React application, which is served as a Tab inside the Microsoft Teams client. The frontend's primary responsibilities are rendering the UI, managing client-side state, and initiating authenticated API calls.
8+
2. **Authentication:** Azure Active Directory is the identity provider. The frontend uses the Teams JS SDK to get an SSO token, which is sent with every API request. The backend validates this token on every call to ensure the request is secure and authorized.
9+
3. **Backend:** The Node.js application, hosted on Azure App Service, is the core of the system.
10+
* It securely loads all its secrets (API keys, connection strings) from **Azure Key Vault** using a passwordless **Managed Identity**.
11+
* It exposes a single primary API endpoint (`/v6/mcp/agent/chat`) that uses Server-Sent Events (SSE) for real-time communication.
12+
* It instantiates a **LangChain Agent** to handle the conversational logic.
13+
4. **AI & Tools:** The LangChain Agent orchestrates calls to external services. It sends the user's prompt and conversation history to **AWS Bedrock** for processing and calls the **Topcoder MCP Gateway** when the AI model decides a tool is needed to answer a question.
14+
5. **Data Persistence:** All conversation history is stored in MongoDB API, providing a scalable and durable memory for the agent.
15+
16+
### Sequence Diagram: A Single Chat Message with a Tool Call
17+
18+
This diagram illustrates the step-by-step flow of data and method calls for a "happy path" scenario where a user sends a message, the agent decides to use a tool, and then responds with a summary.
19+
20+
```mermaid
21+
sequenceDiagram
22+
participant User
23+
participant Frontend as React Frontend
24+
participant Backend as Node.js Backend
25+
participant LangChain as LangChain Agent
26+
participant CosmosDB as MongoDB
27+
participant Bedrock as AWS Bedrock
28+
participant MCP as Topcoder MCP
29+
30+
User->>Frontend: Types "Show me an active challenge" and clicks Send
31+
Frontend->>Backend: POST /v6/mcp/agent/chat (with SSO Token)
32+
33+
rect rgb(230, 240, 255)
34+
note over Backend: Middleware: `validateToken` runs
35+
Backend->>AzureAD: Verify Token Signature (using cached public keys)
36+
AzureAD-->>Backend: OK
37+
end
38+
39+
Backend->>LangChain: Create Agent Instance
40+
LangChain->>CosmosDB: getMessageHistory(sessionId)
41+
CosmosDB-->>LangChain: Return previous messages
42+
43+
LangChain->>Bedrock: streamEvents(prompt, history, tools)
44+
Bedrock-->>LangChain: Stream Chunks (Decides to use a tool)
45+
46+
loop Streaming Response to Client
47+
LangChain-->>Backend: Yields 'thinking' chunks
48+
Backend-->>Frontend: SSE: event: message, data: {type: "chunk", ...}
49+
end
50+
51+
LangChain-->>Backend: Yields 'tool_start' event for `query-tc-challenges`
52+
Backend-->>Frontend: SSE: event: message, data: {type: "tool_start", ...}
53+
54+
LangChain->>MCP: callTool('query-tc-challenges', {status: 'Active'})
55+
MCP-->>LangChain: Return JSON result of challenges
56+
57+
LangChain-->>Backend: Yields 'tool_result' event with data
58+
Backend-->>Frontend: SSE: event: message, data: {type: "tool_result", ...}
59+
60+
LangChain->>Bedrock: streamEvents(prompt, history, tool_result)
61+
Bedrock-->>LangChain: Stream Final Summary Chunks
62+
63+
loop Streaming Final Response
64+
LangChain-->>Backend: Yields final 'text' chunks
65+
Backend-->>Frontend: SSE: event: message, data: {type: "chunk", ...}
66+
end
67+
68+
rect rgb(255, 245, 230)
69+
note over Backend, CosmosDB: Finalization
70+
LangChain->>CosmosDB: addMessages(user_prompt, final_ai_response)
71+
CosmosDB-->>LangChain: OK
72+
Backend-->>Frontend: SSE: event: end
73+
end
74+
```
75+
76+
### Sequence Summary
77+
78+
1. **Request & Auth:** The user sends a prompt. The frontend sends it to the backend API along with the SSO token, which is validated.
79+
2. **Memory Retrieval:** The LangChain agent is created and immediately fetches the conversation history from Cosmos DB to provide context for the LLM.
80+
3. **First LLM Call:** The agent sends the full context to AWS Bedrock. Bedrock analyzes the request and decides that it needs to use the `query-tc-challenges` tool. It streams back its initial thoughts and this tool-use instruction.
81+
4. **Tool Execution:** The backend streams the "thinking" and "tool_start" status to the frontend. It then makes a direct API call to the Topcoder MCP Gateway.
82+
5. **Second LLM Call:** Once the tool result is received, the agent sends this new information back to AWS Bedrock, asking it to synthesize a final, human-readable answer.
83+
6. **Final Response:** Bedrock streams the final summary. The backend relays these text chunks to the frontend, which displays them to the user.
84+
7. **Finalization:** Once the stream is complete, the agent's memory manager saves the new user message and the final AI response back to Mongo DB for future conversations. The SSE connection is then closed.

0 commit comments

Comments
 (0)