The modern, high-performance VK Teams Bot API toolkit for Rust π¦
Complete ecosystem: Client library + CLI tools + MCP server + Storage infrastructure
- π Fast: Rust performance with zero-cost abstractions
 - π οΈ Complete Toolkit: Library, CLI, MCP server, and storage in one ecosystem
 - π€ AI-Ready: Native Model Context Protocol (MCP) support for LLM integration
 - πΎ Smart Storage: PostgreSQL + Vector search for semantic message analysis
 - β‘ Developer-First: Intuitive CLI with auto-completion and colored output
 - π’ Enterprise-Grade: Memory-safe, concurrent, production-ready
 - π¦ Modular: Use only what you need - each component works independently
 
cargo install vkteams-bot-cli
# Set your credentials (or use config file)
export VKTEAMS_BOT_API_TOKEN="your_token_here"
export VKTEAMS_BOT_API_URL="your_api_url"
# Send your first message
vkteams-bot-cli send-text -u user123 -m "Hello from Rust! π¦"[dependencies]
vkteams-bot = "0.11"
tokio = { version = "1.0", features = ["full"] }use vkteams_bot::Bot;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let bot = Bot::with_default_version("API_TOKEN", "API_URL");
    
    // Send a message
    bot.send_text("chat_id", "Hello, World! π").await?;
    
    // Listen for events
    let events = bot.get_events().await?;
    println!("Received {} events", events.len());
    
    Ok(())
}| Component | Description | Version | 
|---|---|---|
| π Core Library | High-performance async VK Teams Bot API client | vkteams-bot v0.11 | 
| π₯οΈ CLI Tool | Feature-complete command-line interface with storage | vkteams-bot-cli v0.7 | 
| π€ MCP Server | AI/LLM integration via Model Context Protocol | vkteams-bot-mcp v0.4 | 
| βοΈ Macros | Development productivity macros | vkteams-bot-macros | 
- PostgreSQL Integration: Full event and message history storage
 - Vector Search: Semantic search using pgvector extension
 - AI Embeddings: OpenAI and Ollama support for text embeddings
 - Smart Search: Full-text and semantic similarity search
 
- 30+ AI Tools: Messages, files, chats, storage operations
 - Interactive Setup: Automatic chat ID elicitation - no manual configuration needed
 - Context Management: Automatic conversation context retrieval
 - CLI-as-Backend: Unified architecture for consistency
 
# Start all services
docker-compose up -d
# Start only essential services
docker-compose --profile relational-database up -d
# Add vector search
docker-compose --profile vector-search up -d- π’ Enterprise Chat Automation: HR bots, IT support, business process automation
 - π€ AI-Powered Assistants: LLM integration with Claude, ChatGPT via MCP
 - β‘ DevOps Integration: CI/CD notifications, monitoring alerts, deployment status
 - π Business Intelligence: Data reporting, analytics dashboards, scheduled reports
 - π Knowledge Management: Semantic search across chat history
 - π§ Internal Tools: Custom workflows, approval processes, team coordination
 
# Interactive event monitoring with filtering
vkteams-bot-cli get-events -l true | grep "ALARM"
# Batch file operations
find ./reports -name "*.pdf" | xargs -I {} vkteams-bot-cli send-file -u team_lead -p {}
# Semantic search in message history
vkteams-bot-cli storage search-semantic "deployment issues last week"
# Get conversation context for AI
vkteams-bot-cli storage get-context -c chat123 --limit 50
# Storage statistics
vkteams-bot-cli storage statsIntegrate VK Teams bots directly with AI assistants:
// Claude Desktop config
{
  "mcpServers": {
    "vkteams-bot": {
      "command": "vkteams-bot-mcp",
      "env": {
        "VKTEAMS_BOT_API_TOKEN": "your_token",
        "VKTEAMS_BOT_API_URL": "your_api_url",
        "DATABASE_URL": "postgresql://localhost/vkteams"
      }
    }
  }
}Note: Chat ID is no longer required in configuration! The MCP server automatically prompts for it when first used.
Now Claude can:
- Auto-configure: Chat ID is requested interactively when first needed
 - Send messages and manage files
 - Search chat history semantically
 - Get conversation context
 - Execute complex workflows
 
Create .config/shared-config.toml:
[api]
token = "your_bot_token"
url = "https://api.vk.com"
[storage]
[storage.database]
url = "postgresql://localhost/vkteams"
auto_migrate = true
[storage.embedding]
provider = "ollama"  # or "openai"
model = "nomic-embed-text"
endpoint = "http://localhost:11434"
[mcp]
enable_storage_tools = true
enable_file_tools = true# Clone and build
git clone https://github.com/bug-ops/vkteams-bot
cd vkteams-bot
cargo build --release
# Run tests with coverage
cargo llvm-cov nextest report
# Check documentation
cargo doc --open
# Run with Docker
docker-compose up -d- π API Docs: docs.rs/vkteams-bot
 - π― VK Teams Bot API: teams.vk.com/botapi
 - π Examples: GitHub Examples
 - π€ MCP Protocol: Model Context Protocol