Skip to content
This repository was archived by the owner on Sep 12, 2025. It is now read-only.

bug-ops/vkteams-bot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

VKTeams-Bot (unofficial)

Crates.io codecov Downloads docs.rs Rust Build CodeQL License

The modern, high-performance VK Teams Bot API toolkit for Rust πŸ¦€
Complete ecosystem: Client library + CLI tools + MCP server + Storage infrastructure

✨ Why?

  • πŸš€ Fast: Rust performance with zero-cost abstractions
  • πŸ› οΈ Complete Toolkit: Library, CLI, MCP server, and storage in one ecosystem
  • πŸ€– AI-Ready: Native Model Context Protocol (MCP) support for LLM integration
  • πŸ’Ύ Smart Storage: PostgreSQL + Vector search for semantic message analysis
  • ⚑ Developer-First: Intuitive CLI with auto-completion and colored output
  • 🏒 Enterprise-Grade: Memory-safe, concurrent, production-ready
  • πŸ“¦ Modular: Use only what you need - each component works independently

πŸš€ Quick Start

Install CLI (Fastest way to get started)

cargo install vkteams-bot-cli

# Set your credentials (or use config file)
export VKTEAMS_BOT_API_TOKEN="your_token_here"
export VKTEAMS_BOT_API_URL="your_api_url"

# Send your first message
vkteams-bot-cli send-text -u user123 -m "Hello from Rust! πŸ¦€"

Use as Library

[dependencies]
vkteams-bot = "0.11"
tokio = { version = "1.0", features = ["full"] }
use vkteams_bot::Bot;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let bot = Bot::with_default_version("API_TOKEN", "API_URL");
    
    // Send a message
    bot.send_text("chat_id", "Hello, World! 🌍").await?;
    
    // Listen for events
    let events = bot.get_events().await?;
    println!("Received {} events", events.len());
    
    Ok(())
}

πŸ”§ Complete Ecosystem

Component Description Version
πŸ“š Core Library High-performance async VK Teams Bot API client vkteams-bot v0.11
πŸ–₯️ CLI Tool Feature-complete command-line interface with storage vkteams-bot-cli v0.7
πŸ€– MCP Server AI/LLM integration via Model Context Protocol vkteams-bot-mcp v0.4
βš™οΈ Macros Development productivity macros vkteams-bot-macros

πŸ†• Storage & AI Features

πŸ’Ύ Storage Infrastructure

  • PostgreSQL Integration: Full event and message history storage
  • Vector Search: Semantic search using pgvector extension
  • AI Embeddings: OpenAI and Ollama support for text embeddings
  • Smart Search: Full-text and semantic similarity search

πŸ€– MCP Integration

  • 30+ AI Tools: Messages, files, chats, storage operations
  • Interactive Setup: Automatic chat ID elicitation - no manual configuration needed
  • Context Management: Automatic conversation context retrieval
  • CLI-as-Backend: Unified architecture for consistency

🐳 Docker Support

# Start all services
docker-compose up -d

# Start only essential services
docker-compose --profile relational-database up -d

# Add vector search
docker-compose --profile vector-search up -d

🎯 Use Cases

  • 🏒 Enterprise Chat Automation: HR bots, IT support, business process automation
  • πŸ€– AI-Powered Assistants: LLM integration with Claude, ChatGPT via MCP
  • ⚑ DevOps Integration: CI/CD notifications, monitoring alerts, deployment status
  • πŸ“Š Business Intelligence: Data reporting, analytics dashboards, scheduled reports
  • πŸ” Knowledge Management: Semantic search across chat history
  • πŸ”§ Internal Tools: Custom workflows, approval processes, team coordination

πŸš€ CLI Highlights

# Interactive event monitoring with filtering
vkteams-bot-cli get-events -l true | grep "ALARM"

# Batch file operations
find ./reports -name "*.pdf" | xargs -I {} vkteams-bot-cli send-file -u team_lead -p {}

# Semantic search in message history
vkteams-bot-cli storage search-semantic "deployment issues last week"

# Get conversation context for AI
vkteams-bot-cli storage get-context -c chat123 --limit 50

# Storage statistics
vkteams-bot-cli storage stats

πŸ€– AI Integration (MCP)

Integrate VK Teams bots directly with AI assistants:

// Claude Desktop config
{
  "mcpServers": {
    "vkteams-bot": {
      "command": "vkteams-bot-mcp",
      "env": {
        "VKTEAMS_BOT_API_TOKEN": "your_token",
        "VKTEAMS_BOT_API_URL": "your_api_url",
        "DATABASE_URL": "postgresql://localhost/vkteams"
      }
    }
  }
}

Note: Chat ID is no longer required in configuration! The MCP server automatically prompts for it when first used.

Now Claude can:

  • Auto-configure: Chat ID is requested interactively when first needed
  • Send messages and manage files
  • Search chat history semantically
  • Get conversation context
  • Execute complex workflows

βš™οΈ Configuration

Create .config/shared-config.toml:

[api]
token = "your_bot_token"
url = "https://api.vk.com"

[storage]
[storage.database]
url = "postgresql://localhost/vkteams"
auto_migrate = true

[storage.embedding]
provider = "ollama"  # or "openai"
model = "nomic-embed-text"
endpoint = "http://localhost:11434"

[mcp]
enable_storage_tools = true
enable_file_tools = true

πŸ› οΈ Development

# Clone and build
git clone https://github.com/bug-ops/vkteams-bot
cd vkteams-bot
cargo build --release

# Run tests with coverage
cargo llvm-cov nextest report

# Check documentation
cargo doc --open

# Run with Docker
docker-compose up -d

πŸ“– Documentation

Releases

No releases published

Contributors 4

  •  
  •  
  •  
  •  

Languages