Technical deep-dive into the design and implementation of kibob.
- Design Philosophy
- Architecture Overview
- Core Modules
- Data Flow
- Extension Points
- Testing Strategy
- Performance Considerations
- Modularity - Loosely coupled components with clear interfaces
- Extensibility - Easy to add new object types, storage backends, transformations
- Type Safety - Leverage Rust's type system to prevent bugs at compile time
- Async First - Non-blocking I/O for efficient network and file operations
- Explicit Over Implicit - Clear data flow, no hidden magic
- Testability - Every component can be tested in isolation
The Extract-Transform-Load pattern provides:
- Separation of concerns - Network, business logic, and storage are independent
- Pipeline composition - Chain operations declaratively
- Reusability - Extractors and loaders work with any transformers
- Observability - Each stage can be instrumented independently
- Performance - Near C-level speed without garbage collection pauses
- Safety - No null pointer exceptions, data races, or memory leaks
- Concurrency - Fearless concurrency with Tokio async runtime
- Ecosystem - Excellent HTTP, JSON, and CLI libraries
- Single binary - No runtime dependencies, easy distribution
┌─────────────────────────────────────────────────────────────┐
│ CLI Layer │
│ (src/main.rs, src/cli.rs) │
└───────────────────────────────┬─────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ Pipeline Orchestration │
│ (src/etl/pipeline.rs) │
└───────────────────────────────┬─────────────────────────────┘
│
┌───────────────┼───────────────┐
▼ ▼ ▼
┌───────────┐ ┌────────────┐ ┌──────────┐
│ Extractor │ │ Transformer│ │ Loader │
│ (Pull) │ │ (Process) │ │ (Push) │
└─────┬─────┘ └──────┬─────┘ └────┬─────┘
│ │ │
▼ ▼ ▼
┌──────────────┐ ┌─────────────┐ ┌──────────────┐
│ Kibana Client│ │ Transform │ │ Storage │
│ (HTTP API) │ │ Logic │ │ (Files/NDJSON)│
└──────────────┘ └─────────────┘ └──────────────┘
main.rs
└─> cli.rs
├─> etl/pipeline.rs
│ ├─> etl/extract.rs (trait)
│ ├─> etl/transform.rs (trait)
│ └─> etl/load.rs (trait)
├─> kibana/saved_objects/
│ ├─> extractor.rs (implements Extractor)
│ ├─> loader.rs (implements Loader)
│ └─> manifest.rs (data structures)
├─> storage/
│ ├─> directory.rs (implements Loader/Extractor)
│ ├─> ndjson.rs (implements Loader/Extractor)
│ └─> gitignore.rs (utility)
├─> transform/
│ ├─> field_dropper.rs (implements Transformer)
│ ├─> field_escaper.rs (implements Transformer)
│ └─> managed_flag.rs (implements Transformer)
└─> client/
├─> kibana.rs (HTTP client)
└─> auth.rs (authentication)
The heart of kibob's architecture. Defines three core traits:
#[async_trait]
pub trait Extractor {
async fn extract(&self) -> Result<Vec<Value>>;
}Purpose: Fetch data from a source (Kibana API, files, etc.)
Implementations:
SavedObjectsExtractor- Fetches from Kibana APIDirectoryReader- Reads from filesystemNdjsonReader- Parses NDJSON files
#[async_trait]
pub trait Transformer {
async fn transform(&self, data: Vec<Value>) -> Result<Vec<Value>>;
}Purpose: Modify data between extraction and loading
Implementations:
FieldDropper- Removes metadata fields (managed, updated_at, etc.)FieldEscaper- Escapes JSON strings for KibanaFieldUnescaper- Unescapes JSON strings for readabilityManagedFlagAdder- Adds managed flag
Chaining Example:
// Pull pipeline: Kibana → Clean → Unescape → Files
let pipeline = Pipeline::new()
.with_extractor(SavedObjectsExtractor::new(client, manifest))
.with_transformer(FieldDropper::new(vec!["managed", "updated_at"]))
.with_transformer(FieldUnescaper::new(vec!["attributes.kibanaSavedObjectMeta"]))
.with_loader(DirectoryWriter::new("objects/"));#[async_trait]
pub trait Loader {
async fn load(&self, data: Vec<Value>) -> Result<usize>;
}Purpose: Write data to destination (Kibana API, files, etc.)
Implementations:
SavedObjectsLoader- Uploads to Kibana APIDirectoryWriter- Writes to filesystemNdjsonWriter- Creates NDJSON files
pub struct Pipeline {
extractor: Option<Box<dyn Extractor>>,
transformers: Vec<Box<dyn Transformer>>,
loader: Option<Box<dyn Loader>>,
}
impl Pipeline {
pub async fn execute(&self) -> Result<usize> {
// 1. Extract
let mut data = self.extractor.extract().await?;
// 2. Transform (chain)
for transformer in &self.transformers {
data = transformer.transform(data).await?;
}
// 3. Load
let count = self.loader.load(data).await?;
Ok(count)
}
}Kibana-specific implementations of ETL traits.
pub struct SavedObjectsManifest {
pub version: String,
pub objects: Vec<ObjectReference>,
}
pub struct ObjectReference {
pub type_: String,
pub id: String,
pub attributes: Option<Value>,
}Purpose: Tracks which objects to manage
Format: JSON file at manifest/saved_objects.json
Responsibilities:
- Read manifest to get object list
- Call Kibana export API with object IDs
- Parse NDJSON response
- Return as Vec
Key Code:
impl Extractor for SavedObjectsExtractor {
async fn extract(&self) -> Result<Vec<Value>> {
let response = self.client
.export_objects(&self.manifest.objects)
.await?;
let objects = parse_ndjson(&response)?;
Ok(objects)
}
}Responsibilities:
- Receive objects as Vec
- Convert to NDJSON format
- Call Kibana import API
- Handle import results
Key Features:
- Overwrites existing objects (idempotent)
- Supports managed flag
- Error handling for conflicts
File and directory operations.
Structure:
objects/
├── dashboard/
│ ├── abc-123.json
│ └── xyz-789.json
├── visualization/
│ └── def-456.json
└── index-pattern/
└── logs-*.json
DirectoryReader:
- Scans directory tree
- Groups by object type
- Loads JSON files
- Returns Vec
DirectoryWriter:
- Receives Vec
- Organizes by type into subdirectories
- Pretty-prints JSON (2-space indent)
- Handles special characters in filenames
Format: Newline-delimited JSON
{"type":"dashboard","id":"abc","attributes":{...}}
{"type":"visualization","id":"xyz","attributes":{...}}NdjsonReader:
- Reads file line-by-line
- Parses each line as JSON
- Skips empty lines
- Returns Vec
NdjsonWriter:
- Receives Vec
- Serializes each as single-line JSON
- Appends newline
- Writes to file
pub struct GitIgnore {
patterns: Vec<String>,
}
impl GitIgnore {
pub fn should_ignore(&self, path: &Path) -> bool {
// Pattern matching logic
}
pub fn ensure_patterns(&mut self, path: &Path) {
// Add patterns to .gitignore if missing
}
}Patterns added:
.env*- Never commit credentials*.ndjson- Temporary export filesmanifest.json.bak- Backup files
Data transformation implementations.
Purpose: Remove unwanted metadata fields
Example:
let dropper = FieldDropper::new(vec![
"managed",
"updated_at",
"version",
]);
// Before:
{"type": "dashboard", "id": "abc", "managed": true, "version": "8.0"}
// After:
{"type": "dashboard", "id": "abc"}Purpose: Handle Kibana's JSON string escaping
Why needed: Kibana stores JSON objects as escaped strings:
{
"attributes": {
"kibanaSavedObjectMeta": "{\"searchSourceJSON\": \"{\\\"query\\\":{}}\"}"
}
}FieldUnescaper (Pull): Converts strings to objects for readability FieldEscaper (Push): Converts objects back to strings for Kibana
Purpose: Add managed: true/false to objects
Usage:
let adder = ManagedFlagAdder::new(true); // managed: trueEffect: Controls whether objects are editable in Kibana UI
HTTP client for Kibana API.
pub struct KibanaClient {
base_url: String,
client: reqwest::Client,
auth: Auth,
space: String,
}
impl KibanaClient {
pub async fn export_objects(&self, refs: &[ObjectReference]) -> Result<String> {
let url = format!("{}/api/saved_objects/_export", self.base_url);
let body = create_export_body(refs);
let response = self.client
.post(&url)
.header("kbn-xsrf", "true")
.json(&body)
.send()
.await?;
Ok(response.text().await?)
}
pub async fn import_objects(&self, ndjson: &str, overwrite: bool) -> Result<()> {
let url = format!("{}/api/saved_objects/_import", self.base_url);
let form = multipart::Form::new()
.text("file", ndjson.to_string())
.text("overwrite", overwrite.to_string());
self.client
.post(&url)
.header("kbn-xsrf", "true")
.multipart(form)
.send()
.await?;
Ok(())
}
}pub enum Auth {
None,
Basic { username: String, password: String },
ApiKey { key: String },
}
impl Auth {
pub fn apply(&self, request: RequestBuilder) -> RequestBuilder {
match self {
Auth::None => request,
Auth::Basic { username, password } => {
request.basic_auth(username, Some(password))
}
Auth::ApiKey { key } => {
request.header("Authorization", format!("ApiKey {}", key))
}
}
}
}Helper functions that compose pipelines for commands.
pub async fn pull_saved_objects(output_dir: &str) -> Result<usize> {
// Load manifest
let manifest = load_saved_objects_manifest(output_dir)?;
// Create client
let client = load_kibana_client()?;
// Build pipeline
let pipeline = Pipeline::new()
.with_extractor(SavedObjectsExtractor::new(client, manifest))
.with_transformer(FieldDropper::new(vec!["managed", "updated_at"]))
.with_transformer(FieldUnescaper::new(vec!["attributes"]))
.with_loader(DirectoryWriter::new(format!("{}/objects", output_dir)));
// Execute
pipeline.execute().await
}pub async fn push_saved_objects(input_dir: &str, managed: bool) -> Result<usize> {
let client = load_kibana_client()?;
let pipeline = Pipeline::new()
.with_extractor(DirectoryReader::new(format!("{}/objects", input_dir)))
.with_transformer(FieldEscaper::new(vec!["attributes"]))
.with_transformer(ManagedFlagAdder::new(managed))
.with_loader(SavedObjectsLoader::new(client));
pipeline.execute().await
}┌─────────────────────────────────────────────────────────────┐
│ 1. Load Manifest │
│ manifest/saved_objects.json → ObjectReference[] │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 2. SavedObjectsExtractor │
│ POST /api/saved_objects/_export │
│ Returns: NDJSON string │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 3. Parse NDJSON → Vec<Value> │
│ Parse each line as JSON object │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 4. FieldDropper │
│ Remove: managed, updated_at, version, references │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 5. FieldUnescaper │
│ Convert escaped JSON strings to objects │
│ "attributes.kibanaSavedObjectMeta.searchSourceJSON" │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 6. DirectoryWriter │
│ Write to: objects/{type}/{id}.json │
│ Pretty print with 2-space indent │
└─────────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────┐
│ 1. DirectoryReader │
│ Scan: objects/{type}/*.json │
│ Returns: Vec<Value> │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 2. FieldEscaper │
│ Convert objects to escaped JSON strings │
│ For Kibana compatibility │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 3. ManagedFlagAdder │
│ Add: "managed": true/false │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 4. Convert to NDJSON │
│ Serialize each object as single-line JSON │
│ Join with newlines │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ 5. SavedObjectsLoader │
│ POST /api/saved_objects/_import │
│ multipart/form-data with NDJSON │
│ overwrite=true │
└─────────────────────────────────────────────────────────────┘
- Update manifest format (if needed)
- No code changes required! ETL is object-type agnostic
Example: Add Canvas workpads
{
"objects": [
{
"type": "canvas-workpad",
"id": "my-workpad-id",
"attributes": {"title": "My Workpad"}
}
]
}Implement the Loader and Extractor traits:
pub struct S3Storage {
bucket: String,
prefix: String,
}
#[async_trait]
impl Loader for S3Storage {
async fn load(&self, data: Vec<Value>) -> Result<usize> {
// Upload to S3
}
}
#[async_trait]
impl Extractor for S3Storage {
async fn extract(&self) -> Result<Vec<Value>> {
// Download from S3
}
}Implement the Transformer trait:
pub struct TitlePrefixer {
prefix: String,
}
#[async_trait]
impl Transformer for TitlePrefixer {
async fn transform(&self, mut data: Vec<Value>) -> Result<Vec<Value>> {
for obj in &mut data {
if let Some(title) = obj.pointer_mut("/attributes/title") {
let new_title = format!("{}{}", self.prefix, title);
*title = Value::String(new_title);
}
}
Ok(data)
}
}
// Usage:
pipeline
.with_transformer(TitlePrefixer::new("[PROD] "))
.execute().await?;- Add variant to
Commandsenum insrc/main.rs - Create helper function in
src/cli.rs - Wire up in
matchstatement
Example: Add validate command
// In src/main.rs
Commands::Validate { dir } => {
validate_project(&dir).await?;
}
// In src/cli.rs
pub async fn validate_project(dir: &str) -> Result<()> {
// Load manifest
let manifest = load_saved_objects_manifest(dir)?;
// Check all referenced files exist
for obj in &manifest.objects {
let path = format!("{}/objects/{}/{}.json", dir, obj.type_, obj.id);
if !Path::new(&path).exists() {
return Err(eyre!("Missing object file: {}", path));
}
}
Ok(())
}Each module has comprehensive unit tests:
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_field_dropper() {
let dropper = FieldDropper::new(vec!["managed"]);
let input = json!({"id": "abc", "managed": true});
let output = dropper.drop_fields(input);
assert_eq!(output, json!({"id": "abc"}));
}
}Located in tests/ directory:
// tests/etl_integration.rs
#[tokio::test]
async fn test_pull_push_roundtrip() {
// Create test data
let temp_dir = TempDir::new()?;
// Pull from Kibana
pull_saved_objects(temp_dir.path()).await?;
// Verify files exist
assert!(temp_dir.path().join("objects/dashboard").exists());
// Push back to Kibana
push_saved_objects(temp_dir.path(), true).await?;
}Use traits for dependency injection:
pub struct MockExtractor {
pub data: Vec<Value>,
}
#[async_trait]
impl Extractor for MockExtractor {
async fn extract(&self) -> Result<Vec<Value>> {
Ok(self.data.clone())
}
}
// Test pipeline without real Kibana
let pipeline = Pipeline::new()
.with_extractor(MockExtractor { data: test_data })
.with_transformer(FieldDropper::new(vec!["managed"]))
.with_loader(MockLoader::new());Run tests with coverage:
cargo test --all
cargo tarpaulin --out HtmlCurrent coverage: ~85% (targeting 90%+)
All network and file operations use Tokio for non-blocking I/O:
// Multiple requests in parallel
let futures = objects.iter()
.map(|obj| client.fetch_object(obj))
.collect::<Vec<_>>();
let results = futures::future::join_all(futures).await;- Streaming NDJSON parsing - Don't load entire export into memory
- Incremental processing - Transform objects one at a time
- String interning - Reuse common strings (type names, field names)
reqwest reuses HTTP connections:
let client = reqwest::Client::builder()
.pool_max_idle_per_host(10)
.build()?;# 100 dashboards, local Kibana
kibob pull ./test-project
# Time: ~2.3s
# 100 dashboards, push
kibob push ./test-project
# Time: ~3.1s
# Memory usage: ~15MB peak-
Caching Layer
- Cache manifests in memory
- Skip unchanged objects during sync
-
Incremental Sync
- Compare checksums
- Only transfer changed objects
-
Parallel Processing
- Process multiple objects concurrently
- Batch API requests
-
Plugin System
- Dynamic transformer loading
- Custom extractors/loaders as plugins
-
Observability
- Structured logging with tracing
- Metrics collection
- OpenTelemetry integration
- Watch mode - Auto-sync on file changes
- Bidirectional sync - Merge changes from both sides
- Conflict resolution - Handle concurrent edits
- Delta encoding - Transfer only diffs
Want to extend kibob? See CONTRIBUTING.md for:
- Development setup
- Code style guidelines
- How to add new features
- Pull request process
- Kibana API Docs: https://www.elastic.co/guide/en/kibana/current/api.html
- Tokio Docs: https://tokio.rs/
- Async Trait: https://docs.rs/async-trait/
- reqwest Docs: https://docs.rs/reqwest/
Questions? Open an issue: https://github.com/VimCommando/kibana-object-manager/issues