ai.log
A Rust-based static blog generator with AI integration capabilities.
Overview
ai.log is part of the ai ecosystem - a static site generator that creates blogs with built-in AI features for content enhancement and atproto integration. The system follows the yui system principles with dual-layer MCP architecture.
Architecture
Dual MCP Integration
ai.log MCP Server (API Layer)
- Role: Independent blog API
- Port: 8002
- Location:
./src/mcp/
- Function: Core blog generation and management
ai.gpt Integration (Server Layer)
- Role: AI integration gateway
- Port: 8001 (within ai.gpt)
- Location:
../src/aigpt/mcp_server.py
- Function: AI memory system + HTTP proxy to ai.log
Data Flow
Claude Code → ai.gpt (Server/AI) → ai.log (API/Blog) → Static Site
↑ ↑
Memory System File Operations
Relationship AI Markdown Processing
Context Analysis Template Rendering
Features
- Static Blog Generation: Inspired by Zola, built with Rust
- AI-Powered Content: Memory-driven article generation via ai.gpt
- 🌍 Ollama Translation: Multi-language markdown translation with structure preservation
- atproto Integration: OAuth authentication and comment system (planned)
- MCP Integration: Seamless Claude Code workflow
Installation
cargo install ailog
Usage
Standalone Mode
# Initialize a new blog
ailog init myblog
# Create a new post
ailog new "My First Post"
# Build the blog
ailog build
# Serve locally
ailog serve
# Start MCP server
ailog mcp --port 8002
# Generate documentation
ailog doc readme --with-ai
ailog doc api --output ./docs
ailog doc structure --include-deps
# Translate documents (requires Ollama)
ailog doc translate --input README.md --target-lang en
ailog doc translate --input docs/api.md --target-lang ja --model qwen2.5:latest
# Clean build files
ailog clean
AI Ecosystem Integration
When integrated with ai.gpt, use natural language:
- "ブログ記事を書いて" → Triggers
log_ai_content
- "記事一覧を見せて" → Triggers
log_list_posts
- "ブログをビルドして" → Triggers
log_build_blog
Documentation & Translation
Generate comprehensive documentation and translate content:
- "READMEを生成して" → Triggers
log_generate_docs
- "APIドキュメントを作成して" → Generates API documentation
- "プロジェクト構造を解析して" → Creates structure documentation
- "このファイルを英語に翻訳して" → Triggers
log_translate_document
- "マークダウンを日本語に変換して" → Uses Ollama for translation
MCP Tools
ai.log Server (Port 8002)
create_blog_post
- Create new blog postlist_blog_posts
- List existing postsbuild_blog
- Build static siteget_post_content
- Get post by slugtranslate_document
⭐ - Ollama-powered markdown translationgenerate_documentation
⭐ - Code analysis and documentation generation
ai.gpt Integration (Port 8001)
log_create_post
- Proxy to ai.log + error handlinglog_list_posts
- Proxy to ai.log + formattinglog_build_blog
- Proxy to ai.log + AI featureslog_get_post
- Proxy to ai.log + contextlog_system_status
- Health check for ai.loglog_ai_content
⭐ - AI memory → blog content generationlog_translate_document
🌍 - Document translation via Ollamalog_generate_docs
📚 - Documentation generation
Documentation Generation Tools
doc readme
- Generate README.md from project analysisdoc api
- Generate API documentationdoc structure
- Analyze and document project structuredoc changelog
- Generate changelog from git historydoc translate
🌍 - Multi-language document translation
Translation Features
- Language Support: English, Japanese, Chinese, Korean, Spanish
- Markdown Preservation: Code blocks, links, images, tables maintained
- Auto-Detection: Automatically detects Japanese content
- Ollama Integration: Uses local AI models for privacy and cost-efficiency
- Smart Processing: Section-by-section translation with structure awareness
Configuration
ai.log Configuration
- Location:
~/.config/syui/ai/log/
- Format: TOML configuration
ai.gpt Integration
- Configuration:
../config.json
- Auto-detection: ai.log tools enabled when
./log/
directory exists - System prompt: Automatically triggers blog tools for related queries
AI Integration Features
Memory-Driven Content Generation
- Source: ai.gpt memory system
- Process: Contextual memories → AI analysis → Blog content
- Output: Structured markdown with personal insights
Automatic Workflows
- Daily blog posts from accumulated memories
- Content enhancement and suggestions
- Related article recommendations
- Multi-language content generation
atproto Integration (Planned)
OAuth 2.0 Authentication
- Client metadata:
public/client-metadata.json
- Comment system integration
- Data sovereignty: Users own their comments
- Collection storage in atproto
Comment System
- atproto account login
- Distributed comment storage
- Real-time comment synchronization
Build & Deploy
GitHub Actions
# .github/workflows/gh-pages.yml
- name: Build ai.log
run: |
cd log
cargo build --release
./target/release/ailog build
Cloudflare Pages
- Static output:
./public/
- Automatic deployment on main branch push
- AI content generation during build process
Development Status
✅ Completed Features
- Project structure and Cargo.toml setup
- CLI interface (init, new, build, serve, clean, mcp, doc)
- Configuration system with TOML support
- Markdown parsing with frontmatter support
- Template system with Handlebars
- Static site generation with posts and pages
- Development server with hot reload
- MCP server integration (both layers)
- ai.gpt integration with 6 tools
- AI memory system connection
- 📚 Documentation generation from code
- 🔍 Rust project analysis and API extraction
- 📝 README, API docs, and structure analysis
- 🌍 Ollama-powered translation system
- 🚀 Complete MCP integration with ai.gpt
- 📄 Markdown-aware translation preserving structure
- Test blog with sample content and styling
🚧 In Progress
- AI-powered content enhancement pipeline
- atproto OAuth implementation
📋 Planned Features
- Advanced template customization
- Plugin system for extensibility
- Real-time comment system
- Multi-blog management
- VTuber integration (ai.verse connection)
Integration with ai Ecosystem
System Dependencies
- ai.gpt: Memory system, relationship tracking, AI provider
- ai.card: Future cross-system content sharing
- ai.bot: atproto posting and mention handling
- ai.verse: 3D world blog representation (future)
yui System Compliance
- Uniqueness: Each blog post tied to individual identity
- Reality Reflection: Personal memories → digital content
- Irreversibility: Published content maintains historical integrity
Getting Started
1. Standalone Usage
git clone [repository]
cd log
cargo run -- init my-blog
cargo run -- new "First Post"
cargo run -- build
cargo run -- serve
2. AI Ecosystem Integration
# Start ai.log MCP server
cargo run -- mcp --port 8002
# In another terminal, start ai.gpt
cd ../
# ai.gpt startup commands
# Use Claude Code with natural language blog commands
Documentation Generation Features
📚 Automatic README Generation
# Generate README from project analysis
ailog doc readme --source ./src --with-ai
# Output: Enhanced README.md with:
# - Project overview and metrics
# - Dependency analysis
# - Module structure
# - AI-generated insights
📖 API Documentation
# Generate comprehensive API docs
ailog doc api --source ./src --format markdown --output ./docs
# Creates:
# - docs/api.md (main API overview)
# - docs/module_name.md (per-module documentation)
# - Function signatures and documentation
# - Struct/enum definitions
🏗️ Project Structure Analysis
# Analyze and document project structure
ailog doc structure --source . --include-deps
# Generates:
# - Directory tree visualization
# - File distribution by language
# - Dependency graph analysis
# - Code metrics and statistics
📝 Git Changelog Generation
# Generate changelog from git history
ailog doc changelog --from v1.0.0 --explain-changes
# Creates:
# - Structured changelog
# - Commit categorization
# - AI-enhanced change explanations
🤖 AI-Enhanced Documentation
When --with-ai
is enabled:
- Content Enhancement: AI improves readability and adds insights
- Context Awareness: Leverages ai.gpt memory system
- Smart Categorization: Automatic organization of content
- Technical Writing: Professional documentation style
🌍 Translation System
Ollama-Powered Translation
ai.log includes a comprehensive translation system powered by Ollama AI models:
# Basic translation
ailog doc translate --input README.md --target-lang en
# Advanced translation with custom settings
ailog doc translate \
--input docs/technical-guide.ja.md \
--target-lang en \
--source-lang ja \
--output docs/technical-guide.en.md \
--model qwen2.5:latest \
--ollama-endpoint http://localhost:11434
Translation Features
📄 Markdown-Aware Processing
- Code Block Preservation: All code snippets remain untranslated
- Link Maintenance: URLs and link structures preserved
- Image Handling: Alt text can be translated while preserving image paths
- Table Translation: Table content translated while maintaining structure
- Header Preservation: Markdown headers translated with level maintenance
🎯 Smart Language Detection
- Auto-Detection: Automatically detects Japanese content using Unicode ranges
- Manual Override: Specify source language for precise control
- Mixed Content: Handles documents with multiple languages
🔧 Flexible Configuration
- Model Selection: Choose from available Ollama models
- Custom Endpoints: Use different Ollama instances
- Output Control: Auto-generate or specify output paths
- Batch Processing: Process multiple files efficiently
Supported Languages
Language | Code | Direction | Model Optimized |
---|---|---|---|
English | en |
↔️ | ✅ qwen2.5 |
Japanese | ja |
↔️ | ✅ qwen2.5 |
Chinese | zh |
↔️ | ✅ qwen2.5 |
Korean | ko |
↔️ | ⚠️ Basic |
Spanish | es |
↔️ | ⚠️ Basic |
Translation Workflow
- Parse Document: Analyze markdown structure and identify sections
- Preserve Code: Isolate code blocks and technical content
- Translate Content: Process text sections with Ollama AI
- Reconstruct: Rebuild document maintaining original formatting
- Validate: Ensure structural integrity and completeness
Integration with ai.gpt
# Via ai.gpt MCP tools
await log_translate_document(
input_file="README.ja.md",
target_lang="en",
model="qwen2.5:latest"
)
Requirements
- Ollama: Install and run Ollama locally
- Models: Download supported models (qwen2.5:latest recommended)
- Memory: Sufficient RAM for model inference
- Network: For initial model download only
Configuration Examples
Basic Blog Config
[blog]
title = "My AI Blog"
description = "Personal thoughts and AI insights"
base_url = "https://myblog.example.com"
[ai]
provider = "openai"
model = "gpt-4"
translation = true
Advanced Integration
// ../config.json (ai.gpt)
{
"mcp": {
"servers": {
"ai_gpt": {
"endpoints": {
"log_ai_content": "/log_ai_content",
"log_create_post": "/log_create_post"
}
}
}
}
}
Troubleshooting
MCP Connection Issues
- Ensure ai.log server is running:
cargo run -- mcp --port 8002
- Check ai.gpt config includes log endpoints
- Verify
./log/
directory exists relative to ai.gpt
Build Failures
- Check Rust version:
rustc --version
- Update dependencies:
cargo update
- Clear cache:
cargo clean
AI Integration Problems
- Verify ai.gpt memory system is initialized
- Check AI provider configuration
- Ensure sufficient context in memory system
License
© syui
Part of the ai ecosystem: ai.gpt, ai.card, ai.log, ai.bot, ai.verse, ai.shell
Description
Languages
Rust
92.8%
HTML
6.1%
CSS
1.1%