f52249c2921463d490f0ae0eaf57118f036c632c
- Add contents: write permission for private repository - Remove Windows zip file references - Simplify artifact upload paths - Fix release workflow permissions for private repos 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
ai.log
AI-powered static blog generator with ATProto integration, part of the ai.ai ecosystem.
🎯 Gitea Action Usage
Use ailog in your Gitea Actions workflow:
name: Deploy Blog
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: ai/log@v1
with:
content-dir: 'content'
output-dir: 'public'
ai-integration: true
atproto-integration: true
- uses: cloudflare/pages-action@v1
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
projectName: my-blog
directory: public
🚀 Quick Start
Development Setup
# 1. Clone and setup
git clone https://git.syui.ai/ai/log
cd log
# 2. Start development services
./run.zsh serve # Blog development server
./run.zsh c # Cloudflare tunnel (example.com)
./run.zsh o # OAuth web server
./run.zsh co # Comment system monitor
# 3. Start Ollama (for Ask AI)
brew install ollama
ollama pull gemma2:2b
OLLAMA_ORIGINS="https://example.com" ollama serve
Production Deployment
# 1. Build static site
hugo
# 2. Deploy to GitHub Pages
git add .
git commit -m "Update blog"
git push origin main
# 3. Automatic deployment via GitHub Actions
# Site available at: https://yourusername.github.io/repo-name
ATProto Integration
# 1. OAuth Client Setup (oauth/client-metadata.json)
{
"client_id": "https://example.com/client-metadata.json",
"client_name": "ai.log Blog System",
"redirect_uris": ["https://example.com/oauth/callback"],
"scope": "atproto",
"grant_types": ["authorization_code", "refresh_token"],
"response_types": ["code"],
"application_type": "web",
"dpop_bound_access_tokens": true
}
# 2. Comment System Configuration
# Collection: ai.syui.log (comments)
# User Management: ai.syui.log.user (registered users)
# 3. Services
./run.zsh o # OAuth authentication server
./run.zsh co # ATProto Jetstream comment monitor
Development with run.zsh
# Development
./run.zsh serve
# Production (with Cloudflare Tunnel)
./run.zsh tunnel
# OAuth app development
./run.zsh o
# Comment system monitoring
./run.zsh co
📋 Commands
Command | Description |
---|---|
./run.zsh c |
Enable Cloudflare tunnel (example.com) for OAuth |
./run.zsh o |
Start OAuth web server (port:4173 = example.com) |
./run.zsh co |
Start comment system (ATProto stream monitor) |
🏗️ Architecture (Pure Rust + HTML + JS)
ai.log/
├── oauth/ # 🎯 OAuth files (protected)
│ ├── oauth-widget-simple.js # Self-contained OAuth widget
│ ├── oauth-simple.html # OAuth authentication page
│ ├── client-metadata.json # ATProto configuration
│ └── README.md # Usage guide
├── my-blog/ # Blog content and templates
│ ├── content/posts/ # Markdown blog posts
│ ├── templates/ # Tera templates
│ ├── static/ # Static assets (OAuth copied here)
│ └── public/ # Generated site (build output)
├── src/ # Rust blog generator
├── scripts/ # Build and deployment scripts
└── run.zsh # 🎯 Main build script
✅ Node.js Dependencies Eliminated
- ❌
package.json
- Removed - ❌
node_modules/
- Removed - ❌
npm run build
- Not needed - ✅ Pure JavaScript OAuth implementation
- ✅ CDN-free, self-contained code
- ✅ Rust-only build process
📖 Original Features
概要
ai.logは、Anthropic Docsにインスパイアされたモダンなインターフェースを持つ、次世代静的ブログジェネレーターです。ai.gptとの深い統合、ローカルAI機能、atproto OAuth連携により、従来のブログシステムを超えた体験を提供します。
主な特徴
🎨 モダンインターフェース
- Anthropic Docs風デザイン: プロフェッショナルで読みやすい
- Timeline形式: BlueskyライクなタイムラインUI
- 自動TOC: 右サイドバーに目次を自動生成
- レスポンシブ: モバイル・デスクトップ対応
🤖 Ask AI機能 ✅
- ローカルAI: Ollama(gemma2:2b)による質問応答
- 認証必須: ATProto OAuth認証でアクセス制御
- トップページ限定: ブログコンテンツに特化した回答
- CORS解決済み: OLLAMA_ORIGINS設定でクロスオリジン問題解消
- プロフィール連携: AIアバターとしてATProtoプロフィール画像表示
- レスポンス最適化: 80文字制限+高いtemperatureで多様な回答
- ローディング表示: Font Awesomeアイコンによる一行ローディング
🔧 Ask AI設定方法
# 1. Ollama設定
brew install ollama
ollama pull gemma2:2b
# 2. CORS設定で起動
OLLAMA_ORIGINS="https://example.com" ollama serve
# 3. AI DID設定 (my-blog/templates/base.html)
const aiConfig = {
systemPrompt: 'You are a helpful AI assistant.',
aiDid: 'did:plc:your-ai-bot-did'
};
🌐 分散SNS連携
- atproto OAuth: Blueskyアカウントでログイン
- コメントシステム: 分散SNSコメント
- データ主権: ユーザーがデータを所有
🔗 エコシステム統合
- ai.gpt: ドキュメント同期・AI機能連携
- MCP Server: ai.gptからの操作をサポート
- ai.wiki: 自動ドキュメント同期
Architecture
Dual MCP Integration
ai.log MCP Server (API Layer)
- Role: Independent blog API
- Port: 8002
- Location:
./src/mcp/
- Function: Core blog generation and management
ai.gpt Integration (Server Layer)
- Role: AI integration gateway
- Port: 8001 (within ai.gpt)
- Location:
../src/aigpt/mcp_server.py
- Function: AI memory system + HTTP proxy to ai.log
Data Flow
Claude Code → ai.gpt (Server/AI) → ai.log (API/Blog) → Static Site
↑ ↑
Memory System File Operations
Relationship AI Markdown Processing
Context Analysis Template Rendering
Features
- Static Blog Generation: Inspired by Zola, built with Rust
- AI-Powered Content: Memory-driven article generation via ai.gpt
- 🌍 Ollama Translation: Multi-language markdown translation with structure preservation
- atproto Integration: OAuth authentication and comment system (planned)
- MCP Integration: Seamless Claude Code workflow
Installation
cargo install ailog
Usage
Standalone Mode
# Initialize a new blog
ailog init myblog
# Create a new post
ailog new "My First Post"
# Build the blog
ailog build
# Serve locally
ailog serve
# Start MCP server
ailog mcp --port 8002
# Generate documentation
ailog doc readme --with-ai
ailog doc api --output ./docs
ailog doc structure --include-deps
# Translate documents (requires Ollama)
ailog doc translate --input README.md --target-lang en
ailog doc translate --input docs/api.md --target-lang ja --model qwen2.5:latest
# Clean build files
ailog clean
AI Ecosystem Integration
When integrated with ai.gpt, use natural language:
- "ブログ記事を書いて" → Triggers
log_ai_content
- "記事一覧を見せて" → Triggers
log_list_posts
- "ブログをビルドして" → Triggers
log_build_blog
Documentation & Translation
Generate comprehensive documentation and translate content:
- "READMEを生成して" → Triggers
log_generate_docs
- "APIドキュメントを作成して" → Generates API documentation
- "プロジェクト構造を解析して" → Creates structure documentation
- "このファイルを英語に翻訳して" → Triggers
log_translate_document
- "マークダウンを日本語に変換して" → Uses Ollama for translation
MCP Tools
ai.log Server (Port 8002)
create_blog_post
- Create new blog postlist_blog_posts
- List existing postsbuild_blog
- Build static siteget_post_content
- Get post by slugtranslate_document
⭐ - Ollama-powered markdown translationgenerate_documentation
⭐ - Code analysis and documentation generation
ai.gpt Integration (Port 8001)
log_create_post
- Proxy to ai.log + error handlinglog_list_posts
- Proxy to ai.log + formattinglog_build_blog
- Proxy to ai.log + AI featureslog_get_post
- Proxy to ai.log + contextlog_system_status
- Health check for ai.loglog_ai_content
⭐ - AI memory → blog content generationlog_translate_document
🌍 - Document translation via Ollamalog_generate_docs
📚 - Documentation generation
Documentation Generation Tools
doc readme
- Generate README.md from project analysisdoc api
- Generate API documentationdoc structure
- Analyze and document project structuredoc changelog
- Generate changelog from git historydoc translate
🌍 - Multi-language document translation
Translation Features
- Language Support: English, Japanese, Chinese, Korean, Spanish
- Markdown Preservation: Code blocks, links, images, tables maintained
- Auto-Detection: Automatically detects Japanese content
- Ollama Integration: Uses local AI models for privacy and cost-efficiency
- Smart Processing: Section-by-section translation with structure awareness
Configuration
ai.log Configuration
- Location:
~/.config/syui/ai/log/
- Format: TOML configuration
ai.gpt Integration
- Configuration:
../config.json
- Auto-detection: ai.log tools enabled when
./log/
directory exists - System prompt: Automatically triggers blog tools for related queries
AI Integration Features
Memory-Driven Content Generation
- Source: ai.gpt memory system
- Process: Contextual memories → AI analysis → Blog content
- Output: Structured markdown with personal insights
Automatic Workflows
- Daily blog posts from accumulated memories
- Content enhancement and suggestions
- Related article recommendations
- Multi-language content generation
atproto Integration (Planned)
OAuth 2.0 Authentication
- Client metadata:
public/client-metadata.json
- Comment system integration
- Data sovereignty: Users own their comments
- Collection storage in atproto
Comment System
- ATProto Stream Monitoring: Real-time Jetstream connection monitoring
- Collection Tracking: Monitors
ai.syui.log
collection for new comments - User Management: Automatically adds commenting users to
ai.syui.log.user
collection - Comment Display: Fetches and displays comments from registered users
- OAuth Integration: atproto account login via Cloudflare tunnel
- Distributed Storage: Comments stored in user-owned atproto collections
Build & Deploy
GitHub Actions
# .github/workflows/gh-pages.yml
- name: Build ai.log
run: |
cd log
cargo build --release
./target/release/ailog build
Cloudflare Pages
- Static output:
./public/
- Automatic deployment on main branch push
- AI content generation during build process
Development Status
✅ Completed Features
- Project structure and Cargo.toml setup
- CLI interface (init, new, build, serve, clean, mcp, doc)
- Configuration system with TOML support
- Markdown parsing with frontmatter support
- Template system with Handlebars
- Static site generation with posts and pages
- Development server with hot reload
- MCP server integration (both layers)
- ai.gpt integration with 6 tools
- AI memory system connection
- 📚 Documentation generation from code
- 🔍 Rust project analysis and API extraction
- 📝 README, API docs, and structure analysis
- 🌍 Ollama-powered translation system
- 🚀 Complete MCP integration with ai.gpt
- 📄 Markdown-aware translation preserving structure
- 💬 ATProto comment system with Jetstream monitoring
- 🔄 Real-time comment collection and user management
- 🔐 OAuth 2.1 integration with Cloudflare tunnel
- 🤖 Ask AI feature with Ollama integration
- ⚡ CORS resolution via OLLAMA_ORIGINS
- 🔒 Authentication-gated AI chat
- 📱 Top-page-only AI access pattern
- Test blog with sample content and styling
🚧 In Progress
- AI-powered content enhancement pipeline
- Advanced comment moderation system
📋 Planned Features
- Advanced template customization
- Plugin system for extensibility
- Real-time comment system
- Multi-blog management
- VTuber integration (ai.verse connection)
Integration with ai Ecosystem
System Dependencies
- ai.gpt: Memory system, relationship tracking, AI provider
- ai.card: Future cross-system content sharing
- ai.bot: atproto posting and mention handling
- ai.verse: 3D world blog representation (future)
yui System Compliance
- Uniqueness: Each blog post tied to individual identity
- Reality Reflection: Personal memories → digital content
- Irreversibility: Published content maintains historical integrity
Getting Started
1. Standalone Usage
git clone [repository]
cd log
cargo run -- init my-blog
cargo run -- new "First Post"
cargo run -- build
cargo run -- serve
2. AI Ecosystem Integration
# Start ai.log MCP server
cargo run -- mcp --port 8002
# In another terminal, start ai.gpt
cd ../
# ai.gpt startup commands
# Use Claude Code with natural language blog commands
Documentation Generation Features
📚 Automatic README Generation
# Generate README from project analysis
ailog doc readme --source ./src --with-ai
# Output: Enhanced README.md with:
# - Project overview and metrics
# - Dependency analysis
# - Module structure
# - AI-generated insights
📖 API Documentation
# Generate comprehensive API docs
ailog doc api --source ./src --format markdown --output ./docs
# Creates:
# - docs/api.md (main API overview)
# - docs/module_name.md (per-module documentation)
# - Function signatures and documentation
# - Struct/enum definitions
🏗️ Project Structure Analysis
# Analyze and document project structure
ailog doc structure --source . --include-deps
# Generates:
# - Directory tree visualization
# - File distribution by language
# - Dependency graph analysis
# - Code metrics and statistics
📝 Git Changelog Generation
# Generate changelog from git history
ailog doc changelog --from v1.0.0 --explain-changes
# Creates:
# - Structured changelog
# - Commit categorization
# - AI-enhanced change explanations
🤖 AI-Enhanced Documentation
When --with-ai
is enabled:
- Content Enhancement: AI improves readability and adds insights
- Context Awareness: Leverages ai.gpt memory system
- Smart Categorization: Automatic organization of content
- Technical Writing: Professional documentation style
🌍 Translation System
Ollama-Powered Translation
ai.log includes a comprehensive translation system powered by Ollama AI models:
# Basic translation
ailog doc translate --input README.md --target-lang en
# Advanced translation with custom settings
ailog doc translate \
--input docs/technical-guide.ja.md \
--target-lang en \
--source-lang ja \
--output docs/technical-guide.en.md \
--model qwen2.5:latest \
--ollama-endpoint http://localhost:11434
Translation Features
📄 Markdown-Aware Processing
- Code Block Preservation: All code snippets remain untranslated
- Link Maintenance: URLs and link structures preserved
- Image Handling: Alt text can be translated while preserving image paths
- Table Translation: Table content translated while maintaining structure
- Header Preservation: Markdown headers translated with level maintenance
🎯 Smart Language Detection
- Auto-Detection: Automatically detects Japanese content using Unicode ranges
- Manual Override: Specify source language for precise control
- Mixed Content: Handles documents with multiple languages
🔧 Flexible Configuration
- Model Selection: Choose from available Ollama models
- Custom Endpoints: Use different Ollama instances
- Output Control: Auto-generate or specify output paths
- Batch Processing: Process multiple files efficiently
Supported Languages
Language | Code | Direction | Model Optimized |
---|---|---|---|
English | en |
↔️ | ✅ qwen2.5 |
Japanese | ja |
↔️ | ✅ qwen2.5 |
Chinese | zh |
↔️ | ✅ qwen2.5 |
Korean | ko |
↔️ | ⚠️ Basic |
Spanish | es |
↔️ | ⚠️ Basic |
Translation Workflow
- Parse Document: Analyze markdown structure and identify sections
- Preserve Code: Isolate code blocks and technical content
- Translate Content: Process text sections with Ollama AI
- Reconstruct: Rebuild document maintaining original formatting
- Validate: Ensure structural integrity and completeness
Integration with ai.gpt
# Via ai.gpt MCP tools
await log_translate_document(
input_file="README.ja.md",
target_lang="en",
model="qwen2.5:latest"
)
Requirements
- Ollama: Install and run Ollama locally
- Models: Download supported models (qwen2.5:latest recommended)
- Memory: Sufficient RAM for model inference
- Network: For initial model download only
Configuration Examples
Basic Blog Config
[blog]
title = "My AI Blog"
description = "Personal thoughts and AI insights"
base_url = "https://myblog.example.com"
[ai]
provider = "openai"
model = "gpt-4"
translation = true
Advanced Integration
// ../config.json (ai.gpt)
{
"mcp": {
"servers": {
"ai_gpt": {
"endpoints": {
"log_ai_content": "/log_ai_content",
"log_create_post": "/log_create_post"
}
}
}
}
}
Troubleshooting
MCP Connection Issues
- Ensure ai.log server is running:
cargo run -- mcp --port 8002
- Check ai.gpt config includes log endpoints
- Verify
./log/
directory exists relative to ai.gpt
Build Failures
- Check Rust version:
rustc --version
- Update dependencies:
cargo update
- Clear cache:
cargo clean
AI Integration Problems
- Verify ai.gpt memory system is initialized
- Check AI provider configuration
- Ensure sufficient context in memory system
systemd
$ sudo vim /usr/lib/systemd/system/ollama.service
[Service]
Environment="OLLAMA_ORIGINS=https://example.com"
# ファイルをsystemdディレクトリにコピー
sudo cp ./systemd/system/ailog-stream.service /etc/systemd/system/
sudo cp ./systemd/system/cloudflared-log.service /etc/systemd/system/
# 権限設定
sudo chmod 644 /etc/systemd/system/ailog-stream.service
sudo chmod 644 /etc/systemd/system/cloudflared-log.service
# systemd設定reload
sudo systemctl daemon-reload
# サービス有効化・開始
sudo systemctl enable ailog-stream.service
sudo systemctl enable cloudflared-log.service
sudo systemctl start ailog-stream.service
sudo systemctl start cloudflared-log.service
# 状態確認
sudo systemctl status ailog-stream.service
sudo systemctl status cloudflared-log.service
# ログ確認
journalctl -u ailog-stream.service -f
journalctl -u cloudflared-log.service -f
設定のポイント:
- User=syui でユーザー権限で実行
- Restart=always で異常終了時自動再起動
- After=network.target でネットワーク起動後に実行
- StandardOutput=journal でログをjournalctlで確認可能
License
© syui
Part of the ai ecosystem: ai.gpt, ai.card, ai.log, ai.bot, ai.verse, ai.shell
Description
Languages
Rust
56.4%
JavaScript
27%
CSS
9.3%
HTML
6%
Shell
1.3%