add ask AI

This commit is contained in:
2025-06-13 15:01:08 +09:00
parent 962017f922
commit fb0e5107cf
14 changed files with 506 additions and 383 deletions

View File

@ -34,7 +34,8 @@
"Bash(./run.zsh:*)",
"Bash(npm run dev:*)",
"Bash(./target/release/ailog:*)",
"Bash(rg:*)"
"Bash(rg:*)",
"Bash(../target/release/ailog build)"
],
"deny": []
}

View File

@ -1,150 +0,0 @@
# ai.log Deployment Guide
## 🌐 Cloudflare Tunnel Setup
ATProto OAuth requires HTTPS for proper CORS handling. Use Cloudflare Tunnel for secure deployment.
### Prerequisites
1. **Install cloudflared**:
```bash
brew install cloudflared
```
2. **Login and create tunnel** (if not already done):
```bash
cloudflared tunnel login
cloudflared tunnel create ailog
```
3. **Configure DNS**:
- Add a CNAME record: `log.syui.ai` → `[tunnel-id].cfargotunnel.com`
### Configuration Files
#### `cloudflared-config.yml`
```yaml
tunnel: a6813327-f880-485d-a9d1-376e6e3df8ad
credentials-file: /Users/syui/.cloudflared/a6813327-f880-485d-a9d1-376e6e3df8ad.json
ingress:
- hostname: log.syui.ai
service: http://localhost:8080
originRequest:
noHappyEyeballs: true
- service: http_status:404
```
#### Production Client Metadata
`static/client-metadata-prod.json`:
```json
{
"client_id": "https://log.syui.ai/client-metadata.json",
"client_name": "ai.log Blog Comment System",
"client_uri": "https://log.syui.ai",
"redirect_uris": ["https://log.syui.ai/"],
"grant_types": ["authorization_code"],
"response_types": ["code"],
"token_endpoint_auth_method": "none",
"application_type": "web"
}
```
### Deployment Commands
#### Quick Start
```bash
# All-in-one deployment
./scripts/tunnel.sh
```
#### Manual Steps
```bash
# 1. Build for production
PRODUCTION=true cargo run -- build
# 2. Start local server
cargo run -- serve --port 8080 &
# 3. Start tunnel
cloudflared tunnel --config cloudflared-config.yml run
```
### Environment Detection
The system automatically detects environment:
- **Development** (`localhost:8080`): Uses local client-metadata.json
- **Production** (`log.syui.ai`): Uses HTTPS client-metadata.json
### CORS Resolution
✅ **With Cloudflare Tunnel**:
- HTTPS domain: `https://log.syui.ai`
- Valid SSL certificate
- Proper CORS headers
- ATProto OAuth works correctly
❌ **With localhost**:
- HTTP only: `http://localhost:8080`
- CORS restrictions
- ATProto OAuth may fail
### Troubleshooting
#### ATProto OAuth Errors
```javascript
// Check client metadata URL in browser console
console.log('Environment:', window.location.hostname);
console.log('Client ID:', clientId);
```
#### Tunnel Connection Issues
```bash
# Check tunnel status
cloudflared tunnel info ailog
# Test local server
curl http://localhost:8080/client-metadata.json
```
#### DNS Propagation
```bash
# Check DNS resolution
dig log.syui.ai
nslookup log.syui.ai
```
### Security Notes
- **Client metadata** is publicly accessible (required by ATProto)
- **Credentials file** contains tunnel secrets (keep secure)
- **HTTPS only** for production OAuth
- **Domain validation** by ATProto servers
### Integration with ai.ai Ecosystem
This deployment enables:
- **ai.log**: Comment system with ATProto authentication
- **ai.card**: Shared OAuth widget
- **ai.gpt**: Memory synchronization via ATProto
- **ai.verse**: Future 3D world integration
### Monitoring
```bash
# Monitor tunnel logs
cloudflared tunnel --config cloudflared-config.yml run --loglevel debug
# Monitor blog server
tail -f /path/to/blog/logs
# Check ATProto connectivity
curl -I https://log.syui.ai/client-metadata.json
```
---
**🔗 Live URL**: https://log.syui.ai
**📊 Status**: Production Ready
**🌐 ATProto**: OAuth Enabled

View File

@ -1,205 +0,0 @@
# OAuth Integration Changes for ai.log
## 概要
ailogブログシステムにATProto/Bluesky OAuth認証を使用したコメントシステムを統合しました。
## 実装された機能
### 1. OAuth認証システム
- **ATProto BrowserOAuthClient** を使用した完全なOAuth 2.1フロー
- Blueskyアカウントでのワンクリック認証
- セッション永続化とリフレッシュトークン対応
### 2. コメントシステム
- 認証済みユーザーによるコメント投稿
- ATProto collection (`ai.syui.log`) への直接保存
- リアルタイムコメント表示と削除機能
- 複数PDS対応のコメント取得
### 3. 管理機能
- 管理者用ユーザーリスト管理
- DID解決とプロフィール情報の自動取得
- JSON形式でのレコード表示・編集
## 技術的変更点
### aicard-web-oauth (React OAuth App)
#### 新規ファイル
```
aicard-web-oauth/
├── src/
│ ├── services/
│ │ ├── atproto-oauth.ts # BrowserOAuthClient wrapper
│ │ └── auth.ts # Legacy auth service
│ ├── components/
│ │ ├── OAuthCallback.tsx # OAuth callback handler
│ │ └── OAuthCallbackPage.tsx
│ └── utils/
│ ├── oauth-endpoints.ts # OAuth endpoint utilities
│ └── oauth-keys.ts # OAuth configuration
```
#### 主要な変更
- **App.tsx**: URL parameter/hash detection, 詳細デバッグログ追加
- **vite.config.ts**: 固定ファイル名出力 (`comment-atproto.js/css`)
- **main.tsx**: React mount点を `comment-atproto` に変更
#### OAuthCallback.tsx の機能
- Query parameters と hash parameters の両方を検出
- 認証完了後の自動URL cleanup (`window.history.replaceState`)
- Popup/direct navigation 両対応
- Fallback認証とエラーハンドリング
### ailog (Rust Static Site Generator)
#### OAuth Callback Route
**src/commands/serve.rs**:
```rust
} else if path.starts_with("/oauth/callback") {
// Handle OAuth callback - serve the callback HTML page
match serve_oauth_callback().await {
Ok((ct, data, cc)) => ("200 OK", ct, data, cc),
Err(e) => // Error handling
}
}
```
#### OAuth Callback HTML
- ATProto認証パラメータの検出・処理
- Hash parameters でのリダイレクト (`#code=...&state=...`)
- Popup/window間通信対応
- localStorage を使った一時的なデータ保存
### Template Integration
#### base.html (ailog templates)
```html
<!-- OAuth Comment System - Load in head for early initialization -->
<script type="module" crossorigin src="/assets/comment-atproto.js"></script>
<link rel="stylesheet" crossorigin href="/assets/comment-atproto.css">
```
#### index.html / post.html
```html
<!-- OAuth Comment System -->
<div id="comment-atproto"></div>
```
### OAuth Configuration
#### client-metadata.json
```json
{
"client_id": "https://log.syui.ai/client-metadata.json",
"redirect_uris": [
"https://log.syui.ai/oauth/callback",
"https://log.syui.ai/"
],
"scope": "atproto transition:generic",
"dpop_bound_access_tokens": true
}
```
## インフラストラクチャ
### Cloudflare Tunnel
```yaml
# cloudflared-config.yml
ingress:
- hostname: log.syui.ai
service: http://localhost:4173 # ailog serve
```
### Build Process
1. **aicard-web-oauth**: `npm run build``dist/assets/`
2. **Asset copy**: `dist/assets/*``my-blog/public/assets/`
3. **ailog build**: Template processing + static file serving
## データフロー
### OAuth認証フロー
```
1. User clicks "atproto" button
2. BrowserOAuthClient initiates OAuth flow
3. Redirect to Bluesky authorization server
4. Callback to https://log.syui.ai/oauth/callback
5. ailog serves OAuth callback HTML
6. JavaScript processes parameters and redirects with hash
7. React app detects hash parameters and completes authentication
8. URL cleanup removes OAuth parameters
```
### コメント投稿フロー
```
1. Authenticated user writes comment
2. React app calls ATProto API
3. Record saved to ai.syui.log collection
4. Comments reloaded from all configured PDS endpoints
5. Real-time display update
```
## 設定ファイル
### 必須ファイル
- `my-blog/static/client-metadata.json` - OAuth client configuration
- `aicard-web-oauth/.env.production` - Production environment variables
- `cloudflared-config.yml` - Tunnel routing configuration
### 開発用ファイル
- `aicard-web-oauth/.env.development` - Development settings
- `aicard-web-oauth/public/client-metadata.json` - Local OAuth metadata
## 主要な修正点
### 1. Build System
- Vite output ファイル名を固定 (`comment-atproto.js/css`)
- Build時のclient-metadata.json更新自動化
### 2. OAuth Callback処理
- Hash parameters 対応でSPA architectureに最適化
- URL cleanup でクリーンなユーザー体験
- Popup/direct navigation 両対応
### 3. Error Handling
- Network エラー時のfallback認証
- セッション期限切れ時の再認証
- OAuth parameter不足時の適切なエラー表示
### 4. Session Management
- localStorage + sessionStorage 併用
- OAuth state/code verifier の適切な管理
- Cross-tab session sharing
## テスト済み機能
**動作確認済み**
- OAuth認証 (Bluesky)
- コメント投稿・削除
- セッション永続化
- URL parameter cleanup
- 複数PDS対応
- 管理者機能
**今後のテスト項目**
- Incognito/private mode での動作
- 複数タブでの同時使用
- Long-term session の動作確認
## 運用メモ
### デプロイ手順
1. `cd aicard-web-oauth && npm run build`
2. `cp -r dist/assets/* ../my-blog/public/assets/`
3. `cd my-blog && cargo build --release`
4. ailog serve でテスト確認
### トラブルシューティング
- OAuth エラー: client-metadata.json のredirect_uris確認
- コメント表示されない: Network tab でAPI response確認
- Build エラー: Node.js/npm version, dependencies確認
## 関連リンク
- [ATProto OAuth Specification](https://atproto.com/specs/oauth)
- [Bluesky OAuth Documentation](https://github.com/bluesky-social/atproto/blob/main/packages/api/OAUTH.md)
- [BrowserOAuthClient API](https://github.com/bluesky-social/atproto/tree/main/packages/oauth-client-browser)

View File

@ -22,9 +22,15 @@ highlight_code = true
minify = false
[ai]
enabled = false
enabled = true
auto_translate = false
comment_moderation = false
ask_ai = true
provider = "ollama"
model = "gemma3:4b"
host = "https://ollama.yourdomain.com"
system_prompt = "You are a helpful AI assistant trained on this blog's content."
ai_did = "did:plc:your-ai-bot-did"
# 3. Build your blog
ailog build
@ -125,10 +131,14 @@ ai.logは、[Anthropic Docs](https://docs.anthropic.com/)にインスパイア
- **レスポンシブ**: モバイル・デスクトップ対応
### 🤖 AI統合機能
- **Ask AI**: ローカルLLM(Ollama)による質問応答
- **Ask AI**: ローカルLLM(Ollama)による質問応答
- トップページでのみ利用可能
- atproto OAuth認証必須
- Cloudflare Tunnel経由でCORS問題解決済み
- **自動翻訳**: 日本語↔英語の自動生成
- **AI記事強化**: コンテンツの自動改善
- **AIコメント**: 記事への一言コメント生成
- **カスタマイズ可能なAI設定**: system_prompt、ai_did、プロフィール連携
### 🌐 分散SNS連携
- **atproto OAuth**: Blueskyアカウントでログイン
@ -341,6 +351,10 @@ Generate comprehensive documentation and translate content:
- **💬 ATProto comment system with Jetstream monitoring**
- **🔄 Real-time comment collection and user management**
- **🔐 OAuth 2.1 integration with Cloudflare tunnel**
- **🤖 Ask AI feature with Ollama integration**
- **⚡ CORS resolution via OLLAMA_ORIGINS**
- **🔒 Authentication-gated AI chat**
- **📱 Top-page-only AI access pattern**
- Test blog with sample content and styling
### 🚧 In Progress

View File

@ -7,7 +7,18 @@ VITE_ADMIN_DID=did:plc:uqzpqmrjnptsxezjx4xuh2mn
# Collection names for OAuth app
VITE_COLLECTION_COMMENT=ai.syui.log
VITE_COLLECTION_USER=ai.syui.log.user
VITE_COLLECTION_CHAT=ai.syui.log.chat
# Collection names for ailog (backward compatibility)
AILOG_COLLECTION_COMMENT=ai.syui.log
AILOG_COLLECTION_USER=ai.syui.log.user
AILOG_COLLECTION_CHAT=ai.syui.log.chat
# AI Configuration
VITE_AI_ENABLED=true
VITE_AI_ASK_AI=true
VITE_AI_PROVIDER=ollama
VITE_AI_MODEL=gemma3:4b
VITE_AI_HOST=https://ollama.syui.ai
VITE_AI_SYSTEM_PROMPT="You are a helpful AI assistant trained on this blog's content. You can answer questions about the articles, provide insights, and help users understand the topics discussed."
VITE_AI_DID=did:plc:4hqjfn7m6n5hno3doamuhgef

View File

@ -1,5 +1,6 @@
import React, { useState, useEffect } from 'react';
import { OAuthCallback } from './components/OAuthCallback';
import { AIChat } from './components/AIChat';
import { authService, User } from './services/auth';
import { atprotoOAuthService } from './services/atproto-oauth';
import { appConfig } from './config/app';
@ -83,8 +84,8 @@ function App() {
}
};
// Jetstream + Cache example
const jetstream = setupJetstream();
// Jetstream + Cache example (disabled for now)
// const jetstream = setupJetstream();
// キャッシュからコメント読み込み
const loadCachedComments = () => {
@ -102,7 +103,10 @@ function App() {
// キャッシュがなければ、ATProtoから取得認証状態に関係なく
if (!loadCachedComments()) {
console.log('No cached comments found, loading from ATProto...');
loadAllComments(); // URLフィルタリングを無効にして全コメント表示
} else {
console.log('Cached comments loaded successfully');
}
// Handle popstate events for mock OAuth flow
@ -144,6 +148,7 @@ function App() {
// Load all comments for display (this will be the default view)
// Temporarily disable URL filtering to see all comments
console.log('OAuth session found, loading all comments...');
loadAllComments();
// Load user list records if admin
@ -164,6 +169,7 @@ function App() {
// Load all comments for display (this will be the default view)
// Temporarily disable URL filtering to see all comments
console.log('Legacy auth session found, loading all comments...');
loadAllComments();
// Load user list records if admin
@ -174,6 +180,7 @@ function App() {
setIsLoading(false);
// 認証状態に関係なく、コメントを読み込む
console.log('No auth session found, loading all comments anyway...');
loadAllComments();
};
@ -480,6 +487,7 @@ function App() {
console.log('Known users used:', knownUsers);
setComments(enhancedComments);
console.log('Comments state updated with', enhancedComments.length, 'comments');
// キャッシュに保存5分間有効
if (pageUrl) {
@ -1076,6 +1084,8 @@ function App() {
</section>
</main>
{/* AI Chat Component - handles all AI functionality */}
<AIChat user={user} isEnabled={appConfig.aiEnabled && appConfig.aiAskAi} />
</div>
);
}

View File

@ -0,0 +1,260 @@
import React, { useState, useEffect } from 'react';
import { User } from '../services/auth';
import { atprotoOAuthService } from '../services/atproto-oauth';
import { appConfig } from '../config/app';
interface AIChatProps {
user: User | null;
isEnabled: boolean;
}
export const AIChat: React.FC<AIChatProps> = ({ user, isEnabled }) => {
const [chatHistory, setChatHistory] = useState<any[]>([]);
const [isLoading, setIsLoading] = useState(false);
const [isProcessing, setIsProcessing] = useState(false);
const [aiProfile, setAiProfile] = useState<any>(null);
// Get AI settings from environment variables
const aiConfig = {
enabled: import.meta.env.VITE_AI_ENABLED === 'true',
askAi: import.meta.env.VITE_AI_ASK_AI === 'true',
provider: import.meta.env.VITE_AI_PROVIDER || 'ollama',
model: import.meta.env.VITE_AI_MODEL || 'gemma3:4b',
host: import.meta.env.VITE_AI_HOST || 'https://ollama.syui.ai',
systemPrompt: import.meta.env.VITE_AI_SYSTEM_PROMPT || 'You are a helpful AI assistant trained on this blog\'s content.',
aiDid: import.meta.env.VITE_AI_DID || 'did:plc:uqzpqmrjnptsxezjx4xuh2mn',
};
// Fetch AI profile on load
useEffect(() => {
const fetchAIProfile = async () => {
if (!aiConfig.aiDid) {
console.log('No AI DID configured');
return;
}
try {
// Try with agent first
const agent = atprotoOAuthService.getAgent();
if (agent) {
console.log('Fetching AI profile with agent for DID:', aiConfig.aiDid);
const profile = await agent.getProfile({ actor: aiConfig.aiDid });
console.log('AI profile fetched successfully:', profile.data);
setAiProfile({
did: aiConfig.aiDid,
handle: profile.data.handle || 'ai-assistant',
displayName: profile.data.displayName || 'AI Assistant',
avatar: profile.data.avatar || null,
description: profile.data.description || null
});
return;
}
// Fallback to public API
console.log('No agent available, trying public API for AI profile');
const response = await fetch(`https://public.api.bsky.app/xrpc/app.bsky.actor.getProfile?actor=${encodeURIComponent(aiConfig.aiDid)}`);
if (response.ok) {
const profileData = await response.json();
console.log('AI profile fetched via public API:', profileData);
setAiProfile({
did: aiConfig.aiDid,
handle: profileData.handle || 'ai-assistant',
displayName: profileData.displayName || 'AI Assistant',
avatar: profileData.avatar || null,
description: profileData.description || null
});
return;
}
} catch (error) {
console.log('Failed to fetch AI profile, using defaults:', error);
setAiProfile({
did: aiConfig.aiDid,
handle: 'ai-assistant',
displayName: 'AI Assistant',
avatar: null,
description: 'AI assistant for this blog'
});
}
};
fetchAIProfile();
}, [aiConfig.aiDid]);
useEffect(() => {
if (!isEnabled || !aiConfig.askAi) return;
// Listen for AI question posts from base.html
const handleAIQuestion = async (event: any) => {
if (!user || !event.detail || !event.detail.question || isProcessing) return;
console.log('AIChat received question:', event.detail.question);
setIsProcessing(true);
try {
await postQuestionAndGenerateResponse(event.detail.question);
} finally {
setIsProcessing(false);
}
};
// Add listener with a small delay to ensure it's ready
setTimeout(() => {
window.addEventListener('postAIQuestion', handleAIQuestion);
console.log('AIChat event listener registered');
// Notify that AI is ready
window.dispatchEvent(new CustomEvent('aiChatReady'));
}, 100);
return () => {
window.removeEventListener('postAIQuestion', handleAIQuestion);
};
}, [user, isEnabled, isProcessing]);
const postQuestionAndGenerateResponse = async (question: string) => {
if (!user || !aiConfig.askAi) return;
setIsLoading(true);
try {
const agent = atprotoOAuthService.getAgent();
if (!agent) throw new Error('No agent available');
// 1. Post question to ATProto
const now = new Date();
const rkey = now.toISOString().replace(/[:.]/g, '-');
const questionRecord = {
$type: appConfig.collections.chat,
question: question,
url: window.location.href,
createdAt: now.toISOString(),
author: {
did: user.did,
handle: user.handle,
avatar: user.avatar,
displayName: user.displayName || user.handle,
},
context: {
page_title: document.title,
page_url: window.location.href,
},
};
await agent.api.com.atproto.repo.putRecord({
repo: user.did,
collection: appConfig.collections.chat,
rkey: rkey,
record: questionRecord,
});
console.log('Question posted to ATProto');
// 2. Get chat history
const chatRecords = await agent.api.com.atproto.repo.listRecords({
repo: user.did,
collection: appConfig.collections.chat,
limit: 10,
});
let chatHistoryText = '';
if (chatRecords.data.records) {
chatHistoryText = chatRecords.data.records
.map((r: any) => {
if (r.value.question) {
return `User: ${r.value.question}`;
} else if (r.value.answer) {
return `AI: ${r.value.answer}`;
}
return '';
})
.filter(Boolean)
.join('\n');
}
// 3. Generate AI response based on provider
let aiAnswer = '';
// 3. Generate AI response using Ollama via proxy
if (aiConfig.provider === 'ollama') {
const prompt = `${aiConfig.systemPrompt}
${chatHistoryText ? `履歴: ${chatHistoryText}` : ''}
質問: ${question}
簡潔に回答:`;
const response = await fetch(`${aiConfig.host}/api/generate`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
model: aiConfig.model,
prompt: prompt,
stream: false,
options: {
temperature: 0.7,
top_p: 0.9,
num_predict: 80, // Shorter responses for faster generation
}
}),
});
if (!response.ok) {
throw new Error('AI API request failed');
}
const data = await response.json();
aiAnswer = data.response;
}
// 4. Immediately dispatch event to update UI
console.log('Dispatching AI response with profile:', aiProfile);
window.dispatchEvent(new CustomEvent('aiResponseReceived', {
detail: {
answer: aiAnswer,
aiProfile: aiProfile,
timestamp: now.toISOString()
}
}));
// 5. Save AI response in background
const answerRkey = now.toISOString().replace(/[:.]/g, '-') + '-answer';
const answerRecord = {
$type: appConfig.collections.chat,
answer: aiAnswer,
question_rkey: rkey,
url: window.location.href,
createdAt: now.toISOString(),
author: {
did: aiConfig.aiDid,
handle: 'AI Assistant',
displayName: 'AI Assistant',
},
};
// Save to ATProto asynchronously (don't wait for it)
agent.api.com.atproto.repo.putRecord({
repo: user.did,
collection: appConfig.collections.chat,
rkey: answerRkey,
record: answerRecord,
}).catch(err => {
console.error('Failed to save AI response to ATProto:', err);
});
} catch (error) {
console.error('Failed to generate AI response:', error);
window.dispatchEvent(new CustomEvent('aiResponseError', {
detail: { error: 'AI応答の生成に失敗しました' }
}));
} finally {
setIsLoading(false);
}
};
// This component doesn't render anything - it just handles the logic
return null;
};

View File

@ -0,0 +1,79 @@
import React, { useState, useEffect } from 'react';
import { AtprotoAgent } from '@atproto/api';
interface AIProfile {
did: string;
handle: string;
displayName?: string;
avatar?: string;
description?: string;
}
interface AIProfileProps {
aiDid: string;
}
export const AIProfile: React.FC<AIProfileProps> = ({ aiDid }) => {
const [profile, setProfile] = useState<AIProfile | null>(null);
const [loading, setLoading] = useState(true);
useEffect(() => {
const fetchAIProfile = async () => {
try {
// Use public API to get profile information
const agent = new AtprotoAgent({ service: 'https://bsky.social' });
const response = await agent.getProfile({ actor: aiDid });
setProfile({
did: response.data.did,
handle: response.data.handle,
displayName: response.data.displayName,
avatar: response.data.avatar,
description: response.data.description,
});
} catch (error) {
console.error('Failed to fetch AI profile:', error);
// Fallback to basic info
setProfile({
did: aiDid,
handle: 'ai-assistant',
displayName: 'AI Assistant',
description: 'AI assistant for this blog',
});
} finally {
setLoading(false);
}
};
if (aiDid) {
fetchAIProfile();
}
}, [aiDid]);
if (loading) {
return <div className="ai-profile-loading">Loading AI profile...</div>;
}
if (!profile) {
return null;
}
return (
<div className="ai-profile">
<div className="ai-avatar">
{profile.avatar ? (
<img src={profile.avatar} alt={profile.displayName || profile.handle} />
) : (
<div className="ai-avatar-placeholder">🤖</div>
)}
</div>
<div className="ai-info">
<div className="ai-name">{profile.displayName || profile.handle}</div>
<div className="ai-handle">@{profile.handle}</div>
{profile.description && (
<div className="ai-description">{profile.description}</div>
)}
</div>
</div>
);
};

View File

@ -4,15 +4,21 @@ export interface AppConfig {
collections: {
comment: string;
user: string;
chat: string;
};
host: string;
rkey?: string; // Current post rkey if on post page
aiEnabled: boolean;
aiAskAi: boolean;
aiProvider: string;
aiModel: string;
aiHost: string;
}
// Generate collection names from host
// Format: ${reg}.${name}.${sub}
// Example: log.syui.ai -> ai.syui.log
function generateCollectionNames(host: string): { comment: string; user: string } {
function generateCollectionNames(host: string): { comment: string; user: string; chat: string } {
try {
// Remove protocol if present
const cleanHost = host.replace(/^https?:\/\//, '');
@ -31,14 +37,16 @@ function generateCollectionNames(host: string): { comment: string; user: string
return {
comment: collectionBase,
user: `${collectionBase}.user`
user: `${collectionBase}.user`,
chat: `${collectionBase}.chat`
};
} catch (error) {
console.warn('Failed to generate collection names from host:', host, error);
// Fallback to default collections
return {
comment: 'ai.syui.log',
user: 'ai.syui.log.user'
user: 'ai.syui.log.user',
chat: 'ai.syui.log.chat'
};
}
}
@ -61,22 +69,36 @@ export function getAppConfig(): AppConfig {
const collections = {
comment: import.meta.env.VITE_COLLECTION_COMMENT || autoGeneratedCollections.comment,
user: import.meta.env.VITE_COLLECTION_USER || autoGeneratedCollections.user,
chat: import.meta.env.VITE_COLLECTION_CHAT || autoGeneratedCollections.chat,
};
const rkey = extractRkeyFromUrl();
// AI configuration
const aiEnabled = import.meta.env.VITE_AI_ENABLED === 'true';
const aiAskAi = import.meta.env.VITE_AI_ASK_AI === 'true';
const aiProvider = import.meta.env.VITE_AI_PROVIDER || 'ollama';
const aiModel = import.meta.env.VITE_AI_MODEL || 'gemma2:2b';
const aiHost = import.meta.env.VITE_AI_HOST || 'https://ollama.syui.ai';
console.log('App configuration:', {
host,
adminDid,
collections,
rkey: rkey || 'none (not on post page)'
rkey: rkey || 'none (not on post page)',
ai: { enabled: aiEnabled, askAi: aiAskAi, provider: aiProvider, model: aiModel, host: aiHost }
});
return {
adminDid,
collections,
host,
rkey
rkey,
aiEnabled,
aiAskAi,
aiProvider,
aiModel,
aiHost
};
}

View File

@ -10,14 +10,21 @@ import { OAuthEndpointHandler } from './utils/oauth-endpoints'
// DISABLED: This may interfere with BrowserOAuthClient
// OAuthEndpointHandler.init()
ReactDOM.createRoot(document.getElementById('comment-atproto')!).render(
<React.StrictMode>
<BrowserRouter>
<Routes>
<Route path="/oauth/callback" element={<OAuthCallbackPage />} />
<Route path="/list" element={<CardList />} />
<Route path="*" element={<App />} />
</Routes>
</BrowserRouter>
</React.StrictMode>,
)
// Mount React app to all comment-atproto divs
const mountPoints = document.querySelectorAll('#comment-atproto');
console.log(`Found ${mountPoints.length} comment-atproto mount points`);
mountPoints.forEach((mountPoint, index) => {
console.log(`Mounting React app to comment-atproto #${index + 1}`);
ReactDOM.createRoot(mountPoint as HTMLElement).render(
<React.StrictMode>
<BrowserRouter>
<Routes>
<Route path="/oauth/callback" element={<OAuthCallbackPage />} />
<Route path="/list" element={<CardList />} />
<Route path="*" element={<App />} />
</Routes>
</BrowserRouter>
</React.StrictMode>,
);
});

20
run.zsh
View File

@ -17,8 +17,8 @@ function _env() {
}
function _server() {
_env
lsof -ti:$port | xargs kill -9 2>/dev/null || true
lsof -ti:11434 | xargs kill -9 2>/dev/null || true
cd $d/my-blog
cargo build --release
$ailog build
@ -26,12 +26,10 @@ function _server() {
}
function _server_public() {
_env
cloudflared tunnel --config $d/cloudflared-config.yml run
}
function _oauth_build() {
_env
cd $oauth
nvm use 21
npm i
@ -43,11 +41,17 @@ function _oauth_build() {
}
function _server_comment() {
_env
cargo build --release
AILOG_DEBUG_ALL=1 $ailog stream start
AILOG_DEBUG_ALL=1 $ailog stream start my-blog
}
function _server_ollama(){
brew services stop ollama
OLLAMA_ORIGINS="https://log.syui.ai" ollama serve
}
_env
case "${1:-serve}" in
tunnel|c)
_server_public
@ -58,6 +62,12 @@ case "${1:-serve}" in
comment|co)
_server_comment
;;
ollama|ol)
_server_ollama
;;
proxy|p)
_server_proxy
;;
serve|s|*)
_server
;;

View File

@ -30,6 +30,7 @@ title = "My Blog"
description = "A blog powered by ailog"
base_url = "https://example.com"
language = "ja"
author = "Your Name"
[build]
highlight_code = true
@ -88,7 +89,7 @@ comment_moderation = false
</div>
<footer class="main-footer">
<p>&copy; 2025 {{ config.title }}</p>
<p>&copy; {{ config.author | default(value=config.title) }}</p>
</footer>
<script>

View File

@ -53,6 +53,39 @@ pub async fn build(project_dir: PathBuf) -> Result<()> {
.and_then(|v| v.as_str())
.unwrap_or("ai.syui.log.user");
let collection_chat = oauth_config.get("collection_chat")
.and_then(|v| v.as_str())
.unwrap_or("ai.syui.log.chat");
// Extract AI config if present
let ai_config = config.get("ai")
.and_then(|v| v.as_table());
let ai_enabled = ai_config
.and_then(|ai| ai.get("enabled"))
.and_then(|v| v.as_bool())
.unwrap_or(false);
let ai_ask_ai = ai_config
.and_then(|ai| ai.get("ask_ai"))
.and_then(|v| v.as_bool())
.unwrap_or(false);
let ai_provider = ai_config
.and_then(|ai| ai.get("provider"))
.and_then(|v| v.as_str())
.unwrap_or("ollama");
let ai_model = ai_config
.and_then(|ai| ai.get("model"))
.and_then(|v| v.as_str())
.unwrap_or("gemma2:2b");
let ai_host = ai_config
.and_then(|ai| ai.get("host"))
.and_then(|v| v.as_str())
.unwrap_or("https://ollama.syui.ai");
// 4. Create .env.production content
let env_content = format!(
r#"# Production environment variables
@ -64,10 +97,19 @@ VITE_ADMIN_DID={}
# Collection names for OAuth app
VITE_COLLECTION_COMMENT={}
VITE_COLLECTION_USER={}
VITE_COLLECTION_CHAT={}
# Collection names for ailog (backward compatibility)
AILOG_COLLECTION_COMMENT={}
AILOG_COLLECTION_USER={}
AILOG_COLLECTION_CHAT={}
# AI Configuration
VITE_AI_ENABLED={}
VITE_AI_ASK_AI={}
VITE_AI_PROVIDER={}
VITE_AI_MODEL={}
VITE_AI_HOST={}
"#,
base_url,
base_url, client_id_path,
@ -75,8 +117,15 @@ AILOG_COLLECTION_USER={}
admin_did,
collection_comment,
collection_user,
collection_chat,
collection_comment,
collection_user
collection_user,
collection_chat,
ai_enabled,
ai_ask_ai,
ai_provider,
ai_model,
ai_host
);
// 5. Find oauth directory (relative to current working directory)

View File

@ -17,6 +17,7 @@ pub struct SiteConfig {
pub description: String,
pub base_url: String,
pub language: String,
pub author: Option<String>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
@ -30,6 +31,12 @@ pub struct AiConfig {
pub enabled: bool,
pub auto_translate: bool,
pub comment_moderation: bool,
pub ask_ai: Option<bool>,
pub provider: Option<String>,
pub model: Option<String>,
pub host: Option<String>,
pub system_prompt: Option<String>,
pub ai_did: Option<String>,
pub api_key: Option<String>,
pub gpt_endpoint: Option<String>,
pub atproto_config: Option<AtprotoConfig>,
@ -135,6 +142,7 @@ impl Default for Config {
description: "A blog powered by ailog".to_string(),
base_url: "https://example.com".to_string(),
language: "ja".to_string(),
author: None,
},
build: BuildConfig {
highlight_code: true,
@ -144,6 +152,12 @@ impl Default for Config {
enabled: false,
auto_translate: false,
comment_moderation: false,
ask_ai: Some(false),
provider: Some("ollama".to_string()),
model: Some("gemma3:4b".to_string()),
host: None,
system_prompt: Some("You are a helpful AI assistant trained on this blog's content.".to_string()),
ai_did: None,
api_key: None,
gpt_endpoint: None,
atproto_config: None,