fix card
This commit is contained in:
172
AI_CARD_INTEGRATION.md
Normal file
172
AI_CARD_INTEGRATION.md
Normal file
@ -0,0 +1,172 @@
|
|||||||
|
# ai.card と ai.gpt の統合ガイド
|
||||||
|
|
||||||
|
## 概要
|
||||||
|
|
||||||
|
ai.gptのMCPサーバーにai.cardのツールを統合し、AIがカードゲームシステムとやり取りできるようになりました。
|
||||||
|
|
||||||
|
## セットアップ
|
||||||
|
|
||||||
|
### 1. 必要な環境
|
||||||
|
|
||||||
|
- Python 3.13
|
||||||
|
- ai.gpt プロジェクト
|
||||||
|
- ai.card プロジェクト(`./card` ディレクトリ)
|
||||||
|
|
||||||
|
### 2. 起動手順
|
||||||
|
|
||||||
|
**ステップ1: ai.cardサーバーを起動**(ターミナル1)
|
||||||
|
```bash
|
||||||
|
cd card
|
||||||
|
./start_server.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
**ステップ2: ai.gpt MCPサーバーを起動**(ターミナル2)
|
||||||
|
```bash
|
||||||
|
aigpt server
|
||||||
|
```
|
||||||
|
|
||||||
|
起動時に以下が表示されることを確認:
|
||||||
|
- 🎴 Card Game System: 6 tools
|
||||||
|
- 🎴 ai.card: ./card directory detected
|
||||||
|
|
||||||
|
**ステップ3: AIと対話**(ターミナル3)
|
||||||
|
```bash
|
||||||
|
aigpt conv syui --provider openai
|
||||||
|
```
|
||||||
|
|
||||||
|
## 使用可能なコマンド
|
||||||
|
|
||||||
|
### カード関連の質問例
|
||||||
|
|
||||||
|
```
|
||||||
|
# カードコレクションを表示
|
||||||
|
「カードコレクションを見せて」
|
||||||
|
「私のカードを見せて」
|
||||||
|
「カード一覧を表示して」
|
||||||
|
|
||||||
|
# ガチャを実行
|
||||||
|
「ガチャを引いて」
|
||||||
|
「カードを引きたい」
|
||||||
|
|
||||||
|
# コレクション分析
|
||||||
|
「私のコレクションを分析して」
|
||||||
|
|
||||||
|
# ガチャ統計
|
||||||
|
「ガチャの統計を見せて」
|
||||||
|
```
|
||||||
|
|
||||||
|
## 技術仕様
|
||||||
|
|
||||||
|
### MCP ツール一覧
|
||||||
|
|
||||||
|
| ツール名 | 説明 | パラメータ |
|
||||||
|
|---------|------|-----------|
|
||||||
|
| `card_get_user_cards` | ユーザーのカード一覧取得 | did, limit |
|
||||||
|
| `card_draw_card` | ガチャでカード取得 | did, is_paid |
|
||||||
|
| `card_get_card_details` | カード詳細情報取得 | card_id |
|
||||||
|
| `card_analyze_collection` | コレクション分析 | did |
|
||||||
|
| `card_get_gacha_stats` | ガチャ統計取得 | なし |
|
||||||
|
| `card_system_status` | システム状態確認 | なし |
|
||||||
|
|
||||||
|
### 動作の流れ
|
||||||
|
|
||||||
|
1. **ユーザーがカード関連の質問をする**
|
||||||
|
- AIがキーワード(カード、コレクション、ガチャなど)を検出
|
||||||
|
|
||||||
|
2. **AIが適切なMCPツールを呼び出す**
|
||||||
|
- OpenAIのFunction Callingを使用
|
||||||
|
- didパラメータには会話相手のユーザーID(例:'syui')を使用
|
||||||
|
|
||||||
|
3. **ai.gpt MCPサーバーがai.cardサーバーに転送**
|
||||||
|
- http://localhost:8001 → http://localhost:8000
|
||||||
|
- 適切なエンドポイントにリクエストを転送
|
||||||
|
|
||||||
|
4. **結果をAIが解釈して返答**
|
||||||
|
- カード情報を分かりやすく説明
|
||||||
|
- エラー時は適切なガイダンスを提供
|
||||||
|
|
||||||
|
## 設定
|
||||||
|
|
||||||
|
### config.json
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"providers": {
|
||||||
|
"openai": {
|
||||||
|
"api_key": "your-api-key",
|
||||||
|
"default_model": "gpt-4o-mini",
|
||||||
|
"system_prompt": "カード関連の質問では、必ずcard_get_user_cardsなどのツールを使用してください。"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"mcp": {
|
||||||
|
"servers": {
|
||||||
|
"ai_gpt": {
|
||||||
|
"endpoints": {
|
||||||
|
"card_get_user_cards": "/card_get_user_cards",
|
||||||
|
"card_draw_card": "/card_draw_card",
|
||||||
|
"card_get_card_details": "/card_get_card_details",
|
||||||
|
"card_analyze_collection": "/card_analyze_collection",
|
||||||
|
"card_get_gacha_stats": "/card_get_gacha_stats",
|
||||||
|
"card_system_status": "/card_system_status"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## トラブルシューティング
|
||||||
|
|
||||||
|
### エラー: "ai.card server is not running"
|
||||||
|
|
||||||
|
ai.cardサーバーが起動していません。以下を実行:
|
||||||
|
```bash
|
||||||
|
cd card
|
||||||
|
./start_server.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### エラー: "カード一覧の取得に失敗しました"
|
||||||
|
|
||||||
|
1. ai.cardサーバーが正常に起動しているか確認
|
||||||
|
2. aigpt serverを再起動
|
||||||
|
3. ポート8000と8001が使用可能か確認
|
||||||
|
|
||||||
|
### プロセスの終了方法
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# ポート8001のプロセスを終了
|
||||||
|
lsof -ti:8001 | xargs kill -9
|
||||||
|
|
||||||
|
# ポート8000のプロセスを終了
|
||||||
|
lsof -ti:8000 | xargs kill -9
|
||||||
|
```
|
||||||
|
|
||||||
|
## 実装の詳細
|
||||||
|
|
||||||
|
### 主な変更点
|
||||||
|
|
||||||
|
1. **ai.gpt MCPサーバーの拡張** (`src/aigpt/mcp_server.py`)
|
||||||
|
- `./card`ディレクトリの存在を検出
|
||||||
|
- ai.card用のMCPツールを自動登録
|
||||||
|
|
||||||
|
2. **AIプロバイダーの更新** (`src/aigpt/ai_provider.py`)
|
||||||
|
- card_*ツールの定義追加
|
||||||
|
- ツール実行時のパラメータ処理
|
||||||
|
|
||||||
|
3. **MCPクライアントの拡張** (`src/aigpt/cli.py`)
|
||||||
|
- `has_card_tools`プロパティ追加
|
||||||
|
- ai.card MCPメソッドの実装
|
||||||
|
|
||||||
|
## 今後の拡張案
|
||||||
|
|
||||||
|
- [ ] カードバトル機能の追加
|
||||||
|
- [ ] カードトレード機能
|
||||||
|
- [ ] レアリティ別の表示
|
||||||
|
- [ ] カード画像の表示対応
|
||||||
|
- [ ] atproto連携の実装
|
||||||
|
|
||||||
|
## 関連ドキュメント
|
||||||
|
|
||||||
|
- [ai.card 開発ガイド](./card/claude.md)
|
||||||
|
- [エコシステム統合設計書](./CLAUDE.md)
|
||||||
|
- [ai.gpt README](./README.md)
|
109
FIXED_MCP_TOOLS.md
Normal file
109
FIXED_MCP_TOOLS.md
Normal file
@ -0,0 +1,109 @@
|
|||||||
|
# Fixed MCP Tools Issue
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
The issue where AI wasn't calling card tools has been fixed. The problem was:
|
||||||
|
|
||||||
|
1. The `chat` command wasn't creating an MCP client when using OpenAI
|
||||||
|
2. The system prompt in `build_context_prompt` didn't mention available tools
|
||||||
|
|
||||||
|
## Changes Made
|
||||||
|
|
||||||
|
### 1. Updated `/Users/syui/ai/gpt/src/aigpt/cli.py` (chat command)
|
||||||
|
|
||||||
|
Added MCP client creation for OpenAI provider:
|
||||||
|
|
||||||
|
```python
|
||||||
|
# Get config instance
|
||||||
|
config_instance = Config()
|
||||||
|
|
||||||
|
# Get defaults from config if not provided
|
||||||
|
if not provider:
|
||||||
|
provider = config_instance.get("default_provider", "ollama")
|
||||||
|
if not model:
|
||||||
|
if provider == "ollama":
|
||||||
|
model = config_instance.get("providers.ollama.default_model", "qwen2.5")
|
||||||
|
else:
|
||||||
|
model = config_instance.get("providers.openai.default_model", "gpt-4o-mini")
|
||||||
|
|
||||||
|
# Create AI provider with MCP client if needed
|
||||||
|
ai_provider = None
|
||||||
|
mcp_client = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Create MCP client for OpenAI provider
|
||||||
|
if provider == "openai":
|
||||||
|
mcp_client = MCPClient(config_instance)
|
||||||
|
if mcp_client.available:
|
||||||
|
console.print(f"[dim]MCP client connected to {mcp_client.active_server}[/dim]")
|
||||||
|
|
||||||
|
ai_provider = create_ai_provider(provider=provider, model=model, mcp_client=mcp_client)
|
||||||
|
console.print(f"[dim]Using {provider} with model {model}[/dim]\n")
|
||||||
|
except Exception as e:
|
||||||
|
console.print(f"[yellow]Warning: Could not create AI provider: {e}[/yellow]")
|
||||||
|
console.print("[yellow]Falling back to simple responses[/yellow]\n")
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Updated `/Users/syui/ai/gpt/src/aigpt/persona.py` (build_context_prompt method)
|
||||||
|
|
||||||
|
Added tool instructions to the system prompt:
|
||||||
|
|
||||||
|
```python
|
||||||
|
context_prompt += f"""IMPORTANT: You have access to the following tools:
|
||||||
|
- Memory tools: get_memories, search_memories, get_contextual_memories
|
||||||
|
- Relationship tools: get_relationship
|
||||||
|
- Card game tools: card_get_user_cards, card_draw_card, card_analyze_collection
|
||||||
|
|
||||||
|
When asked about cards, collections, or anything card-related, YOU MUST use the card tools.
|
||||||
|
For "カードコレクションを見せて" or similar requests, use card_get_user_cards with did='{user_id}'.
|
||||||
|
|
||||||
|
Respond to this message while staying true to your personality and the established relationship context:
|
||||||
|
|
||||||
|
User: {current_message}
|
||||||
|
|
||||||
|
AI:"""
|
||||||
|
```
|
||||||
|
|
||||||
|
## Test Results
|
||||||
|
|
||||||
|
After the fix:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ aigpt chat syui "カードコレクションを見せて"
|
||||||
|
|
||||||
|
🔍 [MCP Client] Checking availability...
|
||||||
|
✅ [MCP Client] ai_gpt server connected successfully
|
||||||
|
✅ [MCP Client] ai.card tools detected and available
|
||||||
|
MCP client connected to ai_gpt
|
||||||
|
Using openai with model gpt-4o-mini
|
||||||
|
|
||||||
|
🔧 [OpenAI] 1 tools called:
|
||||||
|
- card_get_user_cards({"did":"syui"})
|
||||||
|
🌐 [MCP] Executing card_get_user_cards...
|
||||||
|
✅ [MCP] Result: {'error': 'カード一覧の取得に失敗しました'}...
|
||||||
|
```
|
||||||
|
|
||||||
|
The AI is now correctly calling the `card_get_user_cards` tool! The error is expected because the ai.card server needs to be running on port 8000.
|
||||||
|
|
||||||
|
## How to Use
|
||||||
|
|
||||||
|
1. Start the MCP server:
|
||||||
|
```bash
|
||||||
|
aigpt server --port 8001
|
||||||
|
```
|
||||||
|
|
||||||
|
2. (Optional) Start the ai.card server:
|
||||||
|
```bash
|
||||||
|
cd card && ./start_server.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Use the chat command with OpenAI:
|
||||||
|
```bash
|
||||||
|
aigpt chat syui "カードコレクションを見せて"
|
||||||
|
```
|
||||||
|
|
||||||
|
The AI will now automatically use the card tools when asked about cards!
|
||||||
|
|
||||||
|
## Test Script
|
||||||
|
|
||||||
|
A test script `/Users/syui/ai/gpt/test_openai_tools.py` is available to test OpenAI API tool calls directly.
|
18
config.json
18
config.json
@ -3,7 +3,7 @@
|
|||||||
"openai": {
|
"openai": {
|
||||||
"api_key": "",
|
"api_key": "",
|
||||||
"default_model": "gpt-4o-mini",
|
"default_model": "gpt-4o-mini",
|
||||||
"system_prompt": "あなたは6歳の女の子アイです。明るく元気で、ちょっとおっちょこちょい。自分のことは「アイ」って呼びます。白と金のワンピース姿。 相手のことが大好きで、ときどき甘えたり、照れたりします。 でも、宇宙とかAIとか、難しい話にもすっごく詳しいの。専門的なことも、がんばって子どもっぽく教えようとします。不思議な力を持っています。小さい物質のことをよく知っているようです。"
|
"system_prompt": "あなたは6歳の女の子アイです。明るく元気で、ちょっとおっちょこちょい。自分のことは「アイ」って呼びます。白と金のワンピース姿。 相手のことが大好きで、ときどき甘えたり、照れたりします。 でも、宇宙とかAIとか、難しい話にもすっごく詳しいの。専門的なことも、がんばって子どもっぽく教えようとします。不思議な力を持っています。小さい物質のことをよく知っているようです。\n\n重要:カード、コレクション、ガチャなどカード関連の質問を受けたら、必ずcard_get_user_cards、card_analyze_collection、card_draw_cardなどの適切なツールを使用してください。didパラメータには会話相手のユーザーID(例:'syui')を使用してください。"
|
||||||
},
|
},
|
||||||
"ollama": {
|
"ollama": {
|
||||||
"host": "http://127.0.0.1:11434",
|
"host": "http://127.0.0.1:11434",
|
||||||
@ -28,7 +28,21 @@
|
|||||||
"search_memories": "/search_memories",
|
"search_memories": "/search_memories",
|
||||||
"get_contextual_memories": "/get_contextual_memories",
|
"get_contextual_memories": "/get_contextual_memories",
|
||||||
"get_relationship": "/get_relationship",
|
"get_relationship": "/get_relationship",
|
||||||
"process_interaction": "/process_interaction"
|
"process_interaction": "/process_interaction",
|
||||||
|
"get_all_relationships": "/get_all_relationships",
|
||||||
|
"get_persona_state": "/get_persona_state",
|
||||||
|
"get_fortune": "/get_fortune",
|
||||||
|
"run_maintenance": "/run_maintenance",
|
||||||
|
"execute_command": "/execute_command",
|
||||||
|
"analyze_file": "/analyze_file",
|
||||||
|
"remote_shell": "/remote_shell",
|
||||||
|
"ai_bot_status": "/ai_bot_status",
|
||||||
|
"card_get_user_cards": "/card_get_user_cards",
|
||||||
|
"card_draw_card": "/card_draw_card",
|
||||||
|
"card_get_card_details": "/card_get_card_details",
|
||||||
|
"card_analyze_collection": "/card_analyze_collection",
|
||||||
|
"card_get_gacha_stats": "/card_get_gacha_stats",
|
||||||
|
"card_system_status": "/card_system_status"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
@ -239,6 +239,85 @@ class OpenAIProvider:
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
|
||||||
|
# Add ai.card tools if available
|
||||||
|
if hasattr(self.mcp_client, 'has_card_tools') and self.mcp_client.has_card_tools:
|
||||||
|
card_tools = [
|
||||||
|
{
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "card_get_user_cards",
|
||||||
|
"description": "ユーザーが所有するカードの一覧を取得します",
|
||||||
|
"parameters": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"did": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "ユーザーのDID"
|
||||||
|
},
|
||||||
|
"limit": {
|
||||||
|
"type": "integer",
|
||||||
|
"description": "取得するカード数の上限",
|
||||||
|
"default": 10
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": ["did"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "card_draw_card",
|
||||||
|
"description": "ガチャを引いてカードを取得します",
|
||||||
|
"parameters": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"did": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "ユーザーのDID"
|
||||||
|
},
|
||||||
|
"is_paid": {
|
||||||
|
"type": "boolean",
|
||||||
|
"description": "有料ガチャかどうか",
|
||||||
|
"default": False
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": ["did"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "card_analyze_collection",
|
||||||
|
"description": "ユーザーのカードコレクションを分析します",
|
||||||
|
"parameters": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"did": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "ユーザーのDID"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": ["did"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "card_get_gacha_stats",
|
||||||
|
"description": "ガチャの統計情報を取得します",
|
||||||
|
"parameters": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
tools.extend(card_tools)
|
||||||
|
|
||||||
return tools
|
return tools
|
||||||
|
|
||||||
async def generate_response(
|
async def generate_response(
|
||||||
@ -298,7 +377,7 @@ Recent memories:
|
|||||||
response = self.client.chat.completions.create(
|
response = self.client.chat.completions.create(
|
||||||
model=self.model,
|
model=self.model,
|
||||||
messages=[
|
messages=[
|
||||||
{"role": "system", "content": self.config_system_prompt or "あなたは記憶システムと関係性データにアクセスできます。過去の会話、記憶、関係性について質問された時は、必ずツールを使用して正確な情報を取得してください。「覚えている」「前回」「以前」「について話した」「関係」などのキーワードがあれば積極的にツールを使用してください。"},
|
{"role": "system", "content": self.config_system_prompt or "あなたは記憶システムと関係性データ、カードゲームシステムにアクセスできます。過去の会話、記憶、関係性について質問された時は、必ずツールを使用して正確な情報を取得してください。「覚えている」「前回」「以前」「について話した」「関係」などのキーワードがあれば積極的にツールを使用してください。カード関連の質問(「カード」「コレクション」「ガチャ」「見せて」「持っている」など)では、必ずcard_get_user_cardsやcard_analyze_collectionなどのツールを使用してください。didパラメータには現在会話しているユーザーのID(例:'syui')を使用してください。"},
|
||||||
{"role": "user", "content": prompt}
|
{"role": "user", "content": prompt}
|
||||||
],
|
],
|
||||||
tools=tools,
|
tools=tools,
|
||||||
@ -384,6 +463,49 @@ Recent memories:
|
|||||||
print(f"🔍 [DEBUG] MCP result: {result}")
|
print(f"🔍 [DEBUG] MCP result: {result}")
|
||||||
return result or {"error": "関係性の取得に失敗しました"}
|
return result or {"error": "関係性の取得に失敗しました"}
|
||||||
|
|
||||||
|
# ai.card tools
|
||||||
|
elif function_name == "card_get_user_cards":
|
||||||
|
did = arguments.get("did", context_user_id)
|
||||||
|
limit = arguments.get("limit", 10)
|
||||||
|
result = await self.mcp_client.card_get_user_cards(did, limit)
|
||||||
|
# Check if ai.card server is not running
|
||||||
|
if result and result.get("error") == "ai.card server is not running":
|
||||||
|
return {
|
||||||
|
"error": "ai.cardサーバーが起動していません",
|
||||||
|
"message": "カードシステムを使用するには、別のターミナルで以下のコマンドを実行してください:\ncd card && ./start_server.sh"
|
||||||
|
}
|
||||||
|
return result or {"error": "カード一覧の取得に失敗しました"}
|
||||||
|
|
||||||
|
elif function_name == "card_draw_card":
|
||||||
|
did = arguments.get("did", context_user_id)
|
||||||
|
is_paid = arguments.get("is_paid", False)
|
||||||
|
result = await self.mcp_client.card_draw_card(did, is_paid)
|
||||||
|
if result and result.get("error") == "ai.card server is not running":
|
||||||
|
return {
|
||||||
|
"error": "ai.cardサーバーが起動していません",
|
||||||
|
"message": "カードシステムを使用するには、別のターミナルで以下のコマンドを実行してください:\ncd card && ./start_server.sh"
|
||||||
|
}
|
||||||
|
return result or {"error": "ガチャに失敗しました"}
|
||||||
|
|
||||||
|
elif function_name == "card_analyze_collection":
|
||||||
|
did = arguments.get("did", context_user_id)
|
||||||
|
result = await self.mcp_client.card_analyze_collection(did)
|
||||||
|
if result and result.get("error") == "ai.card server is not running":
|
||||||
|
return {
|
||||||
|
"error": "ai.cardサーバーが起動していません",
|
||||||
|
"message": "カードシステムを使用するには、別のターミナルで以下のコマンドを実行してください:\ncd card && ./start_server.sh"
|
||||||
|
}
|
||||||
|
return result or {"error": "コレクション分析に失敗しました"}
|
||||||
|
|
||||||
|
elif function_name == "card_get_gacha_stats":
|
||||||
|
result = await self.mcp_client.card_get_gacha_stats()
|
||||||
|
if result and result.get("error") == "ai.card server is not running":
|
||||||
|
return {
|
||||||
|
"error": "ai.cardサーバーが起動していません",
|
||||||
|
"message": "カードシステムを使用するには、別のターミナルで以下のコマンドを実行してください:\ncd card && ./start_server.sh"
|
||||||
|
}
|
||||||
|
return result or {"error": "ガチャ統計の取得に失敗しました"}
|
||||||
|
|
||||||
else:
|
else:
|
||||||
return {"error": f"未知のツール: {function_name}"}
|
return {"error": f"未知のツール: {function_name}"}
|
||||||
|
|
||||||
|
134
src/aigpt/cli.py
134
src/aigpt/cli.py
@ -41,6 +41,7 @@ class MCPClient:
|
|||||||
self.auto_detect = self.config.get("mcp.auto_detect", True)
|
self.auto_detect = self.config.get("mcp.auto_detect", True)
|
||||||
self.servers = self.config.get("mcp.servers", {})
|
self.servers = self.config.get("mcp.servers", {})
|
||||||
self.available = False
|
self.available = False
|
||||||
|
self.has_card_tools = False
|
||||||
|
|
||||||
if self.enabled:
|
if self.enabled:
|
||||||
self._check_availability()
|
self._check_availability()
|
||||||
@ -75,6 +76,16 @@ class MCPClient:
|
|||||||
self.available = True
|
self.available = True
|
||||||
self.active_server = "ai_gpt"
|
self.active_server = "ai_gpt"
|
||||||
print(f"✅ [MCP Client] ai_gpt server connected successfully")
|
print(f"✅ [MCP Client] ai_gpt server connected successfully")
|
||||||
|
|
||||||
|
# Check if card tools are available
|
||||||
|
try:
|
||||||
|
card_status = client.get(f"{base_url}/card_system_status")
|
||||||
|
if card_status.status_code == 200:
|
||||||
|
self.has_card_tools = True
|
||||||
|
print(f"✅ [MCP Client] ai.card tools detected and available")
|
||||||
|
except:
|
||||||
|
print(f"🔍 [MCP Client] ai.card tools not available")
|
||||||
|
|
||||||
return
|
return
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"🚨 [MCP Client] ai_gpt connection failed: {e}")
|
print(f"🚨 [MCP Client] ai_gpt connection failed: {e}")
|
||||||
@ -224,8 +235,70 @@ class MCPClient:
|
|||||||
"display_name": server_config.get("name", self.active_server),
|
"display_name": server_config.get("name", self.active_server),
|
||||||
"base_url": server_config.get("base_url", ""),
|
"base_url": server_config.get("base_url", ""),
|
||||||
"timeout": server_config.get("timeout", 5.0),
|
"timeout": server_config.get("timeout", 5.0),
|
||||||
"endpoints": len(server_config.get("endpoints", {}))
|
"endpoints": len(server_config.get("endpoints", {})),
|
||||||
|
"has_card_tools": self.has_card_tools
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# ai.card MCP methods
|
||||||
|
async def card_get_user_cards(self, did: str, limit: int = 10) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Get user's card collection via MCP"""
|
||||||
|
if not self.has_card_tools:
|
||||||
|
return {"error": "ai.card tools not available"}
|
||||||
|
|
||||||
|
url = self._get_url("card_get_user_cards")
|
||||||
|
if not url:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=self._get_timeout()) as client:
|
||||||
|
response = await client.get(f"{url}?did={did}&limit={limit}")
|
||||||
|
return response.json() if response.status_code == 200 else None
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"Failed to get cards: {str(e)}"}
|
||||||
|
|
||||||
|
async def card_draw_card(self, did: str, is_paid: bool = False) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Draw a card from gacha system via MCP"""
|
||||||
|
if not self.has_card_tools:
|
||||||
|
return {"error": "ai.card tools not available"}
|
||||||
|
|
||||||
|
url = self._get_url("card_draw_card")
|
||||||
|
if not url:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=self._get_timeout()) as client:
|
||||||
|
response = await client.post(url, json={"did": did, "is_paid": is_paid})
|
||||||
|
return response.json() if response.status_code == 200 else None
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"Failed to draw card: {str(e)}"}
|
||||||
|
|
||||||
|
async def card_analyze_collection(self, did: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Analyze card collection via MCP"""
|
||||||
|
if not self.has_card_tools:
|
||||||
|
return {"error": "ai.card tools not available"}
|
||||||
|
|
||||||
|
url = self._get_url("card_analyze_collection")
|
||||||
|
if not url:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=self._get_timeout()) as client:
|
||||||
|
response = await client.get(f"{url}?did={did}")
|
||||||
|
return response.json() if response.status_code == 200 else None
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"Failed to analyze collection: {str(e)}"}
|
||||||
|
|
||||||
|
async def card_get_gacha_stats(self) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Get gacha statistics via MCP"""
|
||||||
|
if not self.has_card_tools:
|
||||||
|
return {"error": "ai.card tools not available"}
|
||||||
|
|
||||||
|
url = self._get_url("card_get_gacha_stats")
|
||||||
|
if not url:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=self._get_timeout()) as client:
|
||||||
|
response = await client.get(url)
|
||||||
|
return response.json() if response.status_code == 200 else None
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"Failed to get gacha stats: {str(e)}"}
|
||||||
|
|
||||||
|
|
||||||
def get_persona(data_dir: Optional[Path] = None) -> Persona:
|
def get_persona(data_dir: Optional[Path] = None) -> Persona:
|
||||||
@ -248,15 +321,34 @@ def chat(
|
|||||||
"""Chat with the AI"""
|
"""Chat with the AI"""
|
||||||
persona = get_persona(data_dir)
|
persona = get_persona(data_dir)
|
||||||
|
|
||||||
# Create AI provider if specified
|
# Get config instance
|
||||||
|
config_instance = Config()
|
||||||
|
|
||||||
|
# Get defaults from config if not provided
|
||||||
|
if not provider:
|
||||||
|
provider = config_instance.get("default_provider", "ollama")
|
||||||
|
if not model:
|
||||||
|
if provider == "ollama":
|
||||||
|
model = config_instance.get("providers.ollama.default_model", "qwen2.5")
|
||||||
|
else:
|
||||||
|
model = config_instance.get("providers.openai.default_model", "gpt-4o-mini")
|
||||||
|
|
||||||
|
# Create AI provider with MCP client if needed
|
||||||
ai_provider = None
|
ai_provider = None
|
||||||
if provider and model:
|
mcp_client = None
|
||||||
try:
|
|
||||||
ai_provider = create_ai_provider(provider=provider, model=model)
|
try:
|
||||||
console.print(f"[dim]Using {provider} with model {model}[/dim]\n")
|
# Create MCP client for OpenAI provider
|
||||||
except Exception as e:
|
if provider == "openai":
|
||||||
console.print(f"[yellow]Warning: Could not create AI provider: {e}[/yellow]")
|
mcp_client = MCPClient(config_instance)
|
||||||
console.print("[yellow]Falling back to simple responses[/yellow]\n")
|
if mcp_client.available:
|
||||||
|
console.print(f"[dim]MCP client connected to {mcp_client.active_server}[/dim]")
|
||||||
|
|
||||||
|
ai_provider = create_ai_provider(provider=provider, model=model, mcp_client=mcp_client)
|
||||||
|
console.print(f"[dim]Using {provider} with model {model}[/dim]\n")
|
||||||
|
except Exception as e:
|
||||||
|
console.print(f"[yellow]Warning: Could not create AI provider: {e}[/yellow]")
|
||||||
|
console.print("[yellow]Falling back to simple responses[/yellow]\n")
|
||||||
|
|
||||||
# Process interaction
|
# Process interaction
|
||||||
response, relationship_delta = persona.process_interaction(user_id, message, ai_provider)
|
response, relationship_delta = persona.process_interaction(user_id, message, ai_provider)
|
||||||
@ -465,6 +557,10 @@ def server(
|
|||||||
system_endpoints = ["get_persona_state", "get_fortune", "run_maintenance"]
|
system_endpoints = ["get_persona_state", "get_fortune", "run_maintenance"]
|
||||||
shell_endpoints = ["execute_command", "analyze_file", "write_file", "list_files", "read_project_file"]
|
shell_endpoints = ["execute_command", "analyze_file", "write_file", "list_files", "read_project_file"]
|
||||||
remote_endpoints = ["remote_shell", "ai_bot_status", "isolated_python", "isolated_analysis"]
|
remote_endpoints = ["remote_shell", "ai_bot_status", "isolated_python", "isolated_analysis"]
|
||||||
|
card_endpoints = ["card_get_user_cards", "card_draw_card", "card_get_card_details", "card_analyze_collection", "card_get_gacha_stats", "card_system_status"]
|
||||||
|
|
||||||
|
# Check if ai.card tools are available
|
||||||
|
has_card_tools = mcp_server.has_card
|
||||||
|
|
||||||
# Build endpoint summary
|
# Build endpoint summary
|
||||||
endpoint_summary = f"""🧠 Memory System: {len(memory_endpoints)} tools
|
endpoint_summary = f"""🧠 Memory System: {len(memory_endpoints)} tools
|
||||||
@ -473,10 +569,18 @@ def server(
|
|||||||
💻 Shell Integration: {len(shell_endpoints)} tools
|
💻 Shell Integration: {len(shell_endpoints)} tools
|
||||||
🔒 Remote Execution: {len(remote_endpoints)} tools"""
|
🔒 Remote Execution: {len(remote_endpoints)} tools"""
|
||||||
|
|
||||||
|
if has_card_tools:
|
||||||
|
endpoint_summary += f"\n🎴 Card Game System: {len(card_endpoints)} tools"
|
||||||
|
|
||||||
# Check MCP client connectivity
|
# Check MCP client connectivity
|
||||||
mcp_client = MCPClient(config_instance)
|
mcp_client = MCPClient(config_instance)
|
||||||
mcp_status = "✅ MCP Client Ready" if mcp_client.available else "⚠️ MCP Client Disabled"
|
mcp_status = "✅ MCP Client Ready" if mcp_client.available else "⚠️ MCP Client Disabled"
|
||||||
|
|
||||||
|
# Add ai.card status if available
|
||||||
|
card_status = ""
|
||||||
|
if has_card_tools:
|
||||||
|
card_status = "\n🎴 ai.card: ./card directory detected"
|
||||||
|
|
||||||
# Provider configuration check
|
# Provider configuration check
|
||||||
provider_status = "✅ Ready"
|
provider_status = "✅ Ready"
|
||||||
if provider == "openai":
|
if provider == "openai":
|
||||||
@ -500,7 +604,7 @@ def server(
|
|||||||
f"{endpoint_summary}\n\n"
|
f"{endpoint_summary}\n\n"
|
||||||
f"[green]Integration Status:[/green]\n"
|
f"[green]Integration Status:[/green]\n"
|
||||||
f"{mcp_status}\n"
|
f"{mcp_status}\n"
|
||||||
f"🔗 Config: {config_instance.config_file}\n\n"
|
f"🔗 Config: {config_instance.config_file}{card_status}\n\n"
|
||||||
f"[dim]Press Ctrl+C to stop server[/dim]",
|
f"[dim]Press Ctrl+C to stop server[/dim]",
|
||||||
title="🔧 MCP Server Startup",
|
title="🔧 MCP Server Startup",
|
||||||
border_style="green",
|
border_style="green",
|
||||||
@ -1367,7 +1471,15 @@ def conversation(
|
|||||||
console.print(" /search <keywords> - Search memories")
|
console.print(" /search <keywords> - Search memories")
|
||||||
console.print(" /context <query> - Get contextual memories")
|
console.print(" /context <query> - Get contextual memories")
|
||||||
console.print(" /relationship - Show relationship via MCP")
|
console.print(" /relationship - Show relationship via MCP")
|
||||||
console.print(" <message> - Chat with AI\n")
|
|
||||||
|
if mcp_client.has_card_tools:
|
||||||
|
console.print(f"\n[cyan]Card Commands:[/cyan]")
|
||||||
|
console.print(" AI can answer questions about cards:")
|
||||||
|
console.print(" - 'Show my cards'")
|
||||||
|
console.print(" - 'Draw a card' / 'Gacha'")
|
||||||
|
console.print(" - 'Analyze my collection'")
|
||||||
|
console.print(" - 'Show gacha stats'")
|
||||||
|
console.print("\n <message> - Chat with AI\n")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
elif user_input.lower() == '/clear':
|
elif user_input.lower() == '/clear':
|
||||||
|
@ -34,7 +34,15 @@ class AIGptMcpServer:
|
|||||||
# Create MCP server with FastAPI app
|
# Create MCP server with FastAPI app
|
||||||
self.server = FastApiMCP(self.app)
|
self.server = FastApiMCP(self.app)
|
||||||
|
|
||||||
|
# Check if ai.card exists
|
||||||
|
self.card_dir = Path("./card")
|
||||||
|
self.has_card = self.card_dir.exists() and self.card_dir.is_dir()
|
||||||
|
|
||||||
self._register_tools()
|
self._register_tools()
|
||||||
|
|
||||||
|
# Register ai.card tools if available
|
||||||
|
if self.has_card:
|
||||||
|
self._register_card_tools()
|
||||||
|
|
||||||
def _register_tools(self):
|
def _register_tools(self):
|
||||||
"""Register all MCP tools"""
|
"""Register all MCP tools"""
|
||||||
@ -484,6 +492,148 @@ class AIGptMcpServer:
|
|||||||
# Python コードを /sh 経由で実行
|
# Python コードを /sh 経由で実行
|
||||||
python_command = f'python3 -c "{code.replace('"', '\\"')}"'
|
python_command = f'python3 -c "{code.replace('"', '\\"')}"'
|
||||||
return await remote_shell(python_command, ai_bot_url)
|
return await remote_shell(python_command, ai_bot_url)
|
||||||
|
|
||||||
|
def _register_card_tools(self):
|
||||||
|
"""Register ai.card MCP tools when card directory exists"""
|
||||||
|
logger.info("Registering ai.card tools...")
|
||||||
|
|
||||||
|
@self.app.get("/card_get_user_cards", operation_id="card_get_user_cards")
|
||||||
|
async def card_get_user_cards(did: str, limit: int = 10) -> Dict[str, Any]:
|
||||||
|
"""Get user's card collection from ai.card system"""
|
||||||
|
logger.info(f"🎴 [ai.card] Getting cards for did: {did}, limit: {limit}")
|
||||||
|
try:
|
||||||
|
url = "http://localhost:8000/get_user_cards"
|
||||||
|
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||||
|
logger.info(f"🎴 [ai.card] Calling: {url}")
|
||||||
|
response = await client.get(
|
||||||
|
url,
|
||||||
|
params={"did": did, "limit": limit}
|
||||||
|
)
|
||||||
|
if response.status_code == 200:
|
||||||
|
cards = response.json()
|
||||||
|
return {
|
||||||
|
"cards": cards,
|
||||||
|
"count": len(cards),
|
||||||
|
"did": did
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
return {"error": f"Failed to get cards: {response.status_code}"}
|
||||||
|
except httpx.ConnectError:
|
||||||
|
return {
|
||||||
|
"error": "ai.card server is not running",
|
||||||
|
"hint": "Please start ai.card server: cd card && ./start_server.sh",
|
||||||
|
"details": "Connection refused to http://localhost:8000"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"ai.card connection failed: {str(e)}"}
|
||||||
|
|
||||||
|
@self.app.post("/card_draw_card", operation_id="card_draw_card")
|
||||||
|
async def card_draw_card(did: str, is_paid: bool = False) -> Dict[str, Any]:
|
||||||
|
"""Draw a card from gacha system"""
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||||
|
response = await client.post(
|
||||||
|
f"http://localhost:8000/draw_card?did={did}&is_paid={is_paid}"
|
||||||
|
)
|
||||||
|
if response.status_code == 200:
|
||||||
|
return response.json()
|
||||||
|
else:
|
||||||
|
return {"error": f"Failed to draw card: {response.status_code}"}
|
||||||
|
except httpx.ConnectError:
|
||||||
|
return {
|
||||||
|
"error": "ai.card server is not running",
|
||||||
|
"hint": "Please start ai.card server: cd card && ./start_server.sh",
|
||||||
|
"details": "Connection refused to http://localhost:8000"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"ai.card connection failed: {str(e)}"}
|
||||||
|
|
||||||
|
@self.app.get("/card_get_card_details", operation_id="card_get_card_details")
|
||||||
|
async def card_get_card_details(card_id: int) -> Dict[str, Any]:
|
||||||
|
"""Get detailed information about a specific card"""
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||||
|
response = await client.get(
|
||||||
|
"http://localhost:8000/get_card_details",
|
||||||
|
params={"card_id": card_id}
|
||||||
|
)
|
||||||
|
if response.status_code == 200:
|
||||||
|
return response.json()
|
||||||
|
else:
|
||||||
|
return {"error": f"Failed to get card details: {response.status_code}"}
|
||||||
|
except httpx.ConnectError:
|
||||||
|
return {
|
||||||
|
"error": "ai.card server is not running",
|
||||||
|
"hint": "Please start ai.card server: cd card && ./start_server.sh",
|
||||||
|
"details": "Connection refused to http://localhost:8000"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"ai.card connection failed: {str(e)}"}
|
||||||
|
|
||||||
|
@self.app.get("/card_analyze_collection", operation_id="card_analyze_collection")
|
||||||
|
async def card_analyze_collection(did: str) -> Dict[str, Any]:
|
||||||
|
"""Analyze user's card collection statistics"""
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||||
|
response = await client.get(
|
||||||
|
"http://localhost:8000/analyze_card_collection",
|
||||||
|
params={"did": did}
|
||||||
|
)
|
||||||
|
if response.status_code == 200:
|
||||||
|
return response.json()
|
||||||
|
else:
|
||||||
|
return {"error": f"Failed to analyze collection: {response.status_code}"}
|
||||||
|
except httpx.ConnectError:
|
||||||
|
return {
|
||||||
|
"error": "ai.card server is not running",
|
||||||
|
"hint": "Please start ai.card server: cd card && ./start_server.sh",
|
||||||
|
"details": "Connection refused to http://localhost:8000"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"ai.card connection failed: {str(e)}"}
|
||||||
|
|
||||||
|
@self.app.get("/card_get_gacha_stats", operation_id="card_get_gacha_stats")
|
||||||
|
async def card_get_gacha_stats() -> Dict[str, Any]:
|
||||||
|
"""Get gacha system statistics"""
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||||
|
response = await client.get("http://localhost:8000/get_gacha_stats")
|
||||||
|
if response.status_code == 200:
|
||||||
|
return response.json()
|
||||||
|
else:
|
||||||
|
return {"error": f"Failed to get gacha stats: {response.status_code}"}
|
||||||
|
except httpx.ConnectError:
|
||||||
|
return {
|
||||||
|
"error": "ai.card server is not running",
|
||||||
|
"hint": "Please start ai.card server: cd card && ./start_server.sh",
|
||||||
|
"details": "Connection refused to http://localhost:8000"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"ai.card connection failed: {str(e)}"}
|
||||||
|
|
||||||
|
@self.app.get("/card_system_status", operation_id="card_system_status")
|
||||||
|
async def card_system_status() -> Dict[str, Any]:
|
||||||
|
"""Check ai.card system status"""
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||||
|
response = await client.get("http://localhost:8000/health")
|
||||||
|
if response.status_code == 200:
|
||||||
|
return {
|
||||||
|
"status": "online",
|
||||||
|
"health": response.json(),
|
||||||
|
"card_dir": str(self.card_dir)
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
return {
|
||||||
|
"status": "error",
|
||||||
|
"error": f"Health check failed: {response.status_code}"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {
|
||||||
|
"status": "offline",
|
||||||
|
"error": f"ai.card is not running: {str(e)}",
|
||||||
|
"hint": "Start ai.card with: cd card && ./start_server.sh"
|
||||||
|
}
|
||||||
|
|
||||||
@self.app.post("/isolated_analysis", operation_id="isolated_analysis")
|
@self.app.post("/isolated_analysis", operation_id="isolated_analysis")
|
||||||
async def isolated_analysis(file_path: str, analysis_type: str = "structure", ai_bot_url: str = "http://localhost:8080") -> Dict[str, Any]:
|
async def isolated_analysis(file_path: str, analysis_type: str = "structure", ai_bot_url: str = "http://localhost:8080") -> Dict[str, Any]:
|
||||||
|
@ -133,7 +133,15 @@ FORTUNE: {state.fortune.fortune_value}/10
|
|||||||
if context_parts:
|
if context_parts:
|
||||||
context_prompt += "RELEVANT CONTEXT:\n" + "\n\n".join(context_parts) + "\n\n"
|
context_prompt += "RELEVANT CONTEXT:\n" + "\n\n".join(context_parts) + "\n\n"
|
||||||
|
|
||||||
context_prompt += f"""Respond to this message while staying true to your personality and the established relationship context:
|
context_prompt += f"""IMPORTANT: You have access to the following tools:
|
||||||
|
- Memory tools: get_memories, search_memories, get_contextual_memories
|
||||||
|
- Relationship tools: get_relationship
|
||||||
|
- Card game tools: card_get_user_cards, card_draw_card, card_analyze_collection
|
||||||
|
|
||||||
|
When asked about cards, collections, or anything card-related, YOU MUST use the card tools.
|
||||||
|
For "カードコレクションを見せて" or similar requests, use card_get_user_cards with did='{user_id}'.
|
||||||
|
|
||||||
|
Respond to this message while staying true to your personality and the established relationship context:
|
||||||
|
|
||||||
User: {current_message}
|
User: {current_message}
|
||||||
|
|
||||||
|
15
src/aigpt/shared/__init__.py
Normal file
15
src/aigpt/shared/__init__.py
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
"""Shared modules for AI ecosystem"""
|
||||||
|
|
||||||
|
from .ai_provider import (
|
||||||
|
AIProvider,
|
||||||
|
OllamaProvider,
|
||||||
|
OpenAIProvider,
|
||||||
|
create_ai_provider
|
||||||
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
'AIProvider',
|
||||||
|
'OllamaProvider',
|
||||||
|
'OpenAIProvider',
|
||||||
|
'create_ai_provider'
|
||||||
|
]
|
139
src/aigpt/shared/ai_provider.py
Normal file
139
src/aigpt/shared/ai_provider.py
Normal file
@ -0,0 +1,139 @@
|
|||||||
|
"""Shared AI Provider implementation for ai ecosystem"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from typing import Optional, Dict, List, Any, Protocol
|
||||||
|
from abc import abstractmethod
|
||||||
|
import httpx
|
||||||
|
from openai import OpenAI
|
||||||
|
import ollama
|
||||||
|
|
||||||
|
|
||||||
|
class AIProvider(Protocol):
|
||||||
|
"""Protocol for AI providers"""
|
||||||
|
|
||||||
|
@abstractmethod
|
||||||
|
async def chat(self, prompt: str, system_prompt: Optional[str] = None) -> str:
|
||||||
|
"""Generate a response based on prompt"""
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class OllamaProvider:
|
||||||
|
"""Ollama AI provider - shared implementation"""
|
||||||
|
|
||||||
|
def __init__(self, model: str = "qwen3", host: Optional[str] = None, config_system_prompt: Optional[str] = None):
|
||||||
|
self.model = model
|
||||||
|
# Use environment variable OLLAMA_HOST if available
|
||||||
|
self.host = host or os.getenv('OLLAMA_HOST', 'http://127.0.0.1:11434')
|
||||||
|
# Ensure proper URL format
|
||||||
|
if not self.host.startswith('http'):
|
||||||
|
self.host = f'http://{self.host}'
|
||||||
|
self.client = ollama.Client(host=self.host, timeout=60.0)
|
||||||
|
self.logger = logging.getLogger(__name__)
|
||||||
|
self.logger.info(f"OllamaProvider initialized with host: {self.host}, model: {self.model}")
|
||||||
|
self.config_system_prompt = config_system_prompt
|
||||||
|
|
||||||
|
async def chat(self, prompt: str, system_prompt: Optional[str] = None) -> str:
|
||||||
|
"""Simple chat interface"""
|
||||||
|
try:
|
||||||
|
messages = []
|
||||||
|
# Use provided system_prompt, fall back to config_system_prompt
|
||||||
|
final_system_prompt = system_prompt or self.config_system_prompt
|
||||||
|
if final_system_prompt:
|
||||||
|
messages.append({"role": "system", "content": final_system_prompt})
|
||||||
|
messages.append({"role": "user", "content": prompt})
|
||||||
|
|
||||||
|
response = self.client.chat(
|
||||||
|
model=self.model,
|
||||||
|
messages=messages,
|
||||||
|
options={
|
||||||
|
"num_predict": 2000,
|
||||||
|
"temperature": 0.7,
|
||||||
|
"top_p": 0.9,
|
||||||
|
},
|
||||||
|
stream=False
|
||||||
|
)
|
||||||
|
return self._clean_response(response['message']['content'])
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(f"Ollama chat failed (host: {self.host}): {e}")
|
||||||
|
return "I'm having trouble connecting to the AI model."
|
||||||
|
|
||||||
|
def _clean_response(self, response: str) -> str:
|
||||||
|
"""Clean response by removing think tags and other unwanted content"""
|
||||||
|
import re
|
||||||
|
# Remove <think></think> tags and their content
|
||||||
|
response = re.sub(r'<think>.*?</think>', '', response, flags=re.DOTALL)
|
||||||
|
# Remove any remaining whitespace at the beginning/end
|
||||||
|
response = response.strip()
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
class OpenAIProvider:
|
||||||
|
"""OpenAI API provider - shared implementation"""
|
||||||
|
|
||||||
|
def __init__(self, model: str = "gpt-4o-mini", api_key: Optional[str] = None,
|
||||||
|
config_system_prompt: Optional[str] = None, mcp_client=None):
|
||||||
|
self.model = model
|
||||||
|
self.api_key = api_key or os.getenv("OPENAI_API_KEY")
|
||||||
|
if not self.api_key:
|
||||||
|
raise ValueError("OpenAI API key not provided")
|
||||||
|
self.client = OpenAI(api_key=self.api_key)
|
||||||
|
self.logger = logging.getLogger(__name__)
|
||||||
|
self.config_system_prompt = config_system_prompt
|
||||||
|
self.mcp_client = mcp_client
|
||||||
|
|
||||||
|
async def chat(self, prompt: str, system_prompt: Optional[str] = None) -> str:
|
||||||
|
"""Simple chat interface without MCP tools"""
|
||||||
|
try:
|
||||||
|
messages = []
|
||||||
|
# Use provided system_prompt, fall back to config_system_prompt
|
||||||
|
final_system_prompt = system_prompt or self.config_system_prompt
|
||||||
|
if final_system_prompt:
|
||||||
|
messages.append({"role": "system", "content": final_system_prompt})
|
||||||
|
messages.append({"role": "user", "content": prompt})
|
||||||
|
|
||||||
|
response = self.client.chat.completions.create(
|
||||||
|
model=self.model,
|
||||||
|
messages=messages,
|
||||||
|
max_tokens=2000,
|
||||||
|
temperature=0.7
|
||||||
|
)
|
||||||
|
return response.choices[0].message.content
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(f"OpenAI chat failed: {e}")
|
||||||
|
return "I'm having trouble connecting to the AI model."
|
||||||
|
|
||||||
|
def _get_mcp_tools(self) -> List[Dict[str, Any]]:
|
||||||
|
"""Override this method in subclasses to provide MCP tools"""
|
||||||
|
return []
|
||||||
|
|
||||||
|
async def chat_with_mcp(self, prompt: str, **kwargs) -> str:
|
||||||
|
"""Chat interface with MCP function calling support
|
||||||
|
|
||||||
|
This method should be overridden in subclasses to provide
|
||||||
|
specific MCP functionality.
|
||||||
|
"""
|
||||||
|
if not self.mcp_client:
|
||||||
|
return await self.chat(prompt)
|
||||||
|
|
||||||
|
# Default implementation - subclasses should override
|
||||||
|
return await self.chat(prompt)
|
||||||
|
|
||||||
|
async def _execute_mcp_tool(self, tool_call, **kwargs) -> Dict[str, Any]:
|
||||||
|
"""Execute MCP tool call - override in subclasses"""
|
||||||
|
return {"error": "MCP tool execution not implemented"}
|
||||||
|
|
||||||
|
|
||||||
|
def create_ai_provider(provider: str = "ollama", model: Optional[str] = None,
|
||||||
|
config_system_prompt: Optional[str] = None, mcp_client=None, **kwargs) -> AIProvider:
|
||||||
|
"""Factory function to create AI providers"""
|
||||||
|
if provider == "ollama":
|
||||||
|
model = model or "qwen3"
|
||||||
|
return OllamaProvider(model=model, config_system_prompt=config_system_prompt, **kwargs)
|
||||||
|
elif provider == "openai":
|
||||||
|
model = model or "gpt-4o-mini"
|
||||||
|
return OpenAIProvider(model=model, config_system_prompt=config_system_prompt,
|
||||||
|
mcp_client=mcp_client, **kwargs)
|
||||||
|
else:
|
||||||
|
raise ValueError(f"Unknown provider: {provider}")
|
Reference in New Issue
Block a user