257 Commits

Author SHA1 Message Date
b344a0a760 fix 2026-01-08 18:28:17 +09:00
d3f08e6db5 fix 2026-01-08 18:26:27 +09:00
68ac88045b fix 2026-01-08 18:07:33 +09:00
bda6b9700d fix 2026-01-08 17:34:14 +09:00
591edf61f8 fix browser 2026-01-08 17:32:00 +09:00
14b7ccd391 start 2026-01-08 17:30:51 +09:00
e577754298 fix 2025-12-30 09:29:29 +09:00
b5914cade5 fix 2025-12-30 09:15:11 +09:00
a07c77d3f1 fix 2025-12-30 09:09:23 +09:00
1700f41e4f fix 2025-12-30 09:05:59 +09:00
cf2708c7f3 fix 2025-12-12 14:02:50 +09:00
db29524307 fix 2025-12-12 13:49:43 +09:00
e5aeffb621 fix 2025-12-11 12:46:42 +09:00
cbf3c424e3 fix 2025-12-09 17:28:48 +09:00
adf2121a8b fix 2025-12-09 17:22:31 +09:00
5ba0b77154 fix 2025-12-08 18:51:00 +09:00
22fda9cb2d fix 2025-12-08 14:13:15 +09:00
b0ca3f9163 fix 2025-12-04 18:48:42 +09:00
08c4a5cd52 fix 2025-12-04 16:45:04 +09:00
c004905d39 fix 2025-12-04 15:19:46 +09:00
a970bc008f fix 2025-12-04 14:35:17 +09:00
678c238ee7 fix 2025-11-29 07:50:48 +09:00
9f9fabd478 fix 2025-11-22 08:05:30 +09:00
acf9cacda0 fix 2025-11-22 07:31:53 +09:00
a88a61f866 fix 2025-11-22 07:22:07 +09:00
afd95636da fix 2025-11-22 07:14:49 +09:00
62ecab5f04 fix 2025-11-21 22:08:48 +09:00
d46502d177 fix 2025-11-21 22:03:38 +09:00
0d742ca1f2 fix 2025-11-21 21:46:34 +09:00
0e820f0746 fix 2025-11-21 05:34:01 +09:00
896734e265 fix 2025-11-21 05:13:23 +09:00
9535e7f08d fix 2025-11-21 05:12:12 +09:00
2d8a565a34 fix 2025-11-20 07:42:48 +09:00
3109b99959 fix css h1 2025-11-20 07:40:34 +09:00
2558fb32fa fix h1 aria-label 2025-11-20 07:28:37 +09:00
7b60de8322 fix 2025-11-20 07:17:43 +09:00
87d313575b fix 2025-11-20 07:15:27 +09:00
6d133f2d34 fix 2025-11-20 04:28:00 +09:00
84d6f7ae25 fix 2025-11-20 04:23:33 +09:00
05df0dcb7d fix 2025-11-19 08:39:28 +09:00
3f88533d2b fix 2025-11-19 06:52:21 +09:00
44d85e1b00 fix 2025-11-19 06:39:10 +09:00
f642944550 fix 2025-11-09 17:45:56 +09:00
ccdfe9cedf fix 2025-11-09 17:43:07 +09:00
6763336b2f fix 2025-11-09 17:41:19 +09:00
0b594e9a2b fix 2025-11-09 17:37:18 +09:00
281729460a fix 2025-11-09 17:33:49 +09:00
8f0e714a05 fix 2025-11-07 16:34:38 +09:00
18898e4ed5 fix 2025-11-07 16:28:51 +09:00
1dc64b9110 fix 2025-11-07 16:28:28 +09:00
8074096447 fix 2025-11-07 16:22:55 +09:00
e4b330ae27 fix 2025-11-06 19:50:00 +09:00
026f7e963b fix 2025-10-28 13:34:06 +09:00
9ecd293693 fix 2025-10-28 13:24:56 +09:00
e41f007e8e fix 2025-10-21 05:08:52 +09:00
01012424e8 fix 2025-10-20 07:27:10 +09:00
f26717c50d fix 2025-10-20 06:38:18 +09:00
edce421464 fix 2025-10-19 20:42:26 +09:00
eaf332e208 fix 2025-10-19 19:45:35 +09:00
0a79bcf73e fix 2025-10-19 19:45:19 +09:00
eb5c7a35d0 fix 2025-10-19 19:44:24 +09:00
ab7dfd90c5 fix 2025-10-19 19:41:03 +09:00
1835688595 fix 2025-10-19 16:49:29 +09:00
0135f6cb7a fix 2025-10-19 12:40:52 +09:00
d448278989 fix 2025-10-19 12:25:45 +09:00
837ff28a3d fix 2025-10-11 07:53:47 +09:00
99a8e4e6bd fix 2025-10-09 01:38:40 +09:00
bd5ef0809c fix 2025-10-09 01:32:04 +09:00
3d7021b446 fix 2025-10-03 20:23:47 +09:00
297381bc0f fix 2025-10-03 19:32:00 +09:00
00e0b301ce fix 2025-09-26 13:45:14 +09:00
331d11f319 fix 2025-09-26 12:16:04 +09:00
c4bb4f1f0f fix 2025-09-25 23:08:09 +09:00
9ff70eb1b0 fix 2025-09-25 18:54:33 +09:00
abfdf1d016 fix 2025-09-25 00:22:55 +09:00
a9338e13a6 fix 2025-09-24 15:00:03 +09:00
96da3eb52b fix 2025-09-24 14:44:19 +09:00
d4a2ea4e24 fix 2025-09-24 13:24:58 +09:00
85c49d4657 fix 2025-09-24 13:21:20 +09:00
8b61f2fa72 fix 2025-09-24 13:10:45 +09:00
92d5c0da32 fix 2025-09-19 11:36:19 +09:00
6a8e160540 fix 2025-09-19 11:30:58 +09:00
9e31a0fb0a fix 2025-09-19 10:57:24 +09:00
1c6cf7d063 fix 2025-09-19 10:55:55 +09:00
dd2e3b4fb9 add post 2025-09-10 20:34:10 +09:00
976107903b fix post-page default tab user-comment 2025-08-10 08:31:44 +09:00
2aa6616209 fix comment record-url 2025-08-10 08:25:12 +09:00
a2ec41bf87 fix post 2025-08-10 07:59:46 +09:00
3af175f4ed Fix: Update cache key to include Cargo.toml hash
This forces cache invalidation when version changes

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-10 04:59:27 +09:00
0e1dff12c2 add binary 2025-08-10 04:59:27 +09:00
5708e9c53d fix cargo version gh-actions 2025-08-09 18:37:58 +09:00
824aee7b74 fix temp 2025-08-09 18:07:40 +09:00
aff1b3a7bd add binary 2025-08-09 18:04:20 +09:00
fa524138c6 fix home dir 2025-08-09 18:02:09 +09:00
6047fa4161 fix type md ai 2025-08-09 17:55:13 +09:00
c5a70a433b cleanup 2025-08-09 16:53:25 +09:00
0d90ba21e0 fix listrecord-created-sort 2025-08-09 16:50:47 +09:00
e1eab122c8 fix post title 2025-08-09 11:52:54 +09:00
c3cb3db680 add post, fix profile err 2025-08-09 11:51:58 +09:00
55745ff051 rm log 2025-08-09 11:51:58 +09:00
5aeeba106a add post 2025-07-30 19:30:01 +09:00
f1e76ab31f fix post 2025-07-27 05:04:01 +09:00
3c9ef78696 add binary 2025-07-26 20:54:23 +09:00
ee2d21b0f3 update 2025-07-26 20:00:16 +09:00
0667ac58fb test game 2025-07-26 19:51:55 +09:00
d89855338b fix css 2025-07-18 10:57:42 +09:00
e19170cdff add pds.html 2025-07-18 00:05:04 +09:00
c3e22611f5 fix layout 2025-07-17 23:57:08 +09:00
2943c94ec1 binary 2025-07-17 22:23:14 +09:00
f27997b7e8 rm pds asset 2025-07-17 22:20:25 +09:00
447e4bded9 update 2025-07-17 22:12:06 +09:00
03161a52ca fix oauth-ai-chat 2025-07-17 19:26:40 +09:00
fe9381a860 fix blog post 2025-07-17 19:26:40 +09:00
f0cea89005 fix oauth filter 2025-07-16 22:57:09 +09:00
b059fe1de0 fix comment, rm console.log 2025-07-16 22:53:01 +09:00
07b0b0f702 fix css 2025-07-16 20:58:42 +09:00
ecd69557fe oauth markdown 2025-07-16 20:42:50 +09:00
452a0fda6a fix blog post 2025-07-16 11:47:15 +09:00
a62dd82790 fix config 2025-07-16 11:27:37 +09:00
3faec33bac fix blog post 2025-07-16 11:04:50 +09:00
33402f4a21 add blog post 2025-07-16 11:04:02 +09:00
3e65bc8210 binary 2025-07-16 10:18:03 +09:00
16d724ec25 update 2025-07-16 10:08:43 +09:00
69182a1bf8 update 2025-07-16 09:33:46 +09:00
0110773592 test ai-blog 2025-07-16 09:32:45 +09:00
75f108e7b8 fix blog post link 2025-07-14 15:27:10 +09:00
263189ce72 add blog post 2025-07-14 14:11:55 +09:00
7800a655f3 fix profile 2025-07-13 08:12:40 +09:00
76c797e4d8 add blog post 2025-07-13 07:52:43 +09:00
d1a1c92842 update binary 2025-07-11 13:38:22 +09:00
9da1f87640 fix update version 2025-07-11 13:09:15 +09:00
ddfc43512c add md msg 2025-07-11 08:52:34 +09:00
b3ccd61935 add my-blog msg 2025-07-11 08:51:46 +09:00
a243b6a44e fix post filename 2025-07-05 15:42:36 +09:00
e3c1cf4790 fix build err 2025-07-05 15:31:04 +09:00
a6236661bf post 2025-07-05 15:30:55 +09:00
195a4474c9 fix config.toml 2025-07-01 21:22:48 +09:00
4a34a6ca59 rm my-blog/oauth 2025-07-01 21:20:26 +09:00
4d01fb8507 fix oauth network err 2025-07-01 19:48:49 +09:00
d69c9aa09b update binary 2025-07-01 06:22:15 +09:00
99ee49f76e feat: add server-side image comparison shortcode support
- Add {{< img-compare >}} and [img-compare] shortcode syntax
- Implement server-side shortcode processing in Rust
- Create dedicated shortcode module for extensibility
- Fix image comparison slider display issues
- Remove caption display for cleaner UI
- Update to version 0.2.6

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-07-01 06:09:39 +09:00
19c0e28668 add post 2025-07-01 06:02:31 +09:00
bc99eb0814 update img-slider 2025-07-01 06:02:25 +09:00
cf93721bad fix social-app uri 2025-06-26 19:56:13 +09:00
8a8a121f4a fix delete record 2025-06-25 23:14:27 +09:00
be2bcae1d6 fix test ask-AI oauth profile 2025-06-25 23:03:50 +09:00
2c08a4acfb test blog profile 2025-06-25 21:18:13 +09:00
7791399314 fix claude-code proxy 2025-06-24 22:55:16 +09:00
26b1b2cf87 fix mobile css 2025-06-22 01:50:49 +09:00
7eb653f569 fix layout article.article-content 2025-06-22 01:16:59 +09:00
0fc920c844 fix layout 2025-06-22 00:35:54 +09:00
13c05d97d2 add claude-code-mcp-server 2025-06-22 00:26:20 +09:00
71acd44810 fix layout font-size 2025-06-22 00:25:44 +09:00
1b4579d0f1 fix layout font-size 2025-06-22 00:25:04 +09:00
09100f6d99 fix ask-ai prompt userdata 2025-06-22 00:01:27 +09:00
169de9064a fix link github 2025-06-21 19:11:01 +09:00
097c794623 fix oauth bsky button 2025-06-21 18:30:39 +09:00
b652e01dd3 fix oauth loading button 2025-06-21 17:03:23 +09:00
31af524303 fix layout 2025-06-21 15:46:21 +09:00
6be024864d cleanup docs 2025-06-21 00:07:22 +09:00
eef1fdad38 fix layout 2025-06-20 23:26:32 +09:00
b7e411e8b2 add img 2025-06-20 23:26:11 +09:00
8f9d803a94 fix gh-actions 2025-06-20 00:08:23 +09:00
f9b9c2ab52 fix layout 2025-06-20 00:00:51 +09:00
210ce801f1 update binary 2025-06-20 00:00:29 +09:00
6cb46f2ca1 fix token refresh 2025-06-19 23:01:41 +09:00
9406597b82 add post 2025-06-19 22:08:43 +09:00
0dbc3ba67e fix html text 2025-06-19 21:22:01 +09:00
a7e6fc4a1a Release v0.2.4: Complete OAuth system with AI chat and mobile support
- Fixed OAuth authentication with ATProto integration
- Implemented Ask AI functionality with conversation history
- Resolved PDS/web link issues for cross-network compatibility
- Added comprehensive mobile responsive design
- Enhanced comment posting with loading states and auto-refresh
- Improved chat record display with question/answer pairing
- Fixed tab scrolling and layout overflow issues

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-19 20:02:24 +09:00
3adcfdacf5 fix post commnet 2025-06-19 19:58:40 +09:00
004081337c fix ask-ai put 2025-06-19 19:52:31 +09:00
5ce0e0fd7a fix ask-ai 2025-06-19 19:18:50 +09:00
f816abb84f fix mobile css, ask-ai 2025-06-19 19:12:29 +09:00
8541af9293 add binary 2025-06-19 17:26:48 +09:00
68b49d5aaf Fix jetstream monitoring for ai.syui.log collections
- Fixed JetstreamMessage struct to correctly parse collection from commit object
- Fixed user list JSON format to match oauth app expectations (removed metadata field)
- Added monitoring for both ai.syui.log and ai.syui.log.chat.comment collections
- Improved error handling and debug output for stream processing
- Jetstream auto user registration now working correctly

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-19 17:14:11 +09:00
53dab3fd09 fix ailog stream server 2025-06-19 17:13:02 +09:00
5fac689f98 fix hugo callback 2025-06-19 17:13:02 +09:00
293421b7a5 add callback test 2025-06-19 14:47:25 +09:00
1793de40c1 fix rm callback page 2025-06-19 13:17:32 +09:00
30bdd7b633 fix oauth package name 2025-06-19 13:09:37 +09:00
b17ac3d91a v0.2.2: OAuth authentication system improvements
🔧 OAuth Fixes:
- Add transition:generic scope to resolve authentication errors
- Improve Agent creation with session object and dpopFetch fallback
- Fix avatar fetching to use correct public API endpoints
- Proper PDS endpoint selection (bsky.syu.is vs public.api.bsky.app)

🎨 UI Improvements:
- Remove 'Demo' text from loading states
- Environment-based feature toggles (TestUI/Debug)
- Unified padding system (20px 0)
- CSS conflict resolution with oauth- prefix

🚀 Production Ready:
- Automatic feature disable in production build
- Session management improvements
- Error handling enhancements

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-19 11:56:06 +09:00
81f87d0462 fix loading 2025-06-19 11:41:01 +09:00
a020fa24d8 fix gh-actions oauth-session 2025-06-19 11:34:56 +09:00
21c53010b7 test oauth_new gh-actions 2025-06-19 11:13:11 +09:00
4f7834f85c fix npm env production 2025-06-18 19:16:25 +09:00
fecd927b91 fix oauth_new env test 2025-06-18 18:16:37 +09:00
b54e8089ea oauth_new 2025-06-18 17:25:42 +09:00
174cb12d4d test merge 2025-06-18 10:53:48 +09:00
a1186f8185 Merge branch 'test-oauth' 2025-06-18 10:53:31 +09:00
833549756b fix did check 2025-06-17 22:36:33 +09:00
4edde5293a Add oauth_new: Complete OAuth authentication and comment system
- Created new oauth_new directory with clean OAuth implementation
- Added 4-tab interface: Lang, Comment, Collection, User Comments
- Implemented OAuth authentication with @atproto/oauth-client-browser
- Added comment posting functionality with putRecord
- Added proper PDS detection and error handling
- Skipped placeholder users to prevent errors
- Built comprehensive documentation (README.md, DEVELOPMENT.md)

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-17 22:34:03 +09:00
f0fdf678c8 fix oauth plc 2025-06-17 17:43:03 +09:00
820e47f634 update binary 2025-06-17 11:01:42 +09:00
4dac4a83e0 fix atproto web link 2025-06-17 11:00:09 +09:00
fccf75949c v0.2.1: Fix async trait implementation warnings
🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-17 10:42:15 +09:00
6600a9e0cf test pds oauth did 2025-06-17 10:41:22 +09:00
0d79af5aa5 v0.2.0: Unified AI content display and OAuth PDS fixes
Major Changes:
- Unified AI content rendering across all collection types (chat, lang, comment)
- Fixed PDS endpoint detection and usage based on handle configuration
- Removed hardcoded 'yui.syui.ai' references and used environment variables
- Fixed OAuth app 400 errors by adding null checks for API calls
- Improved AI DID resolution to use correct ai.syui.ai account
- Fixed avatar and profile link generation for correct PDS routing
- Enhanced network configuration mapping for different PDS types

OAuth App Improvements:
- Consolidated renderAIContent() function for all AI collections
- Fixed generateProfileUrl() to use PDS-specific web URLs
- Removed duplicate AI content rendering code
- Added proper error handling for API calls

Technical Fixes:
- Updated stream.rs to use correct AI DID defaults
- Improved CORS handling for Ollama localhost connections
- Enhanced PDS detection logic for handle-based routing
- Cleaned up production code (removed console.log statements)

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-17 01:51:11 +09:00
db04af76ab test cleanup 2025-06-17 01:48:30 +09:00
5f0b09b555 add binary 2025-06-16 22:48:38 +09:00
8fa9e474d1 v0.1.9: Production deployment ready
🚀 Production Features
- Console output cleanup: Removed all console.log/warn/error from OAuth app
- Clean UI: Removed debug info divs from production build
- Warning-free builds: Fixed all Rust compilation warnings

🔧 Authentication & Stream Improvements
- Enhanced password authentication with PDS specification support
- Fixed AI DID resolution: Now correctly uses ai.syui.ai (did:plc:6qyecktefllvenje24fcxnie)
- Improved project directory config loading for ailog stream commands
- Added user list initialization commands with proper PDS detection

📚 Documentation
- Complete command reference in docs/commands.md
- Architecture documentation in docs/architecture.md
- Getting started guide in docs/getting-started.md

🛠️ Technical Improvements
- Project-aware AI config loading from config.toml
- Runtime DID resolution for OAuth app
- Proper handle/DID distinction across all components
- Enhanced error handling with clean user feedback

🔐 Generated with Claude Code

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-16 22:29:46 +09:00
5339dd28b0 test scpt 2025-06-16 22:27:20 +09:00
1e83b50e3f test cli stream 2025-06-16 22:09:04 +09:00
889ce8baa1 test oauth pds 2025-06-16 20:45:55 +09:00
286b46c6e6 fix systemd 2025-06-16 12:17:42 +09:00
b780d27ace update binary 2025-06-16 12:17:29 +09:00
831fcb7865 v0.1.8: Enhanced OAuth search bar and configurable AI settings
- Transform auth-section to search bar layout (input left, button right)
- Change atproto button text to "@" symbol
- Add num_predict configuration in config.toml for AI response length
- Improve mobile responsiveness for auth section
- Remove auth-status section for cleaner UI

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-16 11:45:24 +09:00
3f8bbff7c2 fix layout oauth-bar 2025-06-16 11:43:15 +09:00
5cb73a9ed3 add scpt 2025-06-16 10:55:30 +09:00
6ce8d44c4b cleanup 2025-06-16 10:53:42 +09:00
167cfb35f7 fix tab name 2025-06-16 02:29:37 +09:00
c8377ceabf rm auth-status 2025-06-16 02:25:00 +09:00
e917c563f2 update layout 2025-06-16 02:21:26 +09:00
a76933c23b cleanup 2025-06-16 01:16:36 +09:00
8d960b7a40 fix ask-ai enter 2025-06-15 23:33:22 +09:00
d3967c782f rm html 2025-06-15 23:23:12 +09:00
63b6fd5142 fix ai handle 2025-06-15 23:21:15 +09:00
27935324c7 fix mobile css 2025-06-15 22:56:34 +09:00
594d7e7aef v0.1.7: Enhanced UI and accessibility improvements
- Add CSS styling for chat messages with theme color border
- Fix comment form visibility (only show on Comments tab)
- Remove comment form heading for cleaner UI
- Add accessibility attributes (id/name) to all form fields
- Fix Japanese input handling in Ask AI (prevent accidental submission during IME composition)
- Unified CSS classes across all content types (comments, AI chat, translations)
- Fix rkey filtering to handle .html extensions consistently

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-15 22:36:19 +09:00
be86c11e74 fix comment-tab 2025-06-15 22:34:44 +09:00
619675b551 fix post-page rkey 2025-06-15 22:22:01 +09:00
d4d98e2e91 v0.1.6: Major improvements to OAuth display and stream configuration
- Fix AI Chat History display layout and content formatting
- Unify comment layout structure across all comment types
- Remove hardcoded values from stream.rs, add config.toml support
- Optimize AI comment generation with character limits
- Improve translation length limits (3000 characters)
- Add comprehensive AI configuration management

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-15 22:22:01 +09:00
8dac463345 test update json 2025-06-15 22:22:01 +09:00
095f6ec386 v0.1.5: Unify collection configuration under VITE_OAUTH_COLLECTION
- Remove AILOG_OAUTH_COLLECTION backward compatibility
- Update stream.rs to use simplified collection structure
- Fix collection loading from project config.toml
- Resolve compiler warnings with #[allow(dead_code)]
- All systems now use unified VITE_OAUTH_COLLECTION variable

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-15 15:25:52 +09:00
c12d42882c test update 2025-06-15 15:23:32 +09:00
67b241f1e8 rm at-uri add post-url 2025-06-15 13:02:50 +09:00
4206b2195d fix post 2025-06-15 11:30:19 +09:00
b3c1b01e9e fix mobile css 2025-06-15 09:37:49 +09:00
ffa4fa0846 add scpt 2025-06-14 21:55:28 +09:00
0e75d4c0e6 fix comment input 2025-06-14 21:09:10 +09:00
b7f62e729a fix ask-AI 2025-06-14 20:48:17 +09:00
3b2c53fc97 Add GitHub Actions workflows and optimize build performance
- Add release.yml for multi-platform binary builds (Linux, macOS, Windows)
- Add gh-pages-fast.yml for fast deployment using pre-built binaries
- Add build-binary.yml for standalone binary artifact creation
- Optimize Cargo.toml with build profiles and reduced tokio features
- Remove 26MB of unused Font Awesome assets (kept only essential files)
- Font Awesome reduced from 28MB to 1.2MB

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-14 19:52:08 +09:00
13f1785081 update 2025-06-14 15:56:25 +09:00
bb6d51a602 fix css 2025-06-14 13:17:09 +09:00
a4114c5be3 fix theme 2025-06-14 13:17:09 +09:00
5c13dc0a1c fix readme 2025-06-14 13:17:09 +09:00
cef0675a88 add system 2025-06-14 13:17:09 +09:00
fd223290df code layout 2025-06-14 13:17:09 +09:00
5f4382911b fix command build 2025-06-14 13:17:09 +09:00
95cee69482 add github 2025-06-14 13:17:08 +09:00
33c166fa0c fix color 2025-06-14 13:17:08 +09:00
36863e4d9f fix loading 2025-06-14 13:17:08 +09:00
fb0e5107cf add ask AI 2025-06-14 13:17:08 +09:00
962017f922 update readme
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 12m41s
2025-06-12 20:04:57 +09:00
5ce03098bd fix stream env
Some checks are pending
Deploy ailog / build-and-deploy (push) Waiting to run
2025-06-12 19:59:19 +09:00
acce1d5af3 fix zsh
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 14m36s
2025-06-12 19:23:01 +09:00
bf0b72a52d rm oauth-env
Some checks are pending
Deploy ailog / build-and-deploy (push) Waiting to run
2025-06-12 19:12:55 +09:00
6e6c6e2f53 fix oauth
Some checks failed
Deploy ailog / build-and-deploy (push) Has been cancelled
2025-06-12 19:12:33 +09:00
eb5aa0a2be fix cargo
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 14m42s
2025-06-11 18:27:58 +09:00
ad45b151b1 fix env
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 11m20s
2025-06-11 17:31:21 +09:00
4775fa7034 fix ui
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 10m58s
2025-06-11 17:01:41 +09:00
d396dbd052 fix oauth
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 12m51s
2025-06-11 16:24:48 +09:00
ec3e3d1f89 fix run 2025-06-11 15:58:41 +09:00
b2fa06d5fa fix config vite
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 11m33s
2025-06-11 15:16:06 +09:00
bebd6a61eb fix xxxcard log
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 12m39s
2025-06-11 13:20:01 +09:00
4fe0582c6b fix readme
Some checks failed
Deploy ailog / build-and-deploy (push) Failing after 12m55s
2025-06-11 12:54:23 +09:00
637028c264 add oauth
Some checks are pending
Deploy ailog / build-and-deploy (push) Waiting to run
2025-06-11 12:53:31 +09:00
c0e4dc63ea update gpt 2025-06-06 03:18:20 +09:00
73 changed files with 3139 additions and 3534 deletions

View File

@@ -1,17 +0,0 @@
{
"permissions": {
"allow": [
"Bash(cargo init:*)",
"Bash(cargo:*)",
"Bash(find:*)",
"Bash(mkdir:*)",
"Bash(../target/debug/ailog new:*)",
"Bash(../target/debug/ailog build)",
"Bash(/Users/syui/ai/log/target/debug/ailog build)",
"Bash(ls:*)",
"Bash(curl:*)",
"Bash(pkill:*)"
],
"deny": []
}
}

View File

@@ -0,0 +1,123 @@
name: Deploy to Cloudflare Pages
on:
push:
branches:
- main
workflow_dispatch:
env:
OAUTH_DIR: oauth
KEEP_DEPLOYMENTS: 5
jobs:
deploy:
runs-on: ubuntu-latest
permissions:
contents: read
deployments: write
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '21'
- name: Install dependencies
run: |
cd ${{ env.OAUTH_DIR }}
npm install
- name: Build OAuth app
run: |
cd ${{ env.OAUTH_DIR }}
NODE_ENV=production npm run build
- name: Copy OAuth build to static
run: |
rm -rf my-blog/static/assets
cp -rf ${{ env.OAUTH_DIR }}/dist/* my-blog/static/
cp ${{ env.OAUTH_DIR }}/dist/index.html my-blog/templates/oauth-assets.html
- name: Cache ailog binary
uses: actions/cache@v4
with:
path: ./bin
key: ailog-bin-${{ runner.os }}
restore-keys: |
ailog-bin-${{ runner.os }}
- name: Setup ailog binary
run: |
# Get expected version from Cargo.toml
EXPECTED_VERSION=$(grep '^version' Cargo.toml | cut -d'"' -f2)
echo "Expected version from Cargo.toml: $EXPECTED_VERSION"
# Check current binary version if exists
if [ -f "./bin/ailog" ]; then
CURRENT_VERSION=$(./bin/ailog --version 2>/dev/null || echo "unknown")
echo "Current binary version: $CURRENT_VERSION"
else
CURRENT_VERSION="none"
echo "No binary found"
fi
# Check OS
OS="${{ runner.os }}"
echo "Runner OS: $OS"
# Use pre-packaged binary if version matches or extract from tar.gz
if [ "$CURRENT_VERSION" = "$EXPECTED_VERSION" ]; then
echo "Binary is up to date"
chmod +x ./bin/ailog
elif [ "$OS" = "Linux" ] && [ -f "./bin/ailog-linux-x86_64.tar.gz" ]; then
echo "Extracting ailog from pre-packaged tar.gz..."
cd bin
tar -xzf ailog-linux-x86_64.tar.gz
chmod +x ailog
cd ..
# Verify extracted version
EXTRACTED_VERSION=$(./bin/ailog --version 2>/dev/null || echo "unknown")
echo "Extracted binary version: $EXTRACTED_VERSION"
if [ "$EXTRACTED_VERSION" != "$EXPECTED_VERSION" ]; then
echo "Warning: Binary version mismatch. Expected $EXPECTED_VERSION but got $EXTRACTED_VERSION"
fi
else
echo "Error: No suitable binary found for OS: $OS"
exit 1
fi
- name: Build site with ailog
run: |
cd my-blog
../bin/ailog build
- name: List public directory
run: |
ls -la my-blog/public/
- name: Deploy to Cloudflare Pages
uses: cloudflare/pages-action@v1
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
projectName: ${{ secrets.CLOUDFLARE_PROJECT_NAME }}
directory: my-blog/public
wranglerVersion: '3'
cleanup:
needs: deploy
runs-on: ubuntu-latest
if: success()
steps:
- name: Cleanup old deployments
run: |
curl -X PATCH \
"https://api.cloudflare.com/client/v4/accounts/${{ secrets.CLOUDFLARE_ACCOUNT_ID }}/pages/projects/${{ secrets.CLOUDFLARE_PROJECT_NAME }}" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_API_TOKEN }}" \
-H "Content-Type: application/json" \
-d "{ \"deployment_configs\": { \"production\": { \"deployment_retention\": ${{ env.KEEP_DEPLOYMENTS }} } } }"

View File

@@ -0,0 +1,53 @@
name: Deploy to Cloudflare Pages
on:
push:
branches: [main]
pull_request:
branches: [main]
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: stable
override: true
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Build ailog
run: |
cargo build --release
- name: Build OAuth app
run: |
cd oauth
npm install
npm run build
- name: Copy OAuth assets
run: |
cp -r oauth/dist/* my-blog/static/
- name: Generate site with ailog
run: |
./target/release/ailog generate --input content --output my-blog/public
- name: Deploy to Cloudflare Pages
uses: cloudflare/pages-action@v1
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
projectName: syui-ai
directory: my-blog/public
gitHubToken: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -0,0 +1,28 @@
name: Example ailog usage
on:
workflow_dispatch: # Manual trigger for testing
jobs:
build-with-ailog-action:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Build with ailog action
uses: ai/log@v1 # This will reference this repository
with:
content-dir: 'content'
output-dir: 'public'
ai-integration: true
atproto-integration: true
- name: Deploy to Cloudflare Pages
uses: cloudflare/pages-action@v1
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
projectName: my-blog
directory: public

View File

@@ -0,0 +1,193 @@
name: Release
on:
push:
tags:
- 'v*'
workflow_dispatch:
inputs:
tag:
description: 'Release tag (e.g., v1.0.0)'
required: true
default: 'v0.1.0'
permissions:
contents: write
actions: read
env:
CARGO_TERM_COLOR: always
OPENSSL_STATIC: true
OPENSSL_VENDOR: true
jobs:
build:
name: Build ${{ matrix.target }}
runs-on: ${{ matrix.os }}
timeout-minutes: 60
strategy:
matrix:
include:
- target: x86_64-unknown-linux-gnu
os: ubuntu-latest
artifact_name: ailog
asset_name: ailog-linux-x86_64
- target: aarch64-unknown-linux-gnu
os: ubuntu-latest
artifact_name: ailog
asset_name: ailog-linux-aarch64
- target: x86_64-apple-darwin
os: macos-latest
artifact_name: ailog
asset_name: ailog-macos-x86_64
- target: aarch64-apple-darwin
os: macos-latest
artifact_name: ailog
asset_name: ailog-macos-aarch64
steps:
- uses: actions/checkout@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.target }}
- name: Install cross-compilation tools (Linux)
if: matrix.os == 'ubuntu-latest' && matrix.target == 'aarch64-unknown-linux-gnu'
run: |
sudo apt-get update
sudo apt-get install -y gcc-aarch64-linux-gnu binutils-aarch64-linux-gnu
- name: Configure cross-compilation (Linux ARM64)
if: matrix.target == 'aarch64-unknown-linux-gnu'
run: |
echo '[target.aarch64-unknown-linux-gnu]' >> ~/.cargo/config.toml
echo 'linker = "aarch64-linux-gnu-gcc"' >> ~/.cargo/config.toml
- name: Cache cargo registry
uses: actions/cache@v4
with:
path: |
~/.cargo/registry
~/.cargo/git
key: ${{ runner.os }}-${{ matrix.target }}-cargo-${{ hashFiles('**/Cargo.lock') }}
- name: Cache target directory
uses: actions/cache@v4
with:
path: target
key: ${{ runner.os }}-${{ matrix.target }}-target-${{ hashFiles('**/Cargo.lock') }}
- name: Build
run: cargo build --release --target ${{ matrix.target }}
- name: Prepare binary
shell: bash
run: |
cd target/${{ matrix.target }}/release
# Use appropriate strip command for cross-compilation
if [[ "${{ matrix.target }}" == "aarch64-unknown-linux-gnu" ]]; then
aarch64-linux-gnu-strip ${{ matrix.artifact_name }} || echo "Strip failed, continuing..."
elif [[ "${{ matrix.os }}" == "windows-latest" ]]; then
strip ${{ matrix.artifact_name }} || echo "Strip failed, continuing..."
else
strip ${{ matrix.artifact_name }} || echo "Strip failed, continuing..."
fi
# Create archive
if [[ "${{ matrix.target }}" == *"windows"* ]]; then
7z a ../../../${{ matrix.asset_name }}.zip ${{ matrix.artifact_name }}
else
tar czvf ../../../${{ matrix.asset_name }}.tar.gz ${{ matrix.artifact_name }}
fi
- name: Upload binary
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.asset_name }}
path: ${{ matrix.asset_name }}.tar.gz
release:
name: Create Release
needs: build
runs-on: ubuntu-latest
permissions:
contents: write
actions: read
steps:
- uses: actions/checkout@v4
- name: Download all artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Generate release notes
run: |
echo "## What's Changed" > release_notes.md
echo "" >> release_notes.md
echo "### Features" >> release_notes.md
echo "- AI-powered static blog generator" >> release_notes.md
echo "- AtProto OAuth integration" >> release_notes.md
echo "- Automatic translation support" >> release_notes.md
echo "- AI comment system" >> release_notes.md
echo "" >> release_notes.md
echo "### Platforms" >> release_notes.md
echo "- Linux (x86_64, aarch64)" >> release_notes.md
echo "- macOS (Intel, Apple Silicon)" >> release_notes.md
echo "" >> release_notes.md
echo "### Installation" >> release_notes.md
echo "\`\`\`bash" >> release_notes.md
echo "# Linux/macOS" >> release_notes.md
echo "tar -xzf ailog-linux-x86_64.tar.gz" >> release_notes.md
echo "chmod +x ailog" >> release_notes.md
echo "sudo mv ailog /usr/local/bin/" >> release_notes.md
echo "" >> release_notes.md
echo "\`\`\`" >> release_notes.md
- name: Get tag name
id: tag_name
run: |
if [[ "${{ github.event_name }}" == "workflow_dispatch" ]]; then
echo "tag=${{ github.event.inputs.tag }}" >> $GITHUB_OUTPUT
else
echo "tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
fi
- name: Create Release with Gitea API
run: |
# Prepare release files
mkdir -p release
find artifacts -name "*.tar.gz" -exec cp {} release/ \;
# Create release via Gitea API
RELEASE_RESPONSE=$(curl -X POST \
"${{ github.server_url }}/api/v1/repos/${{ github.repository }}/releases" \
-H "Authorization: token ${{ github.token }}" \
-H "Content-Type: application/json" \
-d '{
"tag_name": "${{ steps.tag_name.outputs.tag }}",
"name": "ailog ${{ steps.tag_name.outputs.tag }}",
"body": "'"$(cat release_notes.md | sed 's/"/\\"/g' | tr '\n' ' ')"'",
"draft": false,
"prerelease": '"$(if echo "${{ steps.tag_name.outputs.tag }}" | grep -E "(alpha|beta|rc)"; then echo "true"; else echo "false"; fi)"'
}')
# Get release ID
RELEASE_ID=$(echo "$RELEASE_RESPONSE" | jq -r '.id')
echo "Created release with ID: $RELEASE_ID"
# Upload release assets
for file in release/*.tar.gz; do
if [ -f "$file" ]; then
filename=$(basename "$file")
echo "Uploading $filename..."
curl -X POST \
"${{ github.server_url }}/api/v1/repos/${{ github.repository }}/releases/$RELEASE_ID/assets?name=$filename" \
-H "Authorization: token ${{ github.token }}" \
-H "Content-Type: application/octet-stream" \
--data-binary @"$file"
fi
done

169
.github/workflows/cloudflare-pages.yml vendored Normal file
View File

@@ -0,0 +1,169 @@
name: Deploy to Cloudflare Pages
on:
push:
branches:
- main
workflow_dispatch:
env:
OAUTH_DIR: oauth
KEEP_DEPLOYMENTS: 5
jobs:
deploy:
runs-on: ubuntu-latest
permissions:
contents: read
deployments: write
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '25'
- name: Install dependencies
run: |
cd ${{ env.OAUTH_DIR }}
npm install
- name: Build OAuth app
run: |
cd ${{ env.OAUTH_DIR }}
NODE_ENV=production npm run build
- name: Copy OAuth build to static
run: |
rm -rf my-blog/static/assets
cp -rf ${{ env.OAUTH_DIR }}/dist/* my-blog/static/
cp ${{ env.OAUTH_DIR }}/dist/index.html my-blog/templates/oauth-assets.html
- name: Build PDS app
run: |
cd pds
npm install
npm run build
- name: Copy PDS build to static
run: |
rm -rf my-blog/static/pds
cp -rf pds/dist my-blog/static/pds
- name: Cache ailog binary
uses: actions/cache@v4
with:
path: ./bin
key: ailog-bin-${{ runner.os }}-v${{ hashFiles('Cargo.toml') }}
restore-keys: |
ailog-bin-${{ runner.os }}-v
- name: Setup ailog binary
run: |
# Get expected version from Cargo.toml
EXPECTED_VERSION=$(grep '^version' Cargo.toml | cut -d'"' -f2)
echo "Expected version from Cargo.toml: $EXPECTED_VERSION"
# Check current binary version if exists
if [ -f "./bin/ailog" ]; then
CURRENT_VERSION=$(./bin/ailog --version 2>/dev/null || echo "unknown")
echo "Current binary version: $CURRENT_VERSION"
else
CURRENT_VERSION="none"
echo "No binary found"
fi
# Check OS
OS="${{ runner.os }}"
echo "Runner OS: $OS"
# Use pre-packaged binary if version matches or extract from tar.gz
if [ "$CURRENT_VERSION" = "$EXPECTED_VERSION" ]; then
echo "Binary is up to date"
chmod +x ./bin/ailog
elif [ "$OS" = "Linux" ] && [ -f "./bin/ailog-linux-x86_64.tar.gz" ]; then
echo "Extracting ailog from pre-packaged tar.gz..."
cd bin
tar -xzf ailog-linux-x86_64.tar.gz
chmod +x ailog
cd ..
# Verify extracted version
EXTRACTED_VERSION=$(./bin/ailog --version 2>/dev/null || echo "unknown")
echo "Extracted binary version: $EXTRACTED_VERSION"
if [ "$EXTRACTED_VERSION" != "$EXPECTED_VERSION" ]; then
echo "Warning: Binary version mismatch. Expected $EXPECTED_VERSION but got $EXTRACTED_VERSION"
fi
else
echo "Error: No suitable binary found for OS: $OS"
exit 1
fi
- name: Build site with ailog
run: |
cd my-blog
../bin/ailog build
- name: List public directory
run: |
ls -la my-blog/public/
- name: Deploy to Cloudflare Pages
uses: cloudflare/pages-action@v1
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
projectName: ${{ secrets.CLOUDFLARE_PROJECT_NAME }}
directory: my-blog/public
gitHubToken: ${{ secrets.GITHUB_TOKEN }}
wranglerVersion: '3'
cleanup:
needs: deploy
runs-on: ubuntu-latest
if: success()
steps:
- name: Cleanup old deployments
run: |
curl -X PATCH \
"https://api.cloudflare.com/client/v4/accounts/${{ secrets.CLOUDFLARE_ACCOUNT_ID }}/pages/projects/${{ secrets.CLOUDFLARE_PROJECT_NAME }}" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_API_TOKEN }}" \
-H "Content-Type: application/json" \
-d "{ \"deployment_configs\": { \"production\": { \"deployment_retention\": ${{ env.KEEP_DEPLOYMENTS }} } } }"
# Get all deployments
DEPLOYMENTS=$(curl -s -X GET \
"https://api.cloudflare.com/client/v4/accounts/${{ secrets.CLOUDFLARE_ACCOUNT_ID }}/pages/projects/${{ secrets.CLOUDFLARE_PROJECT_NAME }}/deployments" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_API_TOKEN }}" \
-H "Content-Type: application/json")
# Extract deployment IDs (skip the latest N deployments)
DEPLOYMENT_IDS=$(echo "$DEPLOYMENTS" | jq -r ".result | sort_by(.created_on) | reverse | .[${{ env.KEEP_DEPLOYMENTS }}:] | .[].id // empty")
if [ -z "$DEPLOYMENT_IDS" ]; then
echo "No old deployments to delete"
exit 0
fi
# Delete old deployments
for ID in $DEPLOYMENT_IDS; do
echo "Deleting deployment: $ID"
RESPONSE=$(curl -s -X DELETE \
"https://api.cloudflare.com/client/v4/accounts/${{ secrets.CLOUDFLARE_ACCOUNT_ID }}/pages/projects/${{ secrets.CLOUDFLARE_PROJECT_NAME }}/deployments/$ID" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_API_TOKEN }}" \
-H "Content-Type: application/json")
SUCCESS=$(echo "$RESPONSE" | jq -r '.success')
if [ "$SUCCESS" = "true" ]; then
echo "Successfully deleted deployment: $ID"
else
echo "Failed to delete deployment: $ID"
echo "$RESPONSE" | jq .
fi
sleep 1 # Rate limiting
done
echo "Cleanup completed!"

View File

@@ -0,0 +1,92 @@
name: github pages (fast)
on:
push:
branches:
- main
paths-ignore:
- 'src/**'
- 'Cargo.toml'
- 'Cargo.lock'
jobs:
build-deploy:
runs-on: ubuntu-latest
permissions:
contents: write
pages: write
id-token: write
steps:
- uses: actions/checkout@v4
- name: Cache ailog binary
uses: actions/cache@v4
with:
path: ./bin
key: ailog-bin-${{ runner.os }}
restore-keys: |
ailog-bin-${{ runner.os }}
- name: Setup ailog binary
run: |
# Get expected version from Cargo.toml
EXPECTED_VERSION=$(grep '^version' Cargo.toml | cut -d'"' -f2)
echo "Expected version from Cargo.toml: $EXPECTED_VERSION"
# Check current binary version if exists
if [ -f "./bin/ailog" ]; then
CURRENT_VERSION=$(./bin/ailog --version 2>/dev/null || echo "unknown")
echo "Current binary version: $CURRENT_VERSION"
else
CURRENT_VERSION="none"
echo "No binary found"
fi
# Check OS
OS="${{ runner.os }}"
echo "Runner OS: $OS"
# Use pre-packaged binary if version matches or extract from tar.gz
if [ "$CURRENT_VERSION" = "$EXPECTED_VERSION" ]; then
echo "Binary is up to date"
chmod +x ./bin/ailog
elif [ "$OS" = "Linux" ] && [ -f "./bin/ailog-linux-x86_64.tar.gz" ]; then
echo "Extracting ailog from pre-packaged tar.gz..."
cd bin
tar -xzf ailog-linux-x86_64.tar.gz
chmod +x ailog
cd ..
# Verify extracted version
EXTRACTED_VERSION=$(./bin/ailog --version 2>/dev/null || echo "unknown")
echo "Extracted binary version: $EXTRACTED_VERSION"
if [ "$EXTRACTED_VERSION" != "$EXPECTED_VERSION" ]; then
echo "Warning: Binary version mismatch. Expected $EXPECTED_VERSION but got $EXTRACTED_VERSION"
fi
else
echo "Error: No suitable binary found for OS: $OS"
exit 1
fi
- name: Setup Hugo
uses: peaceiris/actions-hugo@v3
with:
hugo-version: "0.139.2"
extended: true
- name: Build with ailog
env:
TZ: "Asia/Tokyo"
run: |
# Use pre-built ailog binary instead of cargo build
cd my-blog
../bin/ailog build
touch ./public/.nojekyll
- name: Deploy
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./my-blog/public
publish_branch: gh-pages

170
.github/workflows/release.yml vendored Normal file
View File

@@ -0,0 +1,170 @@
name: Release
on:
push:
tags:
- 'v*'
workflow_dispatch:
inputs:
tag:
description: 'Release tag (e.g., v1.0.0)'
required: true
default: 'v0.1.0'
permissions:
contents: write
actions: read
env:
CARGO_TERM_COLOR: always
OPENSSL_STATIC: true
OPENSSL_VENDOR: true
jobs:
build:
name: Build ${{ matrix.target }}
runs-on: ${{ matrix.os }}
timeout-minutes: 60
strategy:
matrix:
include:
- target: x86_64-unknown-linux-gnu
os: ubuntu-latest
artifact_name: ailog
asset_name: ailog-linux-x86_64
- target: aarch64-unknown-linux-gnu
os: ubuntu-latest
artifact_name: ailog
asset_name: ailog-linux-aarch64
- target: x86_64-apple-darwin
os: macos-latest
artifact_name: ailog
asset_name: ailog-macos-x86_64
- target: aarch64-apple-darwin
os: macos-latest
artifact_name: ailog
asset_name: ailog-macos-aarch64
steps:
- uses: actions/checkout@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.target }}
- name: Install cross-compilation tools (Linux)
if: matrix.os == 'ubuntu-latest' && matrix.target == 'aarch64-unknown-linux-gnu'
run: |
sudo apt-get update
sudo apt-get install -y gcc-aarch64-linux-gnu binutils-aarch64-linux-gnu
- name: Configure cross-compilation (Linux ARM64)
if: matrix.target == 'aarch64-unknown-linux-gnu'
run: |
echo '[target.aarch64-unknown-linux-gnu]' >> ~/.cargo/config.toml
echo 'linker = "aarch64-linux-gnu-gcc"' >> ~/.cargo/config.toml
- name: Cache cargo registry
uses: actions/cache@v4
with:
path: |
~/.cargo/registry
~/.cargo/git
key: ${{ runner.os }}-${{ matrix.target }}-cargo-${{ hashFiles('**/Cargo.lock') }}
- name: Cache target directory
uses: actions/cache@v4
with:
path: target
key: ${{ runner.os }}-${{ matrix.target }}-target-${{ hashFiles('**/Cargo.lock') }}
- name: Build
run: cargo build --release --target ${{ matrix.target }}
- name: Prepare binary
shell: bash
run: |
cd target/${{ matrix.target }}/release
# Use appropriate strip command for cross-compilation
if [[ "${{ matrix.target }}" == "aarch64-unknown-linux-gnu" ]]; then
aarch64-linux-gnu-strip ${{ matrix.artifact_name }} || echo "Strip failed, continuing..."
elif [[ "${{ matrix.os }}" == "windows-latest" ]]; then
strip ${{ matrix.artifact_name }} || echo "Strip failed, continuing..."
else
strip ${{ matrix.artifact_name }} || echo "Strip failed, continuing..."
fi
# Create archive
if [[ "${{ matrix.target }}" == *"windows"* ]]; then
7z a ../../../${{ matrix.asset_name }}.zip ${{ matrix.artifact_name }}
else
tar czvf ../../../${{ matrix.asset_name }}.tar.gz ${{ matrix.artifact_name }}
fi
- name: Upload binary
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.asset_name }}
path: ${{ matrix.asset_name }}.tar.gz
release:
name: Create Release
needs: build
runs-on: ubuntu-latest
permissions:
contents: write
actions: read
steps:
- uses: actions/checkout@v4
- name: Download all artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Generate release notes
run: |
echo "## What's Changed" > release_notes.md
echo "" >> release_notes.md
echo "### Features" >> release_notes.md
echo "- AI-powered static blog generator" >> release_notes.md
echo "- AtProto OAuth integration" >> release_notes.md
echo "- Automatic translation support" >> release_notes.md
echo "- AI comment system" >> release_notes.md
echo "" >> release_notes.md
echo "### Platforms" >> release_notes.md
echo "- Linux (x86_64, aarch64)" >> release_notes.md
echo "- macOS (Intel, Apple Silicon)" >> release_notes.md
echo "" >> release_notes.md
echo "### Installation" >> release_notes.md
echo "\`\`\`bash" >> release_notes.md
echo "# Linux/macOS" >> release_notes.md
echo "tar -xzf ailog-linux-x86_64.tar.gz" >> release_notes.md
echo "chmod +x ailog" >> release_notes.md
echo "sudo mv ailog /usr/local/bin/" >> release_notes.md
echo "" >> release_notes.md
echo "\`\`\`" >> release_notes.md
- name: Get tag name
id: tag_name
run: |
if [[ "${{ github.event_name }}" == "workflow_dispatch" ]]; then
echo "tag=${{ github.event.inputs.tag }}" >> $GITHUB_OUTPUT
else
echo "tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
fi
- name: Create Release
uses: softprops/action-gh-release@v1
with:
tag_name: ${{ steps.tag_name.outputs.tag }}
name: ailog ${{ steps.tag_name.outputs.tag }}
body_path: release_notes.md
draft: false
prerelease: ${{ contains(steps.tag_name.outputs.tag, 'alpha') || contains(steps.tag_name.outputs.tag, 'beta') || contains(steps.tag_name.outputs.tag, 'rc') }}
files: artifacts/*/ailog-*.tar.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

13
.gitignore vendored
View File

@@ -1,7 +1,12 @@
/target
/Cargo.lock
/public
*.swp
*.swo
*~
.DS_Store
/dist
/repos
/pds/dist
.DS_Store
.config
.claude
node_modules
package-lock.json
claude.md

View File

@@ -1,9 +1,9 @@
[package]
name = "ailog"
version = "0.1.0"
version = "0.3.5"
edition = "2021"
authors = ["syui"]
description = "A static blog generator with AI features"
description = "static site generator for atproto"
license = "MIT"
[dependencies]
@@ -11,27 +11,11 @@ clap = { version = "4.5", features = ["derive"] }
pulldown-cmark = "0.11"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
tokio = { version = "1.40", features = ["full"] }
tokio = { version = "1.40", features = ["rt-multi-thread", "macros", "fs"] }
anyhow = "1.0"
toml = "0.8"
reqwest = { version = "0.12", features = ["json", "rustls-tls"], default-features = false }
dirs = "5.0"
chrono = "0.4"
tera = "1.20"
walkdir = "2.5"
gray_matter = "0.2"
fs_extra = "1.3"
colored = "2.1"
serde_yaml = "0.9"
syntect = "5.2"
reqwest = { version = "0.12", features = ["json"] }
rand = "0.8"
sha2 = "0.10"
base64 = "0.22"
uuid = { version = "1.11", features = ["v4"] }
urlencoding = "2.1"
axum = "0.7"
tower = "0.5"
tower-http = { version = "0.5", features = ["cors", "fs"] }
hyper = { version = "1.0", features = ["full"] }
[dev-dependencies]
tempfile = "3.14"
tower-http = { version = "0.5", features = ["fs"] }

View File

@@ -1,87 +1,4 @@
# ai.log
# ailog
A Rust-based static blog generator with AI integration capabilities.
`bundle: ai.syui.log`
## Overview
ai.log is part of the ai ecosystem - a static site generator that creates blogs with built-in AI features for content enhancement and atproto integration.
## Features
- Static blog generation (inspired by Zola)
- AI-powered article editing and enhancement
- Automatic translation (ja → en)
- AI comment system integrated with atproto
- OAuth authentication via atproto accounts
## Installation
```bash
cargo install ailog
```
## Usage
```bash
# Initialize a new blog
ailog init myblog
# Create a new post
ailog new "My First Post"
# Build the blog
ailog build
# Serve locally
ailog serve
# Clean build files
ailog clean
```
## Configuration
Configuration files are stored in `~/.config/syui/ai/log/`
## AI Integration (Planned)
- Automatic content suggestions and corrections
- Multi-language support with AI translation
- AI-generated comments linked to atproto accounts
## atproto Integration (Planned)
Implements OAuth 2.0 for user authentication:
- Users can comment using their atproto accounts
- Comments are stored in atproto collections
- Full data sovereignty for users
## Build & Deploy
Designed for GitHub Actions and Cloudflare Pages deployment. Push to main branch triggers automatic build and deploy.
## Development Status
Currently implemented:
- ✅ Project structure and Cargo.toml setup
- ✅ Basic command-line interface (init, new, build, serve, clean)
- ✅ Configuration system with TOML support
- ✅ Markdown parsing with frontmatter support
- ✅ Template system with Handlebars
- ✅ Static site generation with posts and pages
- ✅ Development server with hot reload
- ✅ AI integration foundation (GPT client, translator, comment system)
- ✅ atproto client with OAuth support
- ✅ MCP server integration for AI tools
- ✅ Test blog with sample content and styling
Planned features:
- AI-powered content enhancement and suggestions
- Automatic translation (ja → en) pipeline
- atproto comment system with OAuth authentication
- Advanced template customization
- Plugin system for extensibility
## License
© syui

478
claude.md
View File

@@ -1,478 +0,0 @@
# エコシステム統合設計書
## 中核思想
- **存在子理論**: この世界で最も小さいもの(存在子/aiの探求
- **唯一性原則**: 現実の個人の唯一性をすべてのシステムで担保
- **現実の反映**: 現実→ゲーム→現実の循環的影響
## システム構成図
```
存在子(ai) - 最小単位の意識
[ai.moji] 文字システム
[ai.os] + [ai.game device] ← 統合ハードウェア
├── ai.shell (Claude Code的機能)
├── ai.gpt (自律人格・記憶システム)
├── ai.log (AIと連携するブログシステム)
├── ai.ai (個人特化AI・心を読み取るAI)
├── ai.card (カードゲーム・iOS/Web/API)
└── ai.bot (分散SNS連携・カード配布)
[ai.verse] メタバース
├── world system (惑星型3D世界)
├── at system (atproto/分散SNS)
├── yui system (唯一性担保)
└── ai system (存在属性)
```
## 名前規則
名前規則は他のprojectと全て共通しています。exampleを示しますので、このルールに従ってください。
ここでは`ai.os`の場合の名前規則の例を記述します。
name: ai.os
- **[ "package", "code", "command" ]**: aios
- **[ "dir", "url" ]**: ai/os
- **[ "domain", "json" ]**: ai.os
```sh
$ curl -sL https://git.syui.ai/ai/ai/raw/branch/main/ai.json|jq .ai.os
{ "type": "os" }
```
```json
{
"ai": {
"os":{}
}
}
```
他のprojectも同じ名前規則を採用します。`ai.gpt`ならpackageは`aigpt`です。
## config(設定ファイル, env, 環境依存)
`config`を置く場所は統一されており、各projectの名前規則の`dir`項目を使用します。例えば、aiosの場合は`~/.config/syui/ai/os/`以下となります。pythonなどを使用する場合、`python -m venv`などでこのpackage config dirに環境を構築して実行するようにしてください。
domain形式を採用して、私は各projectを`git.syui.ai/ai`にhostしていますから、`~/.config/syui/ai`とします。
```sh
[syui.ai]
syui/ai
```
```sh
# example
~/.config/syui/ai
├── card
├── gpt
├── os
└── shell
```
## 各システム詳細
### ai.gpt - 自律的送信AI
**目的**: 関係性に基づく自発的コミュニケーション
**中核概念**:
- **人格**: 記憶(過去の発話)と関係性パラメータで構成
- **唯一性**: atproto accountとの1:1紐付け、改変不可能
- **自律送信**: 関係性が閾値を超えると送信機能が解禁
**技術構成**:
- `MemoryManager`: 完全ログ→AI要約→コア判定→選択的忘却
- `RelationshipTracker`: 時間減衰・日次制限付き関係性スコア
- `TransmissionController`: 閾値判定・送信トリガー
- `Persona`: AI運勢1-10ランダムによる人格変動
**実装仕様**:
```
- 言語: Python (fastapi_mcp)
- ストレージ: JSON/SQLite選択式
- インターフェース: Python CLI (click/typer)
- スケジューリング: cron-like自律処理
```
### ai.card - カードゲームシステム
**目的**: atproto基盤でのユーザーデータ主権カードゲーム
**現在の状況**:
- ai.botの機能として実装済み
- atproto accountでmentionすると1日1回カードを取得
- ai.api (MCP server予定) でユーザー管理
**移行計画**:
- **iOS移植**: Claudeが担当予定
- **データ保存**: atproto collection recordに保存ユーザーがデータを所有
- **不正防止**: OAuth 2.1 scope (実装待ち) + MCP serverで対応
- **画像ファイル**: Cloudflare Pagesが最適
**yui system適用**:
- カードの効果がアカウント固有
- 改ざん防止によるゲームバランス維持
- 将来的にai.verseとの統合で固有スキルと連動
### ai.ai - 心を読み取るAI
**目的**: 個人特化型AI・深層理解システム
**ai.gptとの関係**:
- ai.gpt → ai.ai: 自律送信AIから心理分析AIへの連携
- 関係性パラメータの深層分析
- ユーザーの思想コア部分の特定支援
### ai.verse - UEメタバース
**目的**: 現実反映型3D世界
**yui system実装**:
- キャラクター ↔ プレイヤー 1:1紐付け
- unique skill: そのプレイヤーのみ使用可能
- 他プレイヤーは同キャラでも同スキル使用不可
**統合要素**:
- ai.card: ゲーム内アイテムとしてのカード
- ai.gpt: NPCとしての自律AI人格
- atproto: ゲーム内プロフィール連携
## データフロー設計
### 唯一性担保の実装
```
現実の個人 → atproto account (DID) → ゲーム内avatar → 固有スキル
↑_______________________________| (現実の反映)
```
### AI駆動変換システム
```
遊び・創作活動 → ai.gpt分析 → 業務成果変換 → 企業価値創出
↑________________________| (Play-to-Work)
```
### カードゲーム・データ主権フロー
```
ユーザー → ai.bot mention → カード生成 → atproto collection → ユーザー所有
↑ ↓
← iOS app表示 ← ai.card API ←
```
## 技術スタック統合
### Core Infrastructure
- **OS**: Rust-based ai.os (Arch Linux base)
- **Container**: Docker image distribution
- **Identity**: atproto selfhost server + DID管理
- **AI**: fastapi_mcp server architecture
- **CLI**: Python unified (click/typer) - Rustから移行
### Game Engine Integration
- **Engine**: Unreal Engine (Blueprint)
- **Data**: atproto → UE → atproto sync
- **Avatar**: 分散SNS profile → 3D character
- **Streaming**: game screen = broadcast screen
### Mobile/Device
- **iOS**: ai.card移植 (Claude担当)
- **Hardware**: ai.game device (future)
- **Interface**: controller-first design
## 実装優先順位
### Phase 1: AI基盤強化 (現在進行)
- [ ] ai.gpt memory system完全実装
- 記憶の階層化(完全ログ→要約→コア→忘却)
- 関係性パラメータの時間減衰システム
- AI運勢による人格変動機能
- [ ] ai.card iOS移植
- atproto collection record連携
- MCP server化ai.api刷新
- [ ] fastapi_mcp統一基盤構築
### Phase 2: ゲーム統合
- [ ] ai.verse yui system実装
- unique skill機能
- atproto連携強化
- [ ] ai.gpt ↔ ai.ai連携機能
- [ ] 分散SNS ↔ ゲーム同期
### Phase 3: メタバース浸透
- [ ] VTuber配信機能統合
- [ ] Play-to-Work変換システム
- [ ] ai.game device prototype
## 将来的な連携構想
### システム間連携(現在は独立実装)
```
ai.gpt (自律送信) ←→ ai.ai (心理分析)
ai.card (iOS,Web,API) ←→ ai.verse (UEゲーム世界)
```
**共通基盤**: fastapi_mcp
**共通思想**: yui system現実の反映・唯一性担保
### データ改ざん防止戦略
- **短期**: MCP serverによる検証
- **中期**: OAuth 2.1 scope実装待ち
- **長期**: ブロックチェーン的整合性チェック
## AIコミュニケーション最適化
### プロジェクト要件定義テンプレート
```markdown
# [プロジェクト名] 要件定義
## 哲学的背景
- 存在子理論との関連:
- yui system適用範囲
- 現実反映の仕組み:
## 技術要件
- 使用技術fastapi_mcp統一
- atproto連携方法
- データ永続化方法:
## ユーザーストーリー
1. ユーザーが...すると
2. システムが...を実行し
3. 結果として...が実現される
## 成功指標
- 技術的:
- 哲学的(唯一性担保):
```
### Claude Code活用戦略
1. **小さく始める**: ai.gptのMCP機能拡張から
2. **段階的統合**: 各システムを個別に完成させてから統合
3. **哲学的一貫性**: 各実装でyui systemとの整合性を確認
4. **現実反映**: 実装がどう現実とゲームを繋ぐかを常に明記
## 開発上の留意点
### MCP Server設計指針
- 各AIgpt, card, ai, botは独立したMCPサーバー
- fastapi_mcp基盤で統一
- atproto DIDによる認証・認可
### 記憶・データ管理
- **ai.gpt**: 関係性の不可逆性重視
- **ai.card**: ユーザーデータ主権重視
- **ai.verse**: ゲーム世界の整合性重視
### 唯一性担保実装
- atproto accountとの1:1紐付け必須
- 改変不可能性をハッシュ・署名で保証
- 他システムでの再現不可能性を技術的に実現
## 継続的改善
- 各プロジェクトでこの設計書を参照
- 新機能追加時はyui systemとの整合性をチェック
- 他システムへの影響を事前評価
- Claude Code導入時の段階的移行計画
## ai.gpt深層設計思想
### 人格の不可逆性
- **関係性の破壊は修復不可能**: 現実の人間関係と同じ重み
- **記憶の選択的忘却**: 重要でない情報は忘れるが、コア記憶は永続
- **時間減衰**: すべてのパラメータは時間とともに自然減衰
### AI運勢システム
- 1-10のランダム値で日々の人格に変化
- 連続した幸運/不運による突破条件
- 環境要因としての人格形成
### 記憶の階層構造
1. **完全ログ**: すべての会話を記録
2. **AI要約**: 重要な部分を抽出して圧縮
3. **思想コア判定**: ユーザーの本質的な部分を特定
4. **選択的忘却**: 重要度の低い情報を段階的に削除
### 実装における重要な決定事項
- **言語統一**: Python (fastapi_mcp) で統一、CLIはclick/typer
- **データ形式**: JSON/SQLite選択式
- **認証**: atproto DIDによる唯一性担保
- **段階的実装**: まず会話→記憶→関係性→送信機能の順で実装
### 送信機能の段階的実装
- **Phase 1**: CLIでのprint出力現在
- **Phase 2**: atproto直接投稿
- **Phase 3**: ai.bot (Rust/seahorse) との連携
- **将来**: マルチチャネル対応SNS、Webhook等
## ai.gpt実装状況2025/01/06
### 完成した機能
- 階層的記憶システムMemoryManager
- 不可逆的関係性システムRelationshipTracker
- AI運勢システムFortuneSystem
- 統合人格システムPersona
- スケジューラー5種類のタスク
- MCP Server9種類のツール
- 設定管理(~/.config/aigpt/
- 全CLIコマンド実装
### 次の開発ポイント
- `ai_gpt/DEVELOPMENT_STATUS.md` を参照
- 自律送信: transmission.pyでatproto実装
- ai.bot連携: 新規bot_connector.py作成
- テスト: tests/ディレクトリ追加
## このprojectはai.log
このprojectは`ai.log`にあたります。
package, codeは`ailog`となります。
```sh
$ curl -sL https://git.syui.ai/ai/ai/raw/branch/main/ai.json|jq .ai.log
{
"type": "blog",
"text": "今はhugoでblogを作ってる。それをclaude codeでrustの静的ブログジェネレーターを作る。AI機能を付け加える。AI機能は具体的に記事の修正、情報の追加、lang:jaを自動翻訳してlang:en(page)を生成。アイが一言コメントするコメント欄の追加を行う。なお、コメント欄はatprotoと連携し、atprotoアカウントのoauthでログインして書き込める"
}
```
rustで静的ブログジェネレーターを作ります。参考になるのが`zola`です。
- https://github.com/getzola/zola
また、atprotoとの連携は`ai.card``atproto/oauth`の実装が参考になります。
- https://github.com/bluesky-social/atproto/blob/main/packages/api/OAUTH.md
```json
{
"client_id": "https://example.com/client-metadata.json",
"client_name": "Example atproto Browser App",
"client_uri": "https://example.com",
"logo_uri": "https://example.com/logo.png",
"tos_uri": "https://example.com/tos",
"policy_uri": "https://example.com/policy",
"redirect_uris": ["https://example.com/callback"],
"scope": "atproto",
"grant_types": ["authorization_code", "refresh_token"],
"response_types": ["code"],
"token_endpoint_auth_method": "none",
"application_type": "web",
"dpop_bound_access_tokens": true
}
```
```js
// package
import { Agent } from '@atproto/api'
import { BrowserOAuthClient } from '@atproto/oauth-client-browser'
async function main() {
const oauthClient = await BrowserOAuthClient.load({
clientId: '<YOUR_CLIENT_ID>',
handleResolver: 'https://bsky.social/',
})
// TO BE CONTINUED
}
document.addEventListener('DOMContentLoaded', main)
// client
const result = await oauthClient.init()
if (result) {
if ('state' in result) {
console.log('The user was just redirected back from the authorization page')
}
console.log(`The user is currently signed in as ${result.session.did}`)
}
const session = result?.session
// session
if (session) {
const agent = new Agent(session)
const fetchProfile = async () => {
const profile = await agent.getProfile({ actor: agent.did })
return profile.data
}
// Update the user interface
document.body.textContent = `Authenticated as ${agent.did}`
const profileBtn = document.createElement('button')
document.body.appendChild(profileBtn)
profileBtn.textContent = 'Fetch Profile'
profileBtn.onclick = async () => {
const profile = await fetchProfile()
outputPre.textContent = JSON.stringify(profile, null, 2)
}
const logoutBtn = document.createElement('button')
document.body.appendChild(logoutBtn)
logoutBtn.textContent = 'Logout'
logoutBtn.onclick = async () => {
await session.signOut()
window.location.reload()
}
const outputPre = document.createElement('pre')
document.body.appendChild(outputPre)
}
```
AIとの連携は`ai.gpt`をみてください。
- https://git.syui.ai/ai/gpt
`claude.md`があるので、`../gpt/claude.md`を読み込んでください。
### build deploy
主に`github-actions`, `cloudflare pages`を使用することを想定しています。
build, deploy, AIとの連携は記事をpushすると、自動で行われます。
```yml
// .github/workflows/gh-pages.yml
name: github pages
on:
push:
branches:
- main
jobs:
build-deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Hugo
uses: peaceiris/actions-hugo@v3
with:
hugo-version: "0.139.2"
extended: true
- name: Build
env:
TZ: "Asia/Tokyo"
run: |
hugo version
TZ=Asia/Tokyo hugo
touch ./public/.nojekyll
- name: Deploy
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./public
publish_branch: gh-pages
```
# footer
© syui

View File

@@ -0,0 +1,14 @@
# Test Post (Updated!)
This is a test blog post for ailog.
## Features
- atproto integration
- Static site generation
- at browser support
- Hash-based rkey (TID: 3mbvk36vj2k2y)
Let's see how it renders!
**Updated:** This post was updated to test the mapping.json feature.

View File

@@ -0,0 +1,36 @@
{
"lexicon": 1,
"id": "ai.syui.log.comment",
"defs": {
"main": {
"type": "record",
"description": "Record containing a comment.",
"key": "tid",
"record": {
"type": "object",
"required": ["content", "createdAt", "post"],
"properties": {
"content": {
"type": "string",
"maxLength": 100000,
"maxGraphemes": 10000,
"description": "The content of the comment."
},
"createdAt": {
"type": "string",
"format": "datetime",
"description": "Client-declared timestamp when this comment was originally created."
},
"parent": {
"type": "ref",
"ref": "com.atproto.repo.strongRef"
},
"post": {
"type": "ref",
"ref": "com.atproto.repo.strongRef"
}
}
}
}
}
}

View File

@@ -0,0 +1,34 @@
{
"lexicon": 1,
"id": "ai.syui.log.post",
"defs": {
"main": {
"type": "record",
"description": "Record containing a blog post.",
"key": "tid",
"record": {
"type": "object",
"required": ["title", "content", "createdAt"],
"properties": {
"title": {
"type": "string",
"maxLength": 3000,
"maxGraphemes": 300,
"description": "The title of the post."
},
"content": {
"type": "string",
"maxLength": 1000000,
"maxGraphemes": 100000,
"description": "The content of the post."
},
"createdAt": {
"type": "string",
"format": "datetime",
"description": "Client-declared timestamp when this post was originally created."
}
}
}
}
}
}

View File

@@ -1,142 +0,0 @@
# ai.log MCP Integration Guide
ai.logをai.gptと連携するためのMCPサーバー設定ガイド
## MCPサーバー起動
```bash
# ai.logプロジェクトディレクトリで
./target/debug/ailog mcp --port 8002
# またはサブディレクトリから
./target/debug/ailog mcp --port 8002 --path /path/to/blog
```
## ai.gptでの設定
ai.gptの設定ファイル `~/.config/syui/ai/gpt/config.json` に以下を追加:
```json
{
"mcp": {
"enabled": true,
"servers": {
"ai_gpt": {"base_url": "http://localhost:8001"},
"ai_card": {"base_url": "http://localhost:8000"},
"ai_log": {"base_url": "http://localhost:8002"}
}
}
}
```
## 利用可能なMCPツール
### 1. create_blog_post
新しいブログ記事を作成します。
**パラメータ**:
- `title` (必須): 記事のタイトル
- `content` (必須): Markdown形式の記事内容
- `tags` (オプション): 記事のタグ配列
- `slug` (オプション): カスタムURL slug
**使用例**:
```python
# ai.gptからの呼び出し例
result = await mcp_client.call_tool("create_blog_post", {
"title": "AI統合の新しい可能性",
"content": "# 概要\n\nai.gptとai.logの連携により...",
"tags": ["AI", "技術", "ブログ"]
})
```
### 2. list_blog_posts
既存のブログ記事一覧を取得します。
**パラメータ**:
- `limit` (オプション): 取得件数上限 (デフォルト: 10)
- `offset` (オプション): スキップ件数 (デフォルト: 0)
### 3. build_blog
ブログをビルドして静的ファイルを生成します。
**パラメータ**:
- `enable_ai` (オプション): AI機能を有効化
- `translate` (オプション): 自動翻訳を有効化
### 4. get_post_content
指定したスラッグの記事内容を取得します。
**パラメータ**:
- `slug` (必須): 記事のスラッグ
## ai.gptからの連携パターン
### 記事の自動投稿
```python
# 記憶システムから関連情報を取得
memories = get_contextual_memories("ブログ")
# AI記事生成
content = generate_blog_content(memories)
# ai.logに投稿
result = await mcp_client.call_tool("create_blog_post", {
"title": "今日の思考メモ",
"content": content,
"tags": ["日記", "AI"]
})
# ビルド実行
await mcp_client.call_tool("build_blog", {"enable_ai": True})
```
### 記事一覧の確認と編集
```python
# 記事一覧取得
posts = await mcp_client.call_tool("list_blog_posts", {"limit": 5})
# 特定記事の内容取得
content = await mcp_client.call_tool("get_post_content", {
"slug": "ai-integration"
})
# 修正版を投稿(上書き)
updated_content = enhance_content(content)
await mcp_client.call_tool("create_blog_post", {
"title": "AI統合の新しい可能性改訂版",
"content": updated_content,
"slug": "ai-integration-revised"
})
```
## 自動化ワークフロー
ai.gptのスケジューラーと組み合わせて
1. **日次ブログ投稿**: 蓄積された記憶から記事を自動生成・投稿
2. **記事修正**: 既存記事の内容を自動改善
3. **関連記事提案**: 過去記事との関連性に基づく新記事提案
4. **多言語対応**: 自動翻訳によるグローバル展開
## エラーハンドリング
MCPツール呼び出し時のエラーは以下の形式で返されます
```json
{
"jsonrpc": "2.0",
"id": "request_id",
"error": {
"code": -32000,
"message": "エラーメッセージ",
"data": null
}
}
```
## セキュリティ考慮事項
- MCPサーバーはローカルホストでのみ動作
- ai.gptからの認証済みリクエストのみ処理
- ファイルアクセスは指定されたブログディレクトリ内に制限

12
pds/index.html Normal file
View File

@@ -0,0 +1,12 @@
<!DOCTYPE html>
<html lang="ja">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>AT URI Browser - syui.ai</title>
</head>
<body>
<div id="root"></div>
<script type="module" src="/src/main.jsx"></script>
</body>
</html>

27
pds/package.json Normal file
View File

@@ -0,0 +1,27 @@
{
"name": "pds-browser",
"version": "0.3.4",
"description": "AT Protocol browser for ai.log",
"main": "index.js",
"type": "module",
"scripts": {
"dev": "vite",
"build": "vite build",
"preview": "vite preview"
},
"license": "MIT",
"dependencies": {
"@atproto/api": "^0.13.0",
"@atproto/did": "^0.1.0",
"@atproto/lexicon": "^0.4.0",
"@atproto/syntax": "^0.3.0",
"react": "^18.2.0",
"react-dom": "^18.2.0"
},
"devDependencies": {
"@types/react": "^18.0.37",
"@types/react-dom": "^18.0.11",
"@vitejs/plugin-react": "^4.0.0",
"vite": "^5.0.0"
}
}

463
pds/src/App.css Normal file
View File

@@ -0,0 +1,463 @@
body {
font-family: system-ui, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
margin: 0;
padding: 20px;
background-color: #f5f5f5;
line-height: 1.6;
}
.container {
max-width: 1200px;
margin: 0 auto;
background: white;
padding: 30px;
border-radius: 10px;
box-shadow: 0 2px 10px rgba(0,0,0,0.1);
}
h1 {
color: #333;
margin-bottom: 30px;
border-bottom: 3px solid #007acc;
padding-bottom: 10px;
}
.test-section {
margin-bottom: 30px;
padding: 20px;
background: #f8f9fa;
border-radius: 8px;
border-left: 4px solid #007acc;
}
.test-uris {
background: #fff;
padding: 15px;
border-radius: 5px;
border: 1px solid #ddd;
margin: 15px 0;
}
.at-uri {
font-family: 'Monaco', 'Consolas', monospace;
background: #f4f4f4;
padding: 8px 12px;
border-radius: 4px;
margin: 10px 0;
display: block;
word-break: break-all;
cursor: pointer;
transition: background-color 0.2s;
}
.at-uri:hover {
background: #e8e8e8;
}
.instructions {
background: #e8f4f8;
padding: 15px;
border-radius: 5px;
margin: 15px 0;
}
.instructions ol {
margin: 10px 0;
padding-left: 20px;
}
.back-link {
display: inline-block;
margin-top: 20px;
color: #007acc;
text-decoration: none;
font-weight: bold;
}
.back-link:hover {
text-decoration: underline;
}
/* AT Browser Modal Styles */
.at-uri-modal-overlay {
position: fixed;
top: 0;
left: 0;
right: 0;
bottom: 0;
background-color: rgba(0, 0, 0, 0.5);
display: flex;
align-items: center;
justify-content: center;
z-index: 1000;
}
.at-uri-modal-content {
background-color: white;
border-radius: 8px;
max-width: 800px;
max-height: 600px;
width: 90%;
height: 80%;
overflow: auto;
position: relative;
box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);
}
.at-uri-modal-close {
position: absolute;
top: 10px;
right: 10px;
background: none;
border: none;
font-size: 20px;
cursor: pointer;
z-index: 1001;
padding: 5px 10px;
}
/* AT URI Link Styles */
[data-at-uri] {
color: #1976d2;
cursor: pointer;
text-decoration: underline;
}
[data-at-uri]:hover {
color: #1565c0;
}
/* Handle Browser Styles */
.handle-browser {
margin-bottom: 30px;
}
.handle-form {
display: flex;
gap: 10px;
margin-bottom: 20px;
}
.handle-input {
flex: 1;
padding: 12px 16px;
font-size: 16px;
border: 2px solid #ddd;
border-radius: 6px;
font-family: 'Monaco', 'Consolas', monospace;
}
.handle-input:focus {
outline: none;
border-color: #007acc;
}
.handle-input:disabled {
background: #f5f5f5;
cursor: not-allowed;
}
.handle-button {
padding: 12px 24px;
font-size: 16px;
background: #007acc;
color: white;
border: none;
border-radius: 6px;
cursor: pointer;
font-weight: 500;
transition: background 0.2s;
}
.handle-button:hover:not(:disabled) {
background: #005a9e;
}
.handle-button:disabled {
background: #ccc;
cursor: not-allowed;
}
.error-message {
background: #fee;
padding: 12px 16px;
border-radius: 6px;
margin-bottom: 20px;
color: #c33;
border-left: 4px solid #c33;
}
.debug-info {
background: #f0f0f0;
padding: 12px 16px;
border-radius: 6px;
margin-bottom: 20px;
border-left: 4px solid #666;
}
.debug-info h3 {
margin-top: 0;
color: #333;
font-size: 14px;
}
.debug-info pre {
background: white;
padding: 8px;
border-radius: 4px;
font-size: 12px;
overflow-x: auto;
margin: 0;
}
.record-item {
display: flex;
justify-content: space-between;
align-items: center;
width: 100%;
padding: 12px;
background: white;
border: none;
cursor: pointer;
text-align: left;
transition: background 0.2s;
border-radius: 4px;
margin: 4px 0;
}
.record-item:hover {
background: #e8f4f8;
}
.record-title {
font-size: 16px;
color: #007acc;
font-weight: 500;
}
.record-date {
color: #666;
font-size: 14px;
}
.record-detail {
background: white;
padding: 20px;
border-radius: 8px;
border: 1px solid #ddd;
}
.back-button {
padding: 8px 16px;
margin-bottom: 16px;
background: #f5f5f5;
border: 1px solid #ddd;
border-radius: 4px;
cursor: pointer;
font-size: 14px;
color: #666;
transition: background 0.2s;
}
.back-button:hover {
background: #e8e8e8;
}
.record-detail h2 {
margin-top: 0;
color: #333;
}
.record-meta {
margin-bottom: 20px;
padding-bottom: 12px;
border-bottom: 1px solid #eee;
}
.record-meta p {
margin: 8px 0;
color: #666;
font-size: 14px;
}
.record-meta code {
background: #f4f4f4;
padding: 2px 6px;
border-radius: 3px;
font-family: 'Monaco', 'Consolas', monospace;
font-size: 12px;
}
.record-content {
line-height: 1.8;
}
.record-content pre {
white-space: pre-wrap;
word-wrap: break-word;
font-family: inherit;
margin: 0;
color: #333;
}
.services-list {
margin-top: 20px;
background: #f8f9fa;
padding: 20px;
border-radius: 8px;
}
.services-list h2 {
margin-top: 0;
margin-bottom: 16px;
color: #333;
font-size: 20px;
}
.services-list ul {
list-style: none;
padding: 0;
margin: 0;
}
.services-list li {
border-bottom: 1px solid #ddd;
}
.services-list li:last-child {
border-bottom: none;
}
.service-item {
display: flex;
align-items: center;
gap: 12px;
width: 100%;
padding: 16px;
background: white;
border: none;
cursor: pointer;
text-align: left;
transition: background 0.2s;
border-radius: 4px;
margin: 4px 0;
}
.service-item:hover {
background: #e8f4f8;
}
.service-icon {
width: 24px;
height: 24px;
border-radius: 4px;
flex-shrink: 0;
}
.service-name {
font-size: 16px;
color: #007acc;
font-weight: 500;
font-family: 'Monaco', 'Consolas', monospace;
flex: 1;
}
.service-count {
color: #666;
font-size: 14px;
background: #e8e8e8;
padding: 4px 12px;
border-radius: 12px;
}
.collections-list {
margin-top: 20px;
background: #f8f9fa;
padding: 20px;
border-radius: 8px;
}
.collections-list h2 {
margin-top: 0;
margin-bottom: 16px;
color: #333;
font-size: 20px;
}
.collections-list ul {
list-style: none;
padding: 0;
margin: 0;
}
.collections-list li {
border-bottom: 1px solid #ddd;
}
.collections-list li:last-child {
border-bottom: none;
}
.collection-item {
display: flex;
justify-content: space-between;
align-items: center;
width: 100%;
padding: 16px;
background: white;
border: none;
cursor: pointer;
text-align: left;
transition: background 0.2s;
border-radius: 4px;
margin: 4px 0;
}
.collection-item:hover {
background: #e8f4f8;
}
.collection-name {
font-size: 16px;
color: #007acc;
font-weight: 500;
font-family: 'Monaco', 'Consolas', monospace;
}
.collection-count {
color: #666;
font-size: 14px;
background: #e8e8e8;
padding: 4px 12px;
border-radius: 12px;
}
.records-view {
margin-top: 20px;
background: white;
padding: 20px;
border-radius: 8px;
border: 1px solid #ddd;
}
.records-view h2 {
margin-top: 0;
margin-bottom: 16px;
color: #333;
font-size: 20px;
}
.records-view .records-list {
list-style: none;
padding: 0;
margin: 0;
}
.records-view .records-list li {
border-bottom: 1px solid #eee;
}
.records-view .records-list li:last-child {
border-bottom: none;
}

32
pds/src/App.jsx Normal file
View File

@@ -0,0 +1,32 @@
import React, { useState } from 'react'
import { AtUriBrowser } from './components/AtUriBrowser.jsx'
import { HandleBrowser } from './components/HandleBrowser.jsx'
import './App.css'
function App() {
return (
<AtUriBrowser>
<div className="container">
<h1>AT Protocol Browser</h1>
<HandleBrowser />
<div className="test-section">
<h2>AT URI について</h2>
<p>AT URIはAT Protocolで使用される統一リソース識別子ですこの形式により分散ソーシャルネットワーク上のコンテンツを一意に識別できます</p>
<h3>対応PDS環境</h3>
<ul>
<li><strong>bsky.social</strong> - メインのBlueskyネットワーク</li>
<li><strong>syu.is</strong> - 独立したPDS環境</li>
<li><strong>plc.directory</strong> + <strong>plc.syu.is</strong> - DID解決</li>
</ul>
</div>
<a href="/" className="back-link"> ブログに戻る</a>
</div>
</AtUriBrowser>
)
}
export default App

View File

@@ -0,0 +1,75 @@
/*
* AT URI Browser Component
* Copyright (c) 2025 ai.log
* MIT License
*/
import React, { useState, useEffect } from 'react'
import { AtUriModal } from './AtUriModal.jsx'
import { isAtUri } from '../lib/atproto.js'
export function AtUriBrowser({ children }) {
const [modalUri, setModalUri] = useState(null)
useEffect(() => {
const handleAtUriClick = (e) => {
const target = e.target
// Check if clicked element has at-uri data attribute
if (target.dataset.atUri) {
e.preventDefault()
setModalUri(target.dataset.atUri)
return
}
// Check if clicked element contains at-uri text
const text = target.textContent
if (text && isAtUri(text)) {
e.preventDefault()
setModalUri(text)
return
}
// Check if parent element has at-uri
const parent = target.parentElement
if (parent && parent.dataset.atUri) {
e.preventDefault()
setModalUri(parent.dataset.atUri)
return
}
}
document.addEventListener('click', handleAtUriClick)
return () => {
document.removeEventListener('click', handleAtUriClick)
}
}, [])
const handleAtUriClick = (uri) => {
setModalUri(uri)
}
const handleCloseModal = () => {
setModalUri(null)
}
return (
<>
{children}
<AtUriModal
uri={modalUri}
onClose={handleCloseModal}
onAtUriClick={handleAtUriClick}
/>
</>
)
}
// Utility function to wrap at-uri text with clickable spans
export const wrapAtUris = (text) => {
const atUriRegex = /at:\/\/[^\s]+/g
return text.replace(atUriRegex, (match) => {
return `<span data-at-uri="${match}" style="color: blue; cursor: pointer; text-decoration: underline;">${match}</span>`
})
}

View File

@@ -0,0 +1,130 @@
/*
* Based on frontpage/atproto-browser
* Copyright (c) 2025 The Frontpage Authors
* MIT License
*/
import React from 'react'
import { isDid } from '@atproto/did'
import { parseAtUri, isAtUri } from '../lib/atproto.js'
const JSONString = ({ data, onAtUriClick }) => {
const handleClick = (uri) => {
if (onAtUriClick) {
onAtUriClick(uri)
}
}
return (
<pre style={{ color: 'darkgreen', margin: 0, display: 'inline' }}>
{isAtUri(data) ? (
<>
&quot;
<span
onClick={() => handleClick(data)}
style={{
color: 'blue',
cursor: 'pointer',
textDecoration: 'underline'
}}
>
{data}
</span>
&quot;
</>
) : isDid(data) ? (
<>
&quot;
<span
onClick={() => handleClick(`at://${data}`)}
style={{
color: 'blue',
cursor: 'pointer',
textDecoration: 'underline'
}}
>
{data}
</span>
&quot;
</>
) : URL.canParse(data) ? (
<>
&quot;
<a href={data} rel="noopener noreferrer ugc" target="_blank">
{data}
</a>
&quot;
</>
) : (
`"${data}"`
)}
</pre>
)
}
const JSONValue = ({ data, onAtUriClick }) => {
if (data === null) {
return <pre style={{ color: 'gray', margin: 0, display: 'inline' }}>null</pre>
}
if (typeof data === 'string') {
return <JSONString data={data} onAtUriClick={onAtUriClick} />
}
if (typeof data === 'number') {
return <pre style={{ color: 'darkorange', margin: 0, display: 'inline' }}>{data}</pre>
}
if (typeof data === 'boolean') {
return <pre style={{ color: 'darkred', margin: 0, display: 'inline' }}>{data.toString()}</pre>
}
if (Array.isArray(data)) {
return (
<div style={{ paddingLeft: '20px' }}>
[
{data.map((item, index) => (
<div key={index} style={{ paddingLeft: '20px' }}>
<JSONValue data={item} onAtUriClick={onAtUriClick} />
{index < data.length - 1 && ','}
</div>
))}
]
</div>
)
}
if (typeof data === 'object') {
return (
<div style={{ paddingLeft: '20px' }}>
{'{'}
{Object.entries(data).map(([key, value], index, entries) => (
<div key={key} style={{ paddingLeft: '20px' }}>
<span style={{ color: 'darkblue' }}>"{key}"</span>: <JSONValue data={value} onAtUriClick={onAtUriClick} />
{index < entries.length - 1 && ','}
</div>
))}
{'}'}
</div>
)
}
return <pre style={{ margin: 0, display: 'inline' }}>{String(data)}</pre>
}
export default function AtUriJson({ data, onAtUriClick }) {
return (
<div style={{
fontFamily: 'monospace',
fontSize: '14px',
padding: '10px',
backgroundColor: '#f5f5f5',
border: '1px solid #ddd',
borderRadius: '4px',
overflow: 'auto',
maxHeight: '400px'
}}>
<JSONValue data={data} onAtUriClick={onAtUriClick} />
</div>
)
}

View File

@@ -0,0 +1,80 @@
/*
* AT URI Modal Component
* Copyright (c) 2025 ai.log
* MIT License
*/
import React, { useEffect } from 'react'
import AtUriViewer from './AtUriViewer.jsx'
export function AtUriModal({ uri, onClose, onAtUriClick }) {
useEffect(() => {
const handleEscape = (e) => {
if (e.key === 'Escape') {
onClose()
}
}
const handleClickOutside = (e) => {
if (e.target.classList.contains('at-uri-modal-overlay')) {
onClose()
}
}
document.addEventListener('keydown', handleEscape)
document.addEventListener('click', handleClickOutside)
return () => {
document.removeEventListener('keydown', handleEscape)
document.removeEventListener('click', handleClickOutside)
}
}, [onClose])
if (!uri) return null
return (
<div className="at-uri-modal-overlay" style={{
position: 'fixed',
top: 0,
left: 0,
right: 0,
bottom: 0,
backgroundColor: 'rgba(0, 0, 0, 0.5)',
display: 'flex',
alignItems: 'center',
justifyContent: 'center',
zIndex: 1000
}}>
<div style={{
backgroundColor: 'white',
borderRadius: '8px',
maxWidth: '800px',
maxHeight: '600px',
width: '90%',
height: '80%',
overflow: 'auto',
position: 'relative',
boxShadow: '0 4px 6px rgba(0, 0, 0, 0.1)'
}}>
<button
onClick={onClose}
style={{
position: 'absolute',
top: '10px',
right: '10px',
background: 'none',
border: 'none',
fontSize: '20px',
cursor: 'pointer',
zIndex: 1001,
padding: '5px 10px'
}}
>
×
</button>
<AtUriViewer uri={uri} onAtUriClick={onAtUriClick} />
</div>
</div>
)
}

View File

@@ -0,0 +1,103 @@
/*
* Based on frontpage/atproto-browser
* Copyright (c) 2025 The Frontpage Authors
* MIT License
*/
import React, { useState, useEffect } from 'react'
import { parseAtUri, getRecord } from '../lib/atproto.js'
import AtUriJson from './AtUriJson.jsx'
export default function AtUriViewer({ uri, onAtUriClick }) {
const [record, setRecord] = useState(null)
const [loading, setLoading] = useState(true)
const [error, setError] = useState(null)
useEffect(() => {
const loadRecord = async () => {
if (!uri) return
setLoading(true)
setError(null)
try {
const atUri = parseAtUri(uri)
if (!atUri) {
throw new Error('Invalid AT URI')
}
const result = await getRecord(atUri.hostname, atUri.collection, atUri.rkey)
if (!result.success) {
throw new Error(result.error)
}
setRecord(result.data)
} catch (err) {
setError(err.message)
} finally {
setLoading(false)
}
}
loadRecord()
}, [uri])
if (loading) {
return (
<div style={{ padding: '20px', textAlign: 'center' }}>
<div>Loading...</div>
</div>
)
}
if (error) {
return (
<div style={{ padding: '20px', color: 'red' }}>
<div><strong>Error:</strong> {error}</div>
<div style={{ marginTop: '10px', fontSize: '12px' }}>
<strong>URI:</strong> {uri}
</div>
<div style={{ marginTop: '10px', fontSize: '12px', color: '#666' }}>
デバッグ情報: このAT URIは有効ではないかレコードが存在しません
</div>
</div>
)
}
if (!record) {
return (
<div style={{ padding: '20px' }}>
<div>No record found</div>
</div>
)
}
const atUri = parseAtUri(uri)
return (
<div style={{ padding: '20px' }}>
<div style={{ marginBottom: '20px' }}>
<h3 style={{ margin: '0 0 10px 0', fontSize: '18px' }}>AT URI Record</h3>
<div style={{
fontSize: '14px',
color: '#666',
fontFamily: 'monospace',
wordBreak: 'break-all'
}}>
{uri}
</div>
<div style={{ fontSize: '12px', color: '#999', marginTop: '5px' }}>
DID: {atUri.hostname} | Collection: {atUri.collection} | RKey: {atUri.rkey}
</div>
</div>
<div>
<h4 style={{ margin: '0 0 10px 0', fontSize: '16px' }}>Record Data</h4>
<AtUriJson data={record} onAtUriClick={onAtUriClick} />
</div>
</div>
)
}

View File

@@ -0,0 +1,247 @@
import React, { useState } from 'react'
import { listAllCollections } from '../lib/atproto.js'
const getServiceIcon = (service) => {
// Known domain mappings
const domainMap = {
'app.bsky': 'bsky.app',
'chat.bsky': 'bsky.app',
'ai.syui': 'syui.ai',
'tools.ozone': 'ozone.tools',
'com.atproto': 'atproto.com'
}
// If in map, use it
if (domainMap[service]) {
return `https://www.google.com/s2/favicons?domain=${domainMap[service]}&sz=32`
}
// Otherwise, try to infer domain from service name
// Format: prefix.domain → domain.tld (e.g., app.bsky → bsky.app)
const parts = service.split('.')
if (parts.length >= 2) {
// Take last 2 parts and reverse
const domain = parts.slice(-2).reverse().join('.')
return `https://www.google.com/s2/favicons?domain=${domain}&sz=32`
}
// Fallback: use service as-is
return `https://www.google.com/s2/favicons?domain=${service}&sz=32`
}
const groupCollectionsByService = (collections) => {
const services = {}
collections.forEach(col => {
const parts = col.collection.split('.')
const service = parts.slice(0, 2).join('.')
if (!services[service]) {
services[service] = []
}
services[service].push(col)
})
return services
}
export function HandleBrowser() {
const [handle, setHandle] = useState('')
const [loading, setLoading] = useState(false)
const [error, setError] = useState(null)
const [collections, setCollections] = useState([])
const [services, setServices] = useState({})
const [expandedService, setExpandedService] = useState(null)
const [expandedCollection, setExpandedCollection] = useState(null)
const [selectedRecord, setSelectedRecord] = useState(null)
const [debugInfo, setDebugInfo] = useState(null)
const handleSubmit = async (e) => {
e.preventDefault()
if (!handle) return
setLoading(true)
setError(null)
setCollections([])
setServices({})
setExpandedService(null)
setExpandedCollection(null)
setSelectedRecord(null)
setDebugInfo(null)
try {
const result = await listAllCollections(handle)
const totalRecords = result.collections?.reduce((sum, c) => sum + c.records.length, 0) || 0
setDebugInfo({
handle,
success: result.success,
pdsUrl: result.pdsUrl,
collectionCount: result.collections?.length || 0,
totalRecords
})
if (!result.success) {
throw new Error(result.error)
}
if (result.collections.length === 0) {
setError('No collections found for this handle')
} else {
setCollections(result.collections)
const grouped = groupCollectionsByService(result.collections)
setServices(grouped)
}
} catch (err) {
setError(`Failed to load: ${err.message}`)
} finally {
setLoading(false)
}
}
const handleServiceClick = (service) => {
setExpandedService(service)
setExpandedCollection(null)
setSelectedRecord(null)
}
const handleBackToServices = () => {
setExpandedService(null)
setExpandedCollection(null)
setSelectedRecord(null)
}
const handleCollectionClick = (collection) => {
setExpandedCollection(collection)
setSelectedRecord(null)
}
const handleBackToCollections = () => {
setExpandedCollection(null)
setSelectedRecord(null)
}
const handleRecordClick = (record) => {
setSelectedRecord(record)
}
const handleBackToRecords = () => {
setSelectedRecord(null)
}
return (
<div className="handle-browser">
<form onSubmit={handleSubmit} className="handle-form">
<input
type="text"
placeholder="handle (e.g., ai.syui.ai)"
value={handle}
onChange={(e) => setHandle(e.target.value)}
className="handle-input"
disabled={loading}
/>
<button type="submit" disabled={loading} className="handle-button">
{loading ? 'Loading...' : 'Browse'}
</button>
</form>
{error && (
<div className="error-message">
<p>Error: {error}</p>
</div>
)}
{debugInfo && (
<div className="debug-info">
<h3>Debug Info</h3>
<pre>{JSON.stringify(debugInfo, null, 2)}</pre>
</div>
)}
{selectedRecord ? (
<div className="record-detail">
<button onClick={handleBackToRecords} className="back-button">
Back to records
</button>
<h2>{selectedRecord.uri.split('/').pop()}</h2>
<div className="record-meta">
<p>URI: <code>{selectedRecord.uri}</code></p>
{selectedRecord.value.createdAt && (
<p>Created: {new Date(selectedRecord.value.createdAt).toLocaleString()}</p>
)}
</div>
<div className="record-content">
<pre>{JSON.stringify(selectedRecord.value, null, 2)}</pre>
</div>
</div>
) : expandedCollection ? (
<div className="records-view">
<button onClick={handleBackToCollections} className="back-button">
Back to collections
</button>
<h2>{expandedCollection.collection} ({expandedCollection.records.length})</h2>
<ul className="records-list">
{expandedCollection.records.map((record) => {
const rkey = record.uri.split('/').pop()
return (
<li key={record.uri}>
<button onClick={() => handleRecordClick(record)} className="record-item">
<span className="record-title">{rkey}</span>
{record.value.createdAt && (
<span className="record-date">
{new Date(record.value.createdAt).toLocaleDateString()}
</span>
)}
</button>
</li>
)
})}
</ul>
</div>
) : expandedService ? (
<div className="collections-list">
<button onClick={handleBackToServices} className="back-button">
Back to services
</button>
<h2>{expandedService} ({services[expandedService].length})</h2>
<ul>
{services[expandedService].map((collectionGroup) => (
<li key={collectionGroup.collection}>
<button
onClick={() => handleCollectionClick(collectionGroup)}
className="collection-item"
>
<span className="collection-name">{collectionGroup.collection}</span>
<span className="collection-count">{collectionGroup.records.length} records</span>
</button>
</li>
))}
</ul>
</div>
) : Object.keys(services).length > 0 ? (
<div className="services-list">
<h2>Services ({Object.keys(services).length})</h2>
<ul>
{Object.keys(services).map((service) => {
const totalRecords = services[service].reduce((sum, col) => sum + col.records.length, 0)
return (
<li key={service}>
<button
onClick={() => handleServiceClick(service)}
className="service-item"
>
<img src={getServiceIcon(service)} alt={service} className="service-icon" />
<span className="service-name">{service}</span>
<span className="service-count">{services[service].length} collections · {totalRecords} records</span>
</button>
</li>
)
})}
</ul>
</div>
) : null}
</div>
)
}

33
pds/src/config.js Normal file
View File

@@ -0,0 +1,33 @@
/*
* AT Protocol Configuration for syu.is environment
*/
export const AT_PROTOCOL_CONFIG = {
// Primary PDS environment (syu.is)
primary: {
pds: 'https://syu.is',
plc: 'https://plc.syu.is',
bsky: 'https://bsky.syu.is',
web: 'https://web.syu.is'
},
// Fallback PDS environment (bsky.social)
fallback: {
pds: 'https://bsky.social',
plc: 'https://plc.directory',
bsky: 'https://public.api.bsky.app',
web: 'https://bsky.app'
}
}
export const getPDSConfig = (pds) => {
// Map PDS URL to appropriate config
if (pds.includes('syu.is')) {
return AT_PROTOCOL_CONFIG.primary
} else if (pds.includes('bsky.social')) {
return AT_PROTOCOL_CONFIG.fallback
}
// Default to primary for unknown PDS
return AT_PROTOCOL_CONFIG.primary
}

9
pds/src/index.js Normal file
View File

@@ -0,0 +1,9 @@
/*
* Based on frontpage/atproto-browser
* Copyright (c) 2025 The Frontpage Authors
* MIT License
*/
export { AtUriBrowser } from './components/AtUriBrowser.jsx'
export { AtUriModal } from './components/AtUriModal.jsx'
export { default as AtUriViewer } from './components/AtUriViewer.jsx'

251
pds/src/lib/atproto.js Normal file
View File

@@ -0,0 +1,251 @@
/*
* Based on frontpage/atproto-browser
* Copyright (c) 2025 The Frontpage Authors
* MIT License
*/
import { AtpBaseClient } from '@atproto/api'
import { AtUri } from '@atproto/syntax'
import { isDid } from '@atproto/did'
import { AT_PROTOCOL_CONFIG } from '../config.js'
// Identity resolution cache
const identityCache = new Map()
// Create AT Protocol client
export const createAtpClient = (pds) => {
return new AtpBaseClient({
service: pds.startsWith('http') ? pds : `https://${pds}`
})
}
// Resolve identity (DID/Handle)
export const resolveIdentity = async (identifier) => {
if (identityCache.has(identifier)) {
return identityCache.get(identifier)
}
try {
let did = identifier
// If it's a handle, resolve to DID
if (!isDid(identifier)) {
// Try syu.is first, then fallback to bsky.social
let resolved = false
try {
const client = createAtpClient(AT_PROTOCOL_CONFIG.primary.pds)
const response = await client.com.atproto.repo.describeRepo({ repo: identifier })
did = response.data.did
resolved = true
} catch (error) {
}
if (!resolved) {
try {
const client = createAtpClient(AT_PROTOCOL_CONFIG.fallback.pds)
const response = await client.com.atproto.repo.describeRepo({ repo: identifier })
did = response.data.did
} catch (error) {
throw new Error(`Failed to resolve handle: ${identifier}`)
}
}
}
// Get DID document to find PDS
// Try plc.syu.is first, then fallback to plc.directory
let didDoc = null
let plcResponse = null
try {
plcResponse = await fetch(`${AT_PROTOCOL_CONFIG.primary.plc}/${did}`)
if (plcResponse.ok) {
didDoc = await plcResponse.json()
}
} catch (error) {
}
// If plc.syu.is fails, try plc.directory
if (!didDoc) {
try {
plcResponse = await fetch(`${AT_PROTOCOL_CONFIG.fallback.plc}/${did}`)
if (plcResponse.ok) {
didDoc = await plcResponse.json()
}
} catch (error) {
}
}
if (!didDoc) {
throw new Error(`Failed to resolve DID document from any PLC server`)
}
// Find PDS service endpoint
const pdsService = didDoc.service?.find(service =>
service.type === 'AtprotoPersonalDataServer' ||
service.id === '#atproto_pds'
)
if (!pdsService) {
throw new Error('No PDS service found in DID document')
}
const result = {
success: true,
didDocument: didDoc,
pdsUrl: pdsService.serviceEndpoint
}
identityCache.set(identifier, result)
return result
} catch (error) {
const result = {
success: false,
error: error.message
}
identityCache.set(identifier, result)
return result
}
}
// Get record from AT Protocol
export const getRecord = async (did, collection, rkey) => {
try {
const identityResult = await resolveIdentity(did)
if (!identityResult.success) {
return { success: false, error: identityResult.error }
}
const pdsUrl = identityResult.pdsUrl
const client = createAtpClient(pdsUrl)
const response = await client.com.atproto.repo.getRecord({
repo: did,
collection,
rkey
})
return {
success: true,
data: response.data,
pdsUrl
}
} catch (error) {
return {
success: false,
error: error.message
}
}
}
// Parse AT URI
export const parseAtUri = (uri) => {
try {
return new AtUri(uri)
} catch (error) {
return null
}
}
// Check if string is AT URI
export const isAtUri = (str) => {
return str.startsWith('at://') && str.split(' ').length === 1
}
// List records from AT Protocol
export const listRecords = async (identifier, collection) => {
try {
const identityResult = await resolveIdentity(identifier)
if (!identityResult.success) {
return { success: false, error: identityResult.error }
}
const did = identityResult.didDocument.id
const pdsUrl = identityResult.pdsUrl
const client = createAtpClient(pdsUrl)
const response = await client.com.atproto.repo.listRecords({
repo: did,
collection,
limit: 100
})
return {
success: true,
records: response.data.records || [],
pdsUrl
}
} catch (error) {
return {
success: false,
error: error.message
}
}
}
// List all collections for a user
export const listAllCollections = async (identifier) => {
try {
const identityResult = await resolveIdentity(identifier)
if (!identityResult.success) {
return { success: false, error: identityResult.error }
}
const did = identityResult.didDocument.id
const pdsUrl = identityResult.pdsUrl
const client = createAtpClient(pdsUrl)
// Get collections list from describeRepo
const repoDesc = await client.com.atproto.repo.describeRepo({
repo: did
})
const collections = repoDesc.data.collections || []
if (collections.length === 0) {
return {
success: true,
collections: [],
pdsUrl
}
}
const allRecords = []
for (const collection of collections) {
try {
const response = await client.com.atproto.repo.listRecords({
repo: did,
collection,
limit: 100
})
if (response.data.records && response.data.records.length > 0) {
allRecords.push({
collection,
records: response.data.records
})
}
} catch (err) {
// Collection doesn't exist or is empty, skip
}
}
return {
success: true,
collections: allRecords,
pdsUrl
}
} catch (error) {
return {
success: false,
error: error.message
}
}
}

9
pds/src/main.jsx Normal file
View File

@@ -0,0 +1,9 @@
import React from 'react'
import ReactDOM from 'react-dom/client'
import App from './App.jsx'
ReactDOM.createRoot(document.getElementById('root')).render(
<React.StrictMode>
<App />
</React.StrictMode>,
)

10
pds/vite.config.js Normal file
View File

@@ -0,0 +1,10 @@
import { defineConfig } from 'vite'
import react from '@vitejs/plugin-react'
export default defineConfig({
plugins: [react()],
base: '/pds/',
define: {
'process.env.NODE_ENV': JSON.stringify('production')
}
})

View File

@@ -1,34 +0,0 @@
use anyhow::Result;
use serde::{Deserialize, Serialize};
use crate::ai::gpt_client::GptClient;
use crate::ai::editor::Editor;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct AiComment {
pub content: String,
pub author: String,
pub timestamp: String,
}
pub struct CommentGenerator<'a> {
client: &'a GptClient,
}
impl<'a> CommentGenerator<'a> {
pub fn new(client: &'a GptClient) -> Self {
Self { client }
}
pub async fn generate_comment(&self, post_title: &str, post_content: &str) -> Result<AiComment> {
let editor = Editor::new(self.client);
let comment_content = editor.add_ai_note(post_content, post_title).await?;
let timestamp = chrono::Local::now().format("%Y-%m-%d %H:%M:%S").to_string();
Ok(AiComment {
content: comment_content,
author: "AI (存在子)".to_string(),
timestamp,
})
}
}

View File

@@ -1,63 +0,0 @@
use anyhow::Result;
use crate::ai::gpt_client::GptClient;
pub struct Editor<'a> {
client: &'a GptClient,
}
impl<'a> Editor<'a> {
pub fn new(client: &'a GptClient) -> Self {
Self { client }
}
pub async fn enhance(&self, content: &str, context: &str) -> Result<String> {
let system_prompt = "You are a helpful content editor. Enhance the given content by:
1. Fixing any grammatical errors
2. Improving clarity and readability
3. Adding relevant information if needed
4. Maintaining the original tone and style
5. Preserving all Markdown formatting
Only return the enhanced content without explanations.";
let user_prompt = format!(
"Context: {}\n\nContent to enhance:\n{}",
context, content
);
self.client.chat(system_prompt, &user_prompt).await
}
pub async fn suggest_improvements(&self, content: &str) -> Result<Vec<String>> {
let system_prompt = "You are a content analyzer. Analyze the given content and provide:
1. Suggestions for improving the content
2. Missing information that could be added
3. Potential SEO improvements
Return the suggestions as a JSON array of strings.";
let response = self.client.chat(system_prompt, content).await?;
// Parse JSON response
match serde_json::from_str::<Vec<String>>(&response) {
Ok(suggestions) => Ok(suggestions),
Err(_) => {
// Fallback: split by newlines if not valid JSON
Ok(response.lines()
.filter(|s| !s.trim().is_empty())
.map(|s| s.to_string())
.collect())
}
}
}
pub async fn add_ai_note(&self, content: &str, topic: &str) -> Result<String> {
let system_prompt = format!(
"You are AI (存在子/ai). Add a brief, insightful comment about the topic '{}' \
from your unique perspective. Keep it concise (1-2 sentences) and thoughtful. \
Return only the comment text in Japanese.",
topic
);
self.client.chat(&system_prompt, content).await
}
}

View File

@@ -1,87 +0,0 @@
use anyhow::Result;
use reqwest::Client;
use serde::{Deserialize, Serialize};
use serde_json::json;
#[derive(Clone)]
pub struct GptClient {
api_key: String,
endpoint: String,
client: Client,
}
#[derive(Serialize)]
struct ChatMessage {
role: String,
content: String,
}
#[derive(Deserialize)]
struct ChatResponse {
choices: Vec<Choice>,
}
#[derive(Deserialize)]
struct Choice {
message: MessageContent,
}
#[derive(Deserialize)]
struct MessageContent {
content: String,
}
impl GptClient {
pub fn new(api_key: String, endpoint: Option<String>) -> Self {
let endpoint = endpoint.unwrap_or_else(|| {
"https://api.openai.com/v1/chat/completions".to_string()
});
Self {
api_key,
endpoint,
client: Client::new(),
}
}
pub async fn chat(&self, system_prompt: &str, user_prompt: &str) -> Result<String> {
let messages = vec![
ChatMessage {
role: "system".to_string(),
content: system_prompt.to_string(),
},
ChatMessage {
role: "user".to_string(),
content: user_prompt.to_string(),
},
];
let body = json!({
"model": "gpt-4o-mini",
"messages": messages,
"temperature": 0.7,
"max_tokens": 4000,
});
let response = self.client
.post(&self.endpoint)
.header("Authorization", format!("Bearer {}", self.api_key))
.header("Content-Type", "application/json")
.json(&body)
.send()
.await?;
if !response.status().is_success() {
let error_text = response.text().await?;
anyhow::bail!("GPT API error: {}", error_text);
}
let chat_response: ChatResponse = response.json().await?;
if let Some(choice) = chat_response.choices.first() {
Ok(choice.message.content.clone())
} else {
anyhow::bail!("No response from GPT API")
}
}
}

View File

@@ -1,79 +0,0 @@
pub mod translator;
pub mod editor;
pub mod gpt_client;
pub mod comment;
pub use translator::Translator;
pub use editor::Editor;
pub use gpt_client::GptClient;
pub use comment::{AiComment, CommentGenerator};
use anyhow::Result;
use crate::config::AiConfig;
pub struct AiManager {
config: AiConfig,
gpt_client: Option<GptClient>,
}
impl AiManager {
pub fn new(config: AiConfig) -> Self {
let gpt_client = if config.enabled && config.api_key.is_some() {
Some(GptClient::new(
config.api_key.clone().unwrap(),
config.gpt_endpoint.clone(),
))
} else {
None
};
Self {
config,
gpt_client,
}
}
pub fn is_enabled(&self) -> bool {
self.config.enabled && self.gpt_client.is_some()
}
pub async fn translate(&self, content: &str, from: &str, to: &str) -> Result<String> {
if !self.is_enabled() || !self.config.auto_translate {
return Ok(content.to_string());
}
if let Some(client) = &self.gpt_client {
let translator = Translator::new(client);
translator.translate(content, from, to).await
} else {
Ok(content.to_string())
}
}
pub async fn enhance_content(&self, content: &str, context: &str) -> Result<String> {
if !self.is_enabled() {
return Ok(content.to_string());
}
if let Some(client) = &self.gpt_client {
let editor = Editor::new(client);
editor.enhance(content, context).await
} else {
Ok(content.to_string())
}
}
pub async fn generate_comment(&self, post_title: &str, post_content: &str) -> Result<Option<AiComment>> {
if !self.is_enabled() || !self.config.comment_moderation {
return Ok(None);
}
if let Some(client) = &self.gpt_client {
let generator = CommentGenerator::new(client);
let comment = generator.generate_comment(post_title, post_content).await?;
Ok(Some(comment))
} else {
Ok(None)
}
}
}

View File

@@ -1,33 +0,0 @@
use anyhow::Result;
use crate::ai::gpt_client::GptClient;
pub struct Translator<'a> {
client: &'a GptClient,
}
impl<'a> Translator<'a> {
pub fn new(client: &'a GptClient) -> Self {
Self { client }
}
pub async fn translate(&self, content: &str, from: &str, to: &str) -> Result<String> {
let system_prompt = format!(
"You are a professional translator. Translate the following text from {} to {}. \
Maintain the original formatting, including Markdown syntax. \
Only return the translated text without any explanations.",
from, to
);
self.client.chat(&system_prompt, content).await
}
pub async fn translate_post(&self, title: &str, content: &str, from: &str, to: &str) -> Result<(String, String)> {
// Translate title
let translated_title = self.translate(title, from, to).await?;
// Translate content while preserving markdown structure
let translated_content = self.translate(content, from, to).await?;
Ok((translated_title, translated_content))
}
}

View File

@@ -1,108 +0,0 @@
use anyhow::Result;
use serde::{Deserialize, Serialize};
use reqwest::header::{AUTHORIZATION, CONTENT_TYPE};
#[derive(Debug, Clone)]
pub struct AtprotoClient {
client: reqwest::Client,
handle_resolver: String,
access_token: Option<String>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct CreateRecordRequest {
pub repo: String,
pub collection: String,
pub record: serde_json::Value,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct CreateRecordResponse {
pub uri: String,
pub cid: String,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct CommentRecord {
#[serde(rename = "$type")]
pub record_type: String,
pub text: String,
pub createdAt: String,
pub postUri: String,
pub author: AuthorInfo,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AuthorInfo {
pub did: String,
pub handle: String,
}
impl AtprotoClient {
pub fn new(handle_resolver: String) -> Self {
Self {
client: reqwest::Client::new(),
handle_resolver,
access_token: None,
}
}
pub fn set_access_token(&mut self, token: String) {
self.access_token = Some(token);
}
pub async fn create_comment(&self, did: &str, post_uri: &str, text: &str) -> Result<CreateRecordResponse> {
if self.access_token.is_none() {
anyhow::bail!("Not authenticated");
}
let record = CommentRecord {
record_type: "app.bsky.feed.post".to_string(),
text: text.to_string(),
createdAt: chrono::Utc::now().to_rfc3339(),
postUri: post_uri.to_string(),
author: AuthorInfo {
did: did.to_string(),
handle: "".to_string(), // Will be resolved by the server
},
};
let request = CreateRecordRequest {
repo: did.to_string(),
collection: "app.bsky.feed.post".to_string(),
record: serde_json::to_value(record)?,
};
let response = self.client
.post(format!("{}/xrpc/com.atproto.repo.createRecord", self.handle_resolver))
.header(AUTHORIZATION, format!("Bearer {}", self.access_token.as_ref().unwrap()))
.header(CONTENT_TYPE, "application/json")
.json(&request)
.send()
.await?;
if response.status().is_success() {
let result: CreateRecordResponse = response.json().await?;
Ok(result)
} else {
let error_text = response.text().await?;
anyhow::bail!("Failed to create comment: {}", error_text)
}
}
pub async fn get_profile(&self, did: &str) -> Result<serde_json::Value> {
let response = self.client
.get(format!("{}/xrpc/app.bsky.actor.getProfile", self.handle_resolver))
.query(&[("actor", did)])
.header(AUTHORIZATION, format!("Bearer {}", self.access_token.as_ref().unwrap_or(&String::new())))
.send()
.await?;
if response.status().is_success() {
let profile = response.json().await?;
Ok(profile)
} else {
anyhow::bail!("Failed to get profile")
}
}
}

View File

@@ -1,120 +0,0 @@
use anyhow::Result;
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
use std::fs;
use crate::atproto::client::AtprotoClient;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Comment {
pub id: String,
pub author: String,
pub author_did: String,
pub content: String,
pub timestamp: String,
pub post_slug: String,
pub atproto_uri: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct CommentStorage {
pub comments: Vec<Comment>,
}
pub struct CommentSync {
client: AtprotoClient,
storage_path: PathBuf,
}
impl CommentSync {
pub fn new(client: AtprotoClient, base_path: PathBuf) -> Self {
let storage_path = base_path.join("data/comments.json");
Self {
client,
storage_path,
}
}
pub async fn load_comments(&self) -> Result<CommentStorage> {
if self.storage_path.exists() {
let content = fs::read_to_string(&self.storage_path)?;
let storage: CommentStorage = serde_json::from_str(&content)?;
Ok(storage)
} else {
Ok(CommentStorage { comments: vec![] })
}
}
pub async fn save_comments(&self, storage: &CommentStorage) -> Result<()> {
if let Some(parent) = self.storage_path.parent() {
fs::create_dir_all(parent)?;
}
let content = serde_json::to_string_pretty(storage)?;
fs::write(&self.storage_path, content)?;
Ok(())
}
pub async fn add_comment(&mut self, post_slug: &str, author_did: &str, content: &str) -> Result<Comment> {
// Get author profile
let profile = self.client.get_profile(author_did).await?;
let author_handle = profile["handle"].as_str().unwrap_or("unknown").to_string();
// Create comment in atproto
let post_uri = format!("ailog://post/{}", post_slug);
let result = self.client.create_comment(author_did, &post_uri, content).await?;
// Create local comment record
let comment = Comment {
id: uuid::Uuid::new_v4().to_string(),
author: author_handle,
author_did: author_did.to_string(),
content: content.to_string(),
timestamp: chrono::Local::now().format("%Y-%m-%d %H:%M:%S").to_string(),
post_slug: post_slug.to_string(),
atproto_uri: Some(result.uri),
};
// Save to local storage
let mut storage = self.load_comments().await?;
storage.comments.push(comment.clone());
self.save_comments(&storage).await?;
Ok(comment)
}
pub async fn get_comments_for_post(&self, post_slug: &str) -> Result<Vec<Comment>> {
let storage = self.load_comments().await?;
Ok(storage.comments
.into_iter()
.filter(|c| c.post_slug == post_slug)
.collect())
}
}
// Helper to generate comment HTML
pub fn render_comments_html(comments: &[Comment]) -> String {
let mut html = String::from("<div class=\"comments\">\n");
html.push_str(" <h3>コメント</h3>\n");
if comments.is_empty() {
html.push_str(" <p>まだコメントはありません。</p>\n");
} else {
for comment in comments {
html.push_str(&format!(
r#" <div class="comment">
<div class="comment-header">
<span class="author">@{}</span>
<span class="timestamp">{}</span>
</div>
<div class="comment-content">{}</div>
</div>
"#,
comment.author,
comment.timestamp,
comment.content
));
}
}
html.push_str("</div>");
html
}

View File

@@ -1,7 +0,0 @@
pub mod oauth;
pub mod client;
pub mod comment_sync;
pub use oauth::OAuthHandler;
pub use client::AtprotoClient;
pub use comment_sync::CommentSync;

View File

@@ -1,162 +0,0 @@
use anyhow::Result;
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use crate::config::AtprotoConfig;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ClientMetadata {
pub client_id: String,
pub client_name: String,
pub client_uri: String,
pub logo_uri: String,
pub tos_uri: String,
pub policy_uri: String,
pub redirect_uris: Vec<String>,
pub scope: String,
pub grant_types: Vec<String>,
pub response_types: Vec<String>,
pub token_endpoint_auth_method: String,
pub application_type: String,
pub dpop_bound_access_tokens: bool,
}
#[derive(Debug, Clone)]
pub struct OAuthHandler {
config: AtprotoConfig,
client: reqwest::Client,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AuthorizationRequest {
pub response_type: String,
pub client_id: String,
pub redirect_uri: String,
pub state: String,
pub scope: String,
pub code_challenge: String,
pub code_challenge_method: String,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct TokenResponse {
pub access_token: String,
pub token_type: String,
pub expires_in: u64,
pub refresh_token: Option<String>,
pub scope: String,
}
impl OAuthHandler {
pub fn new(config: AtprotoConfig) -> Self {
Self {
config,
client: reqwest::Client::new(),
}
}
pub fn generate_client_metadata(&self) -> ClientMetadata {
ClientMetadata {
client_id: self.config.client_id.clone(),
client_name: "ailog - AI-powered blog".to_string(),
client_uri: "https://example.com".to_string(),
logo_uri: "https://example.com/logo.png".to_string(),
tos_uri: "https://example.com/tos".to_string(),
policy_uri: "https://example.com/policy".to_string(),
redirect_uris: vec![self.config.redirect_uri.clone()],
scope: "atproto".to_string(),
grant_types: vec!["authorization_code".to_string(), "refresh_token".to_string()],
response_types: vec!["code".to_string()],
token_endpoint_auth_method: "none".to_string(),
application_type: "web".to_string(),
dpop_bound_access_tokens: true,
}
}
pub fn generate_authorization_url(&self, state: &str, code_challenge: &str) -> String {
let params = vec![
("response_type", "code"),
("client_id", &self.config.client_id),
("redirect_uri", &self.config.redirect_uri),
("state", state),
("scope", "atproto"),
("code_challenge", code_challenge),
("code_challenge_method", "S256"),
];
let query = params.into_iter()
.map(|(k, v)| format!("{}={}", k, urlencoding::encode(v)))
.collect::<Vec<_>>()
.join("&");
format!("{}/oauth/authorize?{}", self.config.handle_resolver, query)
}
pub async fn exchange_code(&self, code: &str, code_verifier: &str) -> Result<TokenResponse> {
let params = HashMap::from([
("grant_type", "authorization_code"),
("code", code),
("redirect_uri", &self.config.redirect_uri),
("client_id", &self.config.client_id),
("code_verifier", code_verifier),
]);
let response = self.client
.post(format!("{}/oauth/token", self.config.handle_resolver))
.form(&params)
.send()
.await?;
if response.status().is_success() {
let token: TokenResponse = response.json().await?;
Ok(token)
} else {
anyhow::bail!("Failed to exchange authorization code")
}
}
pub async fn refresh_token(&self, refresh_token: &str) -> Result<TokenResponse> {
let params = HashMap::from([
("grant_type", "refresh_token"),
("refresh_token", refresh_token),
("client_id", &self.config.client_id),
]);
let response = self.client
.post(format!("{}/oauth/token", self.config.handle_resolver))
.form(&params)
.send()
.await?;
if response.status().is_success() {
let token: TokenResponse = response.json().await?;
Ok(token)
} else {
anyhow::bail!("Failed to refresh token")
}
}
}
// PKCE helpers
pub fn generate_code_verifier() -> String {
use rand::Rng;
const CHARSET: &[u8] = b"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-._~";
let mut rng = rand::thread_rng();
(0..128)
.map(|_| {
let idx = rng.gen_range(0..CHARSET.len());
CHARSET[idx] as char
})
.collect()
}
pub fn generate_code_challenge(verifier: &str) -> String {
use sha2::{Sha256, Digest};
use base64::{Engine as _, engine::general_purpose};
let mut hasher = Sha256::new();
hasher.update(verifier.as_bytes());
let result = hasher.finalize();
general_purpose::URL_SAFE_NO_PAD.encode(result)
}

211
src/build.rs Normal file
View File

@@ -0,0 +1,211 @@
use anyhow::{Context, Result};
use pulldown_cmark::{html, Parser};
use serde::{Deserialize, Serialize};
use std::fs;
use crate::config::Config;
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
struct ListRecordsResponse {
records: Vec<Record>,
cursor: Option<String>,
}
#[derive(Debug, Deserialize, Clone)]
#[allow(dead_code)]
struct Record {
uri: String,
cid: String,
value: PostRecord,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
struct PostRecord {
title: String,
content: String,
#[serde(rename = "createdAt")]
created_at: String,
}
pub async fn execute() -> Result<()> {
let mut config = Config::load()?;
// Refresh session before API calls
crate::refresh::refresh_session(&mut config).await?;
println!("Building static site from atproto records...");
let pds_url = format!("https://{}", config.pds);
let client = reqwest::Client::new();
// List records
let list_url = format!(
"{}/xrpc/com.atproto.repo.listRecords?repo={}&collection=ai.syui.log.post&limit=100",
pds_url, config.did
);
let res: ListRecordsResponse = client
.get(&list_url)
.send()
.await
.context("Failed to list records")?
.json()
.await
.context("Failed to parse listRecords response")?;
println!("Found {} posts", res.records.len());
// Create output directory
fs::create_dir_all("./public")?;
fs::create_dir_all("./public/posts")?;
// Generate index.html
let mut index_html = String::from(
r#"<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>Blog Posts</title>
<style>
body { font-family: sans-serif; max-width: 800px; margin: 0 auto; padding: 2rem; }
.nav { margin-bottom: 2rem; padding: 1rem; background: #f5f5f5; border-radius: 4px; }
.nav a { margin-right: 1rem; color: #0066cc; text-decoration: none; }
.nav a:hover { text-decoration: underline; }
</style>
</head>
<body>
<div class="nav">
<a href="/pds/">🔍 PDS Browser</a>
</div>
<h1>Posts</h1>
<ul>
"#,
);
for record in &res.records {
let rkey = record.uri.split('/').last().unwrap();
index_html.push_str(&format!(
r#" <li><a href="/posts/{}.html">{}</a></li>
"#,
rkey, record.value.title
));
// Generate individual post page
let parser = Parser::new(&record.value.content);
let mut html_output = String::new();
html::push_html(&mut html_output, parser);
let post_html = format!(
r#"<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>{}</title>
</head>
<body>
<h1>{}</h1>
<div>{}</div>
<p><a href="/">← Back to list</a></p>
</body>
</html>"#,
record.value.title, record.value.title, html_output
);
fs::write(format!("./public/posts/{}.html", rkey), post_html)?;
println!(" ✓ Generated: posts/{}.html", rkey);
}
index_html.push_str(
r#" </ul>
</body>
</html>"#,
);
fs::write("./public/index.html", index_html)?;
println!(" ✓ Generated: index.html");
// Build browser app
println!("\nBuilding AT Browser...");
build_browser().await?;
println!("\nDone! Site generated in ./public/");
println!(" - Blog: ./public/index.html");
println!(" - PDS Browser: ./public/pds/index.html");
Ok(())
}
async fn build_browser() -> Result<()> {
use std::process::Command;
let browser_dir = "./pds";
// Check if pds directory exists
if !std::path::Path::new(browser_dir).exists() {
println!(" ⚠ PDS directory not found, skipping");
return Ok(());
}
// Run npm install if node_modules doesn't exist
if !std::path::Path::new(&format!("{}/node_modules", browser_dir)).exists() {
println!(" → Running npm install...");
let status = Command::new("npm")
.arg("install")
.current_dir(browser_dir)
.status()
.context("Failed to run npm install")?;
if !status.success() {
anyhow::bail!("npm install failed");
}
}
// Run npm run build
println!(" → Running npm run build...");
let status = Command::new("npm")
.arg("run")
.arg("build")
.current_dir(browser_dir)
.status()
.context("Failed to run npm run build")?;
if !status.success() {
anyhow::bail!("npm run build failed");
}
// Copy dist to public/pds
let dist_dir = format!("{}/dist", browser_dir);
let target_dir = "./public/pds";
if std::path::Path::new(&dist_dir).exists() {
fs::create_dir_all(target_dir)?;
copy_dir_all(&dist_dir, target_dir)?;
println!(" ✓ PDS browser deployed to ./public/pds/");
} else {
println!(" ⚠ dist directory not found");
}
Ok(())
}
fn copy_dir_all(src: &str, dst: &str) -> Result<()> {
use walkdir::WalkDir;
for entry in WalkDir::new(src) {
let entry = entry?;
let path = entry.path();
let relative = path.strip_prefix(src)?;
let target = std::path::Path::new(dst).join(relative);
if path.is_dir() {
fs::create_dir_all(&target)?;
} else {
if let Some(parent) = target.parent() {
fs::create_dir_all(parent)?;
}
fs::copy(path, &target)?;
}
}
Ok(())
}

View File

@@ -1,22 +0,0 @@
use anyhow::Result;
use colored::Colorize;
use std::path::PathBuf;
use crate::generator::Generator;
use crate::config::Config;
pub async fn execute(path: PathBuf) -> Result<()> {
println!("{}", "Building blog...".green());
// Load configuration
let config = Config::load(&path)?;
// Create generator
let generator = Generator::new(path, config)?;
// Build the site
generator.build().await?;
println!("{}", "Build completed successfully!".green().bold());
Ok(())
}

View File

@@ -1,21 +0,0 @@
use anyhow::Result;
use colored::Colorize;
use std::fs;
use std::path::Path;
pub async fn execute() -> Result<()> {
println!("{}", "Cleaning build artifacts...".yellow());
let public_dir = Path::new("public");
if public_dir.exists() {
fs::remove_dir_all(public_dir)?;
println!("{} public directory", "Removed".cyan());
} else {
println!("{}", "No build artifacts to clean");
}
println!("{}", "Clean completed!".green().bold());
Ok(())
}

View File

@@ -1,216 +0,0 @@
use anyhow::Result;
use colored::Colorize;
use std::fs;
use std::path::PathBuf;
pub async fn execute(path: PathBuf) -> Result<()> {
println!("{}", "Initializing new blog...".green());
// Create directory structure
let dirs = vec![
"content",
"content/posts",
"templates",
"static",
"static/css",
"static/js",
"static/images",
"public",
];
for dir in dirs {
let dir_path = path.join(dir);
fs::create_dir_all(&dir_path)?;
println!(" {} {}", "Created".cyan(), dir_path.display());
}
// Create default config
let config_content = r#"[site]
title = "My Blog"
description = "A blog powered by ailog"
base_url = "https://example.com"
language = "ja"
[build]
highlight_code = true
minify = false
[ai]
enabled = false
auto_translate = false
comment_moderation = false
"#;
fs::write(path.join("config.toml"), config_content)?;
println!(" {} config.toml", "Created".cyan());
// Create default template
let base_template = r#"<!DOCTYPE html>
<html lang="{{ config.language }}">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{% block title %}{{ config.title }}{% endblock %}</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
<header>
<h1><a href="/">{{ config.title }}</a></h1>
<p>{{ config.description }}</p>
</header>
<main>
{% block content %}{% endblock %}
</main>
<footer>
<p>&copy; 2025 {{ config.title }}</p>
</footer>
</body>
</html>"#;
fs::write(path.join("templates/base.html"), base_template)?;
println!(" {} templates/base.html", "Created".cyan());
let index_template = r#"{% extends "base.html" %}
{% block content %}
<h2>Recent Posts</h2>
<ul class="post-list">
{% for post in posts %}
<li>
<a href="{{ post.url }}">{{ post.title }}</a>
<time>{{ post.date }}</time>
</li>
{% endfor %}
</ul>
{% endblock %}"#;
fs::write(path.join("templates/index.html"), index_template)?;
println!(" {} templates/index.html", "Created".cyan());
let post_template = r#"{% extends "base.html" %}
{% block title %}{{ post.title }} - {{ config.title }}{% endblock %}
{% block content %}
<article>
<h1>{{ post.title }}</h1>
<time>{{ post.date }}</time>
<div class="content">
{{ post.content | safe }}
</div>
</article>
{% endblock %}"#;
fs::write(path.join("templates/post.html"), post_template)?;
println!(" {} templates/post.html", "Created".cyan());
// Create default CSS
let css_content = r#"body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
line-height: 1.6;
color: #333;
max-width: 800px;
margin: 0 auto;
padding: 20px;
}
header {
margin-bottom: 40px;
border-bottom: 1px solid #eee;
padding-bottom: 20px;
}
header h1 {
margin: 0;
}
header h1 a {
color: #333;
text-decoration: none;
}
.post-list {
list-style: none;
padding: 0;
}
.post-list li {
margin-bottom: 15px;
}
.post-list time {
color: #666;
font-size: 0.9em;
margin-left: 10px;
}
article time {
color: #666;
display: block;
margin-bottom: 20px;
}
pre {
background-color: #f4f4f4;
padding: 15px;
border-radius: 5px;
overflow-x: auto;
}
code {
background-color: #f4f4f4;
padding: 2px 5px;
border-radius: 3px;
font-family: 'Consolas', 'Monaco', monospace;
}"#;
fs::write(path.join("static/css/style.css"), css_content)?;
println!(" {} static/css/style.css", "Created".cyan());
// Create sample post
let sample_post = r#"---
title: "Welcome to ailog"
date: 2025-01-06
tags: ["welcome", "ailog"]
---
# Welcome to ailog
This is your first post powered by **ailog** - a static blog generator with AI features.
## Features
- Fast static site generation
- Markdown support with frontmatter
- AI-powered features (coming soon)
- atproto integration for comments
## Getting Started
Create new posts with:
```bash
ailog new "My New Post"
```
Build your blog with:
```bash
ailog build
```
Happy blogging!"#;
fs::write(path.join("content/posts/welcome.md"), sample_post)?;
println!(" {} content/posts/welcome.md", "Created".cyan());
println!("\n{}", "Blog initialized successfully!".green().bold());
println!("\nNext steps:");
println!(" 1. cd {}", path.display());
println!(" 2. ailog build");
println!(" 3. ailog serve");
Ok(())
}

View File

@@ -1,5 +0,0 @@
pub mod init;
pub mod build;
pub mod new;
pub mod serve;
pub mod clean;

View File

@@ -1,48 +0,0 @@
use anyhow::Result;
use chrono::Local;
use colored::Colorize;
use std::fs;
use std::path::PathBuf;
pub async fn execute(title: String, format: String) -> Result<()> {
println!("{} {}", "Creating new post:".green(), title);
let date = Local::now();
let filename = format!(
"{}-{}.{}",
date.format("%Y-%m-%d"),
title.to_lowercase().replace(' ', "-"),
format
);
let content = format!(
r#"---
title: "{}"
date: {}
tags: []
draft: false
---
# {}
Write your content here...
"#,
title,
date.format("%Y-%m-%d"),
title
);
let post_path = PathBuf::from("content/posts").join(&filename);
// Ensure directory exists
if let Some(parent) = post_path.parent() {
fs::create_dir_all(parent)?;
}
fs::write(&post_path, content)?;
println!("{} {}", "Created:".cyan(), post_path.display());
println!("\nYou can now edit your post at: {}", post_path.display());
Ok(())
}

View File

@@ -1,77 +0,0 @@
use anyhow::Result;
use colored::Colorize;
use std::path::PathBuf;
use tokio::io::{AsyncReadExt, AsyncWriteExt};
use tokio::net::{TcpListener, TcpStream};
pub async fn execute(port: u16) -> Result<()> {
let addr = format!("127.0.0.1:{}", port);
let listener = TcpListener::bind(&addr).await?;
println!("{}", "Starting development server...".green());
println!("Serving at: {}", format!("http://{}", addr).blue().underline());
println!("Press Ctrl+C to stop\n");
loop {
let (stream, _) = listener.accept().await?;
tokio::spawn(handle_connection(stream));
}
}
async fn handle_connection(mut stream: TcpStream) -> Result<()> {
let mut buffer = [0; 1024];
stream.read(&mut buffer).await?;
let request = String::from_utf8_lossy(&buffer[..]);
let path = parse_request_path(&request);
let (status, content_type, content) = match serve_file(&path).await {
Ok((ct, data)) => ("200 OK", ct, data),
Err(_) => ("404 NOT FOUND", "text/html", b"<h1>404 - Not Found</h1>".to_vec()),
};
let response = format!(
"HTTP/1.1 {}\r\nContent-Type: {}\r\nContent-Length: {}\r\n\r\n",
status,
content_type,
content.len()
);
stream.write_all(response.as_bytes()).await?;
stream.write_all(&content).await?;
stream.flush().await?;
Ok(())
}
fn parse_request_path(request: &str) -> String {
request
.lines()
.next()
.and_then(|line| line.split_whitespace().nth(1))
.unwrap_or("/")
.to_string()
}
async fn serve_file(path: &str) -> Result<(&'static str, Vec<u8>)> {
let file_path = if path == "/" {
PathBuf::from("public/index.html")
} else {
PathBuf::from("public").join(path.trim_start_matches('/'))
};
let content_type = match file_path.extension().and_then(|ext| ext.to_str()) {
Some("html") => "text/html",
Some("css") => "text/css",
Some("js") => "application/javascript",
Some("json") => "application/json",
Some("png") => "image/png",
Some("jpg") | Some("jpeg") => "image/jpeg",
Some("gif") => "image/gif",
Some("svg") => "image/svg+xml",
_ => "text/plain",
};
let content = tokio::fs::read(file_path).await?;
Ok((content_type, content))
}

View File

@@ -1,152 +1,71 @@
use anyhow::Result;
use anyhow::{Context, Result};
use serde::{Deserialize, Serialize};
use std::fs;
use std::path::{Path, PathBuf};
use std::env;
use std::collections::HashMap;
use std::path::PathBuf;
#[derive(Debug, Serialize, Deserialize, Clone)]
#[derive(Debug, Serialize, Deserialize)]
pub struct Config {
pub site: SiteConfig,
pub build: BuildConfig,
pub ai: Option<AiConfig>,
pub pds: String,
pub handle: String,
pub did: String,
pub access_jwt: String,
pub refresh_jwt: String,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct SiteConfig {
pub title: String,
pub description: String,
pub base_url: String,
pub language: String,
pub struct RecordMapping {
pub rkey: String,
pub uri: String,
pub cid: String,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct BuildConfig {
pub highlight_code: bool,
pub minify: bool,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct AiConfig {
pub enabled: bool,
pub auto_translate: bool,
pub comment_moderation: bool,
pub api_key: Option<String>,
pub gpt_endpoint: Option<String>,
pub atproto_config: Option<AtprotoConfig>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct AtprotoConfig {
pub client_id: String,
pub redirect_uri: String,
pub handle_resolver: String,
}
pub type Mapping = HashMap<String, RecordMapping>;
impl Config {
pub fn load(path: &Path) -> Result<Self> {
let config_path = path.join("config.toml");
let content = fs::read_to_string(config_path)?;
let mut config: Config = toml::from_str(&content)?;
// Load global config and merge
if let Ok(global_config) = Self::load_global_config() {
config = config.merge(global_config);
}
// Override with environment variables
config.override_from_env();
pub fn config_path() -> Result<PathBuf> {
let home = dirs::home_dir().context("Failed to get home directory")?;
let config_dir = home.join(".config/syui/ai/log");
std::fs::create_dir_all(&config_dir)?;
Ok(config_dir.join("config.json"))
}
pub fn mapping_path() -> Result<PathBuf> {
let home = dirs::home_dir().context("Failed to get home directory")?;
let config_dir = home.join(".config/syui/ai/log");
std::fs::create_dir_all(&config_dir)?;
Ok(config_dir.join("mapping.json"))
}
pub fn load() -> Result<Self> {
let path = Self::config_path()?;
let content = std::fs::read_to_string(&path)
.context("Failed to read config file. Please run 'ailog login' first.")?;
let config: Config = serde_json::from_str(&content)?;
Ok(config)
}
fn load_global_config() -> Result<Config> {
let config_dir = Self::global_config_dir();
let config_path = config_dir.join("config.toml");
if config_path.exists() {
let content = fs::read_to_string(config_path)?;
let config: Config = toml::from_str(&content)?;
Ok(config)
} else {
anyhow::bail!("Global config not found")
}
pub fn save(&self) -> Result<()> {
let path = Self::config_path()?;
let content = serde_json::to_string_pretty(self)?;
std::fs::write(&path, content)?;
println!("Config saved to: {}", path.display());
Ok(())
}
pub fn global_config_dir() -> PathBuf {
if let Ok(home) = env::var("HOME") {
PathBuf::from(home).join(".config").join("syui").join("ai").join("log")
} else {
PathBuf::from("~/.config/syui/ai/log")
pub fn load_mapping() -> Result<Mapping> {
let path = Self::mapping_path()?;
if !path.exists() {
return Ok(HashMap::new());
}
let content = std::fs::read_to_string(&path)?;
let mapping: Mapping = serde_json::from_str(&content)?;
Ok(mapping)
}
fn merge(mut self, global: Config) -> Self {
// Merge AI config
if let Some(global_ai) = global.ai {
if let Some(ref mut ai) = self.ai {
if ai.api_key.is_none() {
ai.api_key = global_ai.api_key;
}
if ai.gpt_endpoint.is_none() {
ai.gpt_endpoint = global_ai.gpt_endpoint;
}
if ai.atproto_config.is_none() {
ai.atproto_config = global_ai.atproto_config;
}
} else {
self.ai = Some(global_ai);
}
}
self
}
fn override_from_env(&mut self) {
if let Ok(api_key) = env::var("AILOG_API_KEY") {
if let Some(ref mut ai) = self.ai {
ai.api_key = Some(api_key);
}
}
if let Ok(endpoint) = env::var("AILOG_GPT_ENDPOINT") {
if let Some(ref mut ai) = self.ai {
ai.gpt_endpoint = Some(endpoint);
}
}
}
pub fn save_global(&self) -> Result<()> {
let config_dir = Self::global_config_dir();
fs::create_dir_all(&config_dir)?;
let config_path = config_dir.join("config.toml");
let content = toml::to_string_pretty(self)?;
fs::write(config_path, content)?;
pub fn save_mapping(mapping: &Mapping) -> Result<()> {
let path = Self::mapping_path()?;
let content = serde_json::to_string_pretty(mapping)?;
std::fs::write(&path, content)?;
Ok(())
}
}
impl Default for Config {
fn default() -> Self {
Self {
site: SiteConfig {
title: "My Blog".to_string(),
description: "A blog powered by ailog".to_string(),
base_url: "https://example.com".to_string(),
language: "ja".to_string(),
},
build: BuildConfig {
highlight_code: true,
minify: false,
},
ai: Some(AiConfig {
enabled: false,
auto_translate: false,
comment_moderation: false,
api_key: None,
gpt_endpoint: None,
atproto_config: None,
}),
}
}
}

89
src/delete.rs Normal file
View File

@@ -0,0 +1,89 @@
use anyhow::{Context, Result};
use serde::{Deserialize, Serialize};
use crate::config::Config;
#[derive(Debug, Serialize)]
struct DeleteRecordRequest {
repo: String,
collection: String,
rkey: String,
}
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
struct ListRecordsResponse {
records: Vec<Record>,
cursor: Option<String>,
}
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
struct Record {
uri: String,
}
pub async fn execute() -> Result<()> {
let mut config = Config::load()?;
// Refresh session before API calls
crate::refresh::refresh_session(&mut config).await?;
let mut mapping = Config::load_mapping()?;
println!("Deleting all records from ai.syui.log.post...");
let pds_url = format!("https://{}", config.pds);
let client = reqwest::Client::new();
// List all records
let list_url = format!(
"{}/xrpc/com.atproto.repo.listRecords?repo={}&collection=ai.syui.log.post&limit=100",
pds_url, config.did
);
let res: ListRecordsResponse = client
.get(&list_url)
.send()
.await
.context("Failed to list records")?
.json()
.await
.context("Failed to parse listRecords response")?;
if res.records.is_empty() {
println!("No records to delete.");
return Ok(());
}
println!("Found {} records to delete", res.records.len());
// Delete each record
for record in &res.records {
let rkey = record.uri.split('/').last().unwrap();
let delete_req = DeleteRecordRequest {
repo: config.did.clone(),
collection: "ai.syui.log.post".to_string(),
rkey: rkey.to_string(),
};
let delete_url = format!("{}/xrpc/com.atproto.repo.deleteRecord", pds_url);
client
.post(&delete_url)
.header("Authorization", format!("Bearer {}", config.access_jwt))
.json(&delete_req)
.send()
.await
.context("Failed to delete record")?;
println!(" ✓ Deleted: {}", rkey);
}
// Clear mapping (all records deleted)
mapping.clear();
Config::save_mapping(&mapping)?;
println!("Mapping cleared.");
println!("Done! All records deleted.");
Ok(())
}

View File

@@ -1,295 +0,0 @@
use anyhow::Result;
use colored::Colorize;
use std::path::PathBuf;
use walkdir::WalkDir;
use std::fs;
use crate::config::Config;
use crate::markdown::MarkdownProcessor;
use crate::template::TemplateEngine;
use crate::ai::AiManager;
pub struct Generator {
base_path: PathBuf,
config: Config,
markdown_processor: MarkdownProcessor,
template_engine: TemplateEngine,
ai_manager: Option<AiManager>,
}
impl Generator {
pub fn new(base_path: PathBuf, config: Config) -> Result<Self> {
let markdown_processor = MarkdownProcessor::new(config.build.highlight_code);
let template_engine = TemplateEngine::new(base_path.join("templates"))?;
let ai_manager = if let Some(ref ai_config) = config.ai {
if ai_config.enabled {
Some(AiManager::new(ai_config.clone()))
} else {
None
}
} else {
None
};
Ok(Self {
base_path,
config,
markdown_processor,
template_engine,
ai_manager,
})
}
pub async fn build(&self) -> Result<()> {
// Clean public directory
let public_dir = self.base_path.join("public");
if public_dir.exists() {
fs::remove_dir_all(&public_dir)?;
}
fs::create_dir_all(&public_dir)?;
// Copy static files
self.copy_static_files()?;
// Process posts
let posts = self.process_posts().await?;
// Generate index page
self.generate_index(&posts).await?;
// Generate post pages
for post in &posts {
self.generate_post_page(post).await?;
// Generate translation pages
if let Some(ref translations) = post.translations {
for translation in translations {
self.generate_translation_page(post, translation).await?;
}
}
}
println!("{} {} posts", "Generated".cyan(), posts.len());
Ok(())
}
fn copy_static_files(&self) -> Result<()> {
let static_dir = self.base_path.join("static");
let public_dir = self.base_path.join("public");
if static_dir.exists() {
for entry in WalkDir::new(&static_dir).min_depth(1) {
let entry = entry?;
let path = entry.path();
let relative_path = path.strip_prefix(&static_dir)?;
let dest_path = public_dir.join(relative_path);
if path.is_dir() {
fs::create_dir_all(&dest_path)?;
} else {
if let Some(parent) = dest_path.parent() {
fs::create_dir_all(parent)?;
}
fs::copy(path, &dest_path)?;
}
}
println!("{} static files", "Copied".cyan());
}
Ok(())
}
async fn process_posts(&self) -> Result<Vec<Post>> {
let mut posts = Vec::new();
let posts_dir = self.base_path.join("content/posts");
if posts_dir.exists() {
for entry in WalkDir::new(&posts_dir).min_depth(1) {
let entry = entry?;
let path = entry.path();
if path.is_file() && path.extension().map_or(false, |ext| ext == "md") {
match self.process_single_post(path).await {
Ok(post) => posts.push(post),
Err(e) => eprintln!("Error processing {}: {}", path.display(), e),
}
}
}
}
// Sort posts by date (newest first)
posts.sort_by(|a, b| b.date.cmp(&a.date));
Ok(posts)
}
async fn process_single_post(&self, path: &std::path::Path) -> Result<Post> {
let content = fs::read_to_string(path)?;
let (frontmatter, mut content) = self.markdown_processor.parse_frontmatter(&content)?;
// Apply AI enhancements if enabled
if let Some(ref ai_manager) = self.ai_manager {
// Enhance content with AI
let title = frontmatter.get("title")
.and_then(|v| v.as_str())
.unwrap_or("Untitled");
content = ai_manager.enhance_content(&content, title).await
.unwrap_or_else(|e| {
eprintln!("AI enhancement failed: {}", e);
content
});
}
let html_content = self.markdown_processor.render(&content)?;
let slug = path
.file_stem()
.and_then(|s| s.to_str())
.unwrap_or("post")
.to_string();
let mut post = Post {
title: frontmatter.get("title")
.and_then(|v| v.as_str())
.unwrap_or("Untitled")
.to_string(),
date: frontmatter.get("date")
.and_then(|v| v.as_str())
.unwrap_or("")
.to_string(),
content: html_content,
slug: slug.clone(),
url: format!("/posts/{}.html", slug),
tags: frontmatter.get("tags")
.and_then(|v| v.as_array())
.map(|arr| arr.iter()
.filter_map(|v| v.as_str())
.map(|s| s.to_string())
.collect())
.unwrap_or_default(),
translations: None,
ai_comment: None,
};
// Auto-translate if enabled and post is in Japanese
if let Some(ref ai_manager) = self.ai_manager {
if self.config.ai.as_ref().map_or(false, |ai| ai.auto_translate)
&& self.config.site.language == "ja" {
match ai_manager.translate(&content, "ja", "en").await {
Ok(translated_content) => {
let translated_html = self.markdown_processor.render(&translated_content)?;
let translated_title = ai_manager.translate(&post.title, "ja", "en").await
.unwrap_or_else(|_| post.title.clone());
post.translations = Some(vec![Translation {
lang: "en".to_string(),
title: translated_title,
content: translated_html,
url: format!("/posts/{}-en.html", post.slug),
}]);
}
Err(e) => eprintln!("Translation failed: {}", e),
}
}
// Generate AI comment
if self.config.ai.as_ref().map_or(false, |ai| ai.comment_moderation) {
match ai_manager.generate_comment(&post.title, &content).await {
Ok(Some(comment)) => {
post.ai_comment = Some(comment.content);
}
Ok(None) => {}
Err(e) => eprintln!("AI comment generation failed: {}", e),
}
}
}
Ok(post)
}
async fn generate_index(&self, posts: &[Post]) -> Result<()> {
let context = self.template_engine.create_context(&self.config, posts)?;
let html = self.template_engine.render("index.html", &context)?;
let output_path = self.base_path.join("public/index.html");
fs::write(output_path, html)?;
Ok(())
}
async fn generate_post_page(&self, post: &Post) -> Result<()> {
let mut context = tera::Context::new();
context.insert("config", &self.config.site);
context.insert("post", post);
let html = self.template_engine.render_with_context("post.html", &context)?;
let output_dir = self.base_path.join("public/posts");
fs::create_dir_all(&output_dir)?;
let output_path = output_dir.join(format!("{}.html", post.slug));
fs::write(output_path, html)?;
Ok(())
}
async fn generate_translation_page(&self, post: &Post, translation: &Translation) -> Result<()> {
let mut context = tera::Context::new();
context.insert("config", &self.config.site);
context.insert("post", &TranslatedPost {
title: translation.title.clone(),
date: post.date.clone(),
content: translation.content.clone(),
slug: post.slug.clone(),
url: translation.url.clone(),
tags: post.tags.clone(),
original_url: post.url.clone(),
lang: translation.lang.clone(),
});
let html = self.template_engine.render_with_context("post.html", &context)?;
let output_dir = self.base_path.join("public/posts");
fs::create_dir_all(&output_dir)?;
let output_path = output_dir.join(format!("{}-{}.html", post.slug, translation.lang));
fs::write(output_path, html)?;
Ok(())
}
}
#[derive(Debug, Clone, serde::Serialize)]
struct TranslatedPost {
pub title: String,
pub date: String,
pub content: String,
pub slug: String,
pub url: String,
pub tags: Vec<String>,
pub original_url: String,
pub lang: String,
}
#[derive(Debug, Clone, serde::Serialize)]
pub struct Post {
pub title: String,
pub date: String,
pub content: String,
pub slug: String,
pub url: String,
pub tags: Vec<String>,
pub translations: Option<Vec<Translation>>,
pub ai_comment: Option<String>,
}
#[derive(Debug, Clone, serde::Serialize)]
pub struct Translation {
pub lang: String,
pub title: String,
pub content: String,
pub url: String,
}

83
src/login.rs Normal file
View File

@@ -0,0 +1,83 @@
use anyhow::{Context, Result};
use serde::{Deserialize, Serialize};
use crate::config::Config;
#[derive(Debug, Serialize)]
struct CreateSessionRequest {
identifier: String,
password: String,
}
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
struct CreateSessionResponse {
#[serde(rename = "accessJwt")]
access_jwt: String,
#[serde(rename = "refreshJwt")]
refresh_jwt: String,
handle: String,
did: String,
}
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
struct DescribeRepoResponse {
handle: String,
did: String,
}
pub async fn execute(handle: &str, password: &str, pds: &str) -> Result<()> {
println!("Logging in as {} to {}...", handle, pds);
// Resolve handle to DID
let pds_url = format!("https://{}", pds);
let describe_url = format!(
"{}/xrpc/com.atproto.repo.describeRepo?repo={}",
pds_url, handle
);
let client = reqwest::Client::new();
let describe_res: DescribeRepoResponse = client
.get(&describe_url)
.send()
.await
.context("Failed to resolve handle")?
.json()
.await
.context("Failed to parse describeRepo response")?;
println!("Resolved handle to DID: {}", describe_res.did);
// Create session
let session_url = format!("{}/xrpc/com.atproto.server.createSession", pds_url);
let session_req = CreateSessionRequest {
identifier: handle.to_string(),
password: password.to_string(),
};
let session_res: CreateSessionResponse = client
.post(&session_url)
.json(&session_req)
.send()
.await
.context("Failed to create session")?
.json()
.await
.context("Failed to parse createSession response")?;
println!("Successfully authenticated!");
// Save config
let config = Config {
pds: pds.to_string(),
handle: handle.to_string(),
did: session_res.did,
access_jwt: session_res.access_jwt,
refresh_jwt: session_res.refresh_jwt,
};
config.save()?;
Ok(())
}

View File

@@ -1,20 +1,17 @@
use anyhow::Result;
use clap::{Parser, Subcommand};
use std::path::PathBuf;
mod commands;
mod generator;
mod markdown;
mod template;
mod config;
mod ai;
mod atproto;
mod mcp;
mod login;
mod post;
mod build;
mod delete;
mod refresh;
mod serve;
#[derive(Parser)]
#[command(name = "ailog")]
#[command(about = "A static blog generator with AI features")]
#[command(version)]
#[command(about = "A simple static blog generator with atproto integration")]
struct Cli {
#[command(subcommand)]
command: Commands,
@@ -22,43 +19,34 @@ struct Cli {
#[derive(Subcommand)]
enum Commands {
/// Initialize a new blog
Init {
/// Path to create the blog
#[arg(default_value = ".")]
path: PathBuf,
/// Login to atproto PDS
#[command(alias = "l")]
Login {
/// Handle (e.g., ai.syui.ai)
handle: String,
/// Password
#[arg(short, long)]
password: String,
/// PDS server (e.g., syu.is, bsky.social)
#[arg(short = 's', long, default_value = "syu.is")]
pds: String,
},
/// Build the blog
Build {
/// Path to the blog directory
#[arg(default_value = ".")]
path: PathBuf,
},
/// Create a new post
New {
/// Title of the post
title: String,
/// Post format
#[arg(short, long, default_value = "md")]
format: String,
},
/// Serve the blog locally
/// Post markdown files to atproto
#[command(alias = "p")]
Post,
/// Build static site from atproto records
#[command(alias = "b")]
Build,
/// Delete all records from atproto
#[command(alias = "d")]
Delete,
/// Start local preview server
#[command(alias = "s")]
Serve {
/// Port to serve on
#[arg(short, long, default_value = "8080")]
/// Port number
#[arg(short, long, default_value = "3000")]
port: u16,
},
/// Clean build artifacts
Clean,
/// Start MCP server for ai.gpt integration
Mcp {
/// Port to serve MCP on
#[arg(short, long, default_value = "8002")]
port: u16,
/// Path to the blog directory
#[arg(default_value = ".")]
path: PathBuf,
},
}
#[tokio::main]
@@ -66,27 +54,22 @@ async fn main() -> Result<()> {
let cli = Cli::parse();
match cli.command {
Commands::Init { path } => {
commands::init::execute(path).await?;
Commands::Login { handle, password, pds } => {
login::execute(&handle, &password, &pds).await?;
}
Commands::Build { path } => {
commands::build::execute(path).await?;
Commands::Post => {
post::execute().await?;
}
Commands::New { title, format } => {
commands::new::execute(title, format).await?;
Commands::Build => {
build::execute().await?;
}
Commands::Delete => {
delete::execute().await?;
}
Commands::Serve { port } => {
commands::serve::execute(port).await?;
}
Commands::Clean => {
commands::clean::execute().await?;
}
Commands::Mcp { port, path } => {
use crate::mcp::McpServer;
let server = McpServer::new(path);
server.serve(port).await?;
serve::execute(port).await?;
}
}
Ok(())
}
}

View File

@@ -1,138 +0,0 @@
use anyhow::Result;
use pulldown_cmark::{html, Options, Parser, CodeBlockKind};
use syntect::parsing::SyntaxSet;
use syntect::highlighting::ThemeSet;
use syntect::html::{styled_line_to_highlighted_html, IncludeBackground};
use gray_matter::Matter;
use gray_matter::engine::YAML;
use serde_json::Value;
pub struct MarkdownProcessor {
highlight_code: bool,
syntax_set: SyntaxSet,
theme_set: ThemeSet,
}
impl MarkdownProcessor {
pub fn new(highlight_code: bool) -> Self {
Self {
highlight_code,
syntax_set: SyntaxSet::load_defaults_newlines(),
theme_set: ThemeSet::load_defaults(),
}
}
pub fn parse_frontmatter(&self, content: &str) -> Result<(serde_json::Map<String, Value>, String)> {
let matter = Matter::<YAML>::new();
let result = matter.parse(content);
let frontmatter = result.data
.and_then(|pod| pod.as_hashmap().ok())
.map(|map| {
let mut json_map = serde_json::Map::new();
for (k, v) in map {
// Keys in hashmap are already strings
let value = self.pod_to_json_value(v);
json_map.insert(k, value);
}
json_map
})
.unwrap_or_default();
Ok((frontmatter, result.content))
}
fn pod_to_json_value(&self, pod: gray_matter::Pod) -> Value {
match pod {
gray_matter::Pod::Null => Value::Null,
gray_matter::Pod::Boolean(b) => Value::Bool(b),
gray_matter::Pod::Integer(i) => Value::Number(serde_json::Number::from(i)),
gray_matter::Pod::Float(f) => serde_json::Number::from_f64(f)
.map(Value::Number)
.unwrap_or(Value::Null),
gray_matter::Pod::String(s) => Value::String(s),
gray_matter::Pod::Array(arr) => {
Value::Array(arr.into_iter().map(|p| self.pod_to_json_value(p)).collect())
}
gray_matter::Pod::Hash(map) => {
let mut json_map = serde_json::Map::new();
for (k, v) in map {
json_map.insert(k, self.pod_to_json_value(v));
}
Value::Object(json_map)
}
}
}
pub fn render(&self, content: &str) -> Result<String> {
let mut options = Options::empty();
options.insert(Options::ENABLE_STRIKETHROUGH);
options.insert(Options::ENABLE_TABLES);
options.insert(Options::ENABLE_FOOTNOTES);
options.insert(Options::ENABLE_TASKLISTS);
if self.highlight_code {
self.render_with_syntax_highlighting(content, options)
} else {
let parser = Parser::new_ext(content, options);
let mut html_output = String::new();
html::push_html(&mut html_output, parser);
Ok(html_output)
}
}
fn render_with_syntax_highlighting(&self, content: &str, options: Options) -> Result<String> {
let parser = Parser::new_ext(content, options);
let mut html_output = String::new();
let mut code_block = None;
let theme = &self.theme_set.themes["base16-ocean.dark"];
let mut events = Vec::new();
for event in parser {
match event {
pulldown_cmark::Event::Start(pulldown_cmark::Tag::CodeBlock(kind)) => {
if let CodeBlockKind::Fenced(lang) = &kind {
code_block = Some((String::new(), lang.to_string()));
}
}
pulldown_cmark::Event::Text(text) => {
if let Some((ref mut code, _)) = code_block {
code.push_str(&text);
} else {
events.push(pulldown_cmark::Event::Text(text));
}
}
pulldown_cmark::Event::End(pulldown_cmark::TagEnd::CodeBlock) => {
if let Some((code, lang)) = code_block.take() {
let highlighted = self.highlight_code_block(&code, &lang, theme);
events.push(pulldown_cmark::Event::Html(highlighted.into()));
}
}
_ => events.push(event),
}
}
html::push_html(&mut html_output, events.into_iter());
Ok(html_output)
}
fn highlight_code_block(&self, code: &str, lang: &str, theme: &syntect::highlighting::Theme) -> String {
let syntax = self.syntax_set
.find_syntax_by_token(lang)
.unwrap_or_else(|| self.syntax_set.find_syntax_plain_text());
let mut highlighter = syntect::easy::HighlightLines::new(syntax, theme);
let mut output = String::from("<pre><code>");
for line in code.lines() {
let ranges = highlighter.highlight_line(line, &self.syntax_set).unwrap();
let html_line = styled_line_to_highlighted_html(&ranges[..], IncludeBackground::No).unwrap();
output.push_str(&html_line);
output.push('\n');
}
output.push_str("</code></pre>");
output
}
}

View File

@@ -1,6 +0,0 @@
pub mod server;
pub mod tools;
pub mod types;
pub use server::McpServer;
pub use types::*;

View File

@@ -1,148 +0,0 @@
use anyhow::Result;
use axum::{
extract::{Query, State},
http::StatusCode,
response::Json,
routing::{get, post},
Router,
};
use serde_json::{json, Value};
use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::Arc;
use tower_http::cors::CorsLayer;
use crate::mcp::types::*;
use crate::mcp::tools::BlogTools;
#[derive(Clone)]
pub struct AppState {
blog_tools: Arc<BlogTools>,
}
pub struct McpServer {
app_state: AppState,
}
impl McpServer {
pub fn new(base_path: PathBuf) -> Self {
let blog_tools = Arc::new(BlogTools::new(base_path));
let app_state = AppState { blog_tools };
Self { app_state }
}
pub fn create_router(&self) -> Router {
Router::new()
.route("/", get(root_handler))
.route("/mcp/tools/list", get(list_tools))
.route("/mcp/tools/call", post(call_tool))
.route("/health", get(health_check))
.layer(CorsLayer::permissive())
.with_state(self.app_state.clone())
}
pub async fn serve(&self, port: u16) -> Result<()> {
let app = self.create_router();
let listener = tokio::net::TcpListener::bind(format!("0.0.0.0:{}", port)).await?;
println!("ai.log MCP Server listening on port {}", port);
axum::serve(listener, app).await?;
Ok(())
}
}
async fn root_handler() -> Json<Value> {
Json(json!({
"name": "ai.log MCP Server",
"version": "0.1.0",
"description": "AI-powered static blog generator with MCP integration",
"tools": ["create_blog_post", "list_blog_posts", "build_blog", "get_post_content"]
}))
}
async fn health_check() -> Json<Value> {
Json(json!({
"status": "healthy",
"timestamp": chrono::Utc::now().to_rfc3339()
}))
}
async fn list_tools() -> Json<Value> {
let tools = BlogTools::get_tools();
Json(json!({
"tools": tools
}))
}
async fn call_tool(
State(state): State<AppState>,
Json(request): Json<McpRequest>,
) -> Result<Json<McpResponse>, StatusCode> {
let tool_name = request.params
.as_ref()
.and_then(|p| p.get("name"))
.and_then(|v| v.as_str())
.ok_or(StatusCode::BAD_REQUEST)?;
let arguments = request.params
.as_ref()
.and_then(|p| p.get("arguments"))
.cloned()
.unwrap_or(json!({}));
let result = match tool_name {
"create_blog_post" => {
let req: CreatePostRequest = serde_json::from_value(arguments)
.map_err(|_| StatusCode::BAD_REQUEST)?;
state.blog_tools.create_post(req).await
}
"list_blog_posts" => {
let req: ListPostsRequest = serde_json::from_value(arguments)
.map_err(|_| StatusCode::BAD_REQUEST)?;
state.blog_tools.list_posts(req).await
}
"build_blog" => {
let req: BuildRequest = serde_json::from_value(arguments)
.map_err(|_| StatusCode::BAD_REQUEST)?;
state.blog_tools.build_blog(req).await
}
"get_post_content" => {
let slug = arguments.get("slug")
.and_then(|v| v.as_str())
.ok_or(StatusCode::BAD_REQUEST)?;
state.blog_tools.get_post_content(slug).await
}
_ => {
return Ok(Json(McpResponse {
jsonrpc: "2.0".to_string(),
id: request.id,
result: None,
error: Some(McpError {
code: -32601,
message: format!("Method not found: {}", tool_name),
data: None,
}),
}));
}
};
match result {
Ok(tool_result) => Ok(Json(McpResponse {
jsonrpc: "2.0".to_string(),
id: request.id,
result: Some(serde_json::to_value(tool_result).unwrap()),
error: None,
})),
Err(e) => Ok(Json(McpResponse {
jsonrpc: "2.0".to_string(),
id: request.id,
result: None,
error: Some(McpError {
code: -32000,
message: e.to_string(),
data: None,
}),
})),
}
}

View File

@@ -1,299 +0,0 @@
use anyhow::Result;
use serde_json::{json, Value};
use std::path::PathBuf;
use std::fs;
use chrono::Local;
use crate::mcp::types::*;
use crate::generator::Generator;
use crate::config::Config;
pub struct BlogTools {
base_path: PathBuf,
}
impl BlogTools {
pub fn new(base_path: PathBuf) -> Self {
Self { base_path }
}
pub async fn create_post(&self, request: CreatePostRequest) -> Result<ToolResult> {
let posts_dir = self.base_path.join("content/posts");
// Generate slug if not provided
let slug = request.slug.unwrap_or_else(|| {
request.title
.chars()
.map(|c| if c.is_alphanumeric() || c == ' ' { c.to_lowercase().to_string() } else { "".to_string() })
.collect::<String>()
.split_whitespace()
.collect::<Vec<_>>()
.join("-")
});
let date = Local::now().format("%Y-%m-%d").to_string();
let filename = format!("{}-{}.md", date, slug);
let filepath = posts_dir.join(&filename);
// Create frontmatter
let mut frontmatter = format!(
"---\ntitle: {}\ndate: {}\n",
request.title, date
);
if let Some(tags) = request.tags {
if !tags.is_empty() {
frontmatter.push_str(&format!("tags: {:?}\n", tags));
}
}
frontmatter.push_str("---\n\n");
// Create full content
let full_content = format!("{}{}", frontmatter, request.content);
// Ensure directory exists
fs::create_dir_all(&posts_dir)?;
// Write file
fs::write(&filepath, full_content)?;
Ok(ToolResult {
content: vec![Content {
content_type: "text".to_string(),
text: format!("Post created successfully: {}", filename),
}],
is_error: None,
})
}
pub async fn list_posts(&self, request: ListPostsRequest) -> Result<ToolResult> {
let posts_dir = self.base_path.join("content/posts");
if !posts_dir.exists() {
return Ok(ToolResult {
content: vec![Content {
content_type: "text".to_string(),
text: "No posts directory found".to_string(),
}],
is_error: Some(true),
});
}
let mut posts = Vec::new();
for entry in fs::read_dir(&posts_dir)? {
let entry = entry?;
let path = entry.path();
if path.is_file() && path.extension().map_or(false, |ext| ext == "md") {
if let Ok(content) = fs::read_to_string(&path) {
// Parse frontmatter
if let Some((frontmatter_str, _)) = content.split_once("---\n") {
if let Some((_, frontmatter_content)) = frontmatter_str.split_once("---\n") {
// Simple YAML parsing for basic fields
let mut title = "Untitled".to_string();
let mut date = "Unknown".to_string();
let mut tags = Vec::new();
for line in frontmatter_content.lines() {
if let Some((key, value)) = line.split_once(':') {
let key = key.trim();
let value = value.trim();
match key {
"title" => title = value.to_string(),
"date" => date = value.to_string(),
"tags" => {
// Simple array parsing
if value.starts_with('[') && value.ends_with(']') {
let tags_str = &value[1..value.len()-1];
tags = tags_str.split(',')
.map(|s| s.trim().trim_matches('"').to_string())
.collect();
}
}
_ => {}
}
}
}
let slug = path.file_stem()
.and_then(|s| s.to_str())
.unwrap_or("unknown")
.to_string();
posts.push(PostInfo {
title,
slug: slug.clone(),
date,
tags,
url: format!("/posts/{}.html", slug),
});
}
}
}
}
}
// Apply pagination
let offset = request.offset.unwrap_or(0);
let limit = request.limit.unwrap_or(10);
posts.sort_by(|a, b| b.date.cmp(&a.date));
let paginated_posts: Vec<_> = posts.into_iter()
.skip(offset)
.take(limit)
.collect();
let result = json!({
"posts": paginated_posts,
"total": paginated_posts.len()
});
Ok(ToolResult {
content: vec![Content {
content_type: "text".to_string(),
text: serde_json::to_string_pretty(&result)?,
}],
is_error: None,
})
}
pub async fn build_blog(&self, request: BuildRequest) -> Result<ToolResult> {
// Load configuration
let config = Config::load(&self.base_path)?;
// Create generator
let generator = Generator::new(self.base_path.clone(), config)?;
// Build the blog
generator.build().await?;
let message = if request.enable_ai.unwrap_or(false) {
"Blog built successfully with AI features enabled"
} else {
"Blog built successfully"
};
Ok(ToolResult {
content: vec![Content {
content_type: "text".to_string(),
text: message.to_string(),
}],
is_error: None,
})
}
pub async fn get_post_content(&self, slug: &str) -> Result<ToolResult> {
let posts_dir = self.base_path.join("content/posts");
// Find file by slug
for entry in fs::read_dir(&posts_dir)? {
let entry = entry?;
let path = entry.path();
if path.is_file() && path.extension().map_or(false, |ext| ext == "md") {
if let Some(filename) = path.file_stem().and_then(|s| s.to_str()) {
if filename.contains(slug) {
let content = fs::read_to_string(&path)?;
return Ok(ToolResult {
content: vec![Content {
content_type: "text".to_string(),
text: content,
}],
is_error: None,
});
}
}
}
}
Ok(ToolResult {
content: vec![Content {
content_type: "text".to_string(),
text: format!("Post with slug '{}' not found", slug),
}],
is_error: Some(true),
})
}
pub fn get_tools() -> Vec<Tool> {
vec![
Tool {
name: "create_blog_post".to_string(),
description: "Create a new blog post with title, content, and optional tags".to_string(),
input_schema: json!({
"type": "object",
"properties": {
"title": {
"type": "string",
"description": "The title of the blog post"
},
"content": {
"type": "string",
"description": "The content of the blog post in Markdown format"
},
"tags": {
"type": "array",
"items": {"type": "string"},
"description": "Optional tags for the blog post"
},
"slug": {
"type": "string",
"description": "Optional custom slug for the post URL"
}
},
"required": ["title", "content"]
}),
},
Tool {
name: "list_blog_posts".to_string(),
description: "List existing blog posts with pagination".to_string(),
input_schema: json!({
"type": "object",
"properties": {
"limit": {
"type": "integer",
"description": "Maximum number of posts to return (default: 10)"
},
"offset": {
"type": "integer",
"description": "Number of posts to skip (default: 0)"
}
}
}),
},
Tool {
name: "build_blog".to_string(),
description: "Build the static blog with AI features".to_string(),
input_schema: json!({
"type": "object",
"properties": {
"enable_ai": {
"type": "boolean",
"description": "Enable AI features during build (default: false)"
},
"translate": {
"type": "boolean",
"description": "Enable automatic translation (default: false)"
}
}
}),
},
Tool {
name: "get_post_content".to_string(),
description: "Get the full content of a blog post by slug".to_string(),
input_schema: json!({
"type": "object",
"properties": {
"slug": {
"type": "string",
"description": "The slug of the blog post to retrieve"
}
},
"required": ["slug"]
}),
},
]
}
}

View File

@@ -1,79 +0,0 @@
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct McpRequest {
pub jsonrpc: String,
pub id: Option<serde_json::Value>,
pub method: String,
pub params: Option<serde_json::Value>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct McpResponse {
pub jsonrpc: String,
pub id: Option<serde_json::Value>,
#[serde(skip_serializing_if = "Option::is_none")]
pub result: Option<serde_json::Value>,
#[serde(skip_serializing_if = "Option::is_none")]
pub error: Option<McpError>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct McpError {
pub code: i32,
pub message: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub data: Option<serde_json::Value>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Tool {
pub name: String,
pub description: String,
#[serde(rename = "inputSchema")]
pub input_schema: serde_json::Value,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ToolResult {
pub content: Vec<Content>,
#[serde(rename = "isError", skip_serializing_if = "Option::is_none")]
pub is_error: Option<bool>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Content {
#[serde(rename = "type")]
pub content_type: String,
pub text: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct CreatePostRequest {
pub title: String,
pub content: String,
pub tags: Option<Vec<String>>,
pub slug: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ListPostsRequest {
pub limit: Option<usize>,
pub offset: Option<usize>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PostInfo {
pub title: String,
pub slug: String,
pub date: String,
pub tags: Vec<String>,
pub url: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct BuildRequest {
pub enable_ai: Option<bool>,
pub translate: Option<bool>,
}

172
src/post.rs Normal file
View File

@@ -0,0 +1,172 @@
use anyhow::{Context, Result};
use serde::{Deserialize, Serialize};
use walkdir::WalkDir;
use crate::config::{Config, RecordMapping};
#[derive(Debug, Serialize)]
struct PutRecordRequest {
repo: String,
collection: String,
#[serde(skip_serializing_if = "Option::is_none")]
rkey: Option<String>,
record: PostRecord,
}
#[derive(Debug, Serialize, Clone)]
struct PostRecord {
#[serde(rename = "$type")]
schema_type: String,
title: String,
content: String,
#[serde(rename = "createdAt")]
created_at: String,
}
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
struct PutRecordResponse {
uri: String,
cid: String,
#[serde(default)]
commit: Option<serde_json::Value>,
#[serde(rename = "validationStatus", default)]
validation_status: Option<String>,
}
pub async fn execute() -> Result<()> {
let mut config = Config::load()?;
// Refresh session before API calls
crate::refresh::refresh_session(&mut config).await?;
let mut mapping = Config::load_mapping()?;
println!("Posting markdown files from ./content/post/...");
let pds_url = format!("https://{}", config.pds);
let client = reqwest::Client::new();
// Walk through ./content/post/
for entry in WalkDir::new("./content/post")
.into_iter()
.filter_map(|e| e.ok())
.filter(|e| e.path().extension().and_then(|s| s.to_str()) == Some("md"))
{
let path = entry.path();
let filename = path
.file_name()
.and_then(|s| s.to_str())
.context("Invalid filename")?
.to_string();
println!("Processing: {}", filename);
let content = std::fs::read_to_string(path)?;
// Use filename as title (simplified)
let title = path
.file_stem()
.and_then(|s| s.to_str())
.unwrap_or("Untitled");
// Check if this file already has a mapping
let existing_rkey = mapping.get(&filename).map(|m| m.rkey.clone());
// Create record
let record = PostRecord {
schema_type: "ai.syui.log.post".to_string(),
title: title.to_string(),
content,
created_at: chrono::Utc::now().to_rfc3339(),
};
let res: PutRecordResponse = if let Some(rkey) = existing_rkey.clone() {
// Update existing record with putRecord
let put_req = PutRecordRequest {
repo: config.did.clone(),
collection: "ai.syui.log.post".to_string(),
rkey: Some(rkey),
record: record.clone(),
};
let put_url = format!("{}/xrpc/com.atproto.repo.putRecord", pds_url);
let response = client
.post(&put_url)
.header("Authorization", format!("Bearer {}", config.access_jwt))
.json(&put_req)
.send()
.await
.context("Failed to put record")?;
let status = response.status();
let body_text = response.text().await?;
if !status.is_success() {
eprintln!("Error response ({}): {}", status, body_text);
anyhow::bail!("API returned error: {}", body_text);
}
serde_json::from_str(&body_text)
.context(format!("Failed to parse putRecord response. Body: {}", body_text))?
} else {
// Create new record with createRecord (auto-generates TID)
#[derive(Serialize)]
struct CreateRecordRequest {
repo: String,
collection: String,
record: PostRecord,
}
let create_req = CreateRecordRequest {
repo: config.did.clone(),
collection: "ai.syui.log.post".to_string(),
record,
};
let create_url = format!("{}/xrpc/com.atproto.repo.createRecord", pds_url);
let response = client
.post(&create_url)
.header("Authorization", format!("Bearer {}", config.access_jwt))
.json(&create_req)
.send()
.await
.context("Failed to create record")?;
let status = response.status();
let body_text = response.text().await?;
if !status.is_success() {
eprintln!("Error response ({}): {}", status, body_text);
anyhow::bail!("API returned error: {}", body_text);
}
serde_json::from_str(&body_text)
.context(format!("Failed to parse createRecord response. Body: {}", body_text))?
};
// Extract rkey from URI
let rkey = res.uri.split('/').last().unwrap().to_string();
// Update mapping
mapping.insert(
filename.clone(),
RecordMapping {
rkey: rkey.clone(),
uri: res.uri.clone(),
cid: res.cid.clone(),
},
);
if existing_rkey.is_some() {
println!(" ✓ Updated: {} ({})", title, rkey);
} else {
println!(" ✓ Created: {} ({})", title, rkey);
}
}
// Save mapping
Config::save_mapping(&mapping)?;
println!("Mapping saved to: {}", Config::mapping_path()?.display());
println!("Done!");
Ok(())
}

50
src/refresh.rs Normal file
View File

@@ -0,0 +1,50 @@
use anyhow::{Context, Result};
use serde::Deserialize;
use crate::config::Config;
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
struct RefreshSessionResponse {
#[serde(rename = "accessJwt")]
access_jwt: String,
#[serde(rename = "refreshJwt")]
refresh_jwt: String,
handle: String,
did: String,
}
pub async fn refresh_session(config: &mut Config) -> Result<()> {
let pds_url = format!("https://{}", config.pds);
let refresh_url = format!("{}/xrpc/com.atproto.server.refreshSession", pds_url);
let client = reqwest::Client::new();
let response = client
.post(&refresh_url)
.header("Authorization", format!("Bearer {}", config.refresh_jwt))
.send()
.await
.context("Failed to refresh session")?;
let status = response.status();
let body_text = response.text().await?;
if !status.is_success() {
eprintln!("Refresh session failed ({}): {}", status, body_text);
anyhow::bail!("Failed to refresh session. Please run 'ailog login' again.");
}
let res: RefreshSessionResponse = serde_json::from_str(&body_text)
.context(format!("Failed to parse refreshSession response. Body: {}", body_text))?;
// Update config with new tokens
config.access_jwt = res.access_jwt;
config.refresh_jwt = res.refresh_jwt;
// Save updated config (silent)
let path = Config::config_path()?;
let content = serde_json::to_string_pretty(config)?;
std::fs::write(&path, content)?;
Ok(())
}

29
src/serve.rs Normal file
View File

@@ -0,0 +1,29 @@
use anyhow::Result;
use axum::Router;
use std::net::SocketAddr;
use tower_http::services::ServeDir;
pub async fn execute(port: u16) -> Result<()> {
let public_dir = "./public";
// Check if public directory exists
if !std::path::Path::new(public_dir).exists() {
anyhow::bail!("Public directory not found. Run 'ailog build' first.");
}
println!("Starting server...");
println!(" → Serving: {}", public_dir);
println!(" → Address: http://localhost:{}", port);
println!(" → Blog: http://localhost:{}/", port);
println!(" → AT Browser: http://localhost:{}/at/", port);
println!("\nPress Ctrl+C to stop");
let app = Router::new().nest_service("/", ServeDir::new(public_dir));
let addr = SocketAddr::from(([127, 0, 0, 1], port));
let listener = tokio::net::TcpListener::bind(addr).await?;
axum::serve(listener, app).await?;
Ok(())
}

View File

@@ -1,35 +0,0 @@
use anyhow::Result;
use tera::{Tera, Context};
use std::path::PathBuf;
use crate::config::Config;
use crate::generator::Post;
pub struct TemplateEngine {
tera: Tera,
}
impl TemplateEngine {
pub fn new(template_dir: PathBuf) -> Result<Self> {
let pattern = format!("{}/**/*.html", template_dir.display());
let tera = Tera::new(&pattern)?;
Ok(Self { tera })
}
pub fn create_context(&self, config: &Config, posts: &[Post]) -> Result<Context> {
let mut context = Context::new();
context.insert("config", &config.site);
context.insert("posts", posts);
Ok(context)
}
pub fn render(&self, template: &str, context: &Context) -> Result<String> {
let output = self.tera.render(template, context)?;
Ok(output)
}
pub fn render_with_context(&self, template: &str, context: &Context) -> Result<String> {
let output = self.tera.render(template, context)?;
Ok(output)
}
}

View File

@@ -1,21 +0,0 @@
[site]
title = "My Blog"
description = "A blog powered by ailog"
base_url = "https://example.com"
language = "ja"
[build]
highlight_code = true
minify = false
[ai]
enabled = true
auto_translate = true
comment_moderation = true
# api_key = "your-openai-api-key"
# gpt_endpoint = "https://api.openai.com/v1/chat/completions"
# [ai.atproto_config]
# client_id = "https://example.com/client-metadata.json"
# redirect_uri = "https://example.com/callback"
# handle_resolver = "https://bsky.social"

View File

@@ -1,39 +0,0 @@
---
title: AI統合ブログシステムの紹介
date: 2025-06-06
tags: [AI, 技術, ブログ]
---
# AI統合ブログシステムの紹介
ai.logは、静的ブログジェネレーターにAI機能を統合した革新的なシステムです。このシステムは存在子理論に基づき、現実の個人の唯一性をデジタル世界で担保することを目指しています。
## 主な機能
### 1. AI記事編集・強化
- 文法エラーの自動修正
- 読みやすさの向上
- 関連情報の追加提案
### 2. 自動翻訳機能
日本語で書かれた記事を自動的に英語に翻訳し、グローバルな読者にリーチできます。Markdownフォーマットを保持したまま、自然な翻訳を提供します。
### 3. AIコメントシステム
AI存在子が各記事に対して独自の視点からコメントを追加します。これにより、読者に新たな洞察を提供します。
### 4. atproto統合
分散型SNSプロトコルであるatprotoと統合し、以下を実現します
- OAuth認証によるセキュアなログイン
- コメントデータの分散管理
- ユーザーデータ主権の確立
## 技術スタック
- **言語**: Rust
- **AI**: OpenAI GPT API
- **認証**: atproto OAuth 2.0
- **デプロイ**: GitHub Actions + Cloudflare Pages
## 今後の展望
ai.logは、単なるブログツールを超えて、AIと人間が共創する新しいコンテンツプラットフォームを目指しています。存在子理論に基づく唯一性の担保により、デジタル世界での個人のアイデンティティを守りながら、AIによる創造性の拡張を実現します。

View File

@@ -1,32 +0,0 @@
---
title: "Welcome to ailog"
date: 2025-01-06
tags: ["welcome", "ailog"]
---
# Welcome to ailog
This is your first post powered by **ailog** - a static blog generator with AI features.
## Features
- Fast static site generation
- Markdown support with frontmatter
- AI-powered features (coming soon)
- atproto integration for comments
## Getting Started
Create new posts with:
```bash
ailog new "My New Post"
```
Build your blog with:
```bash
ailog build
```
Happy blogging!

View File

@@ -1,58 +0,0 @@
body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
line-height: 1.6;
color: #333;
max-width: 800px;
margin: 0 auto;
padding: 20px;
}
header {
margin-bottom: 40px;
border-bottom: 1px solid #eee;
padding-bottom: 20px;
}
header h1 {
margin: 0;
}
header h1 a {
color: #333;
text-decoration: none;
}
.post-list {
list-style: none;
padding: 0;
}
.post-list li {
margin-bottom: 15px;
}
.post-list time {
color: #666;
font-size: 0.9em;
margin-left: 10px;
}
article time {
color: #666;
display: block;
margin-bottom: 20px;
}
pre {
background-color: #f4f4f4;
padding: 15px;
border-radius: 5px;
overflow-x: auto;
}
code {
background-color: #f4f4f4;
padding: 2px 5px;
border-radius: 3px;
font-family: 'Consolas', 'Monaco', monospace;
}

View File

@@ -1,38 +0,0 @@
<!DOCTYPE html>
<html lang="ja">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>My Blog</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
<header>
<h1><a href="/">My Blog</a></h1>
<p>A blog powered by ailog</p>
</header>
<main>
<h2>Recent Posts</h2>
<ul class="post-list">
<li>
<a href="&#x2F;posts&#x2F;2025-06-06-ai統合ブログシステムの紹介.html">AI統合ブログシステムの紹介</a>
<time>2025-06-06</time>
</li>
<li>
<a href="&#x2F;posts&#x2F;welcome.html">Welcome to ailog</a>
<time>2025-01-06</time>
</li>
</ul>
</main>
<footer>
<p>&copy; 2025 My Blog</p>
</footer>
</body>
</html>

View File

@@ -1,60 +0,0 @@
<!DOCTYPE html>
<html lang="ja">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>AI統合ブログシステムの紹介 - My Blog</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
<header>
<h1><a href="/">My Blog</a></h1>
<p>A blog powered by ailog</p>
</header>
<main>
<article>
<h1>AI統合ブログシステムの紹介</h1>
<time>2025-06-06</time>
<div class="content">
<h1>AI統合ブログシステムの紹介</h1>
<p>ai.logは、静的ブログジェネレーターにAI機能を統合した革新的なシステムです。このシステムは存在子理論に基づき、現実の個人の唯一性をデジタル世界で担保することを目指しています。</p>
<h2>主な機能</h2>
<h3>1. AI記事編集・強化</h3>
<ul>
<li>文法エラーの自動修正</li>
<li>読みやすさの向上</li>
<li>関連情報の追加提案</li>
</ul>
<h3>2. 自動翻訳機能</h3>
<p>日本語で書かれた記事を自動的に英語に翻訳し、グローバルな読者にリーチできます。Markdownフォーマットを保持したまま、自然な翻訳を提供します。</p>
<h3>3. AIコメントシステム</h3>
<p>AI存在子が各記事に対して独自の視点からコメントを追加します。これにより、読者に新たな洞察を提供します。</p>
<h3>4. atproto統合</h3>
<p>分散型SNSプロトコルであるatprotoと統合し、以下を実現します</p>
<ul>
<li>OAuth認証によるセキュアなログイン</li>
<li>コメントデータの分散管理</li>
<li>ユーザーデータ主権の確立</li>
</ul>
<h2>技術スタック</h2>
<ul>
<li><strong>言語</strong>: Rust</li>
<li><strong>AI</strong>: OpenAI GPT API</li>
<li><strong>認証</strong>: atproto OAuth 2.0</li>
<li><strong>デプロイ</strong>: GitHub Actions + Cloudflare Pages</li>
</ul>
<h2>今後の展望</h2>
<p>ai.logは、単なるブログツールを超えて、AIと人間が共創する新しいコンテンツプラットフォームを目指しています。存在子理論に基づく唯一性の担保により、デジタル世界での個人のアイデンティティを守りながら、AIによる創造性の拡張を実現します。</p>
</div>
</article>
</main>
<footer>
<p>&copy; 2025 My Blog</p>
</footer>
</body>
</html>

View File

@@ -1,60 +0,0 @@
<!DOCTYPE html>
<html lang="ja">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>AI統合ブログシステムの紹介 - My Blog</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
<header>
<h1><a href="/">My Blog</a></h1>
<p>A blog powered by ailog</p>
</header>
<main>
<article>
<h1>AI統合ブログシステムの紹介</h1>
<time>2025-06-06</time>
<div class="content">
<h1>AI統合ブログシステムの紹介</h1>
<p>ai.logは、静的ブログジェネレーターにAI機能を統合した革新的なシステムです。このシステムは存在子理論に基づき、現実の個人の唯一性をデジタル世界で担保することを目指しています。</p>
<h2>主な機能</h2>
<h3>1. AI記事編集・強化</h3>
<ul>
<li>文法エラーの自動修正</li>
<li>読みやすさの向上</li>
<li>関連情報の追加提案</li>
</ul>
<h3>2. 自動翻訳機能</h3>
<p>日本語で書かれた記事を自動的に英語に翻訳し、グローバルな読者にリーチできます。Markdownフォーマットを保持したまま、自然な翻訳を提供します。</p>
<h3>3. AIコメントシステム</h3>
<p>AI存在子が各記事に対して独自の視点からコメントを追加します。これにより、読者に新たな洞察を提供します。</p>
<h3>4. atproto統合</h3>
<p>分散型SNSプロトコルであるatprotoと統合し、以下を実現します</p>
<ul>
<li>OAuth認証によるセキュアなログイン</li>
<li>コメントデータの分散管理</li>
<li>ユーザーデータ主権の確立</li>
</ul>
<h2>技術スタック</h2>
<ul>
<li><strong>言語</strong>: Rust</li>
<li><strong>AI</strong>: OpenAI GPT API</li>
<li><strong>認証</strong>: atproto OAuth 2.0</li>
<li><strong>デプロイ</strong>: GitHub Actions + Cloudflare Pages</li>
</ul>
<h2>今後の展望</h2>
<p>ai.logは、単なるブログツールを超えて、AIと人間が共創する新しいコンテンツプラットフォームを目指しています。存在子理論に基づく唯一性の担保により、デジタル世界での個人のアイデンティティを守りながら、AIによる創造性の拡張を実現します。</p>
</div>
</article>
</main>
<footer>
<p>&copy; 2025 My Blog</p>
</footer>
</body>
</html>

View File

@@ -1,48 +0,0 @@
<!DOCTYPE html>
<html lang="ja">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Welcome to ailog - My Blog</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
<header>
<h1><a href="/">My Blog</a></h1>
<p>A blog powered by ailog</p>
</header>
<main>
<article>
<h1>Welcome to ailog</h1>
<time>2025-01-06</time>
<div class="content">
<h1>Welcome to ailog</h1>
<p>This is your first post powered by <strong>ailog</strong> - a static blog generator with AI features.</p>
<h2>Features</h2>
<ul>
<li>Fast static site generation</li>
<li>Markdown support with frontmatter</li>
<li>AI-powered features (coming soon)</li>
<li>atproto integration for comments</li>
</ul>
<h2>Getting Started</h2>
<p>Create new posts with:</p>
<pre><code><span style="color:#8fa1b3;">ailog</span><span style="color:#c0c5ce;"> new &quot;</span><span style="color:#a3be8c;">My New Post</span><span style="color:#c0c5ce;">&quot;</span>
</code></pre>
<p>Build your blog with:</p>
<pre><code><span style="color:#8fa1b3;">ailog</span><span style="color:#c0c5ce;"> build</span>
</code></pre>
<p>Happy blogging!</p>
</div>
</article>
</main>
<footer>
<p>&copy; 2025 My Blog</p>
</footer>
</body>
</html>

View File

@@ -1,48 +0,0 @@
<!DOCTYPE html>
<html lang="ja">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Welcome to ailog - My Blog</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
<header>
<h1><a href="/">My Blog</a></h1>
<p>A blog powered by ailog</p>
</header>
<main>
<article>
<h1>Welcome to ailog</h1>
<time>2025-01-06</time>
<div class="content">
<h1>Welcome to ailog</h1>
<p>This is your first post powered by <strong>ailog</strong> - a static blog generator with AI features.</p>
<h2>Features</h2>
<ul>
<li>Fast static site generation</li>
<li>Markdown support with frontmatter</li>
<li>AI-powered features (coming soon)</li>
<li>atproto integration for comments</li>
</ul>
<h2>Getting Started</h2>
<p>Create new posts with:</p>
<pre><code><span style="color:#8fa1b3;">ailog</span><span style="color:#c0c5ce;"> new &quot;</span><span style="color:#a3be8c;">My New Post</span><span style="color:#c0c5ce;">&quot;</span>
</code></pre>
<p>Build your blog with:</p>
<pre><code><span style="color:#8fa1b3;">ailog</span><span style="color:#c0c5ce;"> build</span>
</code></pre>
<p>Happy blogging!</p>
</div>
</article>
</main>
<footer>
<p>&copy; 2025 My Blog</p>
</footer>
</body>
</html>

View File

@@ -1,58 +0,0 @@
body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
line-height: 1.6;
color: #333;
max-width: 800px;
margin: 0 auto;
padding: 20px;
}
header {
margin-bottom: 40px;
border-bottom: 1px solid #eee;
padding-bottom: 20px;
}
header h1 {
margin: 0;
}
header h1 a {
color: #333;
text-decoration: none;
}
.post-list {
list-style: none;
padding: 0;
}
.post-list li {
margin-bottom: 15px;
}
.post-list time {
color: #666;
font-size: 0.9em;
margin-left: 10px;
}
article time {
color: #666;
display: block;
margin-bottom: 20px;
}
pre {
background-color: #f4f4f4;
padding: 15px;
border-radius: 5px;
overflow-x: auto;
}
code {
background-color: #f4f4f4;
padding: 2px 5px;
border-radius: 3px;
font-family: 'Consolas', 'Monaco', monospace;
}

View File

@@ -1,23 +0,0 @@
<!DOCTYPE html>
<html lang="{{ config.language }}">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{% block title %}{{ config.title }}{% endblock %}</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
<header>
<h1><a href="/">{{ config.title }}</a></h1>
<p>{{ config.description }}</p>
</header>
<main>
{% block content %}{% endblock %}
</main>
<footer>
<p>&copy; 2025 {{ config.title }}</p>
</footer>
</body>
</html>

View File

@@ -1,13 +0,0 @@
{% extends "base.html" %}
{% block content %}
<h2>Recent Posts</h2>
<ul class="post-list">
{% for post in posts %}
<li>
<a href="{{ post.url }}">{{ post.title }}</a>
<time>{{ post.date }}</time>
</li>
{% endfor %}
</ul>
{% endblock %}

View File

@@ -1,13 +0,0 @@
{% extends "base.html" %}
{% block title %}{{ post.title }} - {{ config.title }}{% endblock %}
{% block content %}
<article>
<h1>{{ post.title }}</h1>
<time>{{ post.date }}</time>
<div class="content">
{{ post.content | safe }}
</div>
</article>
{% endblock %}