#工作记录

kortix-ai/suna: Suna - 开源通用 AI 代理
项目概述
Suna 是一个完全开源的 AI 助手,通过自然对话帮助用户轻松完成研究、数据分析等日常任务。它结合了强大的功能和直观的界面,能够理解用户需求并提供结果。其强大的工具包包括浏览器自动化、文件管理、网页抓取、命令行执行、网站部署以及与各种 API 和服务的集成,这些功能协同工作,使 Suna 能够通过简单的对话解决复杂问题并自动化工作流程。
项目架构
Suna 主要由四个组件组成:
- 后端 API:基于 Python/FastAPI 构建的服务,负责处理 REST 端点、线程管理以及与 Anthropic 等大语言模型(LLM)的集成(通过 LiteLLM)。
- 前端:使用 Next.js/React 开发的应用程序,提供响应式用户界面,包括聊天界面、仪表盘等。
- Agent Docker:为每个代理提供隔离的执行环境,支持浏览器自动化、代码解释器、文件系统访问、工具集成和安全特性。
- Supabase 数据库:负责数据持久化,包括认证、用户管理、对话历史记录、文件存储、代理状态、分析和实时订阅等功能。
使用案例
仓库文档中列举了多个使用案例,展示了 Suna 在不同场景下的应用,例如:
- 竞争对手分析:分析特定行业的市场情况,生成 PDF 报告。
- 风险投资基金列表:获取美国重要风险投资基金的信息。
- 候选人搜索:在 LinkedIn 上查找符合特定条件的候选人。
- 公司旅行规划:生成公司旅行的路线计划和活动安排。
- Excel 数据处理:设置 Excel 电子表格并填充相关信息。
- 活动演讲者挖掘:寻找符合条件的 AI 伦理演讲者并输出联系方式和演讲摘要。
- 科学论文总结和交叉引用:研究和比较科学论文,生成相关报告。
- 潜在客户研究和初步联系:研究潜在客户,生成个性化的初步联系邮件。
- SEO 分析:基于网站生成 SEO 报告分析。
- 个人旅行规划:生成个人旅行的详细行程计划。
- 近期融资的初创公司:从多个平台筛选特定领域的初创公司并生成报告。
- 论坛讨论抓取:在论坛上查找特定主题的信息并生成列表。
Microsoft Windows [Version 10.0.27868.1000]
(c) Microsoft Corporation. All rights reserved.(.venv) F:\PythonProjects\suna>python setup.py '--admin'
███████╗██╗ ██╗███╗ ██╗ █████╗
██╔════╝██║ ██║████╗ ██║██╔══██╗
███████╗██║ ██║██╔██╗ ██║███████║
╚════██║██║ ██║██║╚██╗██║██╔══██║
███████║╚██████╔╝██║ ╚████║██║ ██║
╚══════╝ ╚═════╝ ╚═╝ ╚═══╝╚═╝ ╚═╝
Setup Wizard
This wizard will guide you through setting up Suna, an open-source generalist AI agent.
Step 1/8: Checking requirements
==================================================✅ git is installed
✅ docker is installed
✅ python3 is installed
✅ poetry is installed
✅ pip3 is installed
✅ node is installed
✅ npm is installed
✅ Docker is running
✅ Suna repository detected
完整日志
经过十余次部署尝试,终于成功将项目运行起来,这一路可谓荆棘密布。整个过程需要配置众多外部软件及 API 密钥,从环境搭建到依赖安装,从密钥获取到服务链接,每一个环节都可能暗藏 “陷阱”,需要反复排查与调试。尽管在部署过程中仍遗留了一些待解决的细节问题,但项目已实现基本运行。
现将完整部署日志记录如下,既便于后期复盘总结,也可供大家参考,提前规避常见报错。后续我将继续深入调试,并整理成完整教程分享给大家。
[日志中的API key均已失效(未充值),仅用于日志记录展示,失效的API key会导致部署受阻]
Microsoft Windows [Version 10.0.27868.1000]
(c) Microsoft Corporation. All rights reserved.(.venv) F:\PythonProjects\suna>python setup.py '--admin'███████╗██╗ ██╗███╗ ██╗ █████╗ ██╔════╝██║ ██║████╗ ██║██╔══██╗███████╗██║ ██║██╔██╗ ██║███████║╚════██║██║ ██║██║╚██╗██║██╔══██║███████║╚██████╔╝██║ ╚████║██║ ██║╚══════╝ ╚═════╝ ╚═╝ ╚═══╝╚═╝ ╚═╝Setup WizardThis wizard will guide you through setting up Suna, an open-source generalist AI agent.Step 1/8: Checking requirements
==================================================✅ git is installed
✅ docker is installed
✅ python3 is installed
✅ poetry is installed
✅ pip3 is installed
✅ node is installed
✅ npm is installed
✅ Docker is running
✅ Suna repository detectedStep 2/8: Collecting Supabase information
==================================================ℹ️ You'll need to create a Supabase project before continuing
ℹ️ Visit https://supabase.com/dashboard/projects to create one
ℹ️ After creating your project, visit the project settings -> Data API and you'll need to get the following information:
ℹ️ 1. Supabase Project URL (e.g., https://abcdefg.supabase.co)
ℹ️ 2. Supabase anon key
ℹ️ 3. Supabase service role key
Press Enter to continue once you've created your Supabase project...
Enter your Supabase Project URL (e.g., https://abcdefg.supabase.co): https://gcnijvljsutcxwsdfsgcedjz.supabase.co
Enter your Supabase anon key: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9safasfsaf.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6Imdjbmlqdmxqc3V0Y3h3Z2NlZGp6Iiwicm9sZSI6IsdfmFub24iLCJpYXQiOjE3NDg1MjAwNjksImV4cCI6MjA2NDA5NjA2OX0.WkHwZgqXVwVVR6gnjy1BbfPqqTStdx0Tob0iqMQu5TQ
Enter your Supabase service role key: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpcsdfsafsa3MiOiJzdXBhYmFzZSIsInJlZiI6Imdjbmlqdmxqc3V0Y3h3Z2NlZGp6Iiwicm9sZSI6IsdfnNlcnZpY2Vfcm9sZSIsImlhdCI6MTc0ODUyMDA2OSwiZXhwIjoyMDY0MDk2MDY5fQ.SUGg5LWt41NA_E-fKSt1vBLt4jBFw6sEeMAa1xvYbywStep 3/8: Collecting Daytona information
==================================================ℹ️ You'll need to create a Daytona account before continuing
ℹ️ Visit https://app.daytona.io/ to create one
ℹ️ Then, generate an API key from 'Keys' menu
ℹ️ After that, go to Images (https://app.daytona.io/dashboard/images)
ℹ️ Click '+ Create Image'
ℹ️ Enter 'kortix/suna:0.1.2.8' as the image name
ℹ️ Set '/usr/bin/supervisord -n -c /etc/supervisor/conf.d/supervisord.conf' as the Entrypoint
Press Enter to continue once you've completed these steps...
Enter your Daytona API key: dtn_8856676c89b5575977dc9afe69dbe67sdfsfba1d76361c7e5ff537862c98c3827cd2bStep 4/8: Collecting LLM API keys
==================================================ℹ️ You need at least one LLM provider API key to use Suna
ℹ️ Available LLM providers: OpenAI, Anthropic, OpenRouterSelect LLM providers to configure:
[1] OpenAI
[2] Anthropic
[3] OpenRouter (access to multiple models)
Enter numbers separated by commas (e.g., 1,2,3)Select providers (required, at least one): 1,3
ℹ️
Configuring OPENAI
Enter your OpenAI API key: sk-proj-dUUSgK9ysdfsdfsdfsaf1cFHa-f9ImeDrJkiPbE4Ei0Bs87-YT4idKotRaYkMlU61EuT2RxW1yGlm6-6lcRhMmT3BlbkFJp7ZEISV8HsdhWTxORCEvlwZ7Rrsdfsafv568HKuYpU_9dm0WnCelDytNKPkqWrchoFNhUUh-iCIAGfX-oARecommended OpenAI models:
[1] openai/gpt-4o
[2] openai/gpt-4o-mini
Select default model (1-4) or press Enter for gpt-4o: 1
ℹ️
Configuring OPENROUTER
Enter your OpenRouter API key: sk-or-v1-5405c9fd3c1f99d9122446sdf6ef81f618sdffad90sdfadf192d77ff17cb65a0d312e621286ee6aRecommended OpenRouter models:
[1] openrouter/google/gemini-2.5-pro-preview
[2] openrouter/deepseek/deepseek-chat-v3-0324:free
[3] openrouter/openai/gpt-4o-2024-11-20
Select default model (1-3) or press Enter for gemini-2.5-flash: 2
✅ Using openrouter/deepseek/deepseek-chat-v3-0324:free as the default modelStep 5/8: Collecting search and web scraping API keys
==================================================ℹ️ You'll need to obtain API keys for search and web scraping
ℹ️ Visit https://tavily.com/ to get a Tavily API key
ℹ️ Visit https://firecrawl.dev/ to get a Firecrawl API key
Enter your Tavily API key: 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
Enter your Firecrawl API key: fc-1801bsdfsfedf8e2942d4bdf536032f798e03
Are you self-hosting Firecrawl? (y/n): NStep 6/8: Collecting RapidAPI key
==================================================ℹ️ To enable API services like LinkedIn, and others, you'll need a RapidAPI key
ℹ️ Each service requires individual activation in your RapidAPI account:
ℹ️ 1. Locate the service's `base_url` in its corresponding file (e.g., https://linkedin-data-scraper.p.rapidapi.com in backend/agent/tools/data_providers/LinkedinProvider.py)
ℹ️ 2. Visit that specific API on the RapidAPI marketplace
ℹ️ 3. Subscribe to th`e service (many offer free tiers with limited requests)
ℹ️ 4. Once subscribed, the service will be available to your agent through the API Services tool
ℹ️ A RapidAPI key is optional for API services like LinkedIn
ℹ️ Visit https://rapidapi.com/ to get your API key if needed
ℹ️ You can leave this blank and add it later if desired
Enter your RapidAPI key (optional, press Enter to skip): 936154e36fmshe98d7e77835be33p1c63e0jsnd737f78eca0b
ℹ️ Setting up Supabase database...
✅ Extracted project reference 'gcnijvljsutcxwgcedjz' from your Supabase URL
ℹ️ Changing to backend directory: F:\PythonProjects\suna\backend
ℹ️ Logging into Supabase CLI...
Hello from Supabase! Press Enter to open browser and login automatically.Here is your login link in case browser did not open https://supabase.com/dashboard/cli/login?session_id=99b6b3c2-650b-4554-9c86-971ddf5459f1&token_name=cli_AI\love@AI_1748618285&public_key=0423f5ef16356a29c45508ab16157da5afffbe7ced2f713f1258eeb78313524ae557aab83dsdfafedeb19895a1a6f8bd34b1d9d0d38753e5798c5fff7ffad5d8edf4255Enter your verification code: fd5a5ca0
Token cli_AI\love@AI_17486sdfa18285 created successfully.You are now logged in. Happy coding!
ℹ️ Linking to Supabase project gcnijvljsutcxwgcedjz...
Enter your database password (or leave blank to skip):
Connecting to remote database...
NOTICE (42P06): schema "supabase_migrations" already exists, skipping
NOTICE (42P07): relation "schema_migrations" already exists, skipping
NOTICE (42701): column "statements" of relation "schema_migrations" already exists, skipping
NOTICE (42701): column "name" of relation "schema_migrations" already exists, skipping
NOTICE (42P06): schema "supabase_migrations" already exists, skipping
NOTICE (42P07): relation "seed_files" already exists, skipping
Finished supabase link.
ℹ️ Pushing database migrations...
Connecting to remote database...
Remote database is up to date.
✅ Supabase database setup completed
⚠️ IMPORTANT: You need to manually expose the 'basejump' schema in Supabase
ℹ️ Go to the Supabase web platform -> choose your project -> Project Settings -> Data API
ℹ️ In the 'Exposed Schema' section, add 'basejump' if not already there
Press Enter once you've completed this step...Step 8/8: Installing dependencies
==================================================ℹ️ Installing required dependencies...
ℹ️ Installing frontend dependencies...
npm warn cleanup Failed to remove some directories [
npm warn cleanup [
npm warn cleanup '\\\\?\\F:\\PythonProjects\\suna\\frontend\\node_modules\\next',
npm warn cleanup [Error: EPERM: operation not permitted, rmdir 'F:\PythonProjects\suna\frontend\node_modules\next\dist\client'] {
npm warn cleanup errno: -4048,
npm warn cleanup code: 'EPERM',
npm warn cleanup syscall: 'rmdir',
npm warn cleanup path: 'F:\\PythonProjects\\suna\\frontend\\node_modules\\next\\dist\\client'
npm warn cleanup }
npm warn cleanup ],
npm warn cleanup [
npm warn cleanup 'F:\\PythonProjects\\suna\\frontend\\node_modules\\next',
npm warn cleanup [Error: EPERM: operation not permitted, rmdir 'F:\PythonProjects\suna\frontend\node_modules\next\dist\esm\client\components\react-dev-overlay'] {
npm warn cleanup errno: -4048,
npm warn cleanup code: 'EPERM',
npm warn cleanup syscall: 'rmdir',
npm warn cleanup path: 'F:\\PythonProjects\\suna\\frontend\\node_modules\\next\\dist\\esm\\client\\components\\react-dev-overlay'
npm warn cleanup }
npm warn cleanup ]
npm warn cleanup ]
npm error code 1
npm error path F:\PythonProjects\suna\frontend\node_modules\canvas
npm error command failed
npm error command C:\WINDOWS\system32\cmd.exe /d /s /c prebuild-install -r napi || node-gyp rebuild
npm error Warning: Missing input files:
npm error C:\GTK\bin\libgmodule-2.0-0.dll
npm error C:\GTK\bin\libpangowin32-1.0-0.dll
npm error C:\GTK\bin\libfreetype-6.dll
npm error C:\GTK\bin\libgobject-2.0-0.dll
npm error C:\GTK\bin\libpango-1.0-0.dll
npm error C:\GTK\bin\libpangoft2-1.0-0.dll
npm error C:\GTK\bin\libcairo-2.dll
npm error C:\GTK\bin\libglib-2.0-0.dll
npm error C:\GTK\bin\libintl-8.dll
npm error C:\GTK\bin\libpng14-14.dll
npm error C:\GTK\bin\libgthread-2.0-0.dll
npm error C:\GTK\bin\libfontconfig-1.dll
npm error C:\GTK\bin\libpangocairo-1.0-0.dll
npm error C:\GTK\bin\zlib1.dll
npm error C:\GTK\bin\libexpat-1.dll
npm error
npm error Backend.cc
npm error F:\PythonProjects\suna\frontend\node_modules\canvas\src\backend\Backend.h(3,10): error C1083: 无法打开包括文件: “cairo.h”: No such file or directory [F:\PythonProjects\suna\frontend\node_modules\canvas\build\canvas.vcxproj]
npm error (编译源文件“../src/backend/Backend.cc”)
npm error prebuild-install warn install read ECONNRESET
npm error gyp info it worked if it ends with ok
npm error gyp info using node-gyp@11.0.0
npm error gyp info using node@22.16.0 | win32 | x64
npm error gyp info find Python using Python version 3.12.9 found at "C:\msys64\mingw64\bin\python3.exe"
npm error gyp info find VS using VS2022 (17.12.36106.13) found at:
npm error gyp info find VS "D:\Program Files\Microsoft Visual Studio\2022\Professional"
npm error gyp info find VS run with --verbose for detailed information
npm error gyp info spawn C:\msys64\mingw64\bin\python3.exe
npm error gyp info spawn args [
npm error gyp info spawn args 'C:\\Program Files\\nodejs\\node_modules\\npm\\node_modules\\node-gyp\\gyp\\gyp_main.py',
npm error gyp info spawn args 'binding.gyp',
npm error gyp info spawn args '-f',
npm error gyp info spawn args 'msvs',
npm error gyp info spawn args '-I',
npm error gyp info spawn args 'F:\\PythonProjects\\suna\\frontend\\node_modules\\canvas\\build\\config.gypi',
npm error gyp info spawn args '-I',
npm error gyp info spawn args 'C:\\Program Files\\nodejs\\node_modules\\npm\\node_modules\\node-gyp\\addon.gypi',
npm error gyp info spawn args '-I',
npm error gyp info spawn args 'C:\\Users\\love\\AppData\\Local\\node-gyp\\Cache\\22.16.0\\include\\node\\common.gypi',
npm error gyp info spawn args '-Dlibrary=shared_library',
npm error gyp info spawn args '-Dvisibility=default',
npm error gyp info spawn args '-Dnode_root_dir=C:\\Users\\love\\AppData\\Local\\node-gyp\\Cache\\22.16.0',
npm error gyp info spawn args '-Dnode_gyp_dir=C:\\Program Files\\nodejs\\node_modules\\npm\\node_modules\\node-gyp',
npm error gyp info spawn args '-Dnode_lib_file=C:\\\\Users\\\\love\\\\AppData\\\\Local\\\\node-gyp\\\\Cache\\\\22.16.0\\\\<(target_arch)\\\\node.lib',
npm error gyp info spawn args '-Dmodule_root_dir=F:\\PythonProjects\\suna\\frontend\\node_modules\\canvas',
npm error gyp info spawn args '-Dnode_engine=v8',
npm error gyp info spawn args '--depth=.',
npm error gyp info spawn args '--no-parallel',
npm error gyp info spawn args '--generator-output',
npm error gyp info spawn args 'F:\\PythonProjects\\suna\\frontend\\node_modules\\canvas\\build',
npm error gyp info spawn args '-Goutput_dir=.'
npm error gyp info spawn args ]
npm error gyp info spawn D:\Program Files\Microsoft Visual Studio\2022\Professional\MSBuild\Current\Bin\MSBuild.exe
npm error gyp info spawn args [
npm error gyp info spawn args 'build\\binding.sln',
npm error gyp info spawn args '/clp:Verbosity=minimal',
npm error gyp info spawn args '/nologo',
npm error gyp info spawn args '/p:Configuration=Release;Platform=x64'
npm error gyp info spawn args ]
npm error gyp ERR! build error
npm error gyp ERR! stack Error: `D:\Program Files\Microsoft Visual Studio\2022\Professional\MSBuild\Current\Bin\MSBuild.exe` failed with exit code: 1
npm error gyp ERR! stack at ChildProcess.<anonymous> (C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\lib\build.js:216:23)
npm error gyp ERR! stack at ChildProcess.emit (node:events:518:28)
npm error gyp ERR! stack at ChildProcess._handle.onexit (node:internal/child_process:293:12)
npm error gyp ERR! System Windows_NT 10.0.27868
npm error gyp ERR! command "C:\\Program Files\\nodejs\\node.exe" "C:\\Program Files\\nodejs\\node_modules\\npm\\node_modules\\node-gyp\\bin\\node-gyp.js" "rebuild"
npm error gyp ERR! cwd F:\PythonProjects\suna\frontend\node_modules\canvas
npm error gyp ERR! node -v v22.16.0
npm error gyp ERR! node-gyp -v v11.0.0
npm error gyp ERR! not ok
npm error A complete log of this run can be found in: C:\Users\love\AppData\Local\npm-cache\_logs\2025-05-30T15_19_00_254Z-debug-0.log
❌ Failed to install dependencies: Command '['npm', 'install']' returned non-zero exit status 1.
ℹ️ You may need to install them manually.
ℹ️ Configuring environment files...
✅ Backend .env file created at backend\.env
ℹ️ Redis host is set to: redis
ℹ️ RabbitMQ host is set to: rabbitmq
✅ Frontend .env.local file created at frontend\.env.local
ℹ️ Backend URL is set to: http://localhost:8000/apiStep 8/8: Starting Suna
==================================================ℹ️ You can start Suna using either Docker Compose or by manually starting the frontend, backend and worker.How would you like to start Suna?
[1] Docker Compose (recommended, starts all services)
[2] Manual startup (requires Redis, RabbitMQ & separate terminals) Enter your choice (1 or 2): 1
ℹ️ Starting Suna with Docker Compose...
ℹ️ Building images locally...
Compose can now delegate builds to bake for better performance.To do so, set COMPOSE_BAKE=true.
[+] Building 426.5s (34/34) FINISHED docker:desktop-linux=> [worker internal] load build definition from Dockerfile 0.0s=> => transferring dockerfile: 1.63kB 0.0s => [backend internal] load metadata for docker.io/library/python:3.11-slim 6.3s => [worker internal] load .dockerignore 0.0s=> => transferring context: 2B 0.0s => [backend 1/7] FROM docker.io/library/python:3.11-slim@sha256:dbf1de478a55d6763afaa39c2f3d7b54b25230614980276de5cacdde79529d0c 0.1s => => resolve docker.io/library/python:3.11-slim@sha256:dbf1de478a55d6763afaa39c2f3d7b54b25230614980276de5cacdde79529d0c 0.0s => [worker internal] load build context 0.0s => => transferring context: 7.75kB 0.0s => CACHED [backend 2/7] WORKDIR /app 0.0s => CACHED [backend 3/7] RUN apt-get update && apt-get install -y --no-install-recommends build-essential curl && rm -rf /var/lib/apt/lists/* 0.0s => CACHED [backend 4/7] RUN useradd -m -u 1000 appuser && mkdir -p /app/logs && chown -R appuser:appuser /app 0.0s => CACHED [worker 5/7] COPY --chown=appuser:appuser requirements.txt . 0.0s => [worker 6/7] RUN pip install --no-cache-dir -r requirements.txt gunicorn 110.4s => [worker 7/7] COPY --chown=appuser:appuser . . 0.1s=> [worker] exporting to image 12.7s=> => exporting layers 9.7s=> => exporting manifest sha256:a6e63d8f4567dc7ce2dd73de276ab5f62b50ae4991dbfa03f890eea7cc0c9d78 0.0s=> => exporting config sha256:236895aed0cf64c4db115b31dbfae75bbe84ec6c4d94d3f7f1648a1961435ef8 0.0s=> => exporting attestation manifest sha256:846935b1db61c8759fc8603810ba0abe08e537d4f5a86f2f678a26d7f96fc6e8 0.0s=> => exporting manifest list sha256:f9938f968b86a5dfdbbdfd7b4eb8b76a848f2937c4c45eaa13e8f5f924d4fad6 0.0s=> => naming to docker.io/library/suna-worker:latest 0.0s => => unpacking to docker.io/library/suna-worker:latest 2.8s => [worker] resolving provenance for metadata file 0.0s=> [backend internal] load build definition from Dockerfile 0.0s=> => transferring dockerfile: 1.63kB 0.0s => [backend internal] load .dockerignore 0.0s=> => transferring context: 2B 0.0s => [backend internal] load build context 0.0s => => transferring context: 5.75kB 0.0s => CACHED [backend 5/7] COPY --chown=appuser:appuser requirements.txt . 0.0s => CACHED [backend 6/7] RUN pip install --no-cache-dir -r requirements.txt gunicorn 0.0s => CACHED [backend 7/7] COPY --chown=appuser:appuser . . 0.0s => [backend] exporting to image 0.1s => => exporting layers 0.0s => => exporting manifest sha256:14fa145bd6eb38ce984f807e8744d0937a4fc107f068d40433d7c14bea4d1476 0.0s => => exporting config sha256:d6f08a5c47d5a9ef5e550f4ef620be566ce98db2b10141b4f123874939dcdef8 0.0s => => exporting attestation manifest sha256:9aa719d69af0e8c88936163351a6fa4cf448145ec7c25f06833782299e46ed28 0.0s => => exporting manifest list sha256:fb06e27847e8b9b247ae01196489d0f75305e6c736b823793bc50850cc55edeb 0.0s => => naming to docker.io/library/suna-backend:latest 0.0s=> => unpacking to docker.io/library/suna-backend:latest 0.0s => [backend] resolving provenance for metadata file 0.0s => [frontend internal] load build definition from Dockerfile 0.0s=> => transferring dockerfile: 704B 0.0s => [frontend internal] load metadata for docker.io/library/node:20-slim 2.8s => [frontend internal] load .dockerignore 0.0s=> => transferring context: 2B 0.0s => [frontend 1/7] FROM docker.io/library/node:20-slim@sha256:cb4abfbba7dfaa78e21ddf2a72a592e5f9ed36ccf98bdc8ad3ff945673d288c2 21.0s => => resolve docker.io/library/node:20-slim@sha256:cb4abfbba7dfaa78e21ddf2a72a592e5f9ed36ccf98bdc8ad3ff945673d288c2 0.0s => => sha256:d9d139bf2ac215a0d57ef09e790699a8fd5587c00200db6a91446278356b32aa 447B / 447B 12.3s => => sha256:b12d1e6fd3ba6067543928fa3e4c9a9307711cf5a4593699d157dba3af3e7d21 1.71MB / 1.71MB 15.3s => => sha256:d34dc2c1b56bf7f58faea3b73986ac0a274f2b369cc5f24a5ea26015fdd57e95 41.17MB / 41.17MB 19.2s => => sha256:057bf83be68af82a505c30eb852a4b542c264fe429954c8e0c0e204a9c9dd86e 3.31kB / 3.31kB 20.4s => => extracting sha256:057bf83be68af82a505c30eb852a4b542c264fe429954c8e0c0e204a9c9dd86e 0.0s => => extracting sha256:d34dc2c1b56bf7f58faea3b73986ac0a274f2b369cc5f24a5ea26015fdd57e95 0.4s => => extracting sha256:b12d1e6fd3ba6067543928fa3e4c9a9307711cf5a4593699d157dba3af3e7d21 0.0s => => extracting sha256:d9d139bf2ac215a0d57ef09e790699a8fd5587c00200db6a91446278356b32aa 0.0s => [frontend internal] load build context 0.4s => => transferring context: 15.10MB 0.3s => [frontend 2/7] WORKDIR /app 0.4s => [frontend 3/7] COPY package*.json ./ 0.0s => [frontend 4/7] RUN apt-get update && apt-get install -y --no-install-recommends python3 make g++ build-essential pkg-config libcairo2-dev libpango1.0-dev libjpeg-dev libgif-dev librsvg2-dev && rm -rf /var/lib/apt/lists/* 95.4s => [frontend 5/7] RUN npm install 31.0s => [frontend 6/7] COPY . . 0.4s => [frontend 7/7] RUN npm run build 96.1s => [frontend] exporting to image 42.2s => => exporting layers 32.0s => => exporting manifest sha256:5aa3bf772b57c08f01051d99a26dd00ca11bd0f6c9964672d854b5a9237ca2cc 0.0s => => exporting config sha256:c29ae31e61f62fbc9cc353572cc75685d91c48c4f930fc0e8aba4f785f0a0a33 0.0s => => exporting attestation manifest sha256:a228930985cc14ebd9460baadf26c81cc3e51c65f11868d9b576ce2c917604a2 0.0s => => exporting manifest list sha256:c905a71017595001e983964ecb5076c266eee581bd54b08a8face117267b8f0e 0.0s => => naming to docker.io/library/suna-frontend:latest 0.0s => => unpacking to docker.io/library/suna-frontend:latest 10.0s => [frontend] resolving provenance for metadata file 0.0s
[+] Running 11/11✔ backend Built 0.0s ✔ frontend Built 0.0s ✔ worker Built 0.0s ✔ Network suna_default Created 0.6s ✔ Volume "suna_rabbitmq_data" Created 0.0s ✔ Volume "suna_redis_data" Created 0.0s ✔ Container suna-rabbitmq-1 Healthy 15.6s ✔ Container suna-redis-1 Healthy 15.6s ✔ Container suna-worker-1 Started 13.8s ✔ Container suna-backend-1 Started 16.1s ✔ Container suna-frontend-1 Started 19.0s
ℹ️ Waiting for services to start...
⚠️ Some services might not be running correctly. Check 'docker compose ps' for details.✨ Suna Setup Complete! ✨ℹ️ Suna is configured to use openrouter/deepseek/deepseek-chat-v3-0324:free as the default LLM model
ℹ️ Your Suna instance is now running!
ℹ️ Access it at: http://localhost:3000
ℹ️ Create an account using Supabase authentication to start using SunaUseful Docker commands:docker compose ps - Check the status of Suna servicesdocker compose logs - View logs from all servicesdocker compose logs -f - Follow logs from all servicesdocker compose down - Stop Suna servicesdocker compose up -d - Start Suna services (after they've been stopped)(.venv) F:\PythonProjects\suna>
- 访问日志中输出的
http://localhost:3000网页
,使用 Supabase 账号注册登录。