AI-assisted structural engineering workspace for AEC workflows.
StructureClaw-demo.mp4
- Conversational engineering workflow from natural language to analysis artifacts
- Unified orchestration loop: draft -> validate -> analyze -> code-check -> report
- Web UI, API backend, and backend-hosted Python analysis runtime in one monorepo
- Regression and contract scripts for repeatable engineering validation
frontend (Next.js)
-> backend (Fastify + Prisma + Agent orchestration + analysis runtime host)
-> backend/src/agent-skills/analysis-execution/python
-> reports / metrics / artifacts
Main directories:
frontend/: Next.js 14 applicationbackend/: Fastify API, agent/chat flows, Prisma integration, and analysis execution hostscripts/: startup helpers and contract/regression checksdocs/: user handbook and protocol references
Recommended local flow:
make doctor
make start
make statusNotes:
- SQLite is now the default local database. A fresh setup writes to
.runtime/data/structureclaw.db. - If your old local
.envstill pointsDATABASE_URLat a local PostgreSQL instance,make doctorandmake startwill auto-migrate that data into SQLite, rewrite.envto the SQLite default, and keep the original PostgreSQL URL inPOSTGRES_SOURCE_DATABASE_URL. - That first auto-migration also creates a local backup file like
.env.pre-sqlite-migration.<timestamp>.bak.
Useful follow-up commands:
make logs
make stop
make backend-regression
make analysis-regressionCLI alternative:
./sclaw doctor
./sclaw start
./sclaw status
./sclaw logs all --follow
./sclaw stopWindows users can now start the full stack directly with Docker, which is the easiest path for beginners who do not want to install local Node.js, Python, PostgreSQL, and Redis first.
Recommended steps:
- Install and start Docker Desktop.
- If Docker Desktop asks you to enable WSL 2 or required container features on first launch, follow the setup wizard and restart Docker Desktop.
- Create
.envfrom.env.examplein the project root, and at minimum fill inLLM_PROVIDER,LLM_API_KEY,LLM_MODEL, andLLM_BASE_URL. - Start the full stack with Docker Compose:
make docker-upIf your Windows environment does not have make, run:
docker compose up --buildOnce the stack is ready, the main entrypoints are:
- Frontend:
http://localhost:30000 - Backend health check:
http://localhost:30010/health - Analysis routes:
http://localhost:30010/analyze - Database status page:
http://localhost:30000/console/database
To stop the containers:
make docker-downOr:
docker compose downCopy and adjust environment variables from .env.example.
Key variables include:
PORT,FRONTEND_PORTDATABASE_URL,POSTGRES_SOURCE_DATABASE_URL,REDIS_URLLLM_PROVIDER,LLM_API_KEY,LLM_MODEL,LLM_BASE_URLANALYSIS_PYTHON_BIN,ANALYSIS_PYTHON_TIMEOUT_MS,ANALYSIS_ENGINE_MANIFEST_PATH
Backend:
POST /api/v1/agent/runPOST /api/v1/chat/messagePOST /api/v1/chat/streamPOST /api/v1/chat/execute
Backend-hosted analysis:
POST /validatePOST /convertPOST /analyzePOST /code-check
- Skills are enhancement layers, not the only execution path.
- Unmatched selected skills fall back to generic no-skill modeling.
- User-visible content must support both English and Chinese.
- Keep module boundaries explicit across frontend/backend/analysis skills.
- English handbook:
docs/handbook.md - Chinese handbook:
docs/handbook_CN.md - English reference:
docs/reference.md - Chinese reference:
docs/reference_CN.md - Chinese overview:
README_CN.md - Contribution guide:
CONTRIBUTING.md
Please read CONTRIBUTING.md before opening a PR.
MIT. See LICENSE.