A Python backend that integrates Node-RED with AI capabilities using Pydantic AI.
red_node_pimp/
├── db/ # Database files
│ ├── measurements.db # SQLite database for measurements
│ ├── init_db.py # Database initialization script
│ └── test_query.py # Database test queries
│
├── node_red/ # Node-RED flow management
│ ├── flows/ # Individual flow definitions
│ │ ├── base_flow.py # Base class for all flows
│ │ └── temperature_flow.py # Temperature measurement flow
│ ├── utils/ # Node-RED utilities
│ │ └── node_red.py # Deployment and management functions
│ └── deploy.py # Main deployment script
│
└── tools/ # AI Tools and Integrations
├── base_tool.py # Base class for all tools
├── dependencies.py # Shared dependencies and configurations
├── models.py # Pydantic models for data validation
└── jokes.py # Chuck Norris jokes tool implementation
The tools system is built using Pydantic AI and follows a modular architecture:
-
Base Tool (
tools/base_tool.py)- Abstract base class for all tools
- Defines common interface and behaviors
- Handles error management and logging
-
Dependencies (
tools/dependencies.py)- Centralized dependency management
- Environment configuration
- Shared resources (e.g., API endpoints)
-
Models (
tools/models.py)- Pydantic models for data validation
- Request/Response schemas
- Shared data structures
-
Tool Implementations
- Each tool inherits from
BaseTool - Example:
jokes.pyfor Chuck Norris jokes - Follows Pydantic AI best practices
- Each tool inherits from
The Node-RED integration is organized into reusable components:
-
Base Flow (
node_red/flows/base_flow.py)- Abstract base class for flows
- Handles tab creation and ID management
- Provides common flow structure
-
Flow Implementations
- Each flow inherits from
NodeRedFlow - Example:
temperature_flow.pyfor temperature measurements - Self-contained flow definitions
- Each flow inherits from
-
Utilities (
node_red/utils/)- Deployment management
- Node-RED API interaction
- Backup functionality
This project uses SciPhi Cloud for RAG capabilities. To configure R2R:
- Create an account on SciPhi Cloud
- Get your API key from the dashboard
- Add it to your
.envfile:
SCIPHI_API_KEY=your_api_key_hereThe project will automatically use SciPhi Cloud's managed R2R service instead of running it locally.
This project uses LangSmith for monitoring and debugging LangChain applications. Required environment variables:
LANGSMITH_TRACING=true
LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
LANGSMITH_API_KEY="<your-api-key>"
LANGSMITH_PROJECT="<your-projeect-name>"
OPENAI_API_KEY="<your-openai-api-key>"- Python 3.11.9 (recommended version for best compatibility with LangChain and LangGraph)
- pyenv-win for Python version management
- Poetry for dependency management
- Node-RED installed and running
- Install pyenv-win:
# Run the installation script
.\install-pyenv-win.ps1- Install Python 3.11.9:
pyenv install 3.11.9- Set Python version for the project:
# This will use Python 3.11.9 for this project
pyenv local 3.11.9- Install dependencies:
poetry install-
Clone the project
git clone <repository-url> cd red_node_pimp
-
Install dependencies
# pyenv will automatically install and use Python 3.11.9 poetry install # Install Python dependencies cd node_red && npm install # Install Node-RED dependencies
-
Deploy Node-RED flows
cd node_red poetry run python deploy.py # Deploy the flows to Node-RED cd ..
- Start everything with one command
This will:
./start_app.ps1
- Start Node-RED (available at http://127.0.0.1:1880)
- Wait for Node-RED to initialize
- Start the FastAPI server with chat interface (available at http://127.0.0.1:8000)
To start only the FastAPI backend with Gradio interface (without Node-RED integration):
- Make sure you have Poetry installed:
pip install poetry- Install dependencies:
poetry install- Start the FastAPI server:
poetry run startThis will start the FastAPI server with hot-reload enabled. The Gradio interface will be available at http://localhost:8000.
- Node-RED must be running for temperature-related features to work
- The chat interface supports both temperature queries and Chuck Norris jokes
- All chat responses are in French, but documentation remains in English
- Create a new file in
tools/ - Inherit from
BaseTool - Define required properties:
class NewTool(BaseTool): name: str = Field("tool_name", description="...") description: str = Field("Tool description", ...)
- Implement the
runmethod
- Create a new file in
node_red/flows/ - Inherit from
NodeRedFlow - Implement required properties:
class NewFlow(NodeRedFlow): @property def name(self) -> str: return "Flow Name" @property def description(self) -> str: return "Flow description"
- Add the flow to
deploy.py
from tools.jokes import jokes_agent
# Use the tool
result = await jokes_agent.run("Tell me a joke", deps=deps)# Deploy all flows
python node_red/deploy.pyNODE_RED_API: Node-RED API URL (default: "http://127.0.0.1:1880")NODE_RED_API_KEY: Optional API key for Node-RED authentication
-
Install dependencies:
pip install -r requirements.txt
-
Initialize the database:
python db/init_db.py
-
Deploy Node-RED flows:
python node_red/deploy.py
-
Start the FastAPI server:
uvicorn main:app --reload
This is a fork of SqueezeAILab/LLMCompiler with improved imports and package structure.