Switch to RLM instead of ReAct
This commit is contained in:
259
README.md
259
README.md
@@ -1,6 +1,6 @@
|
|||||||
# nanocode
|
# nanocode
|
||||||
|
|
||||||
Minimal Claude Code alternative using DSPy ReAct! Single Python file, zero dependencies, ~250 lines.
|
Minimal Claude Code alternative using DSPy RLM! Single Python file, ~390 lines.
|
||||||
|
|
||||||
Built using Claude Code, then used to build itself.
|
Built using Claude Code, then used to build itself.
|
||||||
|
|
||||||
@@ -8,57 +8,272 @@ Built using Claude Code, then used to build itself.
|
|||||||
|
|
||||||
## Features
|
## Features
|
||||||
|
|
||||||
- Full agentic loop with tool use
|
- Full agentic loop with tool use via [DSPy RLM](https://dspy.ai/)
|
||||||
- Tools: `read`, `write`, `edit`, `glob`, `grep`, `bash`
|
- Tools: `read`, `write`, `edit`, `glob`, `grep`, `bash`
|
||||||
- Conversation history
|
- Conversation history with context
|
||||||
- Colored terminal output
|
- Colored terminal output
|
||||||
|
- **Modaic Integration**: Push, version, and share as a [Modaic](https://modaic.dev) autoprogram
|
||||||
|
|
||||||
### OpenRouter
|
---
|
||||||
|
|
||||||
Use [OpenRouter](https://openrouter.ai) to access any model:
|
## Prerequisites
|
||||||
|
|
||||||
|
Before using nanocode (or any DSPy RLM-based program), you need to install the Deno code interpreter:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
brew install deno
|
||||||
|
```
|
||||||
|
|
||||||
|
This is required for the RLM's code execution capabilities.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### Option 1: Use as a Modaic AutoProgram
|
||||||
|
|
||||||
|
Load and run nanocode directly from the Modaic Hub without cloning:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from modaic import AutoProgram
|
||||||
|
|
||||||
|
# Load the precompiled nanocode agent from Modaic Hub
|
||||||
|
agent = AutoProgram.from_precompiled(
|
||||||
|
"farouk1/nanocode",
|
||||||
|
config={
|
||||||
|
"lm": "openrouter/anthropic/claude-3.5-sonnet",
|
||||||
|
"max_iters": 20
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Run a coding task
|
||||||
|
result = agent(task="What Python files are in this directory?")
|
||||||
|
print(result.answer)
|
||||||
|
print(result.affected_files)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 2: Run Locally (Interactive CLI)
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
export OPENROUTER_API_KEY="your-key"
|
export OPENROUTER_API_KEY="your-key"
|
||||||
python nanocode.py
|
python nanocode.py
|
||||||
```
|
```
|
||||||
|
|
||||||
To use a different model:
|
To use a specific model:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
export OPENROUTER_API_KEY="your-key"
|
export OPENROUTER_API_KEY="your-key"
|
||||||
export MODEL="openai/gpt-5.2"
|
export MODEL="openai/gpt-4"
|
||||||
python nanocode.py
|
python nanocode.py
|
||||||
```
|
```
|
||||||
|
|
||||||
## Commands
|
---
|
||||||
|
|
||||||
- `/c` - Clear conversation
|
## Configuration
|
||||||
- `/q` or `exit` - Quit
|
|
||||||
|
When using as a Modaic AutoProgram, you can configure these options:
|
||||||
|
|
||||||
|
| Parameter | Type | Default | Description |
|
||||||
|
|-----------|------|---------|-------------|
|
||||||
|
| `lm` | str | `openrouter/anthropic/claude-3.5-sonnet` | Primary language model |
|
||||||
|
| `sub_lm` | str | `openrouter/openai/gpt-4.1` | Sub-LM for reasoning steps |
|
||||||
|
| `max_iters` | int | `20` | Maximum agent iterations |
|
||||||
|
| `api_base` | str | `https://openrouter.ai/api/v1` | API base URL |
|
||||||
|
| `max_tokens` | int | `16000` | Maximum tokens per request |
|
||||||
|
| `max_output_chars` | int | `100000` | Maximum output character limit |
|
||||||
|
| `verbose` | bool | `False` | Enable verbose logging |
|
||||||
|
|
||||||
|
Example with custom configuration:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from modaic import AutoProgram
|
||||||
|
|
||||||
|
agent = AutoProgram.from_precompiled(
|
||||||
|
"farouk1/nanocode",
|
||||||
|
config={
|
||||||
|
"lm": "openrouter/openai/gpt-4",
|
||||||
|
"sub_lm": "openrouter/openai/gpt-3.5-turbo",
|
||||||
|
"max_iters": 30,
|
||||||
|
"max_tokens": 8000,
|
||||||
|
"verbose": True
|
||||||
|
}
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## CLI Commands
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `/c` | Clear conversation history |
|
||||||
|
| `/q` or `exit` | Quit the application |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## Tools
|
## Tools
|
||||||
|
|
||||||
| Tool | Description |
|
The agent has access to the following tools:
|
||||||
|------|-------------|
|
|
||||||
| `read` | Read file with line numbers, offset/limit |
|
|
||||||
| `write` | Write content to file |
|
|
||||||
| `edit` | Replace string in file (must be unique) |
|
|
||||||
| `glob` | Find files by pattern, sorted by mtime |
|
|
||||||
| `grep` | Search files for regex |
|
|
||||||
| `bash` | Run shell command |
|
|
||||||
|
|
||||||
## Example
|
| Tool | Function | Description |
|
||||||
|
|------|----------|-------------|
|
||||||
|
| `readfile` | `read_file(path, offset, limit)` | Read file contents with line numbers |
|
||||||
|
| `writefile` | `write_file(path, content)` | Write content to a file |
|
||||||
|
| `editfile` | `edit_file(path, old, new, replace_all)` | Replace text in a file (old must be unique unless `replace_all=True`) |
|
||||||
|
| `globfiles` | `glob_files(pattern, path)` | Find files matching a glob pattern, sorted by modification time |
|
||||||
|
| `grepfiles` | `grep_files(pattern, path)` | Search files for a regex pattern |
|
||||||
|
| `runbash` | `run_bash(cmd)` | Run a shell command and return output |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Example Usage
|
||||||
|
|
||||||
|
### Interactive CLI
|
||||||
|
|
||||||
```
|
```
|
||||||
────────────────────────────────────────
|
────────────────────────────────────────
|
||||||
❯ what files are here?
|
❯ what files are here?
|
||||||
────────────────────────────────────────
|
────────────────────────────────────────
|
||||||
|
|
||||||
⏺ Glob(**/*.py)
|
⏺ Thinking...
|
||||||
⎿ nanocode.py
|
⏺ globfiles(pattern='**/*', path='.')
|
||||||
|
|
||||||
⏺ There's one Python file: nanocode.py
|
⏺ I found the following files:
|
||||||
|
- nanocode.py
|
||||||
|
- README.md
|
||||||
|
- modaic/SKILL.md
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Programmatic Usage
|
||||||
|
|
||||||
|
```python
|
||||||
|
from modaic import AutoProgram
|
||||||
|
|
||||||
|
agent = AutoProgram.from_precompiled("farouk1/nanocode")
|
||||||
|
|
||||||
|
# Read a file
|
||||||
|
result = agent(task="Read the first 10 lines of nanocode.py")
|
||||||
|
print(result.answer)
|
||||||
|
|
||||||
|
# Search for patterns
|
||||||
|
result = agent(task="Find all functions that contain 'file' in their name")
|
||||||
|
print(result.answer)
|
||||||
|
|
||||||
|
# Make edits
|
||||||
|
result = agent(task="Add a comment at the top of README.md")
|
||||||
|
print(result.affected_files) # ['README.md']
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### Overview
|
||||||
|
|
||||||
|
```
|
||||||
|
nanocode.py
|
||||||
|
├── File Operations
|
||||||
|
│ ├── read_file() - Read with line numbers
|
||||||
|
│ ├── write_file() - Write content
|
||||||
|
│ └── edit_file() - Find & replace
|
||||||
|
├── Search Operations
|
||||||
|
│ ├── glob_files() - Pattern matching
|
||||||
|
│ └── grep_files() - Regex search
|
||||||
|
├── Shell Operations
|
||||||
|
│ └── run_bash() - Execute commands
|
||||||
|
├── DSPy Components
|
||||||
|
│ ├── CodingAssistant (Signature)
|
||||||
|
│ └── RLMCodingProgram (PrecompiledProgram)
|
||||||
|
└── Modaic Integration
|
||||||
|
└── RLMCodingConfig (PrecompiledConfig)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Key Classes
|
||||||
|
|
||||||
|
#### `RLMCodingConfig`
|
||||||
|
Configuration class extending `PrecompiledConfig` for experiment-specific parameters.
|
||||||
|
|
||||||
|
```python
|
||||||
|
class RLMCodingConfig(PrecompiledConfig):
|
||||||
|
max_iters: int = 20
|
||||||
|
lm: str = "openrouter/anthropic/claude-3.5-sonnet"
|
||||||
|
sub_lm: str = "openrouter/openai/gpt-4.1"
|
||||||
|
api_base: str = "https://openrouter.ai/api/v1"
|
||||||
|
max_tokens: int = 16000
|
||||||
|
max_output_chars: int = 100000
|
||||||
|
verbose: bool = False
|
||||||
|
```
|
||||||
|
|
||||||
|
#### `RLMCodingProgram`
|
||||||
|
Main program class extending `PrecompiledProgram`. Wraps a DSPy RLM agent with coding tools.
|
||||||
|
|
||||||
|
```python
|
||||||
|
class RLMCodingProgram(PrecompiledProgram):
|
||||||
|
config: RLMCodingConfig
|
||||||
|
|
||||||
|
def forward(self, task: str) -> dspy.Prediction:
|
||||||
|
# Returns prediction with .answer and .affected_files
|
||||||
|
return self.agent(task=task)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### `CodingAssistant`
|
||||||
|
DSPy Signature defining the agent's input/output schema.
|
||||||
|
|
||||||
|
```python
|
||||||
|
class CodingAssistant(dspy.Signature):
|
||||||
|
task: str = dspy.InputField()
|
||||||
|
answer: str = dspy.OutputField()
|
||||||
|
affected_files: list[str] = dspy.OutputField()
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Publishing Your Own Version
|
||||||
|
|
||||||
|
If you modify nanocode and want to publish your own version to Modaic Hub:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from nanocode import RLMCodingProgram, RLMCodingConfig
|
||||||
|
|
||||||
|
# Create and optionally optimize your program
|
||||||
|
program = RLMCodingProgram(RLMCodingConfig())
|
||||||
|
|
||||||
|
# Push to your Modaic Hub repo
|
||||||
|
program.push_to_hub(
|
||||||
|
"your-username/my-nanocode",
|
||||||
|
commit_message="My customized nanocode",
|
||||||
|
with_code=True # Include source code for AutoProgram loading
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Dependencies
|
||||||
|
|
||||||
|
- [DSPy](https://dspy.ai/) - Framework for programming language models
|
||||||
|
- [Modaic](https://modaic.ai/) - Hub for sharing and versioning DSPy programs
|
||||||
|
- OpenRouter API key (for accessing language models)
|
||||||
|
|
||||||
|
Install dependencies:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install dspy modaic
|
||||||
|
# or with uv
|
||||||
|
uv add dspy modaic
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Environment Variables
|
||||||
|
|
||||||
|
| Variable | Required | Description |
|
||||||
|
|----------|----------|-------------|
|
||||||
|
| `OPENROUTER_API_KEY` | Yes | Your OpenRouter API key |
|
||||||
|
| `MODEL` | No | Override the default model selection |
|
||||||
|
| `MODAIC_TOKEN` | For Hub | Required for pushing/loading from Modaic Hub |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## License
|
## License
|
||||||
|
|
||||||
MIT
|
MIT
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"model": null,
|
"model": null,
|
||||||
"max_iters": 20,
|
"max_iters": 20,
|
||||||
"lm": "openai/gpt-5.2-codex",
|
"lm": "openrouter/anthropic/claude-3.5-sonnet",
|
||||||
"sub_lm": "openrouter/openai/gpt-4.1",
|
"sub_lm": "openrouter/openai/gpt-4.1",
|
||||||
"api_base": "https://openrouter.ai/api/v1",
|
"api_base": "https://openrouter.ai/api/v1",
|
||||||
"max_tokens": 16000,
|
"max_tokens": 16000,
|
||||||
|
|||||||
@@ -385,6 +385,6 @@ def main():
|
|||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
agent = RLMCodingProgram(RLMCodingConfig(lm="openai/gpt-5.2-codex"))
|
agent = RLMCodingProgram(RLMCodingConfig())
|
||||||
agent.push_to_hub(MODAIC_REPO_PATH, commit_message="Switch to RLM instead of ReAct", tag="v0.0.1")
|
agent.push_to_hub(MODAIC_REPO_PATH, commit_message="Switch to RLM instead of ReAct", tag="v0.0.2")
|
||||||
#main()
|
#main()
|
||||||
|
|||||||
@@ -29,7 +29,7 @@
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
"lm": {
|
"lm": {
|
||||||
"model": "openai/gpt-5.2-codex",
|
"model": "openrouter/anthropic/claude-3.5-sonnet",
|
||||||
"model_type": "chat",
|
"model_type": "chat",
|
||||||
"cache": true,
|
"cache": true,
|
||||||
"num_retries": 3,
|
"num_retries": 3,
|
||||||
@@ -67,7 +67,7 @@
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
"lm": {
|
"lm": {
|
||||||
"model": "openai/gpt-5.2-codex",
|
"model": "openrouter/anthropic/claude-3.5-sonnet",
|
||||||
"model_type": "chat",
|
"model_type": "chat",
|
||||||
"cache": true,
|
"cache": true,
|
||||||
"num_retries": 3,
|
"num_retries": 3,
|
||||||
|
|||||||
Reference in New Issue
Block a user