Files
minecraft-friend-rlm/README.md
2026-01-27 22:37:59 -08:00

4.4 KiB
Raw Permalink Blame History

Minecraft MCP Friend (baseline “AI friend”)

This folder contains a working baseline agent that:

  • Spawns the Minecraft MCP server (@fundamentallabs/minecraft-mcp) over stdio
  • Joins your Minecraft world as a bot
  • Polls readChat and decides what to do using DSPy RLM + Groq via LiteLLM
  • Acts by calling MCP tools like sendChat, mineResource, openInventory, dropItem, etc.

If you just want the quick start: scroll to Run the agent.


Requirements

What you need installed

  • Java Minecraft (the official launcher is fine)
  • Node.js (so npx works)
  • uv (for Python + dependencies)

About Minecraft worlds and ports (important)

This agent joins a world via Mineflayer through the MCP server. Two common gotchas:

  • Open to LAN chooses a port: even if you type 25565, the real port is the one Minecraft prints in chat as “Local game hosted on port ####”.
  • Bots are clients: you dont “reserve a bot port.” The bot connects to your worlds host/port like any other client.

Security note (read this)

  • Never commit API keys. This project expects your Groq key in .env (loaded at runtime).
  • If you ever pasted a key into chat/screenshots, treat it as compromised and rotate it.

Setup (uv + Python 3.12)

From the repo root:

# DSPy RLM + MCP SDK need a modern Python.
uv python install 3.12
uv venv --python 3.12
source .venv/bin/activate

uv pip install -r requirements.txt

cp .env.example .env

Now edit .env and set at least:

  • GROQ_API_KEY=...
  • (optional) MAIN_MODEL and SUB_MODEL
  • (optional) BOT_USERNAME

Run the agent (join your world)

Step 1: start a world

Option A (easy): Single-player → Open to LAN

  1. Launch Minecraft
  2. Open your single-player world
  3. Choose Open to LAN
  4. In chat, copy the port from the message:
    • “Local game hosted on port #####

Option B (stable): run a dedicated server (recommended if you want a consistent port)

Step 2: run the agent

In the same terminal (with the venv activated):

python agent.py --host 127.0.0.1 --mc-port <PORT_FROM_MINECRAFT_CHAT>

Notes:

  • Use --host 127.0.0.1 if the bot runs on the same machine as Minecraft.
  • If the bot is on another machine, use your LAN IP (e.g. 192.168.x.y) instead.

Step 3: talk to it in Minecraft chat

Try:

  • “hi can you get some wood?”
  • “can you collect a stack of logs for me?”

Validate connectivity (without joining)

This confirms the “MCP → list_tools → DSPy Tool conversion” pipeline:

python agent.py --validate-tools

Troubleshooting

1) ECONNREFUSED (connection refused)

This almost always means youre using the wrong port or your world is no longer open.

Checklist:

  • Re-open your world to LAN and re-check the port printed in chat.
  • Verify the port is listening:
lsof -nP -iTCP:<PORT> -sTCP:LISTEN
nc -vz 127.0.0.1 <PORT>

2) Unsupported protocol version 'XYZ' (attempted to use 'ABC' data)

This is a Minecraft version mismatch between your client/server and the Mineflayer stack behind the MCP server.

Fastest fix:

  • Run a Minecraft version that matches what the bot stack expects (the errors “attempted” number is the clue).

Alternative:

  • Update the MCP server dependency stack (harder; can move the mismatch around).

3) “It keeps saying it delivered items, but I didnt get them”

Minecraft item transfer is tricky. In this baseline we treat the reliable mechanic as drop items near the player so they can be picked up. If youre testing “give” behaviors, prefer “drop-to-transfer” semantics.


Whats in this folder

  • agent.py: main loop; joins world; polls chat; calls DSPy RLM
  • config.py: .env settings (models, poll rate, etc.)
  • host_interpreter.py: host-based RLM interpreter (avoids some sandbox/runtime issues)
  • memory_fs.py: local “memory filesystem” (stored under .memory/)
  • mcp_client.py: thin MCP wrapper utilities (useful for debugging)
  • uv.lock: Python deps (pinned to dspy[mcp]==3.1.2)

References

  • DSPy MCP tutorial: https://dspy.ai/tutorials/mcp/?h=mcp
  • DSPy language models: https://dspy.ai/learn/programming/language_models/
  • LiteLLM Groq provider: https://docs.litellm.ai/docs/providers/groq
  • MCP filesystem server (shape inspiration): https://www.npmjs.com/package/@modelcontextprotocol/server-filesystem