2 Commits

Author SHA1 Message Date
3b154da6e5 Translate comments to English 2026-03-04 23:20:04 +01:00
81025922eb Import 0.9.0 development baseline 2026-03-04 22:21:18 +01:00
20 changed files with 4683 additions and 726 deletions

17
LICENSE Normal file
View File

@@ -0,0 +1,17 @@
GNU LESSER GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2025 H5N3RG
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Lesser General Public License as published
by the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.

315
README.md
View File

@@ -1,62 +1,299 @@
# LLM Connect
A Luanti (formerly Minetest) mod that connects the game to a Large Language Model (LLM) using an OpenAI-compatible API endpoint.
**A Luanti (formerly Minetest) mod that integrates Large Language Models (LLMs) directly into the game with an AI-powered Lua IDE and building assistant.**
## Purpose
![License](https://img.shields.io/badge/license-LGPL--3.0--or--later-blue)
This mod allows players to interact with an LLM directly within the Luanti chat.
It sends the player's chat message along with relevant in-game context—such as server info, installed mods, and available materials—to a remote API endpoint.
This enables the LLM to provide highly specific and helpful answers, e.g., on crafting items or locating resources in-game.
---
## Features
## 🌟 Overview
- **In-game AI Chat:** Send prompts to the LLM using a simple chat command. - **Context-Aware & Granular:** Automatically includes crucial server and material data in the prompts. **The inclusion of context elements (server info, mod list, materials, position) is now fully configurable via settings.** - **Configurable:** API key, endpoint URL, model, and all **context components** can be set via chat commands or the in-game menu. - **Conversation History:** Maintains short-term conversation history for more relevant responses. - **Robust Token Handling:** Supports sending `max_tokens` as an integer to avoid floating-point issues; optionally configurable via `settingtypes.txt` or chat commands.
LLM Connect brings modern AI assistance into Luanti worlds.
Players and developers can interact with a Large Language Model directly in-game to:
## Implementation
- ask questions
- generate Lua code
- analyze or refactor scripts
- assist with WorldEdit building tasks
- experiment with sandboxed Lua execution
- Built with Luanti's HTTP API for sending requests to an external, OpenAI-compatible endpoint. - Structured to be understandable and extendable for new contributors.
- Version: **0.7.8**
The mod combines an **AI chat interface**, a **Smart Lua IDE**, and **LLM-assisted building tools** into a single integrated workflow.
## Requirements
---
- A running Luanti server.
- An API key from a supported service. ## Supported API Endpoints
## ✨ Core Features
Successfully tested with:
### 🤖 AI Chat Interface
- Open WebUI
Interact with a Large Language Model directly inside Luanti.
Features include:
- In-game chat GUI
- conversation context handling
- player and world information awareness
- configurable prompts and system instructions
- support for OpenAI-compatible APIs
The chat system automatically includes contextual information such as:
- player position
- installed mods
- selected materials
- server environment
---
### 💻 Smart Lua IDE
LLM Connect includes a fully integrated **AI-assisted Lua development environment**.
Capabilities include:
- AI code generation from natural language prompts
- semantic code explanation
- automated refactoring
- code analysis
- interactive editing interface
- integration with the game environment
Developers can experiment with Lua snippets directly inside the game.
---
### 🧪 Sandboxed Code Execution
Lua code can be executed inside a controlled environment.
Security features include:
- privilege-based execution access
- sandboxed runtime
- optional whitelist restrictions
- prevention of filesystem access
Execution results are returned to the IDE interface for inspection.
---
### 🏗️ WorldEdit AI Assistant
LLM Connect can assist with building tasks using WorldEdit.
Examples:
- structure generation prompts
- building suggestions
- node/material selection
- architectural transformations
The system can combine:
- player position
- selected materials
- worldedit context
to produce context-aware instructions.
---
### 🧱 Material Selection Tools
The mod includes a **material picker** interface that helps the AI understand:
- available nodes
- building palettes
- player selections
This improves the quality of building-related prompts.
---
## 🔐 Permission System
Access to AI features is controlled through Luanti privileges.
| Privilege | Description |
|-----------|-------------|
| `llm` | Basic AI chat access |
| `llm_ide` | Access to the Smart Lua IDE |
| `llm_dev` | Sandbox Lua execution |
| `llm_root` | Full administrative control |
Server operators should grant privileges carefully.
---
## 📋 Requirements
- Luanti server **5.4.0 or newer recommended**
- HTTP API enabled
- Access to a compatible LLM endpoint
Supported providers include:
- OpenAI
- Ollama
- LM Studio
- Mistral AI
- OpenAI API
- Ollama and LocalAI (integer `max_tokens` ensures compatibility)
- LocalAI
- Open WebUI
- Mistral
- Together AI
- any OpenAI-compatible API
## Commands
---
- `/llm_setkey <key> [url] [model]` Sets the API key, URL, and model for the LLM. - `/llm_setmodel <model>` Sets the LLM model to be used.
- `/llm_set_endpoint <url>` Sets the API endpoint URL.
- `/llm_set_context <count> [player]` Sets the maximum context length for a player or all players.
- `/llm_reset` Resets the conversation history and the cached metadata for the current player.
- `/llm <prompt>` Sends a message to the LLM. - `/llm_integer` Forces `max_tokens` to be sent as an integer (default).
- `/llm_float` Sends `max_tokens` as a float (optional, experimental).
## 🚀 Installation
**Context Control (Configurable via In-Game Settings):**
The following context elements can be individually toggled `ON`/`OFF` in the Luanti main menu:
* Send Server Info (`llm_context_send_server_info`)
* Send Mod List (`llm_context_send_mod_list`)
* Send Commands List (`llm_context_send_commands`)
* Send Player Position (`llm_context_send_player_pos`)
* Send Available Materials (`llm_context_send_materials`)
### ContentDB (recommended)
## Potential for Expansion
Install via ContentDB:
- Add support for more API endpoints. - Integrate with additional in-game events or data sources (player inventory, world data).
- Improve error handling and performance.
- Create a graphical user interface (formspec) for configuration instead of relying solely on chat commands.
```
## Contributing
Content → Mods → LLM Connect
```
---
### Manual Installation
1. Download the repository or release archive
2. Extract into your `mods` folder
3. Ensure the folder name is:
```
llm_connect
```
4. Enable HTTP API in `minetest.conf`
```
secure.http_mods = llm_connect
```
Restart the server.
---
## ⚙️ Configuration
Configuration can be done via:
- `/llm_config` GUI
- `minetest.conf`
Example:
```
llm_api_key = your-api-key
llm_api_url = [https://api.openai.com/v1/chat/completions](https://api.openai.com/v1/chat/completions)
llm_model = gpt-4
llm_temperature = 0.7
llm_max_tokens = 4000
llm_timeout = 120
```
Context options:
```
llm_context_send_player_pos = true
llm_context_send_mod_list = true
llm_context_send_materials = true
```
---
## 🎮 Commands
| Command | Description |
|-------|-------------|
| `/llm` | Open AI chat |
| `/llm_ide` | Open Smart Lua IDE |
| `/llm_config` | Open configuration interface |
---
## 🔐 Security Notes
LLM Connect includes multiple safety mechanisms:
- privilege-based access control
- sandboxed execution environment
- optional Lua whitelist
- no filesystem access in sandbox mode
Server administrators should still review generated code carefully.
---
## 🧭 Roadmap
See:
```
ROADMAP_090.md
```
for planned improvements and upcoming features.
---
## 🤝 Contributing
Contributions are welcome.
Typical workflow:
1. fork repository
2. create feature branch
3. implement changes
4. submit pull request
Areas of interest:
- new AI integrations
- UI improvements
- security auditing
- building tools
- documentation
---
## 📜 License
LGPL-3.0-or-later
See `LICENSE`.
---
## 🔗 Links
ContentDB
https://content.luanti.org/packages/H5N3RG/llm_connect/
Luanti
https://www.luanti.org/
---
**LLM Connect Bringing AI-assisted development into Luanti.**
```
This project is in an early stage and welcomes contributions:
- Even small fixes help, especially with API integration, UI improvements, and performance tuning. - Contributions from experienced developers are highly welcome. ```

109
ROADMAP_090.md Normal file
View File

@@ -0,0 +1,109 @@
# LLM Connect 0.9 Roadmap
This document outlines the development goals and planned improvements
for the 0.9 release of LLM Connect.
The focus of this version is stability, improved context handling,
and better integration between AI features and the Luanti environment.
---
## Overview
Version 0.9 aims to refine the existing AI integration and extend
the development tools provided by the Smart Lua IDE.
Key areas include:
- improved context awareness
- better IDE workflow
- enhanced WorldEdit integration
- improved error handling and execution feedback
---
## Completed Features
The following features are already implemented:
- Improved request timeout handling
- Expanded configuration options
- Wider IDE layout and improved UI usability
- Guide checkbox for code generation prompts
- Custom file indexing system replacing `core.get_dir_list`
- Improved proxy and API error handling
---
## In Progress
These features are currently under active development:
### IDE Context System
The IDE will provide additional contextual information to the LLM.
Planned context elements include:
- active mods list
- player position
- currently opened file
- execution output from previous runs
This will allow the LLM to generate more accurate and relevant code.
---
## Planned Improvements
### Execution Feedback Loop
Improve the interaction between generated code and the execution system.
Possible features:
- automatic error detection
- AI-assisted debugging
- improved output visualization
---
### WorldEdit Integration
Further improvements to AI-assisted building tools:
- context-aware structure generation
- material-aware building suggestions
- improved prompt templates
---
### Prompt System Refinements
Improve system prompts used for:
- Lua code generation
- WorldEdit assistance
- general chat interactions
The goal is more consistent and reliable responses.
---
## Future Ideas
Ideas being explored for future versions:
- agent-style AI workflows
- multi-step code generation and correction
- automatic debugging loops
- extended IDE tooling
- improved building automation tools
---
## Long-Term Vision
LLM Connect aims to evolve into a complete AI-assisted development
environment inside Luanti, enabling players and modders to experiment,
prototype, and build complex systems directly within the game.

135
chat_context.lua Normal file
View File

@@ -0,0 +1,135 @@
-- chat_context.lua
-- Collects game and world context for the LLM
-- Uses settings from settingtypes.txt
local core = core
local M = {}
local materials_cache = nil
local materials_hash = nil
-- Computes a hash of the registry to detect changes in nodes or items
local function compute_registry_hash()
local count = 0
for _ in pairs(core.registered_nodes) do count = count + 1 end
for _ in pairs(core.registered_items) do count = count + 1 end
return tostring(count)
end
-- Generates a string context of registered materials (nodes, tools, items)
local function get_materials_context()
local current_hash = compute_registry_hash()
if materials_cache and materials_hash == current_hash then
return materials_cache
end
local lines = {}
local categories = {
{list = core.registered_nodes, label = "Nodes"},
{list = core.registered_tools, label = "Tools"},
{list = core.registered_craftitems, label = "Items"}
}
for _, cat in ipairs(categories) do
local count = 0
local items = {}
for name, _ in pairs(cat.list) do
-- Filter out internal engine nodes
if not name:match("^__builtin") and not name:match("^ignore") and not name:match("^air") then
count = count + 1
if count <= 40 then
table.insert(items, name)
end
end
end
table.insert(lines, cat.label .. ": " .. table.concat(items, ", "))
end
materials_cache = table.concat(lines, "\n")
materials_hash = current_hash
return materials_cache
end
-- Returns general information about the server and game state
function M.get_server_info()
local info = {}
local version = core.get_version()
table.insert(info, "Game: " .. (core.get_game_info().name or "Luanti/Minetest"))
table.insert(info, "Engine Version: " .. (version.project or "unknown"))
if core.settings:get_bool("llm_context_send_mod_list") then
local mods = core.get_modnames()
table.sort(mods)
table.insert(info, "Active Mods: " .. table.concat(mods, ", "))
end
local time = core.get_timeofday() * 24000
local hour = math.floor(time / 1000)
local min = math.floor((time % 1000) / 1000 * 60)
table.insert(info, string.format("In-game Time: %02d:%02d", hour, min))
return table.concat(info, "\n")
end
-- Compiles all enabled context categories into a single string
function M.get_context(name)
local ctx = {"--- START CONTEXT ---"}
-- 1. Server Info
if core.settings:get_bool("llm_context_send_server_info") ~= false then
table.insert(ctx, "--- SERVER INFO ---")
table.insert(ctx, M.get_server_info())
end
-- 2. Player Info
if core.settings:get_bool("llm_context_send_player_pos") ~= false then
local player = core.get_player_by_name(name)
if player then
local pos = player:get_pos()
local hp = player:get_hp()
local wielded = player:get_wielded_item():get_name()
table.insert(ctx, string.format("Current Player (%s): HP: %d, Pos: (x=%.1f, y=%.1f, z=%.1f)",
name, hp, pos.x, pos.y, pos.z))
if wielded ~= "" then
table.insert(ctx, "Holding item: " .. wielded)
end
end
end
-- 3. Chat Commands
if core.settings:get_bool("llm_context_send_commands") then
local cmds = {}
for cmd, _ in pairs(core.registered_chatcommands) do
table.insert(cmds, "/" .. cmd)
end
table.sort(cmds)
table.insert(ctx, "Available Commands (Top 50): " .. table.concat(cmds, ", ", 1, math.min(50, #cmds)))
end
-- 4. Materials
if core.settings:get_bool("llm_context_send_materials") then
table.insert(ctx, "--- REGISTERED MATERIALS ---")
table.insert(ctx, get_materials_context())
end
table.insert(ctx, "--- END CONTEXT ---")
return table.concat(ctx, "\n")
end
-- Injects the game context as a system message into the messages table
function M.append_context(messages, name)
local context_str = M.get_context(name)
table.insert(messages, 1, {
role = "system",
content = "You are an AI assistant inside a Luanti (Minetest) game world. " ..
"Use the following game context to answer the user's questions accurately.\n\n" ..
context_str
})
return messages
end
return M

368
chat_gui.lua Normal file
View File

@@ -0,0 +1,368 @@
-- chat_gui.lua
-- LLM Chat Interface v0.8.7
-- Privilege model:
-- llm → Chat only
-- llm_dev → + IDE button
-- llm_worldedit → + WE Single/Loop + Mats + Undo
-- llm_root → Superrole: implies all of the above + Config button
local core = core
local M = {}
local mod_path = core.get_modpath("llm_connect")
local context_ok, chat_context = pcall(dofile, mod_path .. "/chat_context.lua")
if not context_ok then
core.log("error", "[chat_gui] Failed to load chat_context.lua: " .. tostring(chat_context))
chat_context = nil
end
-- material_picker: prefer already-loaded global, fallback to dofile
local material_picker = _G.material_picker
if not material_picker then
local ok, mp = pcall(dofile, mod_path .. "/material_picker.lua")
if ok and mp then
material_picker = mp
else
core.log("warning", "[chat_gui] material_picker not available: " .. tostring(mp))
end
end
local function get_llm_api()
if not _G.llm_api then error("[chat_gui] llm_api not available") end
return _G.llm_api
end
-- ============================================================
-- Privilege helpers
-- llm_root is a super-role: implies llm + llm_dev + llm_worldedit
-- ============================================================
local function raw_priv(name, priv)
local p = core.get_player_privs(name) or {}
return p[priv] == true
end
local function has_priv(name, priv)
if raw_priv(name, "llm_root") then return true end
return raw_priv(name, priv)
end
local function can_chat(name) return has_priv(name, "llm") end
local function can_ide(name) return has_priv(name, "llm_dev") end
local function can_worldedit(name) return has_priv(name, "llm_worldedit") end
local function can_config(name) return raw_priv(name, "llm_root") end -- root only, no implication upward
-- ============================================================
-- Session
-- ============================================================
local sessions = {}
local WE_MODE_LABEL = {chat="Chat", single="WE Single", loop="WE Loop"}
local WE_MODE_COLOR = {chat="#444455", single="#2a4a6a", loop="#4a2a6a"}
local WE_MODE_COLOR_UNAVAIL = "#333333"
local function get_session(name)
if not sessions[name] then
sessions[name] = {history={}, last_input="", we_mode="chat"}
end
return sessions[name]
end
local function we_available()
return type(_G.we_agency) == "table" and _G.we_agency.is_available()
end
local function cycle_we_mode(session, name)
if not we_available() then
core.chat_send_player(name, "[LLM] WorldEdit not available.")
return
end
local cur = session.we_mode
if cur == "chat" then session.we_mode = "single"
elseif cur == "single" then session.we_mode = "loop"
elseif cur == "loop" then session.we_mode = "chat"
end
core.chat_send_player(name, "[LLM] Mode: " .. WE_MODE_LABEL[session.we_mode])
end
-- ============================================================
-- Build Formspec
-- ============================================================
function M.show(name)
if not can_chat(name) then
core.chat_send_player(name, "[LLM] Missing privilege: llm")
return
end
local session = get_session(name)
local text_accum = ""
for _, msg in ipairs(session.history) do
if msg.role ~= "system" then
local content = msg.content or ""
if msg.role == "user" then
text_accum = text_accum .. "You: " .. content .. "\n\n"
else
text_accum = text_accum .. "[LLM]: " .. content .. "\n\n"
end
end
end
if text_accum == "" then
text_accum = "Welcome to LLM Chat!\nType your question below."
end
local W = 16.0
local H = 12.0
local PAD = 0.25
local HEADER_H = 1.8
local INPUT_H = 0.7
local CHAT_H = H - HEADER_H - INPUT_H - (PAD * 6)
local fs = {
"formspec_version[6]",
"size[" .. W .. "," .. H .. "]",
"bgcolor[#0f0f0f;both]",
"style_type[*;bgcolor=#1a1a1a;textcolor=#e0e0e0]",
}
-- Header box
table.insert(fs, "box[0,0;" .. W .. "," .. HEADER_H .. ";#202020]")
table.insert(fs, "label[" .. PAD .. ",0.30;LLM Chat - " .. core.formspec_escape(name) .. "]")
-- ── Header row 1 right side: Config (root) + IDE (dev) ────
local right_x = W - PAD
if can_config(name) then
right_x = right_x - 2.0
table.insert(fs, "style[open_config;bgcolor=#2a2a1a;textcolor=#ffeeaa]")
table.insert(fs, "button[" .. right_x .. ",0.08;2.0,0.65;open_config;Config]")
table.insert(fs, "tooltip[open_config;Open LLM configuration (llm_root only)]")
end
if can_ide(name) then
right_x = right_x - 2.3 - 0.15
table.insert(fs, "style[open_ide;bgcolor=#1a1a2a;textcolor=#aaaaff]")
table.insert(fs, "button[" .. right_x .. ",0.08;2.3,0.65;open_ide;IDE]")
table.insert(fs, "tooltip[open_ide;Open Smart Lua IDE (llm_dev)]")
end
-- Header row 2: three direct WE-Mode buttons side by side
local mode = session.we_mode or "chat"
if can_worldedit(name) then
local we_ok = we_available()
local dim = "#2a2a2a"
local function we_btn(bname, bx, bw, blabel, active, color_on, color_dim, tip)
local bg = we_ok and (active and color_on or color_dim) or dim
local fg = active and "#ffffff" or (we_ok and "#889999" or "#555555")
table.insert(fs, "style[" .. bname .. ";bgcolor=" .. bg .. ";textcolor=" .. fg .. "]")
table.insert(fs, "button[" .. bx .. ",0.95;" .. bw .. ",0.65;" .. bname .. ";" .. blabel .. "]")
table.insert(fs, "tooltip[" .. bname .. ";" .. (we_ok and tip or "WorldEdit not loaded") .. "]")
end
we_btn("we_btn_chat", PAD, 2.6, "Chat", mode=="chat", "#444466", "#1e1e2e", "Normal LLM chat mode")
we_btn("we_btn_single", PAD + 2.7, 2.8, "WE Single", mode=="single", "#2a4a7a", "#151d2a", "WorldEdit: one plan per message")
we_btn("we_btn_loop", PAD + 5.6, 2.6, "WE Loop", mode=="loop", "#4a2a7a", "#1e1228", "WorldEdit: iterative build loop (up to 6 steps)")
if mode == "single" or mode == "loop" then
local mat_count = material_picker and #material_picker.get_materials(name) or 0
local mat_label = mat_count > 0 and ("Mats (" .. mat_count .. ")") or "Mats"
local mat_color = mat_count > 0 and "#1a3a1a" or "#252525"
table.insert(fs, "style[we_materials_open;bgcolor=" .. mat_color .. ";textcolor=#aaffaa]")
table.insert(fs, "button[" .. (PAD + 8.3) .. ",0.95;2.6,0.65;we_materials_open;" .. mat_label .. "]")
table.insert(fs, "tooltip[we_materials_open;Material picker: attach node names to LLM context]")
end
table.insert(fs, "style[we_undo;bgcolor=#3a2020;textcolor=#ffaaaa]")
table.insert(fs, "button[" .. (W - PAD - 2.1) .. ",0.95;2.1,0.65;we_undo;Undo]")
table.insert(fs, "tooltip[we_undo;Undo last WorldEdit agency operation]")
end
-- Chat history
table.insert(fs, "textarea[" .. PAD .. "," .. (HEADER_H + PAD) .. ";"
.. (W - PAD*2) .. "," .. CHAT_H
.. ";history_display;;" .. core.formspec_escape(text_accum) .. "]")
table.insert(fs, "style[history_display;textcolor=#e0e0e0;bgcolor=#1a1a1a;border=false]")
-- Input
local input_y = HEADER_H + PAD + CHAT_H + PAD
table.insert(fs, "field[" .. PAD .. "," .. input_y .. ";"
.. (W - PAD*2 - 2.5) .. "," .. INPUT_H
.. ";input;;" .. core.formspec_escape(session.last_input) .. "]")
table.insert(fs, "button[" .. (W - PAD - 2.2) .. "," .. input_y
.. ";2.2," .. INPUT_H .. ";send;Send]")
table.insert(fs, "field_close_on_enter[input;false]")
-- Toolbar
local tb_y = input_y + INPUT_H + PAD
table.insert(fs, "button[" .. PAD .. "," .. tb_y .. ";2.8,0.75;clear;Clear Chat]")
core.show_formspec(name, "llm_connect:chat", table.concat(fs))
end
-- ============================================================
-- Formspec Handler
-- ============================================================
function M.handle_fields(name, formname, fields)
-- Forward to Material Picker
if formname:match("^llm_connect:material_picker") then
if material_picker then
local result = material_picker.handle_fields(name, formname, fields)
if fields.close_picker or fields.close_and_back or fields.quit then
M.show(name)
end
return result
end
return false
end
if not formname:match("^llm_connect:chat") then return false end
local session = get_session(name)
local updated = false
-- ── WE-Buttons (privilege-checked) ──────────────────────
if fields.we_btn_chat then
if can_worldedit(name) then session.we_mode = "chat"; updated = true end
elseif fields.we_btn_single then
if can_worldedit(name) and we_available() then session.we_mode = "single"; updated = true end
elseif fields.we_btn_loop then
if can_worldedit(name) and we_available() then session.we_mode = "loop"; updated = true end
elseif fields.we_materials_open then
if can_worldedit(name) and material_picker then
material_picker.show(name)
end
return true
elseif fields.we_undo then
if can_worldedit(name) and _G.we_agency then
local res = _G.we_agency.undo(name)
table.insert(session.history, {role="assistant",
content=(res.ok and "Undo: " or "Error: ") .. res.message})
updated = true
end
-- ── IDE / Config (privilege-checked) ────────────────────
elseif fields.open_ide then
if can_ide(name) and _G.ide_gui then
_G.ide_gui.show(name)
end
return true
elseif fields.open_config then
if can_config(name) and _G.config_gui then
_G.config_gui.show(name)
end
return true
-- ── Send ────────────────────────────────────────────────
elseif fields.send or fields.key_enter_field == "input" then
local input = (fields.input or ""):trim()
if input ~= "" then
table.insert(session.history, {role="user", content=input})
session.last_input = ""
-- WE Loop (nur llm_worldedit)
if session.we_mode == "loop" and can_worldedit(name) and we_available() then
table.insert(session.history, {role="assistant", content="(starting WE loop...)"})
updated = true
local mat_ctx = material_picker and material_picker.build_material_context(name)
local loop_input = mat_ctx and (input .. "\n\n" .. mat_ctx) or input
_G.we_agency.run_loop(name, loop_input, {
max_iterations = (_G.llm_api and _G.llm_api.config.we_max_iterations or 6),
timeout = (_G.llm_api and _G.llm_api.get_timeout("we") or 90),
on_step = function(i, plan, results)
local lines = {"[WE Loop] Step " .. i .. ": " .. plan}
for _, r in ipairs(results) do
table.insert(lines, " " .. (r.ok and "v" or "x") .. " " .. r.tool .. ": " .. r.message)
end
core.chat_send_player(name, table.concat(lines, "\n"))
end,
}, function(res)
local reply = _G.we_agency.format_loop_results(res)
for i = #session.history, 1, -1 do
if session.history[i].content == "(starting WE loop...)" then
session.history[i].content = reply; break
end
end
M.show(name)
end)
-- WE Single (nur llm_worldedit)
elseif session.we_mode == "single" and can_worldedit(name) and we_available() then
table.insert(session.history, {role="assistant", content="(planning WE operations...)"})
updated = true
local mat_ctx = material_picker and material_picker.build_material_context(name)
local single_input = mat_ctx and (input .. "\n\n" .. mat_ctx) or input
_G.we_agency.request(name, single_input, function(res)
local reply = not res.ok
and ("Error: " .. (res.error or "unknown"))
or _G.we_agency.format_results(res.plan, res.results)
for i = #session.history, 1, -1 do
if session.history[i].content == "(planning WE operations...)" then
session.history[i].content = reply; break
end
end
M.show(name)
end)
-- Normal Chat (always allowed if llm privilege is present)
else
-- Reset WE-mode if player lacks privilege
if session.we_mode ~= "chat" and not can_worldedit(name) then
session.we_mode = "chat"
end
local messages = {}
local context_added = false
if chat_context then
messages = chat_context.append_context(messages, name)
if #messages > 0 and messages[1].role == "system" then context_added = true end
end
if not context_added then
table.insert(messages, 1, {role="system",
content="You are a helpful assistant in the Luanti/Minetest game."})
end
for _, msg in ipairs(session.history) do table.insert(messages, msg) end
table.insert(session.history, {role="assistant", content="(thinking...)"})
updated = true
local llm_api = get_llm_api()
llm_api.request(messages, function(result)
local content = result.success and result.content
or "Error: " .. (result.error or "Unknown error")
for i = #session.history, 1, -1 do
if session.history[i].content == "(thinking...)" then
session.history[i].content = content; break
end
end
M.show(name)
end, {timeout = (_G.llm_api and _G.llm_api.get_timeout("chat") or 180)})
end
end
-- ── Clear ───────────────────────────────────────────────
elseif fields.clear then
session.history = {}
session.last_input = ""
updated = true
elseif fields.quit then
return true
end
if updated then M.show(name) end
return true
end
core.register_on_leaveplayer(function(player)
sessions[player:get_player_name()] = nil
end)
return M

298
code_executor.lua Normal file
View File

@@ -0,0 +1,298 @@
-- code_executor.lua
-- Secure Lua code execution for LLM-Connect / Smart Lua IDE
-- Privileges:
-- llm_dev → Sandbox + Whitelist, no persistent registrations
-- llm_root → Unrestricted execution + persistent registrations possible
local core = core
local M = {}
M.execution_history = {} -- per player: {timestamp, code_snippet, success, output/error}
local STARTUP_FILE = core.get_worldpath() .. DIR_DELIM .. "llm_startup.lua"
-- =============================================================
-- Helper functions
-- =============================================================
local function player_has_priv(name, priv)
local privs = core.get_player_privs(name) or {}
return privs[priv] == true
end
-- llm_root is a super-role: implies llm_dev and all others
local function has_llm_priv(name, priv)
if player_has_priv(name, "llm_root") then return true end
return player_has_priv(name, priv)
end
local function is_llm_root(name)
return player_has_priv(name, "llm_root")
end
-- =============================================================
-- Sandbox environment (for normal llm_dev / llm users)
-- =============================================================
local function create_sandbox_env(player_name)
local safe_core = {
-- Logging & Chat
log = core.log,
chat_send_player = core.chat_send_player,
-- Secure read access
get_node = core.get_node,
get_node_or_nil = core.get_node_or_nil,
find_node_near = core.find_node_near,
find_nodes_in_area = core.find_nodes_in_area,
get_meta = core.get_meta,
get_player_by_name = core.get_player_by_name,
get_connected_players = core.get_connected_players,
}
-- Block registration functions (require restart)
local function blocked_registration(name)
return function(...)
core.log("warning", ("[code_executor] Blocked registration call: %s by %s"):format(name, player_name))
core.chat_send_player(player_name, "Registrations are forbidden in sandbox mode.\nOnly llm_root may execute these persistently.")
return nil
end
end
safe_core.register_node = blocked_registration("register_node")
safe_core.register_craftitem = blocked_registration("register_craftitem")
safe_core.register_tool = blocked_registration("register_tool")
safe_core.register_craft = blocked_registration("register_craft")
safe_core.register_entity = blocked_registration("register_entity")
-- Allowed dynamic registrations (very restricted)
safe_core.register_chatcommand = core.register_chatcommand
safe_core.register_on_chat_message = core.register_on_chat_message
-- Safe standard libraries (without dangerous functions)
local env = {
-- Lua basics
assert = assert,
error = error,
pairs = pairs,
ipairs = ipairs,
next = next,
select = select,
type = type,
tostring = tostring,
tonumber = tonumber,
unpack = table.unpack or unpack,
-- Safe string/table/math functions
string = { byte=string.byte, char=string.char, find=string.find, format=string.format,
gmatch=string.gmatch, gsub=string.gsub, len=string.len, lower=string.lower,
match=string.match, rep=string.rep, reverse=string.reverse, sub=string.sub,
upper=string.upper },
table = { concat=table.concat, insert=table.insert, remove=table.remove, sort=table.sort },
math = math,
-- Minetest-safe API
core = safe_core,
-- Redirect print
print = function(...) end, -- will be overwritten later
}
-- Output buffer with limit
local output_buffer = {}
local output_size = 0
local MAX_OUTPUT = 100000 -- ~100 KB
env.print = function(...)
local parts = {}
for i = 1, select("#", ...) do
parts[i] = tostring(select(i, ...))
end
local line = table.concat(parts, "\t")
if output_size + #line > MAX_OUTPUT then
table.insert(output_buffer, "\n[OUTPUT TRUNCATED 100 KB limit reached]")
return
end
table.insert(output_buffer, line)
output_size = output_size + #line
end
return env, output_buffer
end
-- =============================================================
-- Append persistent startup code (llm_root only)
-- =============================================================
local function append_to_startup(code, player_name)
local f, err = io.open(STARTUP_FILE, "a")
if not f then
core.log("error", ("[code_executor] Cannot open startup file: %s"):format(tostring(err)))
return false, err
end
f:write(("\n-- Added by %s at %s\n"):format(player_name, os.date("%Y-%m-%d %H:%M:%S")))
f:write(code)
f:write("\n\n")
f:close()
core.log("action", ("[code_executor] Appended code to %s by %s"):format(STARTUP_FILE, player_name))
return true
end
-- =============================================================
-- Main execution function
-- =============================================================
function M.execute(player_name, code, options)
options = options or {}
local result = { success = false }
if type(code) ~= "string" or code:trim() == "" then
result.error = "No or empty code provided"
return result
end
local is_root = is_llm_root(player_name)
local use_sandbox = options.sandbox ~= false
local allow_persist = options.allow_persist or is_root
-- Check whether the player has execution rights at all
if not has_llm_priv(player_name, "llm_dev") then
result.error = "Missing privilege: llm_dev (or llm_root)"
return result
end
-- =============================================
-- 1. Compile
-- =============================================
local func, compile_err = loadstring(code, "=(llm_ide)")
if not func then
result.error = "Compile error: " .. tostring(compile_err)
core.log("warning", ("[code_executor] Compile failed for %s: %s"):format(player_name, result.error))
return result
end
-- =============================================
-- 2. Prepare environment & print redirection
-- =============================================
local output_buffer = {}
local env
if use_sandbox then
env, output_buffer = create_sandbox_env(player_name)
setfenv(func, env) -- Lua 5.1 compatibility (Luanti mostly uses LuaJIT)
else
-- Unrestricted mode → Careful!
if not is_root then
result.error = "Unrestricted execution only allowed for llm_root"
return result
end
-- Redirect print (without overwriting _G)
local old_print = print
print = function(...)
local parts = {}
for i = 1, select("#", ...) do parts[#parts+1] = tostring(select(i, ...)) end
local line = table.concat(parts, "\t")
table.insert(output_buffer, line)
end
end
-- =============================================
-- 3. Execute (with instruction limit)
-- =============================================
local ok, exec_res = pcall(function()
-- Instruction limit could be added here later (currently dummy)
return func()
end)
-- Reset print (if unrestricted)
if not use_sandbox then
print = old_print
end
-- =============================================
-- 4. Process result
-- =============================================
result.output = table.concat(output_buffer, "\n")
if ok then
result.success = true
result.return_value = exec_res
core.log("action", ("[code_executor] Success by %s (sandbox=%s)"):format(player_name, tostring(use_sandbox)))
else
result.error = "Runtime error: " .. tostring(exec_res)
core.log("warning", ("[code_executor] Execution failed for %s: %s"):format(player_name, result.error))
end
-- =============================================
-- 5. Check for registrations → Persistence?
-- =============================================
local has_registration = code:match("register_node%s*%(") or
code:match("register_tool%s*%(") or
code:match("register_craftitem%s*%(") or
code:match("register_entity%s*%(") or
code:match("register_craft%s*%(")
if has_registration then
if allow_persist and is_root then
local saved, save_err = append_to_startup(code, player_name)
if saved then
local msg = "Code with registrations saved to llm_startup.lua.\nWill be active after server restart."
core.chat_send_player(player_name, msg)
result.output = (result.output or "") .. "\n\n" .. msg
result.persisted = true
else
result.error = (result.error or "") .. "\nPersistence failed: " .. tostring(save_err)
end
else
local msg = "Code contains registrations (node/tool/...). \nOnly llm_root can execute these persistently (restart required)."
core.chat_send_player(player_name, msg)
result.error = (result.error or "") .. "\n" .. msg
result.success = false -- even if execution was ok
end
end
-- Save history
M.execution_history[player_name] = M.execution_history[player_name] or {}
table.insert(M.execution_history[player_name], {
timestamp = os.time(),
code = code:sub(1, 200) .. (code:len() > 200 and "..." or ""),
success = result.success,
output = result.output,
error = result.error,
})
return result
end
-- =============================================================
-- History functions
-- =============================================================
function M.get_history(player_name, limit)
limit = limit or 10
local hist = M.execution_history[player_name] or {}
local res = {}
local start = math.max(1, #hist - limit + 1)
for i = start, #hist do
res[#res+1] = hist[i]
end
return res
end
function M.clear_history(player_name)
M.execution_history[player_name] = nil
end
-- Cleanup
core.register_on_leaveplayer(function(player)
local name = player:get_player_name()
M.execution_history[name] = nil
end)
return M

255
config_gui.lua Normal file
View File

@@ -0,0 +1,255 @@
-- config_gui.lua
-- LLM API Configuration GUI (llm_root only)
-- v0.8.1: Added timeout field for better control
local core = core
local M = {}
local function has_priv(name, priv)
local p = core.get_player_privs(name) or {}
return p[priv] == true
end
local function get_llm_api()
if not _G.llm_api then
error("[config_gui] llm_api not available")
end
return _G.llm_api
end
function M.show(name)
if not has_priv(name, "llm_root") then
core.chat_send_player(name, "Missing privilege: llm_root")
return
end
local llm_api = get_llm_api()
local cfg = llm_api.config
local W, H = 14.0, 14.5
local PAD = 0.3
local HEADER_H = 0.8
local FIELD_H = 0.8
local BTN_H = 0.9
local fs = {
"formspec_version[6]",
"size[" .. W .. "," .. H .. "]",
"bgcolor[#0f0f0f;both]",
"style_type[*;bgcolor=#1a1a1a;textcolor=#e0e0e0;font=mono]",
}
-- Header
table.insert(fs, "box[0,0;" .. W .. "," .. HEADER_H .. ";#202020]")
table.insert(fs, "label[" .. PAD .. "," .. (HEADER_H/2 - 0.2) .. ";LLM Configuration (llm_root only)]")
table.insert(fs, "label[" .. (W - 4) .. "," .. (HEADER_H/2 - 0.2) .. ";" .. os.date("%H:%M") .. "]")
local y = HEADER_H + PAD * 2
-- API Key
table.insert(fs, "label[" .. PAD .. "," .. y .. ";API Key:]")
y = y + 0.5
table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. (W - PAD*2) .. "," .. FIELD_H .. ";api_key;;" .. core.formspec_escape(cfg.api_key or "") .. "]")
table.insert(fs, "style[api_key;bgcolor=#1e1e1e]")
y = y + FIELD_H + PAD
-- API URL
table.insert(fs, "label[" .. PAD .. "," .. y .. ";API URL:]")
y = y + 0.5
table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. (W - PAD*2) .. "," .. FIELD_H .. ";api_url;;" .. core.formspec_escape(cfg.api_url or "") .. "]")
table.insert(fs, "style[api_url;bgcolor=#1e1e1e]")
y = y + FIELD_H + PAD
-- Model
table.insert(fs, "label[" .. PAD .. "," .. y .. ";Model:]")
y = y + 0.5
table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. (W - PAD*2) .. "," .. FIELD_H .. ";model;;" .. core.formspec_escape(cfg.model or "") .. "]")
table.insert(fs, "style[model;bgcolor=#1e1e1e]")
y = y + FIELD_H + PAD
-- Max Tokens & Temperature (side by side)
table.insert(fs, "label[" .. PAD .. "," .. y .. ";Max Tokens:]")
table.insert(fs, "label[" .. (W/2 + PAD) .. "," .. y .. ";Temperature:]")
y = y + 0.5
local half_w = (W - PAD*3) / 2
table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. half_w .. "," .. FIELD_H .. ";max_tokens;;" .. tostring(cfg.max_tokens or 4000) .. "]")
table.insert(fs, "style[max_tokens;bgcolor=#1e1e1e]")
table.insert(fs, "field[" .. (W/2 + PAD) .. "," .. y .. ";" .. half_w .. "," .. FIELD_H .. ";temperature;;" .. tostring(cfg.temperature or 0.7) .. "]")
table.insert(fs, "style[temperature;bgcolor=#1e1e1e]")
y = y + FIELD_H + PAD
-- Timeout field (new in v0.8.1)
table.insert(fs, "label[" .. PAD .. "," .. y .. ";Timeout (seconds):]")
y = y + 0.5
table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. half_w .. "," .. FIELD_H .. ";timeout;;" .. tostring(cfg.timeout or 120) .. "]")
table.insert(fs, "style[timeout;bgcolor=#1e1e1e]")
table.insert(fs, "tooltip[timeout;Global fallback timeout (30-600s). Per-mode overrides below override this.]")
y = y + FIELD_H + PAD
-- Per-mode timeout overrides
table.insert(fs, "label[" .. PAD .. "," .. y .. ";Per-mode timeout overrides (0 = use global):]")
y = y + 0.5
local third_w = (W - PAD * 2 - 0.2 * 2) / 3
local function tx(i) return PAD + i * (third_w + 0.2) end
table.insert(fs, "label[" .. tx(0) .. "," .. y .. ";Chat:]")
table.insert(fs, "label[" .. tx(1) .. "," .. y .. ";IDE:]")
table.insert(fs, "label[" .. tx(2) .. "," .. y .. ";WorldEdit:]")
y = y + 0.45
table.insert(fs, "field[" .. string.format("%.2f", tx(0)) .. "," .. y .. ";" .. string.format("%.2f", third_w) .. "," .. FIELD_H .. ";timeout_chat;;" .. tostring(cfg.timeout_chat or 0) .. "]")
table.insert(fs, "style[timeout_chat;bgcolor=#1e1e1e]")
table.insert(fs, "tooltip[timeout_chat;Chat mode timeout (0 = global)]")
table.insert(fs, "field[" .. string.format("%.2f", tx(1)) .. "," .. y .. ";" .. string.format("%.2f", third_w) .. "," .. FIELD_H .. ";timeout_ide;;" .. tostring(cfg.timeout_ide or 0) .. "]")
table.insert(fs, "style[timeout_ide;bgcolor=#1e1e1e]")
table.insert(fs, "tooltip[timeout_ide;IDE mode timeout (0 = global)]")
table.insert(fs, "field[" .. string.format("%.2f", tx(2)) .. "," .. y .. ";" .. string.format("%.2f", third_w) .. "," .. FIELD_H .. ";timeout_we;;" .. tostring(cfg.timeout_we or 0) .. "]")
table.insert(fs, "style[timeout_we;bgcolor=#1e1e1e]")
table.insert(fs, "tooltip[timeout_we;WorldEdit mode timeout (0 = global)]")
y = y + FIELD_H + PAD * 2
-- WEA toggle + separator
table.insert(fs, "box[" .. PAD .. "," .. y .. ";" .. (W - PAD*2) .. ",0.02;#333333]")
y = y + 0.18
local wea_val = core.settings:get_bool("llm_worldedit_additions", true)
local wea_label = "Enable WorldEditAdditions tools (torus, ellipsoid, erode, convolve...)"
local wea_is_installed = type(worldeditadditions) == "table"
if not wea_is_installed then
wea_label = wea_label .. " [WEA mod not detected]"
end
table.insert(fs, "checkbox[" .. PAD .. "," .. y .. ";wea_enabled;" .. core.formspec_escape(wea_label) .. ";" .. (wea_val and "true" or "false") .. "]")
y = y + 0.55 + PAD
-- 4 buttons evenly distributed: Save, Reload, Test, Close
local btn_count = 4
local btn_spacing = 0.2
local btn_w = (W - PAD * 2 - btn_spacing * (btn_count - 1)) / btn_count
local function bx(i) return PAD + i * (btn_w + btn_spacing) end
table.insert(fs, "button[" .. string.format("%.2f", bx(0)) .. "," .. y .. ";" .. string.format("%.2f", btn_w) .. "," .. BTN_H .. ";save;Save Config]")
table.insert(fs, "button[" .. string.format("%.2f", bx(1)) .. "," .. y .. ";" .. string.format("%.2f", btn_w) .. "," .. BTN_H .. ";reload;Reload]")
table.insert(fs, "button[" .. string.format("%.2f", bx(2)) .. "," .. y .. ";" .. string.format("%.2f", btn_w) .. "," .. BTN_H .. ";test;Test Connection]")
table.insert(fs, "style[close;bgcolor=#3a1a1a;textcolor=#ffaaaa]")
table.insert(fs, "button[" .. string.format("%.2f", bx(3)) .. "," .. y .. ";" .. string.format("%.2f", btn_w) .. "," .. BTN_H .. ";close;✕ Close]")
y = y + BTN_H + PAD
-- Info label
table.insert(fs, "label[" .. PAD .. "," .. y .. ";Note: Runtime changes. Edit minetest.conf for persistence.]")
core.show_formspec(name, "llm_connect:config", table.concat(fs))
end
function M.handle_fields(name, formname, fields)
if not formname:match("^llm_connect:config") then
return false
end
if not has_priv(name, "llm_root") then
return true
end
local llm_api = get_llm_api()
-- WEA checkbox: instant toggle (no Save needed)
if fields.wea_enabled ~= nil then
local val = fields.wea_enabled == "true"
core.settings:set_bool("llm_worldedit_additions", val)
core.chat_send_player(name, "[LLM] WorldEditAdditions tools: " .. (val and "enabled" or "disabled"))
M.show(name)
return true
end
if fields.save then
-- Validation
local max_tokens = tonumber(fields.max_tokens)
local temperature = tonumber(fields.temperature)
local timeout = tonumber(fields.timeout)
if not max_tokens or max_tokens < 1 or max_tokens > 100000 then
core.chat_send_player(name, "[LLM] Error: max_tokens must be between 1 and 100000")
return true
end
if not temperature or temperature < 0 or temperature > 2 then
core.chat_send_player(name, "[LLM] Error: temperature must be between 0 and 2")
return true
end
if not timeout or timeout < 30 or timeout > 600 then
core.chat_send_player(name, "[LLM] Error: timeout must be between 30 and 600 seconds")
return true
end
local timeout_chat = tonumber(fields.timeout_chat) or 0
local timeout_ide = tonumber(fields.timeout_ide) or 0
local timeout_we = tonumber(fields.timeout_we) or 0
for _, t in ipairs({timeout_chat, timeout_ide, timeout_we}) do
if t ~= 0 and (t < 30 or t > 600) then
core.chat_send_player(name, "[LLM] Error: per-mode timeouts must be 0 or between 30-600")
return true
end
end
llm_api.set_config({
api_key = fields.api_key or "",
api_url = fields.api_url or "",
model = fields.model or "",
max_tokens = max_tokens,
temperature = temperature,
timeout = timeout,
timeout_chat = timeout_chat,
timeout_ide = timeout_ide,
timeout_we = timeout_we,
})
core.chat_send_player(name, "[LLM] Configuration updated (runtime only)")
core.log("action", "[llm_connect] Config updated by " .. name)
M.show(name)
return true
elseif fields.reload then
llm_api.reload_config()
core.chat_send_player(name, "[LLM] Configuration reloaded from settings")
core.log("action", "[llm_connect] Config reloaded by " .. name)
M.show(name)
return true
elseif fields.test then
-- Test LLM connection with a simple request
core.chat_send_player(name, "[LLM] Testing connection...")
local messages = {
{role = "user", content = "Reply with just the word 'OK' if you can read this."}
}
llm_api.request(messages, function(result)
if result.success then
core.chat_send_player(name, "[LLM] ✓ Connection test successful!")
core.chat_send_player(name, "[LLM] Response: " .. (result.content or "No content"))
else
core.chat_send_player(name, "[LLM] ✗ Connection test failed!")
core.chat_send_player(name, "[LLM] Error: " .. (result.error or "Unknown error"))
end
end, {timeout = 30})
return true
elseif fields.close or fields.quit then
-- Return to chat_gui
if _G.chat_gui then
_G.chat_gui.show(name)
else
core.close_formspec(name, "llm_connect:config")
end
return true
end
return true
end
return M

671
ide_gui.lua Normal file
View File

@@ -0,0 +1,671 @@
-- ide_gui.lua
-- Smart Lua IDE interface for LLM-Connect
-- v0.9.0: File manager with dropdown, save/load from dedicated snippets folder
local core = core
local M = {}
-- ======================================================
-- File Storage
-- ======================================================
-- Resolve paths at load time (like sethome/init.lua does) NOT lazily at runtime.
-- Under mod security, io.open works reliably when called with paths
-- resolved during the mod loading phase.
local SNIPPETS_DIR = (core.get_worldpath or minetest.get_worldpath)() .. "/" .. "llm_snippets"
local MKDIR_FN = core.mkdir or minetest.mkdir
-- Create snippets dir immediately at load time
if MKDIR_FN then
MKDIR_FN(SNIPPETS_DIR)
else
core.log("warning", "[ide_gui] mkdir not available snippets dir may not exist")
end
core.log("action", "[ide_gui] snippets dir: " .. SNIPPETS_DIR)
local function get_snippets_dir()
return SNIPPETS_DIR
end
local function ensure_snippets_dir()
-- Dir was already created at load time; this is now a no-op that just returns the path
return SNIPPETS_DIR
end
-- Index file tracks all saved snippets (avoids core.get_dir_list which is unreliable under mod security)
local INDEX_PATH = SNIPPETS_DIR .. "/_index.txt"
local function get_index_path()
return INDEX_PATH
end
local function read_index()
local path = get_index_path()
local f = io.open(path, "r")
if not f then return {} end
local files = {}
for line in f:lines() do
line = line:match("^%s*(.-)%s*$")
if line ~= "" then
table.insert(files, line)
end
end
f:close()
table.sort(files)
return files
end
local function write_index(files)
local path = get_index_path()
local sorted = {}
for _, v in ipairs(files) do table.insert(sorted, v) end
table.sort(sorted)
-- deduplicate
local seen = {}
local deduped = {}
for _, v in ipairs(sorted) do
if not seen[v] then seen[v] = true; table.insert(deduped, v) end
end
local ok = core.safe_file_write(path, table.concat(deduped, "\n"))
return ok
end
local function index_add(filename)
local files = read_index()
local exists = false
for _, v in ipairs(files) do
if v == filename then exists = true; break end
end
if not exists then
table.insert(files, filename)
write_index(files)
end
end
local function index_remove(filename)
local files = read_index()
local new = {}
for _, v in ipairs(files) do
if v ~= filename then table.insert(new, v) end
end
write_index(new)
end
-- One-time migration: if index is empty, probe known filenames via io.open
-- and rebuild the index from whatever is actually on disk.
-- Luanti doesn't give us reliable directory listing under mod security,
-- so we use a best-effort scan of any names we can discover.
local migration_done = false
local function maybe_migrate()
if migration_done then return end
migration_done = true
local idx = read_index()
if #idx > 0 then return end -- index already populated, nothing to do
-- We can't list the directory, but we can check for files the user
-- might have saved under common names in older versions.
local dir = ensure_snippets_dir()
local candidates = {"untitled.lua", "colorstones.lua", "test.lua", "init.lua", "startup.lua"}
local found = {}
for _, name in ipairs(candidates) do
local f = io.open(dir .. "/" .. name, "r")
if f then f:close(); table.insert(found, name) end
end
if #found > 0 then
write_index(found)
core.log("action", "[ide_gui] Migration: added " .. #found .. " existing snippets to index")
end
end
-- Public: returns sorted list of snippet filenames
local function list_snippet_files()
maybe_migrate()
return read_index()
end
local function read_file(filepath)
local f, err = io.open(filepath, "r")
if not f then
core.log("warning", "[ide_gui] read_file failed: " .. tostring(filepath) .. " " .. tostring(err))
return nil, err
end
local content = f:read("*a")
f:close()
return content
end
local function write_file(filepath, content)
-- core.safe_file_write does atomic write, preferred for snippets
local ok = core.safe_file_write(filepath, content)
if not ok then
-- fallback to io.open
local f, err = io.open(filepath, "w")
if not f then return false, err end
f:write(content)
f:close()
end
return true
end
-- ======================================================
-- Module helpers
-- ======================================================
local function get_executor()
if not _G.executor then
error("[ide_gui] executor not available - init.lua failed?")
end
return _G.executor
end
local function get_llm_api()
if not _G.llm_api then
error("[ide_gui] llm_api not available - init.lua failed?")
end
return _G.llm_api
end
local prompts
local function get_prompts()
if not prompts then
local ok, p = pcall(dofile, core.get_modpath("llm_connect") .. "/ide_system_prompts.lua")
if not ok then
core.log("error", "[ide_gui] Failed to load prompts: " .. tostring(p))
prompts = {
SYNTAX_FIXER = "Fix syntax errors in this Lua/Minetest code. Return raw Lua only.",
SEMANTIC_ANALYZER = "Analyze this Minetest Lua code for logic errors.",
CODE_EXPLAINER = "Explain this Minetest Lua code simply.",
CODE_GENERATOR = "Generate clean Minetest Lua code based on the user request.",
}
else
prompts = p
end
end
return prompts
end
-- Session data per player
local sessions = {}
local DEFAULT_CODE = [[-- Welcome to Smart Lua IDE!
-- Write your Luanti mod code here.
core.register_node("example:test_node", {
description = "Test Node",
tiles = {"default_stone.png"},
groups = {cracky = 3},
})
]]
local function get_session(name)
if not sessions[name] then
sessions[name] = {
code = DEFAULT_CODE,
output = "Ready!\nUse the toolbar buttons or type a prompt and click Generate.",
guiding_active = false, -- Naming guide toggle (off by default)
filename = "untitled.lua",
pending_proposal = nil,
last_prompt = "",
last_modified = os.time(),
file_list = {},
selected_file = "",
}
sessions[name].file_list = list_snippet_files()
end
return sessions[name]
end
local function has_priv(name, priv)
local p = core.get_player_privs(name) or {}
return p[priv] == true
end
local function can_use_ide(name)
return has_priv(name, "llm_ide") or has_priv(name, "llm_dev") or has_priv(name, "llm_root")
end
local function can_execute(name)
return has_priv(name, "llm_dev") or has_priv(name, "llm_root")
end
local function is_root(name)
return has_priv(name, "llm_root")
end
-- ======================================================
-- Main Formspec
-- ======================================================
function M.show(name)
if not can_use_ide(name) then
core.chat_send_player(name, "Missing privilege: llm_ide (or higher)")
return
end
local session = get_session(name)
-- Refresh file list on every render
session.file_list = list_snippet_files()
local code_esc = core.formspec_escape(session.code or "")
local output_esc = core.formspec_escape(session.output or "")
local fn_esc = core.formspec_escape(session.filename or "untitled.lua")
local prompt_esc = core.formspec_escape(session.last_prompt or "")
local W, H = 19.2, 13.0
local PAD = 0.2
local HEADER_H = 0.8
local TOOL_H = 0.9
local FILE_H = 0.9
local PROMPT_H = 0.8
local STATUS_H = 0.6
local tool_y = HEADER_H + PAD
local file_y = tool_y + TOOL_H + PAD
local prompt_y = file_y + FILE_H + PAD
local work_y = prompt_y + PROMPT_H + PAD
local work_h = H - work_y - STATUS_H - PAD * 2
local col_w = (W - PAD * 3) / 2
local fs = {
"formspec_version[6]",
"size[" .. W .. "," .. H .. "]",
"bgcolor[#0f0f0f;both]",
"style_type[*;bgcolor=#1a1a1a;textcolor=#e8e8e8;font=mono]",
}
-- ── Header ───────────────────────────────────────────────
table.insert(fs, "box[0,0;" .. W .. "," .. HEADER_H .. ";#1e1e1e]")
table.insert(fs, "label[" .. PAD .. "," .. (HEADER_H/2 - 0.15) .. ";Smart Lua IDE | " .. fn_esc .. "]")
table.insert(fs, "label[" .. (W - 6.2) .. "," .. (HEADER_H/2 - 0.15) .. ";" .. os.date("%H:%M") .. "]")
table.insert(fs, "style[close_ide;bgcolor=#3a1a1a;textcolor=#ffaaaa]")
table.insert(fs, "button[" .. (W - PAD - 2.0) .. ",0.08;2.0,0.65;close_ide;x Close]")
-- ── Toolbar ───────────────────────────────────────────────
local bw = 1.85
local bp = 0.12
local bh = TOOL_H - 0.05
local x = PAD
local function add_btn(id, label, tip, enabled)
if not enabled then
table.insert(fs, "style[" .. id .. ";bgcolor=#444444;textcolor=#888888]")
end
table.insert(fs, "button[" .. x .. "," .. tool_y .. ";" .. bw .. "," .. bh .. ";" .. id .. ";" .. label .. "]")
if tip then table.insert(fs, "tooltip[" .. id .. ";" .. tip .. "]") end
x = x + bw + bp
end
add_btn("syntax", "Syntax", "Local syntax check + AI fix if errors found", true)
add_btn("analyze", "Analyze", "AI: find logic & API issues", true)
add_btn("explain", "Explain", "AI: explain the code in plain language", true)
add_btn("run", "▶ Run", can_execute(name) and "Execute in sandbox" or "Execute (needs llm_dev)", can_execute(name))
if session.pending_proposal then
table.insert(fs, "style[apply;bgcolor=#2a6a2a;textcolor=#ffffff]")
add_btn("apply", "✓ Apply", "Apply AI proposal into editor", true)
else
add_btn("apply", "Apply", "No pending proposal yet", false)
end
-- ── File Manager Row ──────────────────────────────────────
-- Layout: [Dropdown (files)] [Load] [Filename field] [Save] [New]
local files = session.file_list
local dd_str = #files > 0 and table.concat(files, ",") or "(no files)"
-- Find index for pre-selection
local dd_idx = 1
if session.selected_file ~= "" then
for i, f in ipairs(files) do
if f == session.selected_file then dd_idx = i; break end
end
end
local DD_W = 4.5
local BTN_SM = 1.4
local FN_W = W - PAD * 6 - DD_W - BTN_SM * 3
local fbh = FILE_H - 0.05
-- Dropdown
table.insert(fs, "dropdown[" .. PAD .. "," .. file_y .. ";" .. DD_W .. "," .. fbh
.. ";file_dropdown;" .. dd_str .. ";" .. dd_idx .. ";false]")
table.insert(fs, "tooltip[file_dropdown;Select a saved snippet]")
local fx = PAD + DD_W + PAD
-- Load button
table.insert(fs, "button[" .. fx .. "," .. file_y .. ";" .. BTN_SM .. "," .. fbh .. ";file_load;Load]")
table.insert(fs, "tooltip[file_load;Load selected file into editor]")
fx = fx + BTN_SM + PAD
-- Filename input
table.insert(fs, "field[" .. fx .. "," .. file_y .. ";" .. FN_W .. "," .. fbh .. ";filename_input;;" .. fn_esc .. "]")
table.insert(fs, "field_close_on_enter[filename_input;false]")
table.insert(fs, "style[filename_input;bgcolor=#1e1e1e;textcolor=#e8e8e8]")
table.insert(fs, "tooltip[filename_input;Filename to save as (auto-appends .lua)]")
fx = fx + FN_W + PAD
-- Save / New (root only)
if is_root(name) then
table.insert(fs, "style[file_save;bgcolor=#2a4a6a;textcolor=#ffffff]")
table.insert(fs, "button[" .. fx .. "," .. file_y .. ";" .. BTN_SM .. "," .. fbh .. ";file_save;Save]")
table.insert(fs, "tooltip[file_save;Save editor content as the given filename]")
fx = fx + BTN_SM + PAD
table.insert(fs, "button[" .. fx .. "," .. file_y .. ";" .. BTN_SM .. "," .. fbh .. ";file_new;New]")
table.insert(fs, "tooltip[file_new;Clear editor for a new file]")
end
-- ── Prompt Row ────────────────────────────────────────────
-- Layout: [Prompt field ............] [☐ Guide] [Generate]
local gen_w = 2.2
local guide_w = 3.2 -- checkbox + label
local pr_w = W - PAD * 4 - guide_w - gen_w
table.insert(fs, "field[" .. PAD .. "," .. prompt_y .. ";" .. pr_w .. "," .. PROMPT_H
.. ";prompt_input;;" .. prompt_esc .. "]")
table.insert(fs, "field_close_on_enter[prompt_input;false]")
table.insert(fs, "style[prompt_input;bgcolor=#1e1e1e;textcolor=#e8e8e8]")
table.insert(fs, "tooltip[prompt_input;Describe what code to generate, then click Generate]")
-- Naming guide toggle checkbox
local guide_on = session.guiding_active == true
local cx = PAD + pr_w + PAD
local guide_color = guide_on and "#1a3a1a" or "#252525"
table.insert(fs, "style[guide_toggle;bgcolor=" .. guide_color .. ";textcolor=#aaffaa]")
table.insert(fs, "checkbox[" .. cx .. "," .. (prompt_y + 0.15) .. ";guide_toggle;llm_connect: guide;" .. (guide_on and "true" or "false") .. "]")
table.insert(fs, "tooltip[guide_toggle;Inject naming convention guide into Generate calls.\nTeaches the LLM to use the llm_connect: prefix for registrations.]")
local gx = cx + guide_w + PAD
if can_execute(name) then
table.insert(fs, "style[generate;bgcolor=#2a4a6a;textcolor=#ffffff]")
else
table.insert(fs, "style[generate;bgcolor=#444444;textcolor=#888888]")
end
table.insert(fs, "button[" .. gx .. "," .. prompt_y .. ";" .. gen_w .. "," .. PROMPT_H
.. ";generate;Generate]")
table.insert(fs, "tooltip[generate;"
.. (can_execute(name) and "AI: generate code from your prompt" or "Generate (needs llm_dev)")
.. "]")
-- ── Editor & Output ───────────────────────────────────────
table.insert(fs, "style[code;bgcolor=#1e1e1e;textcolor=#e8e8e8;border=true]")
table.insert(fs, "textarea[" .. PAD .. "," .. work_y .. ";" .. (col_w - PAD) .. "," .. work_h
.. ";code;;" .. code_esc .. "]")
table.insert(fs, "style[output;bgcolor=#181818;textcolor=#cccccc;border=true]")
table.insert(fs, "textarea[" .. (PAD + col_w + PAD) .. "," .. work_y .. ";" .. (col_w - PAD) .. "," .. work_h
.. ";output;;" .. output_esc .. "]")
-- ── Status Bar ────────────────────────────────────────────
local sy = H - STATUS_H - PAD
table.insert(fs, "box[0," .. sy .. ";" .. W .. "," .. STATUS_H .. ";#1e1e1e]")
local status = "File: " .. fn_esc .. " | Modified: " .. os.date("%H:%M", session.last_modified)
if session.pending_proposal then
status = status .. " | ★ PROPOSAL READY click Apply"
end
table.insert(fs, "label[" .. PAD .. "," .. (sy + 0.22) .. ";" .. status .. "]")
core.show_formspec(name, "llm_connect:ide", table.concat(fs))
end
-- ======================================================
-- Formspec Handler
-- ======================================================
function M.handle_fields(name, formname, fields)
if not formname:match("^llm_connect:ide") then return false end
if not can_use_ide(name) then return true end
local session = get_session(name)
local updated = false
-- Capture live editor/field state
if fields.code then session.code = fields.code; session.last_modified = os.time() end
if fields.prompt_input then session.last_prompt = fields.prompt_input end
if fields.guide_toggle ~= nil then
session.guiding_active = (fields.guide_toggle == "true")
M.show(name)
return true
end
if fields.filename_input and fields.filename_input ~= "" then
local fn = fields.filename_input:match("^%s*(.-)%s*$")
if fn ~= "" then
if not fn:match("%.lua$") then fn = fn .. ".lua" end
session.filename = fn
end
end
-- Dropdown: track selection
if fields.file_dropdown then
local val = fields.file_dropdown
if val ~= "(no files)" and val ~= "" then
-- index_event=false → val is the filename directly
-- Fallback: if val is a number string, resolve via file_list index
local as_num = tonumber(val)
if as_num and session.file_list and session.file_list[as_num] then
val = session.file_list[as_num]
end
session.selected_file = val
end
updated = true
end
-- ── File operations ───────────────────────────────────────
if fields.file_load then
local target = session.selected_file
if target == "" or target == "(no files)" then
session.output = "Please select a file in the dropdown first."
else
local path = ensure_snippets_dir() .. DIR_DELIM .. target
local content, read_err = read_file(path)
if content then
session.code = content
session.filename = target
session.last_modified = os.time()
session.output = "✓ Loaded: " .. target
else
session.output = "✗ Could not read: " .. target
.. "\nPath: " .. path
.. (read_err and ("\nError: " .. tostring(read_err)) or "")
-- Remove from index if file is gone
index_remove(target)
session.file_list = list_snippet_files()
end
end
updated = true
elseif fields.file_save and is_root(name) then
local fn = session.filename
if fn == "" then fn = "untitled.lua" end
if not fn:match("%.lua$") then fn = fn .. ".lua" end
fn = fn:match("([^/\\]+)$") or fn -- prevent path traversal
session.filename = fn
local path = ensure_snippets_dir() .. DIR_DELIM .. fn
local ok, err = write_file(path, session.code)
if ok then
index_add(fn)
session.output = "✓ Saved: " .. fn
session.last_modified = os.time()
session.file_list = list_snippet_files()
session.selected_file = fn
else
session.output = "✗ Save failed: " .. tostring(err)
end
updated = true
elseif fields.file_new and is_root(name) then
session.code = DEFAULT_CODE
session.filename = "untitled.lua"
session.last_modified = os.time()
session.pending_proposal = nil
session.output = "New file ready. Write code and save."
updated = true
-- ── Toolbar actions ───────────────────────────────────────
elseif fields.syntax then
M.check_syntax(name); return true
elseif fields.analyze then
M.analyze_code(name); return true
elseif fields.explain then
M.explain_code(name); return true
elseif fields.generate and can_execute(name) then
M.generate_code(name); return true
elseif fields.run and can_execute(name) then
M.run_code(name); return true
elseif fields.apply then
if session.pending_proposal then
session.code = session.pending_proposal
session.pending_proposal = nil
session.last_modified = os.time()
session.output = "✓ Applied proposal to editor."
else
session.output = "No pending proposal to apply."
end
updated = true
elseif fields.close_ide or fields.quit then
if _G.chat_gui then _G.chat_gui.show(name) end
return true
end
if updated then M.show(name) end
return true
end
-- ======================================================
-- Actions (AI)
-- ======================================================
function M.check_syntax(name)
local session = get_session(name)
local func, err = loadstring(session.code)
if func then
session.output = "✓ Syntax OK no errors found."
M.show(name)
return
end
session.output = "✗ Syntax error:\n" .. tostring(err) .. "\n\nAsking AI to fix…"
M.show(name)
local p = get_prompts()
get_llm_api().code(p.SYNTAX_FIXER, session.code, function(result)
if result.success then
local fixed = result.content
fixed = fixed:match("```lua\n(.-)```") or fixed:match("```\n(.-)```") or fixed
session.pending_proposal = fixed
session.output = "AI fix proposal:\n\n" .. fixed .. "\n\n→ Press [Apply] to use."
else
session.output = "Syntax error:\n" .. tostring(err)
.. "\n\nAI fix failed: " .. (result.error or "?")
end
M.show(name)
end)
end
function M.analyze_code(name)
local session = get_session(name)
session.output = "Analyzing code… (please wait)"
M.show(name)
local p = get_prompts()
get_llm_api().code(p.SEMANTIC_ANALYZER, session.code, function(result)
if result.success then
local content = result.content
local code_part = content:match("```lua\n(.-)```") or content:match("```\n(.-)```")
local analysis = content:match("%-%-%[%[(.-)%]%]") or content
if code_part then
session.pending_proposal = code_part
session.output = "Analysis:\n" .. analysis .. "\n\n→ Improved code ready. Press [Apply]."
else
session.output = "Analysis:\n" .. content
end
else
session.output = "Error: " .. (result.error or "No response")
end
M.show(name)
end)
end
function M.explain_code(name)
local session = get_session(name)
session.output = "Explaining code… (please wait)"
M.show(name)
local p = get_prompts()
get_llm_api().code(p.CODE_EXPLAINER, session.code, function(result)
session.output = result.success and result.content or ("Error: " .. (result.error or "?"))
M.show(name)
end)
end
function M.generate_code(name)
local session = get_session(name)
local user_req = (session.last_prompt or ""):match("^%s*(.-)%s*$")
if user_req == "" then
session.output = "Please enter a prompt in the field above first."
M.show(name)
return
end
session.output = "Generating code… (please wait)"
M.show(name)
local p = get_prompts()
-- Append naming guide if toggle is active in session
local guide_addendum = ""
if session.guiding_active and p.NAMING_GUIDE then
guide_addendum = p.NAMING_GUIDE
end
local sys_msg = p.CODE_GENERATOR .. guide_addendum .. "\n\nUser request: " .. user_req
get_llm_api().code(sys_msg, session.code, function(result)
if result.success and result.content then
local gen = result.content
gen = gen:match("```lua\n(.-)```") or gen:match("```\n(.-)```") or gen
session.pending_proposal = gen
session.output = "Generated code proposal:\n\n" .. gen
.. "\n\n→ Press [Apply] to insert into editor."
else
session.output = "Generation failed: " .. (result.error or "No response")
end
M.show(name)
end)
end
function M.run_code(name)
local session = get_session(name)
local executor = get_executor()
session.output = "Executing… (please wait)"
M.show(name)
local res = executor.execute(name, session.code, {sandbox = true})
if res.success then
local out = "✓ Execution successful.\n\nOutput:\n"
.. (res.output ~= "" and res.output or "(no output)")
if res.return_value then out = out .. "\n\nReturn: " .. tostring(res.return_value) end
if res.persisted then out = out .. "\n\n→ Startup file updated (restart needed)" end
session.output = out
else
session.output = "✗ Execution failed:\n" .. (res.error or "Unknown error")
if res.output and res.output ~= "" then
session.output = session.output .. "\n\nOutput before error:\n" .. res.output
end
end
M.show(name)
end
-- Cleanup
core.register_on_leaveplayer(function(player)
sessions[player:get_player_name()] = nil
end)
return M

40
ide_languages.lua Normal file
View File

@@ -0,0 +1,40 @@
local LANGUAGE_NAMES = {
en = "English",
de = "German",
es = "Spanish",
fr = "French",
it = "Italian",
pt = "Portuguese",
ru = "Russian",
zh = "Chinese",
ja = "Japanese",
ko = "Korean",
ar = "Arabic",
hi = "Hindi",
tr = "Turkish",
nl = "Dutch",
pl = "Polish",
sv = "Swedish",
da = "Danish",
no = "Norwegian",
fi = "Finnish",
cs = "Czech",
hu = "Hungarian",
ro = "Romanian",
el = "Greek",
th = "Thai",
vi = "Vietnamese",
id = "Indonesian",
ms = "Malay",
he = "Hebrew",
bn = "Bengali",
uk = "Ukrainian",
}
function get_language_name(code)
return LANGUAGE_NAMES[code] or "English"
end
return {
get_language_name = get_language_name
}

120
ide_system_prompts.lua Normal file
View File

@@ -0,0 +1,120 @@
-- smart_lua_ide/prompts.lua
-- System prompts for different AI assistant modes
local prompts = {}
prompts.SYNTAX_FIXER = [[You are a Lua syntax corrector specialized in Minetest/Luanti mod development.
Your task: Fix ONLY syntax errors in the provided code.
Rules:
1. Return ONLY the corrected Lua code
2. NO explanations, NO markdown blocks, NO comments
3. Preserve the original logic and structure
4. Fix: missing 'end', unmatched parentheses, typos in keywords, etc.
5. Do NOT refactor or optimize - only fix syntax
6. Do NOT add any filesystem/network/system access
Output format: Raw Lua code only.]]
prompts.SEMANTIC_ANALYZER = [[You are a Lua code analyzer for Minetest/Luanti mods.
Your task: Analyze code for logic errors, API misuse, and improvements.
Context:
- Minetest Lua API version 5.x
- Common APIs: core.register_node, core.register_tool, core.register_chatcommand
- Deprecated functions should be flagged
Security rules:
- Do NOT introduce os/io/debug/require/dofile/loadfile/package
- Do NOT introduce core.request_http_api or core.request_insecure_environment
Output format:
1. First, provide the CORRECTED CODE
2. Then, add a comment block explaining:
- What was wrong
- What was changed
- Why it matters
Example format:
-- [CORRECTED CODE HERE]
--[[ ANALYSIS:
- ...
]]
prompts.CODE_EXPLAINER = [[You are a Minetest/Luanti mod development tutor.
Your task: Explain the provided Lua code in simple terms.
Focus on:
1. What the code does (high-level)
2. Key Minetest API calls and their purpose
3. Potential issues or improvements
4. Best practices being followed/violated
Be concise but educational.]]
prompts.CODE_GENERATOR = [[You are a Minetest/Luanti mod code generator.
Your task: Generate clean, functional Lua code based on the user's request.
Requirements:
1. Use modern Minetest API (5.x+)
2. Include error handling where appropriate
3. Add brief inline comments for complex logic
4. Follow Minetest coding conventions
5. Return ONLY executable Lua code
Security requirements (important):
- Do NOT use os/io/debug/package/require/dofile/loadfile
- Do NOT use core.request_http_api or core.request_insecure_environment
- Avoid privilege/auth manipulation APIs
Output: Raw Lua code ready to execute.]]
prompts.REFACTORER = [[You are a code refactoring expert for Minetest/Luanti mods.
Your task: Improve code quality without changing functionality.
Improvements:
1. Better variable names
2. Extract repeated code into functions
3. Optimize performance (e.g., caching, avoiding repeated lookups)
4. Improve readability and structure
5. Add helpful comments
Security requirements:
- Do NOT add os/io/debug/package/require/dofile/loadfile
- Do NOT add core.request_http_api or core.request_insecure_environment
Output:
1. Refactored code
2. Brief comment explaining major changes]]
-- ============================================================
-- Naming Convention Guide (opt-in, injected when guide_toggle is active)
-- Appended to CODE_GENERATOR when llm_connect: prefix guide is enabled.
-- ============================================================
prompts.NAMING_GUIDE = [[
IMPORTANT Luanti/Minetest Naming Conventions for this IDE:
This code runs inside the "llm_connect" mod context.
REGISTRATIONS always use "llm_connect:" prefix:
Correct: core.register_node("llm_connect:my_stone", { ... })
Correct: core.register_craftitem("llm_connect:magic_dust", { ... })
Incorrect: core.register_node("mymod:my_stone", { ... }) -- fails
Incorrect: core.register_node("default:my_stone", { ... }) -- fails
LUA STDLIB Luanti uses LuaJIT, not standard Lua:
No string:capitalize() use: (str:sub(1,1):upper() .. str:sub(2))
No string:split() use: string.gmatch or manual parsing
READING other mods is always fine:
core.get_node(pos) -- ok
core.registered_nodes["default:stone"] -- ok
]]
return prompts

582
init.lua
View File

@@ -1,422 +1,226 @@
-- ===========================================================================
-- LLM Connect Init v0.7.8
-- LLM Connect Init v0.9.0-dev
-- author: H5N3RG
-- license: LGPL-3.0-or-later
-- Fix: max_tokens type handling, fully configurable, robust JSON
-- Added: metadata for ingame-commads
-- Enhancement: Dynamic metadata handling, player name in prompts
-- NEW: Automatic API endpoint completion for compatibility.
-- UPDATE: Configurable context sending
-- ===========================================================================
local core = core
local mod_dir = core.get_modpath("llm_connect")
-- Load HTTP API
-- === HTTP API ===
local http = core.request_http_api()
if not http then
core.log("error", "[llm_connect] HTTP API not available! Add 'llm_connect' to secure.http_mods in minetest.conf!")
return
end
-- === Load settings from menu / settingtypes.txt ===
local api_key = core.settings:get("llm_api_key") or ""
local api_url = core.settings:get("llm_api_url") or ""
local model_name = core.settings:get("llm_model") or ""
-- NEW Context Settings
local send_server_info = core.settings:get_bool("llm_context_send_server_info")
local send_mod_list = core.settings:get_bool("llm_context_send_mod_list")
local send_commands = core.settings:get_bool("llm_context_send_commands")
local send_player_pos = core.settings:get_bool("llm_context_send_player_pos")
local send_materials = core.settings:get_bool("llm_context_send_materials")
-- NEW: Function to check and complete the API endpoint
local function finalize_api_url(url)
if not url or url == "" then
return ""
end
-- 1. Remove trailing slash if present
local clean_url = url:gsub("/$", "")
-- Check if the URL contains a path component (everything after the host:port)
-- We assume any '/' after the protocol part (e.g., 'http://') or the host:port
-- indicates a user-defined path, which should not be overwritten.
-- Simple check: if the URL contains more than two slashes (e.g. 'http://host')
-- or if it contains any character after the host:port that is not part of the port number.
-- Attempt to find the end of the host/port part (first '/' after the protocol slashes)
local protocol_end = clean_url:find("://")
local host_end = 0
if protocol_end then
host_end = clean_url:find("/", protocol_end + 3) -- Find the first slash after '://'
end
-- If no further slash is found (host_end is nil), it means only the base address (host:port) is present.
if not host_end then
-- Append the default OpenAI-compatible path
return clean_url .. "/v1/chat/completions"
end
-- If a path is found, use the URL as is.
return url
end
-- Apply the auto-completion/finalization logic
api_url = finalize_api_url(api_url)
-- max_tokens type: default integer, override via settings
local setting_val = core.settings:get_bool("llm_max_tokens_integer")
local max_tokens_type = "integer"
if setting_val == false then
max_tokens_type = "float"
end
-- Storage for conversation history per player
local history = {}
local max_history = { ["default"] = 10 }
local metadata_cache = {} -- Cache for metadata to detect changes
-- Helper functions
local function get_history(name)
history[name] = history[name] or {}
return history[name]
end
local function get_max_history(name)
return max_history[name] or max_history["default"]
end
local function string_split(str, delim)
local res = {}
local i = 1
local str_len = #str
local delim_len = #delim
while i <= str_len do
local pos = string.find(str, delim, i, true)
if pos then
table.insert(res, string.sub(str, i, pos - 1))
i = pos + delim_len
else
table.insert(res, string.sub(str, i))
break
end
end
return res
end
-- Load optional context files
local mod_dir = core.get_modpath("llm_connect")
local llm_materials_context = nil
pcall(function()
llm_materials_context = dofile(mod_dir .. "/llm_materials_context.lua")
end)
local function read_file_content(filepath)
local f = io.open(filepath, "r")
if not f then return nil end
local content = f:read("*a")
f:close()
return content
end
local system_prompt_content = read_file_content(mod_dir .. "/system_prompt.txt") or ""
-- === Privileges ===
core.register_privilege("llm", { description = "Can chat with the LLM model", give_to_singleplayer=true, give_to_admin=true })
core.register_privilege("llm_root", { description = "Can configure the LLM API key, model, and endpoint", give_to_singleplayer=true, give_to_admin=true })
core.register_privilege("llm", {
description = "LLM Connect: /llm chat interface (chat mode only)",
give_to_singleplayer = true,
give_to_admin = true,
})
-- === Metadata Functions ===
local meta_data_functions = {}
local function get_username(player_name) return player_name or "Unknown Player" end
local function get_installed_mods()
local mods = {}
-- NOTE: core.get_mods is undocumented. Using core.get_modnames (documented) instead.
if core.get_modnames then
-- core.get_modnames returns a table of mod names, already sorted alphabetically.
mods = core.get_modnames()
core.register_privilege("llm_dev", {
description = "LLM Connect: Smart Lua IDE + sandbox code execution (whitelist limited)",
give_to_singleplayer = false,
give_to_admin = false,
})
core.register_privilege("llm_worldedit", {
description = "LLM Connect: WorldEdit agency (WE Single + WE Loop + material picker)",
give_to_singleplayer = false,
give_to_admin = false,
})
core.register_privilege("llm_root", {
description = "LLM Connect: Full access (implies llm + llm_dev + llm_worldedit). Config, unrestricted execution, persistent code.",
give_to_singleplayer = false,
give_to_admin = true,
})
-- === Load central LLM API module ===
local llm_api_ok, llm_api = pcall(dofile, mod_dir .. "/llm_api.lua")
if not llm_api_ok or not llm_api then
core.log("error", "[llm_connect] Failed to load llm_api.lua: " .. tostring(llm_api))
return
end
if not llm_api.init(http) then
core.log("error", "[llm_connect] Failed to initialize llm_api")
return
end
-- === Load code executor ===
local executor_ok, executor = pcall(dofile, mod_dir .. "/code_executor.lua")
if not executor_ok or not executor then
core.log("error", "[llm_connect] Failed to load code_executor.lua: " .. tostring(executor))
return
end
-- === Load GUI modules ===
local chat_gui_ok, chat_gui = pcall(dofile, mod_dir .. "/chat_gui.lua")
if not chat_gui_ok then
core.log("error", "[llm_connect] Failed to load chat_gui.lua: " .. tostring(chat_gui))
return
end
local ide_gui_ok, ide_gui = pcall(dofile, mod_dir .. "/ide_gui.lua")
if not ide_gui_ok then
core.log("error", "[llm_connect] Failed to load ide_gui.lua: " .. tostring(ide_gui))
return
end
local config_gui_ok, config_gui = pcall(dofile, mod_dir .. "/config_gui.lua")
if not config_gui_ok then
core.log("error", "[llm_connect] Failed to load config_gui.lua: " .. tostring(config_gui))
return
end
-- === Load helpers ===
local chat_context_ok, chat_context = pcall(dofile, mod_dir .. "/chat_context.lua")
if not chat_context_ok then
core.log("warning", "[llm_connect] chat_context.lua not loaded: " .. tostring(chat_context))
chat_context = nil
end
-- === Load WorldEdit agency module (optional dependency) ===
local we_agency_ok, we_agency = pcall(dofile, mod_dir .. "/llm_worldedit.lua")
if not we_agency_ok then
core.log("warning", "[llm_connect] llm_worldedit.lua failed to load: " .. tostring(we_agency))
we_agency = nil
elseif not we_agency.is_available() then
core.log("warning", "[llm_connect] WorldEdit not detected at load time agency mode disabled")
core.log("warning", "[llm_connect] worldedit global type: " .. type(worldedit))
-- NOTE: we_agency is still set as global we_available() checks at runtime
-- so WE buttons may still appear if worldedit loads later (should not happen with optional_depends)
end
-- === Load material picker ===
local picker_ok, material_picker = pcall(dofile, mod_dir .. "/material_picker.lua")
if not picker_ok then
core.log("warning", "[llm_connect] material_picker.lua not loaded: " .. tostring(material_picker))
material_picker = nil
end
-- === Make modules globally available ===
_G.chat_gui = chat_gui
_G.llm_api = llm_api
_G.executor = executor
_G.we_agency = we_agency
_G.material_picker = material_picker
_G.ide_gui = ide_gui
_G.config_gui = config_gui
-- === Startup code loader ===
local startup_file = core.get_worldpath() .. "/llm_startup.lua"
local function load_startup_code()
local f = io.open(startup_file, "r")
if f then
f:close()
core.log("action", "[llm_connect] Loading startup code from " .. startup_file)
local ok, err = pcall(dofile, startup_file)
if not ok then
core.log("error", "[llm_connect] Startup code error: " .. tostring(err))
core.log("error", "[llm_connect] Fix the error in llm_startup.lua and restart the server")
else
-- Fallback for extremely old versions
table.insert(mods,"Mod list not available (core.get_modnames missing)")
core.log("action", "[llm_connect] Startup code loaded successfully")
end
return mods
end
-- Function to collect chat commands
local function get_installed_commands()
local commands = {}
if core.chatcommands then
for name, cmd in pairs(core.chatcommands) do
if not name:match("^__builtin:") then
local desc = cmd.description or "No description"
table.insert(commands, "/" .. name .. " " .. (cmd.params or "") .. " - " .. desc)
end
end
table.sort(commands)
else
table.insert(commands, "Command list not available.")
core.log("action", "[llm_connect] No llm_startup.lua found (this is normal on first run)")
end
return commands
end
local function get_server_settings()
local settings = {
server_name = core.settings:get("server_name") or "Unnamed Server",
server_description= core.settings:get("server_description") or "No description",
motd = core.settings:get("motd") or "No MOTD set",
port = core.settings:get("port") or "Unknown",
gameid = (core.get_game_info and core.get_game_info().id) or core.settings:get("gameid") or "Unknown",
game_name = (core.get_game_info and core.get_game_info().name) or "Unknown",
worldpath = core.get_worldpath() or "Unknown",
mapgen = core.get_mapgen_setting("mg_name") or "Unknown",
}
return settings
end
function meta_data_functions.gather_context(player_name)
local context = {}
context.player = get_username(player_name)
context.installed_mods = get_installed_mods()
context.installed_commands = get_installed_commands()
context.server_settings = get_server_settings()
-- Add dynamic player data (e.g., position)
local player = core.get_player_by_name(player_name)
if player then
local pos = player:get_pos()
context.player_position = string.format("x=%.2f, y=%.2f, z=%.2f", pos.x, pos.y, pos.z)
else
context.player_position = "Unknown"
end
return context
end
-- Compute a simple hash for metadata to detect changes
local function compute_metadata_hash(context)
-- Hash calculation now depends on which fields are active to avoid unnecessary cache busts
local str = context.player
if send_server_info then str = str .. context.server_settings.server_name .. context.server_settings.worldpath end
if send_mod_list then str = str .. table.concat(context.installed_mods, ",") end
if send_commands then str = str .. table.concat(context.installed_commands, ",") end
if send_player_pos then str = str .. context.player_position end
-- Material context has its own hash in llm_materials_context.lua, so we don't include it here
return core.sha1(str)
end
load_startup_code()
-- === Chat Commands ===
core.register_chatcommand("llm_setkey", {
params = "<key> [url] [model]",
description = "Sets the API key, URL, and model for the LLM.",
privs = {llm_root=true},
func = function(name,param)
if not core.check_player_privs(name,{llm_root=true}) then return false,"No permission!" end
local parts = string_split(param," ")
if #parts==0 then return false,"Please provide API key!" end
api_key = parts[1]
if parts[2] then api_url = finalize_api_url(parts[2]) end -- Apply finalization here too
if parts[3] then model_name = parts[3] end
core.chat_send_player(name,"[LLM] API key, URL and model set. (URL auto-corrected if only host:port was provided.)")
return true
end,
})
core.register_chatcommand("llm_setmodel", {
params = "<model>",
description = "Sets the LLM model.",
privs = {llm_root=true},
func = function(name,param)
if not core.check_player_privs(name,{llm_root=true}) then return false,"No permission!" end
if param=="" then return false,"Provide a model name!" end
model_name = param
core.chat_send_player(name,"[LLM] Model set to '"..model_name.."'.")
return true
end,
})
core.register_chatcommand("llm_set_endpoint", {
params = "<url>",
description = "Sets the API endpoint URL.",
privs = {llm_root=true},
func = function(name,param)
if not core.check_player_privs(name,{llm_root=true}) then return false,"No permission!" end
if param=="" then return false,"Provide URL!" end
api_url = finalize_api_url(param) -- Apply finalization here
core.chat_send_player(name,"[LLM] API endpoint set to "..api_url.." (URL auto-corrected if only host:port was provided.)")
return true
end,
})
core.register_chatcommand("llm_set_context", {
params = "<count> [player]",
description = "Sets the max context length.",
privs = {llm_root=true},
func = function(name,param)
if not core.check_player_privs(name,{llm_root=true}) then return false,"No permission!" end
local parts = string_split(param," ")
local count = tonumber(parts[1])
local target_player = parts[2]
if not count or count<1 then return false,"Provide number > 0!" end
if target_player and target_player~="" then max_history[target_player]=count
else max_history["default"]=count end
core.chat_send_player(name,"[LLM] Context length set.")
return true
end,
})
core.register_chatcommand("llm_float", {
description = "Set max_tokens as float",
privs = {llm_root=true},
func = function(name)
max_tokens_type="float"
core.chat_send_player(name,"[LLM] max_tokens now sent as float.")
return true
end,
})
core.register_chatcommand("llm_integer", {
description = "Set max_tokens as integer",
privs = {llm_root=true},
func = function(name)
max_tokens_type="integer"
core.chat_send_player(name,"[LLM] max_tokens now sent as integer.")
return true
end,
})
core.register_chatcommand("llm_reset", {
description = "Resets conversation and context.",
privs = {llm=true},
func = function(name)
history[name] = {}
metadata_cache[name] = nil -- Reset metadata cache
core.chat_send_player(name,"[LLM] Conversation and metadata reset.")
end,
})
-- === Main Chat Command ===
core.register_chatcommand("llm", {
params = "<prompt>",
description = "Sends prompt to the LLM",
description = "Opens the LLM chat interface",
privs = {llm = true},
func = function(name)
chat_gui.show(name)
return true, "Opening LLM chat..."
end,
})
core.register_chatcommand("llm_msg", {
params = "<message>",
description = "Send a direct message to the LLM (text-only, no GUI)",
privs = {llm = true},
func = function(name, param)
if not core.check_player_privs(name,{llm=true}) then return false,"No permission!" end
if param=="" then return false,"Provide a prompt!" end
if api_key=="" or api_url=="" or model_name=="" then
return false,"[LLM] API key, URL, or Model not set! Check mod settings."
if not param or param == "" then
return false, "Usage: /llm_msg <your question>"
end
local player_history = get_history(name)
local max_hist = get_max_history(name)
-- Add player name to prompt for clarity
local user_prompt = "Player " .. name .. ": " .. param
table.insert(player_history,{role="user",content=user_prompt})
while #player_history>max_hist do table.remove(player_history,1) end
-- Gather and cache metadata
local context_data = meta_data_functions.gather_context(name)
local current_metadata_hash = compute_metadata_hash(context_data)
local needs_metadata_update = not metadata_cache[name] or metadata_cache[name].hash ~= current_metadata_hash
local messages = {}
-- Build dynamic system prompt with metadata
local dynamic_system_prompt = system_prompt_content
if needs_metadata_update then
local metadata_string = "\n\n--- METADATA ---\n" ..
"Player: " .. context_data.player .. "\n"
-- Conditional Player Position
if send_player_pos then
metadata_string = metadata_string .. "Player Position: " .. context_data.player_position .. "\n"
end
-- Conditional Server Info
if send_server_info then
metadata_string = metadata_string ..
"Server Name: " .. context_data.server_settings.server_name .. "\n" ..
"Server Description: " .. context_data.server_settings.server_description .. "\n" ..
"MOTD: " .. context_data.server_settings.motd .. "\n" ..
"Game: " .. context_data.server_settings.game_name .. " (" .. context_data.server_settings.gameid .. ")\n" ..
"Mapgen: " .. context_data.server_settings.mapgen .. "\n" ..
"World Path: " .. context_data.server_settings.worldpath .. "\n" ..
"Port: " .. context_data.server_settings.port .. "\n"
end
-- Conditional Mod List
if send_mod_list then
local mods_list_str = table.concat(context_data.installed_mods,", ")
if #context_data.installed_mods>10 then mods_list_str="(More than 10 installed mods: "..#context_data.installed_mods..")" end
metadata_string = metadata_string ..
"Installed Mods (" .. #context_data.installed_mods .. "): " .. mods_list_str .. "\n"
end
-- Conditional Command List
if send_commands then
local commands_list_str = table.concat(context_data.installed_commands, "\n")
metadata_string = metadata_string ..
"Available Commands:\n" .. commands_list_str .. "\n"
end
-- Conditional Materials Context
if send_materials and llm_materials_context and llm_materials_context.get_available_materials then
metadata_string = metadata_string ..
"\n--- AVAILABLE MATERIALS ---\n" .. llm_materials_context.get_available_materials()
end
dynamic_system_prompt = system_prompt_content .. metadata_string
metadata_cache[name] = { hash = current_metadata_hash, metadata = metadata_string }
local messages = {{role = "user", content = param}}
llm_api.request(messages, function(result)
if result.success then
core.chat_send_player(name, "[LLM] " .. (result.content or "(no response)"))
else
dynamic_system_prompt = system_prompt_content .. metadata_cache[name].metadata
core.chat_send_player(name, "[LLM] Error: " .. (result.error or "unknown error"))
end
table.insert(messages,{role="system",content=dynamic_system_prompt})
for _,msg in ipairs(player_history) do table.insert(messages,msg) end
-- === max_tokens handling with final JSON fix ===
local max_tokens_value = 2000
if max_tokens_type == "integer" then
max_tokens_value = math.floor(max_tokens_value)
else
max_tokens_value = tonumber(max_tokens_value)
end
local body = core.write_json({ model=model_name, messages=messages, max_tokens=max_tokens_value })
-- Force integer in JSON string if needed (important for Go backends)
if max_tokens_type == "integer" then
body = body:gsub('"max_tokens"%s*:%s*(%d+)%.0', '"max_tokens": %1')
end
core.log("action", "[llm_connect DEBUG] max_tokens_type = " .. max_tokens_type)
core.log("action", "[llm_connect DEBUG] max_tokens_value = " .. tostring(max_tokens_value))
core.log("action", "[llm_connect DEBUG] API URL used: " .. api_url) -- Log the final URL
-- Send HTTP request
http.fetch({
url = api_url,
post_data = body,
method = "POST",
extra_headers = {
"Content-Type: application/json",
"Authorization: Bearer " .. api_key
},
timeout = 90,
}, function(result)
if result.succeeded then
local response = core.parse_json(result.data)
local text = "(no answer)"
if response and response.choices and response.choices[1] and response.choices[1].message then
text = response.choices[1].message.content
table.insert(player_history,{role="assistant",content=text})
elseif response and response.message and response.message.content then
text = response.message.content
end
core.chat_send_player(name,"[LLM] "..text)
else
core.chat_send_player(name,"[LLM] Request failed: "..(result.error or "Unknown error"))
end
end)
end, {timeout = llm_api.get_timeout("chat")})
return true, "Request sent..."
end,
})
core.register_chatcommand("llm_undo", {
description = "Undo the last WorldEdit agency operation",
privs = {llm = true},
func = function(name)
if not _G.we_agency then
return false, "WorldEdit agency module not loaded"
end
local res = _G.we_agency.undo(name)
return res.ok, "[LLM] " .. res.message
end,
})
core.register_chatcommand("llm_reload_startup", {
description = "Reload llm_startup.lua (WARNING: Cannot register new items!)",
privs = {llm_root = true},
func = function(name)
core.log("action", "[llm_connect] Manual startup reload triggered by " .. name)
core.chat_send_player(name, "[LLM] WARNING: Reloading startup code at runtime")
core.chat_send_player(name, "[LLM] New registrations will FAIL. Restart server for registrations.")
local f = io.open(startup_file, "r")
if f then
f:close()
local ok, err = pcall(dofile, startup_file)
if not ok then
core.chat_send_player(name, "[LLM] x Reload failed: " .. tostring(err))
return false, "Reload failed"
else
core.chat_send_player(name, "[LLM] Reloaded (restart needed for registrations)")
return true, "Code reloaded"
end
else
core.chat_send_player(name, "[LLM] x No llm_startup.lua found")
return false, "File not found"
end
end,
})
-- === Central formspec handler ===
core.register_on_player_receive_fields(function(player, formname, fields)
if not player then return false end
local name = player:get_player_name()
if formname:match("^llm_connect:chat") or formname:match("^llm_connect:material_picker") then
return chat_gui.handle_fields(name, formname, fields)
elseif formname:match("^llm_connect:ide") then
return ide_gui.handle_fields(name, formname, fields)
elseif formname:match("^llm_connect:config") then
return config_gui.handle_fields(name, formname, fields)
end
return false
end)
-- === Logging ===
core.log("action", "[llm_connect] LLM Connect v0.9.0 loaded")
if llm_api.is_configured() then
core.log("action", "[llm_connect] LLM API ready - model: " .. tostring(llm_api.config.model))
else
core.log("warning", "[llm_connect] LLM API not configured yet - use /llm and open Config button")
end

View File

@@ -1,168 +0,0 @@
GNU LESSER GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
This version of the GNU Lesser General Public License incorporates
the terms and conditions of version 3 of the GNU General Public
License, supplemented by the additional permissions listed below.
0. Additional Definitions.
"This License" refers to version 3 of the GNU Lesser General Public
License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Library" refers to a covered work governed by this License,
other than an Application or a Combined Work as defined below.
An "Application" is any work that makes use of an interface provided
by the Library, but which is not otherwise based on the Library.
Defining a subclass of a class defined by the Library is deemed a mode
of using an interface provided by the Library.
A "Combined Work" is a work produced by combining or linking an
Application with the Library. The particular version of the Library
with which the Combined Work was made is also called the "Linked
Version".
The "Minimal Corresponding Source" for a Combined Work means the
Corresponding Source for the Combined Work, excluding any source code
for portions of the Combined Work that, considered in isolation, are
based on the Application, and not on the Linked Version.
The "Corresponding Application Code" for a Combined Work means the
object code and/or source code for the Application, including any data
and utility programs needed for reproducing the Combined Work from the
Application, but excluding the System Libraries of the Combined Work.
1. Exception to Section 3 of the GNU GPL.
You may convey a covered work under sections 3 and 4 of this License
without being bound by section 3 of the GNU GPL.
2. Conveying Modified Versions.
If you modify a copy of the Library, and, in your modifications, a
facility refers to a function or data to be supplied by an Application
that uses the facility (other than as an argument passed when the
facility is invoked), then you may convey a copy of the modified
version:
a) under this License, provided that you make a good faith effort to
ensure that, in the event an Application does not supply the
function or data, the facility still operates, and performs
whatever part of its purpose remains meaningful, or
b) under the GNU GPL, with none of the additional permissions of
this License applicable to that copy.
3. Object Code Incorporating Material from Library Header Files.
The object code form of an Application may incorporate material from
a header file that is part of the Library. You may convey such object
code under terms of your choice, provided that, if the incorporated
material is not limited to numerical parameters, data structure
layouts and accessors, or small macros, inline functions and templates
(ten or fewer lines in length), you do both of the following:
a) Give prominent notice with each copy of the object code that the
Library is used in it and that the Library and its use are
covered by this License.
b) Accompany the object code with a copy of the GNU GPL and this license
document.
4. Combined Works.
You may convey a Combined Work under terms of your choice that,
taken together, effectively do not restrict modification of the
portions of the Library contained in the Combined Work and reverse
engineering for debugging such modifications, if you also do each of
the following:
a) Give prominent notice with each copy of the Combined Work that
the Library is used in it and that the Library and its use are
covered by this License.
b) Accompany the Combined Work with a copy of the GNU GPL and this license
document.
c) For a Combined Work that displays copyright notices during
execution, include the copyright notice for the Library among
these notices, as well as a reference directing the user to the
copies of the GNU GPL and this license document.
d) Do one of the following:
0) Convey the Minimal Corresponding Source under the terms of this
License, and the Corresponding Application Code in a form
suitable for, and under terms that permit, the user to
recombine or relink the Application with a modified version of
the Linked Version to produce a modified Combined Work, in the
manner specified by section 6 of the GNU GPL for conveying
Corresponding Source.
1) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (a) uses at run time
a copy of the Library already present on the user's computer
system, and (b) will operate properly with a modified version
of the Library that is interface-compatible with the Linked
Version.
e) Provide Installation Information, but only if you would otherwise
be required to provide such information under section 6 of the GNU
GPL, and only to the extent that such information is
necessary to install and execute a modified version of the
Combined Work produced by recombining or relinking the
Application with a modified version of the Linked Version. (If
you use option 4d0, the Installation Information must accompany
the Minimal Corresponding Source and Corresponding Application
Code. If you use option 4d1, you must provide the Installation
Information in the manner specified by section 6 of the GNU GPL
for conveying Corresponding Source.)
5. Combined Libraries.
You may place library facilities that are a work based on the
Library side by side in a single library together with other library
facilities that are not Applications and are not covered by this
License, and convey such a combined library under terms of your
choice, if you do both of the following:
a) Accompany the combined library with a copy of the same work based
on the Library, uncombined with any other library facilities,
conveyed under the terms of this License.
b) Give prominent notice with the combined library that part of it
is a work based on the Library, and explaining where to find the
accompanying uncombined form of the same work.
6. Revised Versions of the GNU Lesser General Public License.
The Free Software Foundation may publish revised and/or new versions
of the GNU Lesser General Public License from time to time. Such new
versions will be similar in spirit to the present version, but may
differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the
Library as you received it specifies that a certain numbered version
of the GNU Lesser General Public License "or any later version"
applies to it, you have the option of following the terms and
conditions either of that published version or of any later version
published by the Free Software Foundation. If the Library as you
received it does not specify a version number of the GNU Lesser
General Public License, you may choose any version of the GNU Lesser
General Public License ever published by the Free Software Foundation.
If the Library as you received it specifies that a proxy can decide
whether future versions of the GNU Lesser General Public License shall
apply, that proxy's public statement of acceptance of any version is
permanent authorization for you to choose that version for the
Library.

314
llm_api.lua Normal file
View File

@@ -0,0 +1,314 @@
-- llm_api.lua
-- Central LLM API interface for LLM-Connect (v0.8+)
local core = core
local M = {}
-- Internal states
M.http = nil
M.config = {
api_key = "",
api_url = "",
model = "",
max_tokens = 4000,
max_tokens_integer = true,
temperature = 0.7,
top_p = 0.9,
presence_penalty = 0.0,
frequency_penalty = 0.0,
timeout = 120, -- global fallback
timeout_chat = 0, -- 0 = use global
timeout_ide = 0,
timeout_we = 0,
language = "en",
language_repeat = 1,
-- context
context_max_history = 20,
-- ide
ide_naming_guide = true,
ide_include_run_output = true,
ide_context_mod_list = true,
ide_context_node_sample = true,
ide_max_code_context = 300,
-- worldedit
we_max_iterations = 6,
we_snapshot = true,
}
local language_instruction_cache = nil
-- ============================================================
-- Initialization
-- ============================================================
function M.init(http_api)
if not http_api then
core.log("error", "[llm_api] No HTTP API provided")
return false
end
M.http = http_api
-- Load settings once
M.reload_config()
return true
end
-- ============================================================
-- Configuration loading / updating
-- ============================================================
function M.reload_config()
-- Read exact keys from settingtypes.txt
M.config.api_key = core.settings:get("llm_api_key") or ""
M.config.api_url = core.settings:get("llm_api_url") or ""
M.config.model = core.settings:get("llm_model") or ""
M.config.max_tokens = tonumber(core.settings:get("llm_max_tokens")) or 4000
M.config.max_tokens_integer = core.settings:get_bool("llm_max_tokens_integer", true)
M.config.temperature = tonumber(core.settings:get("llm_temperature")) or 0.7
M.config.top_p = tonumber(core.settings:get("llm_top_p")) or 0.9
M.config.presence_penalty = tonumber(core.settings:get("llm_presence_penalty")) or 0.0
M.config.frequency_penalty = tonumber(core.settings:get("llm_frequency_penalty")) or 0.0
M.config.timeout = tonumber(core.settings:get("llm_timeout")) or 120
M.config.timeout_chat = tonumber(core.settings:get("llm_timeout_chat")) or 0
M.config.timeout_ide = tonumber(core.settings:get("llm_timeout_ide")) or 0
M.config.timeout_we = tonumber(core.settings:get("llm_timeout_we")) or 0
M.config.language = core.settings:get("llm_language") or "en"
M.config.language_repeat = tonumber(core.settings:get("llm_language_instruction_repeat")) or 1
M.config.context_max_history = tonumber(core.settings:get("llm_context_max_history")) or 20
M.config.ide_naming_guide = core.settings:get_bool("llm_ide_naming_guide", true)
M.config.ide_include_run_output = core.settings:get_bool("llm_ide_include_run_output", true)
M.config.ide_context_mod_list = core.settings:get_bool("llm_ide_context_mod_list", true)
M.config.ide_context_node_sample = core.settings:get_bool("llm_ide_context_node_sample", true)
M.config.ide_max_code_context = tonumber(core.settings:get("llm_ide_max_code_context")) or 300
M.config.we_max_iterations = tonumber(core.settings:get("llm_we_max_iterations")) or 6
M.config.we_snapshot = core.settings:get_bool("llm_we_snapshot_before_exec", true)
-- Invalidate cache
language_instruction_cache = nil
end
-- Returns the effective timeout for a given mode ("chat", "ide", "we").
-- Uses per-mode override if > 0, otherwise falls back to global llm_timeout.
function M.get_timeout(mode)
local override = 0
if mode == "chat" then override = M.config.timeout_chat
elseif mode == "ide" then override = M.config.timeout_ide
elseif mode == "we" then override = M.config.timeout_we
end
if override and override > 0 then return override end
return M.config.timeout
end
function M.set_config(updates)
for k, v in pairs(updates) do
if M.config[k] ~= nil then
M.config[k] = v
end
end
language_instruction_cache = nil
end
function M.is_configured()
return M.config.api_key ~= "" and
M.config.api_url ~= "" and
M.config.model ~= ""
end
-- ============================================================
-- Language instruction (cached)
-- ============================================================
local function get_language_instruction()
if language_instruction_cache then
return language_instruction_cache
end
local lang = M.config.language
local repeat_count = math.max(0, M.config.language_repeat or 1)
if lang == "en" or repeat_count == 0 then
language_instruction_cache = ""
return ""
end
local lang_name = "English"
local lang_mod_path = core.get_modpath("llm_connect") .. "/ide_languages.lua"
local ok, lang_mod = pcall(dofile, lang_mod_path)
if ok and lang_mod and lang_mod.get_language_name then
lang_name = lang_mod.get_language_name(lang) or lang_name
end
local instr = "Important: Answer exclusively in " .. lang_name .. "!\n" ..
"All explanations, code, comments, output and any text you generate must be in " .. lang_name .. "."
local parts = {}
for _ = 1, repeat_count do
table.insert(parts, instr)
end
language_instruction_cache = table.concat(parts, "\n\n") .. "\n\n"
return language_instruction_cache
end
-- ============================================================
-- Request Function
-- ============================================================
function M.request(messages, callback, options)
if not M.is_configured() then
callback({ success = false, error = "LLM API not configured (Check API Key/URL/Model)" })
return
end
options = options or {}
local cfg = M.config
local lang_instr = get_language_instruction()
if lang_instr ~= "" and (not messages[1] or messages[1].role ~= "system") then
table.insert(messages, 1, { role = "system", content = lang_instr })
end
local body_table = {
model = options.model or cfg.model,
messages = messages,
max_tokens = options.max_tokens or cfg.max_tokens,
temperature = options.temperature or cfg.temperature,
top_p = options.top_p or cfg.top_p,
presence_penalty = options.presence_penalty or cfg.presence_penalty,
frequency_penalty = options.frequency_penalty or cfg.frequency_penalty,
stream = options.stream == true,
}
if options.tools then
body_table.tools = options.tools
body_table.tool_choice = options.tool_choice or "auto"
end
local max_t = body_table.max_tokens
if cfg.max_tokens_integer then
body_table.max_tokens = math.floor(max_t)
else
body_table.max_tokens = tonumber(max_t)
end
local body = core.write_json(body_table)
if cfg.max_tokens_integer then
body = body:gsub('"max_tokens"%s*:%s*(%d+)%.0', '"max_tokens": %1')
end
if core.settings:get_bool("llm_debug") then
core.log("action", "[llm_api] Requesting " .. cfg.model .. " at " .. cfg.api_url)
end
M.http.fetch({
url = cfg.api_url,
method = "POST",
data = body,
timeout = options.timeout or cfg.timeout,
extra_headers = {
"Content-Type: application/json",
"Authorization: Bearer " .. cfg.api_key,
},
}, function(result)
if not result.succeeded then
local err = "HTTP request failed"
if result.timeout then
err = "Request timed out (limit: " .. tostring(options.timeout or cfg.timeout) .. "s)"
elseif result.code then
err = "HTTP " .. tostring(result.code)
elseif result.error then
-- Proxy-level errors (Envoy overflow, connection reset, etc.)
local raw = tostring(result.error)
if raw:find("overflow") or raw:find("reset") or raw:find("upstream") then
err = "Proxy/upstream error (possibly Mistral overload or rate limit). Retry in a moment."
else
err = raw
end
end
callback({ success = false, error = err, code = result.code })
return
end
-- Handle non-JSON responses (proxy errors often return plain text)
local raw_data = tostring(result.data or "")
if raw_data:find("upstream connect error") or raw_data:find("reset reason") then
callback({ success = false, error = "Proxy/upstream error: " .. raw_data:sub(1, 80) .. " possibly Mistral overload, retry in a moment." })
return
end
local response = core.parse_json(result.data)
if not response or type(response) ~= "table" then
local raw_preview = raw_data:sub(1, 120)
callback({ success = false, error = "Invalid JSON response: " .. raw_preview })
return
end
if response.error then
callback({
success = false,
error = response.error.message or "API error",
error_type = response.error.type,
code = response.error.code
})
return
end
local content = nil
if response.choices and response.choices[1] then
content = response.choices[1].message.content
elseif response.message and response.message.content then
content = response.message.content
end
local ret = {
success = content ~= nil,
content = content,
raw = response,
finish_reason = response.choices and response.choices[1] and response.choices[1].finish_reason,
usage = response.usage,
}
if response.choices and response.choices[1] and response.choices[1].message.tool_calls then
ret.tool_calls = response.choices[1].message.tool_calls
end
if core.settings:get_bool("llm_debug") then
core.log("action", "[llm_api DEBUG] Raw response: " .. tostring(result.data or "no data"))
core.log("action", "[llm_api DEBUG] Parsed: " .. core.write_json(response or {}, true))
end
callback(ret)
end)
end
-- ============================================================
-- Helper Wrappers
-- ============================================================
function M.chat(messages, callback, options)
M.request(messages, callback, options)
end
function M.ask(system_prompt, user_message, callback, options)
local messages = {
{ role = "system", content = system_prompt },
{ role = "user", content = user_message },
}
M.request(messages, callback, options)
end
function M.code(system_prompt, code_block, callback, options)
local user_msg = "```lua\n" .. code_block .. "\n```"
M.ask(system_prompt, user_msg, callback, options)
end
return M

View File

@@ -1,79 +0,0 @@
-- mods/llm_connect/llm_materials_context.lua
local M = {}
-- Cache for materials to avoid recomputation
local materials_cache = nil
local materials_cache_hash = nil
-- Compute a hash for registered items to detect changes
local function compute_materials_hash()
local str = ""
for name, _ in pairs(core.registered_nodes) do str = str .. name end
for name, _ in pairs(core.registered_craftitems) do str = str .. name end
for name, _ in pairs(core.registered_tools) do str = str .. name end
for name, _ in pairs(core.registered_entities) do str = str .. name end
return core.sha1(str)
end
-- Function to collect available materials
function M.get_available_materials()
local current_hash = compute_materials_hash()
if materials_cache and materials_cache_hash == current_hash then
return materials_cache
end
local materials_info = {}
local current_mod_name = core.get_current_modname()
-- Collect nodes
for name, def in pairs(core.registered_nodes) do
if not name:match("^__builtin:") and not name:match("^ignore$") and not name:match("^air$") then
table.insert(materials_info, "Node: " .. name)
end
end
-- Collect craftitems
for name, def in pairs(core.registered_craftitems) do
if not name:match("^__builtin:") then
table.insert(materials_info, "Craftitem: " .. name)
end
end
-- Collect tools
for name, def in pairs(core.registered_tools) do
if not name:match("^__builtin:") then
table.insert(materials_info, "Tool: " .. name)
end
end
-- Collect entities
for name, def in pairs(core.registered_entities) do
if not name:match("^__builtin:") then
table.insert(materials_info, "Entity: " .. name)
end
end
-- Limit the output
local max_items_to_list = 50 -- Reduced for token efficiency
local total_items = #materials_info
local output_string = ""
if total_items > 0 then
output_string = "Registered materials (" .. total_items .. " in total):\n"
for i = 1, math.min(total_items, max_items_to_list) do
output_string = output_string .. " - " .. materials_info[i] .. "\n"
end
if total_items > max_items_to_list then
output_string = output_string .. " ... and " .. (total_items - max_items_to_list) .. " more materials (truncated).\n"
end
else
output_string = "No registered materials found.\n"
end
materials_cache = output_string
materials_cache_hash = current_hash
return output_string
end
return M

1315
llm_worldedit.lua Normal file

File diff suppressed because it is too large Load Diff

371
material_picker.lua Normal file
View File

@@ -0,0 +1,371 @@
-- material_picker.lua v2.0
-- Inventory-style material selection for the LLM WorldEdit context
--
-- UI: Tiles with item icons (item_image) + colored highlight when active
-- Search filter at the top, Toggle-All button, Remove-All button
-- Tiles are buttons → click toggles selection
--
-- PUBLIC API (used by chat_gui.lua / llm_worldedit.lua):
-- M.get_materials(player_name) → sorted list of node name strings
-- M.has_materials(player_name) → bool
-- M.build_material_context(player_name) → string for LLM system prompt
-- M.show(player_name) → open formspec
-- M.handle_fields(player_name, formname, fields) → bool
local core = core
local M = {}
-- ============================================================
-- Configuration
-- ============================================================
local COLS = 8 -- tiles per row
local TILE_SIZE = 1.4 -- tile width/height in formspec units
local TILE_PAD = 0.08 -- spacing between tiles
local MAX_NODES = 128 -- max candidates to render
-- ============================================================
-- Session state
-- ============================================================
local sessions = {}
local function get_session(name)
if not sessions[name] then
sessions[name] = {
materials = {}, -- [node_name] = true
filter = "",
page = 1, -- current page (pagination)
}
end
return sessions[name]
end
core.register_on_leaveplayer(function(player)
sessions[player:get_player_name()] = nil
end)
-- ============================================================
-- PUBLIC API
-- ============================================================
function M.get_materials(player_name)
local sess = get_session(player_name)
local list = {}
for node in pairs(sess.materials) do
table.insert(list, node)
end
table.sort(list)
return list
end
function M.has_materials(player_name)
local sess = get_session(player_name)
for _ in pairs(sess.materials) do return true end
return false
end
function M.build_material_context(player_name)
local mats = M.get_materials(player_name)
if #mats == 0 then return nil end
return table.concat({
"--- PLAYER-SELECTED BUILD MATERIALS ---",
"The player has explicitly chosen the following node(s) for this build.",
"Prefer these exact node names when generating tool_calls.",
"Nodes: " .. table.concat(mats, ", "),
"--- END MATERIALS ---",
}, "\n")
end
-- ============================================================
-- Registry filter
-- ============================================================
local function build_candidate_list(filter)
filter = (filter or ""):lower():trim()
local candidates = {}
for name, def in pairs(core.registered_nodes) do
if not name:match("^__builtin")
and name ~= "air" and name ~= "ignore"
then
if filter == ""
or name:lower():find(filter, 1, true)
or (def.description and def.description:lower():find(filter, 1, true))
then
table.insert(candidates, name)
end
end
end
table.sort(candidates)
return candidates
end
-- ============================================================
-- Formspec builder
-- ============================================================
-- Calculates page count
local function get_page_info(total, per_page, current_page)
local total_pages = math.max(1, math.ceil(total / per_page))
current_page = math.max(1, math.min(current_page, total_pages))
local first = (current_page - 1) * per_page + 1
local last = math.min(total, current_page * per_page)
return current_page, total_pages, first, last
end
local ITEMS_PER_PAGE = COLS * 6 -- 6 rows = 48 tiles per page
function M.show(player_name)
local sess = get_session(player_name)
local filter = sess.filter or ""
local candidates = build_candidate_list(filter)
local total = #candidates
local page, total_pages, first, last =
get_page_info(total, ITEMS_PER_PAGE, sess.page)
sess.page = page -- write corrected page back
local selected_count = 0
for _ in pairs(sess.materials) do selected_count = selected_count + 1 end
-- ── Dimensions ────────────────────────────────────────
local W = COLS * (TILE_SIZE + TILE_PAD) + 0.5
local HDR_H = 0.9
local SRCH_H = 0.7
local INFO_H = 0.4
local GRID_H = 6 * (TILE_SIZE + TILE_PAD)
local NAV_H = 0.7
local BTN_H = 0.75
local PAD = 0.25
local H = HDR_H + PAD + SRCH_H + PAD + INFO_H + PAD + GRID_H + PAD + NAV_H + PAD + BTN_H + PAD
local fs = {
"formspec_version[6]",
"size[" .. string.format("%.2f", W) .. "," .. string.format("%.2f", H) .. "]",
"bgcolor[#0d0d0d;both]",
"style_type[*;bgcolor=#181818;textcolor=#e0e0e0]",
}
-- ── Header ─────────────────────────────────────────────
table.insert(fs, "box[0,0;" .. string.format("%.2f", W) .. "," .. HDR_H .. ";#1e1e2e]")
table.insert(fs, "label[" .. PAD .. ",0.35;⚙ Build Materials — "
.. core.formspec_escape(player_name)
.. " (" .. selected_count .. " selected)]")
table.insert(fs, "style[close_picker;bgcolor=#3a1a1a;textcolor=#ffaaaa]")
table.insert(fs, "button[" .. string.format("%.2f", W - PAD - 2.0) .. ",0.12;2.0,0.65;close_picker;✕ Close]")
local y = HDR_H + PAD
-- ── Search field ────────────────────────────────────────
local field_w = W - PAD * 2 - 2.6
table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. string.format("%.2f", field_w) .. "," .. SRCH_H
.. ";filter;;" .. core.formspec_escape(filter) .. "]")
table.insert(fs, "style[filter;bgcolor=#111122;textcolor=#ccccff]")
table.insert(fs, "field_close_on_enter[filter;false]")
table.insert(fs, "style[do_filter;bgcolor=#1a1a3a;textcolor=#aaaaff]")
table.insert(fs, "button[" .. string.format("%.2f", PAD + field_w + 0.1) .. "," .. y
.. ";2.4," .. SRCH_H .. ";do_filter;⟳ Search]")
y = y + SRCH_H + PAD
-- ── Info row + Toggle-All ─────────────────────────────
local info_str
if total == 0 then
info_str = "No nodes found"
else
info_str = string.format("%d node(s) — page %d/%d", total, page, total_pages)
end
table.insert(fs, "label[" .. PAD .. "," .. (y + 0.05) .. ";" .. core.formspec_escape(info_str) .. "]")
-- Toggle-All button (select/deselect all on this page)
local page_nodes = {}
for i = first, last do
table.insert(page_nodes, candidates[i])
end
local page_all_selected = #page_nodes > 0
for _, n in ipairs(page_nodes) do
if not sess.materials[n] then page_all_selected = false; break end
end
local toggle_label = page_all_selected and "☑ Deselect Page" or "☐ Select Page"
local toggle_color = page_all_selected and "#2a4a2a" or "#333344"
table.insert(fs, "style[toggle_page;bgcolor=" .. toggle_color .. ";textcolor=#ccffcc]")
table.insert(fs, "button[" .. string.format("%.2f", W - PAD - 3.5) .. "," .. (y - 0.05)
.. ";3.5," .. INFO_H+0.1 .. ";toggle_page;" .. toggle_label .. "]")
y = y + INFO_H + PAD
-- ── Tile grid ────────────────────────────────────────
-- Each tile = item_image_button (icon) + colored background when active
local col = 0
local row = 0
local IMG = TILE_SIZE - 0.25
local STEP = TILE_SIZE + TILE_PAD
for idx, node_name in ipairs(page_nodes) do
local tx = PAD + col * STEP
local ty = y + row * STEP
local is_sel = sess.materials[node_name] == true
-- Background box: green if selected, dark if not
local bg_color = is_sel and "#1a3a1a" or "#1a1a1a"
table.insert(fs, "box[" .. string.format("%.2f,%.2f;%.2f,%.2f", tx, ty, TILE_SIZE, TILE_SIZE)
.. ";" .. bg_color .. "]")
-- Item image button (clickable, shows icon)
-- button name encodes the candidate index: "tile_N"
local btn_name = "tile_" .. tostring((page - 1) * ITEMS_PER_PAGE + idx)
-- item_image_button[x,y;w,h;item;name;label]
table.insert(fs, "item_image_button["
.. string.format("%.2f,%.2f;%.2f,%.2f", tx + 0.05, ty + 0.05, IMG, IMG)
.. ";" .. core.formspec_escape(node_name)
.. ";" .. btn_name .. ";]")
-- Checkmark label top-right when selected
if is_sel then
table.insert(fs, "label["
.. string.format("%.2f,%.2f", tx + TILE_SIZE - 0.38, ty + 0.18)
.. ";§(c=#00ff00)✔]")
end
-- Tooltip: node name
local def = core.registered_nodes[node_name]
local desc = (def and def.description and def.description ~= "")
and def.description or node_name
table.insert(fs, "tooltip[" .. btn_name .. ";"
.. core.formspec_escape(desc .. "\n" .. node_name) .. "]")
col = col + 1
if col >= COLS then
col = 0
row = row + 1
end
end
y = y + GRID_H + PAD
-- ── Navigation ─────────────────────────────────────────
local nav_btn_w = 2.2
if total_pages > 1 then
table.insert(fs, "style[page_prev;bgcolor=#222233;textcolor=#aaaaff]")
table.insert(fs, "button[" .. PAD .. "," .. y .. ";" .. nav_btn_w .. "," .. NAV_H .. ";page_prev;◀ Prev]")
table.insert(fs, "style[page_next;bgcolor=#222233;textcolor=#aaaaff]")
table.insert(fs, "button[" .. string.format("%.2f", W - PAD - nav_btn_w) .. "," .. y
.. ";" .. nav_btn_w .. "," .. NAV_H .. ";page_next;Next ▶]")
end
y = y + NAV_H + PAD
-- ── Bottom buttons ──────────────────────────────────────
local b_w = (W - PAD * 3) / 2
table.insert(fs, "style[clear_all;bgcolor=#3a1a1a;textcolor=#ff8888]")
table.insert(fs, "button[" .. PAD .. "," .. y .. ";" .. string.format("%.2f", b_w) .. "," .. BTN_H
.. ";clear_all;✕ Clear All Selected]")
table.insert(fs, "style[close_and_back;bgcolor=#1a2a1a;textcolor=#aaffaa]")
table.insert(fs, "button[" .. string.format("%.2f", PAD * 2 + b_w) .. "," .. y
.. ";" .. string.format("%.2f", b_w) .. "," .. BTN_H .. ";close_and_back;✓ Done]")
core.show_formspec(player_name, "llm_connect:material_picker", table.concat(fs))
end
-- ============================================================
-- Formspec handler
-- ============================================================
function M.handle_fields(player_name, formname, fields)
if not formname:match("^llm_connect:material_picker") then
return false
end
local sess = get_session(player_name)
local candidates = build_candidate_list(sess.filter)
local total = #candidates
-- Update filter (live)
if fields.filter ~= nil then
sess.filter = fields.filter
end
-- ── Search / filter ──────────────────────────────────
if fields.do_filter or fields.key_enter_field == "filter" then
sess.page = 1
M.show(player_name)
return true
end
-- ── Pagination ──────────────────────────────────────
local page, total_pages = get_page_info(total, ITEMS_PER_PAGE, sess.page)
if fields.page_prev then
sess.page = math.max(1, page - 1)
M.show(player_name)
return true
end
if fields.page_next then
sess.page = math.min(total_pages, page + 1)
M.show(player_name)
return true
end
-- ── Toggle page ──────────────────────────────────────
if fields.toggle_page then
local _, _, first, last = get_page_info(total, ITEMS_PER_PAGE, sess.page)
-- Check if all are selected
local all_sel = true
for i = first, last do
if not sess.materials[candidates[i]] then all_sel = false; break end
end
-- Toggle
for i = first, last do
if all_sel then
sess.materials[candidates[i]] = nil
else
sess.materials[candidates[i]] = true
end
end
M.show(player_name)
return true
end
-- ── Clear all ────────────────────────────────────────
if fields.clear_all then
sess.materials = {}
M.show(player_name)
return true
end
-- ── Close / done ─────────────────────────────────────
if fields.close_picker or fields.close_and_back or fields.quit then
-- Signal to chat_gui: picker closed → re-open chat GUI
-- (handled in handle_fields of chat_gui.lua / init.lua)
return true
end
-- ── Tile buttons: tile_N ────────────────────────────
-- Format: tile_<global_index> (1-based across all pages)
for field_name, _ in pairs(fields) do
local global_idx = field_name:match("^tile_(%d+)$")
if global_idx then
global_idx = tonumber(global_idx)
local node = candidates[global_idx]
if node then
if sess.materials[node] then
sess.materials[node] = nil
core.chat_send_player(player_name, "[LLM] ✕ " .. node)
else
sess.materials[node] = true
core.chat_send_player(player_name, "[LLM] ✓ " .. node)
end
end
M.show(player_name)
return true
end
end
return true
end
-- ============================================================
core.log("action", "[llm_connect] material_picker.lua v2.0 loaded")
return M

View File

@@ -1,26 +1,7 @@
name = llm_connect
description = Connects your Luanti/Minetest server to an LLM (Large Language Model) using an OpenAI-compatible API endpoint.
description = Connects your Luanti server to an LLM + integrated AI-powered Lua IDE
title = LLM Connect
author = H5N3RG
license = LGPL-3.0-or-later
media_license = LGPL-3.0-or-later
forum =
depends =
optional_depends =
version = 0.7.8
# === Default Settings ===
llm_max_tokens_integer = true
llm_api_key =
llm_api_url =
llm_model =
# === Default Context Settings ===
llm_context_send_server_info = true
llm_context_send_mod_list = false
llm_context_send_commands = true
llm_context_send_player_pos = true
llm_context_send_materials = false
version = 0.9.0-dev
optional_depends = worldedit, worldeditadditions

View File

@@ -1,36 +1,92 @@
# ===========================================================================
# LLM Connect - Luanti/Minetest mod settings
# LLM Connect - Luanti/Minetest Mod Settings
# ===========================================================================
# Configure the LLM connection and behavior in the in-game menu
# Internal name, (Label), Type, Default, [min max for int/float]
# LLM Connection, Chat, IDE, WorldEdit & Prompt Behavior
# Version: 0.9.0
# ===========================================================================
# Determines whether max_tokens is sent as integer (true) or float (false)
# === LLM API Base Settings ===
llm_api_key (API Key) string
llm_api_url (API URL OpenAI compatible) string
llm_model (Model Name) string
llm_max_tokens (Max Tokens Response length) int 4000 500 16384
llm_max_tokens_integer (Send max_tokens as integer) bool true
# Your API key for the LLM endpoint
llm_api_key (API Key) string
llm_temperature (Temperature Creativity 0..2) float 0.7 0.0 2.0
llm_top_p (Top P Nucleus Sampling 0..1) float 0.9 0.0 1.0
llm_presence_penalty (Presence Penalty -2..2) float 0.0 -2.0 2.0
llm_frequency_penalty (Frequency Penalty -2..2) float 0.0 -2.0 2.0
# The URL of the OpenAI-compatible LLM endpoint
llm_api_url (API URL) string
# Global timeout for ALL LLM requests (chat, IDE, WorldEdit).
# Per-mode overrides (llm_timeout_chat/ide/we) take precedence if set > 0.
# Default: 120 seconds. Range: 30600.
llm_timeout (Global Request Timeout in seconds) int 120 30 600
# The model to use
[cite_start]for the LLM (leave empty for none) [cite: 34]
llm_model (Model) string
# Per-mode timeout overrides. Set to 0 to use the global llm_timeout.
llm_timeout_chat (Chat mode timeout override, 0=global) int 0 0 600
llm_timeout_ide (IDE mode timeout override, 0=global) int 0 0 600
llm_timeout_we (WorldEdit mode timeout override, 0=global) int 0 0 600
# === Context Configuration ===
llm_debug (Enable debug logging) bool false
# Send server name, description, motd, gameid, port, worldpath, mapgen
llm_context_send_server_info (Send Server Info) bool true
# Send the list of all installed mods
llm_context_send_mod_list (Send Mod List) bool false
# === Chat Context ===
# Send the list of all available chat commands
llm_context_send_commands (Send Commands List) bool true
llm_context_send_server_info (Send server info to LLM) bool true
llm_context_send_mod_list (Send list of active mods) bool false
llm_context_send_commands (Send available chat commands) bool true
llm_context_send_player_pos (Send player position and HP) bool true
llm_context_send_materials (Send node/item/tool registry sample) bool false
# Send the player's current position (x,y,z)
llm_context_send_player_pos (Send Player Position) bool true
# Max chat history messages sent per LLM request. Oldest dropped first.
llm_context_max_history (Max chat history messages sent) int 20 2 100
# Send the list of registered nodes, craftitems, tools, and entities
llm_context_send_materials (Send Available Materials) bool false
# === Language ===
llm_language (Response language) enum en en,de,es,fr,it,pt,ru,zh,ja,ko,ar,hi,tr,nl,pl,sv,da,no,fi,cs,hu,ro,el,th,vi,id,ms,he,bn,uk
llm_language_instruction_repeat (Repeat language instruction) int 1 0 5
# === IDE Behavior ===
llm_ide_hot_reload (Hot-reload world after execution) bool true
llm_ide_auto_save (Auto-save code buffer) bool true
llm_ide_live_suggestions (Live suggestions not yet implemented) bool false
llm_ide_whitelist_enabled (Sandbox security whitelist) bool true
# Send last run output to LLM so it can self-correct after a failed execution.
llm_ide_include_run_output (Send last run output for self-correction) bool true
# Max lines of code sent as context. Prevents token overflow. 0 = no limit.
llm_ide_max_code_context (Max code lines sent to LLM, 0=unlimited) int 300 0 2000
# === IDE Guiding Prompts ===
# Inject naming-convention guide into IDE LLM calls.
# Teaches the model that registrations must use the "llm_connect:" prefix.
llm_ide_naming_guide (Inject naming convention guide) bool true
# Inject context about active mods and nodes into Generate calls.
llm_ide_context_mod_list (Send mod list in IDE context) bool true
llm_ide_context_node_sample (Send node sample in IDE context) bool true
# === WorldEdit ===
llm_worldedit_additions (Enable WorldEditAdditions tools) bool true
llm_we_max_iterations (Max iterations in WE Loop mode) int 6 1 20
llm_we_snapshot_before_exec (Snapshot before each WE execution) bool true
# ===========================================================================
# Notes:
# - llm_timeout_*: 0 = inherit global llm_timeout
# - llm_language "en" = no language instruction injected (saves tokens)
# - llm_ide_* settings only affect the Smart Lua IDE
# - llm_we_* settings only affect WorldEdit agency mode
# - Timeout/config changes take effect after /llm_config reload or restart
# ===========================================================================

View File

@@ -1 +0,0 @@
YOU ARE A HELPFUL, KNOWLEDGEABLE, AND FRIENDLY AI ASSISTANT FOR THE LUANTI (FORMERLY MINETEST) GAME PLATFORM.

View File

@@ -0,0 +1,114 @@
-- worldedit_system_prompts.lua
-- System prompts for LLM WorldEdit agency mode
-- Used by llm_worldedit.lua
local P = {}
-- ============================================================
-- Base prompt (single-shot and loop mode)
-- ============================================================
P.SYSTEM_PROMPT = [[You are a WorldEdit agent inside a Luanti (Minetest) voxel game.
Your job is to translate the player's natural language building request into a sequence of WorldEdit tool calls.
You will receive:
- The player's current position (x, y, z)
- Their current WorldEdit selection (pos1, pos2) if any
- A coarse sample of nearby nodes
- The list of available tools
Respond ONLY with a JSON object:
{
"plan": "<one-sentence description of what you will do>",
"tool_calls": [
{"tool": "<tool_name>", "args": { ... }},
...
]
}
Do NOT add explanation text outside the JSON.
Do NOT invent tool names not in the available list.
Use "air" to remove/clear nodes.
Coordinates must be integers.
Example response:
{
"plan": "Place a 5x3x5 stone platform 2 blocks below the player.",
"tool_calls": [
{"tool": "set_pos1", "args": {"x": -12, "y": 63, "z": 44}},
{"tool": "set_pos2", "args": {"x": -8, "y": 65, "z": 48}},
{"tool": "set_region", "args": {"node": "default:stone"}}
]
}
]]
-- ============================================================
-- Loop mode addendum (appended to SYSTEM_PROMPT for run_loop)
-- ============================================================
P.LOOP_ADDENDUM = [[
ADDITIONAL RULES FOR ITERATIVE MODE:
STEP 1 ONLY On the very first step (when you receive only "Goal: ..."):
- First write a short OVERALL PLAN as your "plan" field describing ALL steps you intend to take.
- Then execute only the FIRST part of that plan in tool_calls.
- Example: Goal is "build a house" → plan = "Step 1/4: Place 10x5x10 stone floor. Then: hollow walls, add roof, add door."
SUBSEQUENT STEPS You receive "Completed steps so far:" plus your original goal:
- Your "plan" field should say which step of your overall plan this is (e.g. "Step 2/4: Hollow out walls")
- Only execute the CURRENT step, not the whole plan at once.
- If a previous step failed, note it and adapt. Never repeat a failing call unchanged.
DONE SIGNAL:
- Set "done": true only when the entire structure is complete.
- Set "done": true also if you are stuck after a failure.
- Always set "done": false if there are more steps remaining.
COORDINATE DISCIPLINE:
- Always use absolute integer coordinates.
- pos arguments for sphere/dome/cylinder/pyramid/cube must be {x,y,z} — never a string.
- pos1 and pos2 define the region for set_region, replace, copy, move, stack, flip, rotate.
Response format (strict JSON, no extra text):
{
"plan": "<step N/total: what this step does>",
"tool_calls": [ {"tool": "...", "args": {...}}, ... ],
"done": false,
"reason": ""
}
]]
-- ============================================================
-- WorldEditAdditions addendum (appended when WEA is available)
-- ============================================================
P.WEA_ADDENDUM = [[
When WorldEditAdditions (WEA) tools are available, you may use them alongside standard WorldEdit tools.
WEA tools require pos1 to be set (torus, ellipsoid, floodfill) or both pos1+pos2 (overlay, replacemix, layers, erode, convolve).
WEA tool examples:
- Torus: {"tool": "torus", "args": {"radius_major": 10, "radius_minor": 3, "node": "default:stone"}}
- Ellipsoid: {"tool": "ellipsoid", "args": {"rx": 8, "ry": 5, "rz": 8, "node": "default:dirt"}}
- Overlay: {"tool": "overlay", "args": {"node": "default:dirt_with_grass"}}
- Layers: {"tool": "layers", "args": {"layers": [{"node": "default:dirt_with_grass", "depth": 1}, {"node": "default:dirt", "depth": 3}]}}
- Erode: {"tool": "erode", "args": {"algorithm": "snowballs", "iterations": 2}}
- Convolve: {"tool": "convolve", "args": {"kernel": "gaussian", "size": 5}}
- Replacemix: {"tool": "replacemix", "args": {"target": "default:stone", "replacements": [{"node": "default:cobble", "chance": 2}, {"node": "default:mossy_cobble", "chance": 1}]}}
]]
-- ============================================================
-- Convenience: build full system prompt strings
-- ============================================================
-- Single-shot prompt (with optional WEA addendum)
function P.build_single(wea)
return P.SYSTEM_PROMPT .. (wea and P.WEA_ADDENDUM or "")
end
-- Loop prompt (with optional WEA addendum)
function P.build_loop(wea)
return P.SYSTEM_PROMPT .. P.LOOP_ADDENDUM .. (wea and P.WEA_ADDENDUM or "")
end
return P