diff --git a/LICENSE b/LICENSE
new file mode 100644
index 0000000..b8ed5b3
--- /dev/null
+++ b/LICENSE
@@ -0,0 +1,17 @@
+GNU LESSER GENERAL PUBLIC LICENSE
+Version 3, 29 June 2007
+
+Copyright (C) 2025 H5N3RG
+
+This program is free software: you can redistribute it and/or modify
+it under the terms of the GNU Lesser General Public License as published
+by the Free Software Foundation, either version 3 of the License, or
+(at your option) any later version.
+
+This program is distributed in the hope that it will be useful,
+but WITHOUT ANY WARRANTY; without even the implied warranty of
+MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+GNU Lesser General Public License for more details.
+
+You should have received a copy of the GNU Lesser General Public License
+along with this program. If not, see .
diff --git a/README.md b/README.md
index 64cd446..c575682 100644
--- a/README.md
+++ b/README.md
@@ -1,62 +1,299 @@
+
# LLM Connect
-A Luanti (formerly Minetest) mod that connects the game to a Large Language Model (LLM) using an OpenAI-compatible API endpoint.
+**A Luanti (formerly Minetest) mod that integrates Large Language Models (LLMs) directly into the game with an AI-powered Lua IDE and building assistant.**
-## Purpose
+
-This mod allows players to interact with an LLM directly within the Luanti chat.
-It sends the player's chat message along with relevant in-game contextโsuch as server info, installed mods, and available materialsโto a remote API endpoint.
-This enables the LLM to provide highly specific and helpful answers, e.g., on crafting items or locating resources in-game.
+---
-## Features
+## ๐ Overview
-- **In-game AI Chat:** Send prompts to the LLM using a simple chat command. - **Context-Aware & Granular:** Automatically includes crucial server and material data in the prompts. **The inclusion of context elements (server info, mod list, materials, position) is now fully configurable via settings.** - **Configurable:** API key, endpoint URL, model, and all **context components** can be set via chat commands or the in-game menu. - **Conversation History:** Maintains short-term conversation history for more relevant responses. - **Robust Token Handling:** Supports sending `max_tokens` as an integer to avoid floating-point issues; optionally configurable via `settingtypes.txt` or chat commands.
+LLM Connect brings modern AI assistance into Luanti worlds.
+Players and developers can interact with a Large Language Model directly in-game to:
-## Implementation
+- ask questions
+- generate Lua code
+- analyze or refactor scripts
+- assist with WorldEdit building tasks
+- experiment with sandboxed Lua execution
-- Built with Luanti's HTTP API for sending requests to an external, OpenAI-compatible endpoint. - Structured to be understandable and extendable for new contributors.
-- Version: **0.7.8**
+The mod combines an **AI chat interface**, a **Smart Lua IDE**, and **LLM-assisted building tools** into a single integrated workflow.
-## Requirements
+---
-- A running Luanti server.
-- An API key from a supported service. ## Supported API Endpoints
+## โจ Core Features
-Successfully tested with:
+### ๐ค AI Chat Interface
-- Open WebUI
+Interact with a Large Language Model directly inside Luanti.
+
+Features include:
+
+- In-game chat GUI
+- conversation context handling
+- player and world information awareness
+- configurable prompts and system instructions
+- support for OpenAI-compatible APIs
+
+The chat system automatically includes contextual information such as:
+
+- player position
+- installed mods
+- selected materials
+- server environment
+
+---
+
+### ๐ป Smart Lua IDE
+
+LLM Connect includes a fully integrated **AI-assisted Lua development environment**.
+
+Capabilities include:
+
+- AI code generation from natural language prompts
+- semantic code explanation
+- automated refactoring
+- code analysis
+- interactive editing interface
+- integration with the game environment
+
+Developers can experiment with Lua snippets directly inside the game.
+
+---
+
+### ๐งช Sandboxed Code Execution
+
+Lua code can be executed inside a controlled environment.
+
+Security features include:
+
+- privilege-based execution access
+- sandboxed runtime
+- optional whitelist restrictions
+- prevention of filesystem access
+
+Execution results are returned to the IDE interface for inspection.
+
+---
+
+### ๐๏ธ WorldEdit AI Assistant
+
+LLM Connect can assist with building tasks using WorldEdit.
+
+Examples:
+
+- structure generation prompts
+- building suggestions
+- node/material selection
+- architectural transformations
+
+The system can combine:
+
+- player position
+- selected materials
+- worldedit context
+
+to produce context-aware instructions.
+
+---
+
+### ๐งฑ Material Selection Tools
+
+The mod includes a **material picker** interface that helps the AI understand:
+
+- available nodes
+- building palettes
+- player selections
+
+This improves the quality of building-related prompts.
+
+---
+
+## ๐ Permission System
+
+Access to AI features is controlled through Luanti privileges.
+
+| Privilege | Description |
+|-----------|-------------|
+| `llm` | Basic AI chat access |
+| `llm_ide` | Access to the Smart Lua IDE |
+| `llm_dev` | Sandbox Lua execution |
+| `llm_root` | Full administrative control |
+
+Server operators should grant privileges carefully.
+
+---
+
+## ๐ Requirements
+
+- Luanti server **5.4.0 or newer recommended**
+- HTTP API enabled
+- Access to a compatible LLM endpoint
+
+Supported providers include:
+
+- OpenAI
+- Ollama
- LM Studio
-- Mistral AI
-- OpenAI API
-- Ollama and LocalAI (integer `max_tokens` ensures compatibility)
+- LocalAI
+- Open WebUI
+- Mistral
+- Together AI
+- any OpenAI-compatible API
-## Commands
+---
-- `/llm_setkey [url] [model]` โ Sets the API key, URL, and model for the LLM. - `/llm_setmodel ` โ Sets the LLM model to be used.
-- `/llm_set_endpoint ` โ Sets the API endpoint URL.
-- `/llm_set_context [player]` โ Sets the maximum context length for a player or all players.
-- `/llm_reset` โ Resets the conversation history and the cached metadata for the current player.
-- `/llm ` โ Sends a message to the LLM. - `/llm_integer` โ Forces `max_tokens` to be sent as an integer (default).
-- `/llm_float` โ Sends `max_tokens` as a float (optional, experimental).
+## ๐ Installation
-**Context Control (Configurable via In-Game Settings):**
-The following context elements can be individually toggled `ON`/`OFF` in the Luanti main menu:
-* Send Server Info (`llm_context_send_server_info`)
-* Send Mod List (`llm_context_send_mod_list`)
-* Send Commands List (`llm_context_send_commands`)
-* Send Player Position (`llm_context_send_player_pos`)
-* Send Available Materials (`llm_context_send_materials`)
+### ContentDB (recommended)
-## Potential for Expansion
+Install via ContentDB:
-- Add support for more API endpoints. - Integrate with additional in-game events or data sources (player inventory, world data).
-- Improve error handling and performance.
-- Create a graphical user interface (formspec) for configuration instead of relying solely on chat commands.
+```
-## Contributing
+Content โ Mods โ LLM Connect
+
+```
+
+---
+
+### Manual Installation
+
+1. Download the repository or release archive
+2. Extract into your `mods` folder
+3. Ensure the folder name is:
+
+```
+
+llm_connect
+
+```
+
+4. Enable HTTP API in `minetest.conf`
+
+```
+
+secure.http_mods = llm_connect
+
+```
+
+Restart the server.
+
+---
+
+## โ๏ธ Configuration
+
+Configuration can be done via:
+
+- `/llm_config` GUI
+- `minetest.conf`
+
+Example:
+
+```
+
+llm_api_key = your-api-key
+llm_api_url = [https://api.openai.com/v1/chat/completions](https://api.openai.com/v1/chat/completions)
+llm_model = gpt-4
+
+llm_temperature = 0.7
+llm_max_tokens = 4000
+llm_timeout = 120
+
+```
+
+Context options:
+
+```
+
+llm_context_send_player_pos = true
+llm_context_send_mod_list = true
+llm_context_send_materials = true
+
+```
+
+---
+
+## ๐ฎ Commands
+
+| Command | Description |
+|-------|-------------|
+| `/llm` | Open AI chat |
+| `/llm_ide` | Open Smart Lua IDE |
+| `/llm_config` | Open configuration interface |
+
+---
+
+## ๐ Security Notes
+
+LLM Connect includes multiple safety mechanisms:
+
+- privilege-based access control
+- sandboxed execution environment
+- optional Lua whitelist
+- no filesystem access in sandbox mode
+
+Server administrators should still review generated code carefully.
+
+---
+
+## ๐งญ Roadmap
+
+See:
+
+```
+
+ROADMAP_090.md
+
+```
+
+for planned improvements and upcoming features.
+
+---
+
+## ๐ค Contributing
+
+Contributions are welcome.
+
+Typical workflow:
+
+1. fork repository
+2. create feature branch
+3. implement changes
+4. submit pull request
+
+Areas of interest:
+
+- new AI integrations
+- UI improvements
+- security auditing
+- building tools
+- documentation
+
+---
+
+## ๐ License
+
+LGPL-3.0-or-later
+
+See `LICENSE`.
+
+---
+
+## ๐ Links
+
+ContentDB
+https://content.luanti.org/packages/H5N3RG/llm_connect/
+
+Luanti
+https://www.luanti.org/
+
+---
+
+**LLM Connect โ Bringing AI-assisted development into Luanti.**
+```
-This project is in an early stage and welcomes contributions:
-- Even small fixes help, especially with API integration, UI improvements, and performance tuning. - Contributions from experienced developers are highly welcome. ```
diff --git a/ROADMAP_090.md b/ROADMAP_090.md
new file mode 100644
index 0000000..74de33e
--- /dev/null
+++ b/ROADMAP_090.md
@@ -0,0 +1,109 @@
+# LLM Connect 0.9 Roadmap
+
+This document outlines the development goals and planned improvements
+for the 0.9 release of LLM Connect.
+
+The focus of this version is stability, improved context handling,
+and better integration between AI features and the Luanti environment.
+
+---
+
+## Overview
+
+Version 0.9 aims to refine the existing AI integration and extend
+the development tools provided by the Smart Lua IDE.
+
+Key areas include:
+
+- improved context awareness
+- better IDE workflow
+- enhanced WorldEdit integration
+- improved error handling and execution feedback
+
+---
+
+## Completed Features
+
+The following features are already implemented:
+
+- Improved request timeout handling
+- Expanded configuration options
+- Wider IDE layout and improved UI usability
+- Guide checkbox for code generation prompts
+- Custom file indexing system replacing `core.get_dir_list`
+- Improved proxy and API error handling
+
+---
+
+## In Progress
+
+These features are currently under active development:
+
+### IDE Context System
+
+The IDE will provide additional contextual information to the LLM.
+
+Planned context elements include:
+
+- active mods list
+- player position
+- currently opened file
+- execution output from previous runs
+
+This will allow the LLM to generate more accurate and relevant code.
+
+---
+
+## Planned Improvements
+
+### Execution Feedback Loop
+
+Improve the interaction between generated code and the execution system.
+
+Possible features:
+
+- automatic error detection
+- AI-assisted debugging
+- improved output visualization
+
+---
+
+### WorldEdit Integration
+
+Further improvements to AI-assisted building tools:
+
+- context-aware structure generation
+- material-aware building suggestions
+- improved prompt templates
+
+---
+
+### Prompt System Refinements
+
+Improve system prompts used for:
+
+- Lua code generation
+- WorldEdit assistance
+- general chat interactions
+
+The goal is more consistent and reliable responses.
+
+---
+
+## Future Ideas
+
+Ideas being explored for future versions:
+
+- agent-style AI workflows
+- multi-step code generation and correction
+- automatic debugging loops
+- extended IDE tooling
+- improved building automation tools
+
+---
+
+## Long-Term Vision
+
+LLM Connect aims to evolve into a complete AI-assisted development
+environment inside Luanti, enabling players and modders to experiment,
+prototype, and build complex systems directly within the game.
diff --git a/chat_context.lua b/chat_context.lua
new file mode 100644
index 0000000..fa94d1e
--- /dev/null
+++ b/chat_context.lua
@@ -0,0 +1,135 @@
+-- chat_context.lua
+-- Collects game and world context for the LLM
+-- Uses settings from settingtypes.txt
+
+local core = core
+local M = {}
+
+local materials_cache = nil
+local materials_hash = nil
+
+-- Computes a hash of the registry to detect changes in nodes or items
+local function compute_registry_hash()
+ local count = 0
+ for _ in pairs(core.registered_nodes) do count = count + 1 end
+ for _ in pairs(core.registered_items) do count = count + 1 end
+ return tostring(count)
+end
+
+-- Generates a string context of registered materials (nodes, tools, items)
+local function get_materials_context()
+ local current_hash = compute_registry_hash()
+ if materials_cache and materials_hash == current_hash then
+ return materials_cache
+ end
+
+ local lines = {}
+ local categories = {
+ {list = core.registered_nodes, label = "Nodes"},
+ {list = core.registered_tools, label = "Tools"},
+ {list = core.registered_craftitems, label = "Items"}
+ }
+
+ for _, cat in ipairs(categories) do
+ local count = 0
+ local items = {}
+ for name, _ in pairs(cat.list) do
+ -- Filter out internal engine nodes
+ if not name:match("^__builtin") and not name:match("^ignore") and not name:match("^air") then
+ count = count + 1
+ if count <= 40 then
+ table.insert(items, name)
+ end
+ end
+ end
+ table.insert(lines, cat.label .. ": " .. table.concat(items, ", "))
+ end
+
+ materials_cache = table.concat(lines, "\n")
+ materials_hash = current_hash
+ return materials_cache
+end
+
+-- Returns general information about the server and game state
+function M.get_server_info()
+ local info = {}
+ local version = core.get_version()
+ table.insert(info, "Game: " .. (core.get_game_info().name or "Luanti/Minetest"))
+ table.insert(info, "Engine Version: " .. (version.project or "unknown"))
+
+ if core.settings:get_bool("llm_context_send_mod_list") then
+ local mods = core.get_modnames()
+ table.sort(mods)
+ table.insert(info, "Active Mods: " .. table.concat(mods, ", "))
+ end
+
+ local time = core.get_timeofday() * 24000
+ local hour = math.floor(time / 1000)
+ local min = math.floor((time % 1000) / 1000 * 60)
+ table.insert(info, string.format("In-game Time: %02d:%02d", hour, min))
+
+ return table.concat(info, "\n")
+end
+
+-- Compiles all enabled context categories into a single string
+function M.get_context(name)
+ local ctx = {"--- START CONTEXT ---"}
+
+ -- 1. Server Info
+ if core.settings:get_bool("llm_context_send_server_info") ~= false then
+ table.insert(ctx, "--- SERVER INFO ---")
+ table.insert(ctx, M.get_server_info())
+ end
+
+ -- 2. Player Info
+ if core.settings:get_bool("llm_context_send_player_pos") ~= false then
+ local player = core.get_player_by_name(name)
+ if player then
+ local pos = player:get_pos()
+ local hp = player:get_hp()
+ local wielded = player:get_wielded_item():get_name()
+
+ table.insert(ctx, string.format("Current Player (%s): HP: %d, Pos: (x=%.1f, y=%.1f, z=%.1f)",
+ name, hp, pos.x, pos.y, pos.z))
+
+ if wielded ~= "" then
+ table.insert(ctx, "Holding item: " .. wielded)
+ end
+ end
+ end
+
+ -- 3. Chat Commands
+ if core.settings:get_bool("llm_context_send_commands") then
+ local cmds = {}
+ for cmd, _ in pairs(core.registered_chatcommands) do
+ table.insert(cmds, "/" .. cmd)
+ end
+ table.sort(cmds)
+ table.insert(ctx, "Available Commands (Top 50): " .. table.concat(cmds, ", ", 1, math.min(50, #cmds)))
+ end
+
+ -- 4. Materials
+ if core.settings:get_bool("llm_context_send_materials") then
+ table.insert(ctx, "--- REGISTERED MATERIALS ---")
+ table.insert(ctx, get_materials_context())
+ end
+
+ table.insert(ctx, "--- END CONTEXT ---")
+ return table.concat(ctx, "\n")
+end
+
+-- Injects the game context as a system message into the messages table
+function M.append_context(messages, name)
+ local context_str = M.get_context(name)
+
+ table.insert(messages, 1, {
+ role = "system",
+ content = "You are an AI assistant inside a Luanti (Minetest) game world. " ..
+ "Use the following game context to answer the user's questions accurately.\n\n" ..
+ context_str
+ })
+
+ return messages
+end
+
+return M
diff --git a/chat_gui.lua b/chat_gui.lua
new file mode 100644
index 0000000..c293ff2
--- /dev/null
+++ b/chat_gui.lua
@@ -0,0 +1,368 @@
+-- chat_gui.lua
+-- LLM Chat Interface v0.8.7
+-- Privilege model:
+-- llm โ Chat only
+-- llm_dev โ + IDE button
+-- llm_worldedit โ + WE Single/Loop + Mats + Undo
+-- llm_root โ Superrole: implies all of the above + Config button
+
+local core = core
+local M = {}
+
+local mod_path = core.get_modpath("llm_connect")
+
+local context_ok, chat_context = pcall(dofile, mod_path .. "/chat_context.lua")
+if not context_ok then
+ core.log("error", "[chat_gui] Failed to load chat_context.lua: " .. tostring(chat_context))
+ chat_context = nil
+end
+
+-- material_picker: prefer already-loaded global, fallback to dofile
+local material_picker = _G.material_picker
+if not material_picker then
+ local ok, mp = pcall(dofile, mod_path .. "/material_picker.lua")
+ if ok and mp then
+ material_picker = mp
+ else
+ core.log("warning", "[chat_gui] material_picker not available: " .. tostring(mp))
+ end
+end
+
+local function get_llm_api()
+ if not _G.llm_api then error("[chat_gui] llm_api not available") end
+ return _G.llm_api
+end
+
+-- ============================================================
+-- Privilege helpers
+-- llm_root is a superrole: implies llm + llm_dev + llm_worldedit
+-- ============================================================
+
+local function raw_priv(name, priv)
+ local p = core.get_player_privs(name) or {}
+ return p[priv] == true
+end
+
+local function has_priv(name, priv)
+ if raw_priv(name, "llm_root") then return true end
+ return raw_priv(name, priv)
+end
+
+local function can_chat(name) return has_priv(name, "llm") end
+local function can_ide(name) return has_priv(name, "llm_dev") end
+local function can_worldedit(name) return has_priv(name, "llm_worldedit") end
+local function can_config(name) return raw_priv(name, "llm_root") end -- root only, no implication upward
+
+-- ============================================================
+-- Session
+-- ============================================================
+
+local sessions = {}
+
+local WE_MODE_LABEL = {chat="Chat", single="WE Single", loop="WE Loop"}
+local WE_MODE_COLOR = {chat="#444455", single="#2a4a6a", loop="#4a2a6a"}
+local WE_MODE_COLOR_UNAVAIL = "#333333"
+
+local function get_session(name)
+ if not sessions[name] then
+ sessions[name] = {history={}, last_input="", we_mode="chat"}
+ end
+ return sessions[name]
+end
+
+local function we_available()
+ return type(_G.we_agency) == "table" and _G.we_agency.is_available()
+end
+
+local function cycle_we_mode(session, name)
+ if not we_available() then
+ core.chat_send_player(name, "[LLM] WorldEdit not available.")
+ return
+ end
+ local cur = session.we_mode
+ if cur == "chat" then session.we_mode = "single"
+ elseif cur == "single" then session.we_mode = "loop"
+ elseif cur == "loop" then session.we_mode = "chat"
+ end
+ core.chat_send_player(name, "[LLM] Mode: " .. WE_MODE_LABEL[session.we_mode])
+end
+
+-- ============================================================
+-- Build Formspec
+-- ============================================================
+
+function M.show(name)
+ if not can_chat(name) then
+ core.chat_send_player(name, "[LLM] Missing privilege: llm")
+ return
+ end
+
+ local session = get_session(name)
+ local text_accum = ""
+
+ for _, msg in ipairs(session.history) do
+ if msg.role ~= "system" then
+ local content = msg.content or ""
+ if msg.role == "user" then
+ text_accum = text_accum .. "You: " .. content .. "\n\n"
+ else
+ text_accum = text_accum .. "[LLM]: " .. content .. "\n\n"
+ end
+ end
+ end
+ if text_accum == "" then
+ text_accum = "Welcome to LLM Chat!\nType your question below."
+ end
+
+ local W = 16.0
+ local H = 12.0
+ local PAD = 0.25
+ local HEADER_H = 1.8
+ local INPUT_H = 0.7
+ local CHAT_H = H - HEADER_H - INPUT_H - (PAD * 6)
+
+ local fs = {
+ "formspec_version[6]",
+ "size[" .. W .. "," .. H .. "]",
+ "bgcolor[#0f0f0f;both]",
+ "style_type[*;bgcolor=#1a1a1a;textcolor=#e0e0e0]",
+ }
+
+ -- Header box
+ table.insert(fs, "box[0,0;" .. W .. "," .. HEADER_H .. ";#202020]")
+ table.insert(fs, "label[" .. PAD .. ",0.30;LLM Chat - " .. core.formspec_escape(name) .. "]")
+
+ -- โโ Header Zeile 1 rechts: Config (root) + IDE (dev) โโโโ
+ local right_x = W - PAD
+ if can_config(name) then
+ right_x = right_x - 2.0
+ table.insert(fs, "style[open_config;bgcolor=#2a2a1a;textcolor=#ffeeaa]")
+ table.insert(fs, "button[" .. right_x .. ",0.08;2.0,0.65;open_config;Config]")
+ table.insert(fs, "tooltip[open_config;Open LLM configuration (llm_root only)]")
+ end
+ if can_ide(name) then
+ right_x = right_x - 2.3 - 0.15
+ table.insert(fs, "style[open_ide;bgcolor=#1a1a2a;textcolor=#aaaaff]")
+ table.insert(fs, "button[" .. right_x .. ",0.08;2.3,0.65;open_ide;IDE]")
+ table.insert(fs, "tooltip[open_ide;Open Smart Lua IDE (llm_dev)]")
+ end
+
+ -- Header Zeile 2: drei direkte WE-Mode Buttons nebeneinander
+ local mode = session.we_mode or "chat"
+ if can_worldedit(name) then
+ local we_ok = we_available()
+ local dim = "#2a2a2a"
+
+ local function we_btn(bname, bx, bw, blabel, active, color_on, color_dim, tip)
+ local bg = we_ok and (active and color_on or color_dim) or dim
+ local fg = active and "#ffffff" or (we_ok and "#889999" or "#555555")
+ table.insert(fs, "style[" .. bname .. ";bgcolor=" .. bg .. ";textcolor=" .. fg .. "]")
+ table.insert(fs, "button[" .. bx .. ",0.95;" .. bw .. ",0.65;" .. bname .. ";" .. blabel .. "]")
+ table.insert(fs, "tooltip[" .. bname .. ";" .. (we_ok and tip or "WorldEdit not loaded") .. "]")
+ end
+
+ we_btn("we_btn_chat", PAD, 2.6, "Chat", mode=="chat", "#444466", "#1e1e2e", "Normal LLM chat mode")
+ we_btn("we_btn_single", PAD + 2.7, 2.8, "WE Single", mode=="single", "#2a4a7a", "#151d2a", "WorldEdit: one plan per message")
+ we_btn("we_btn_loop", PAD + 5.6, 2.6, "WE Loop", mode=="loop", "#4a2a7a", "#1e1228", "WorldEdit: iterative build loop (up to 6 steps)")
+
+ if mode == "single" or mode == "loop" then
+ local mat_count = material_picker and #material_picker.get_materials(name) or 0
+ local mat_label = mat_count > 0 and ("Mats (" .. mat_count .. ")") or "Mats"
+ local mat_color = mat_count > 0 and "#1a3a1a" or "#252525"
+ table.insert(fs, "style[we_materials_open;bgcolor=" .. mat_color .. ";textcolor=#aaffaa]")
+ table.insert(fs, "button[" .. (PAD + 8.3) .. ",0.95;2.6,0.65;we_materials_open;" .. mat_label .. "]")
+ table.insert(fs, "tooltip[we_materials_open;Material picker: attach node names to LLM context]")
+ end
+
+ table.insert(fs, "style[we_undo;bgcolor=#3a2020;textcolor=#ffaaaa]")
+ table.insert(fs, "button[" .. (W - PAD - 2.1) .. ",0.95;2.1,0.65;we_undo;Undo]")
+ table.insert(fs, "tooltip[we_undo;Undo last WorldEdit agency operation]")
+ end
+
+ -- Chat history
+ table.insert(fs, "textarea[" .. PAD .. "," .. (HEADER_H + PAD) .. ";"
+ .. (W - PAD*2) .. "," .. CHAT_H
+ .. ";history_display;;" .. core.formspec_escape(text_accum) .. "]")
+ table.insert(fs, "style[history_display;textcolor=#e0e0e0;bgcolor=#1a1a1a;border=false]")
+
+ -- Input
+ local input_y = HEADER_H + PAD + CHAT_H + PAD
+ table.insert(fs, "field[" .. PAD .. "," .. input_y .. ";"
+ .. (W - PAD*2 - 2.5) .. "," .. INPUT_H
+ .. ";input;;" .. core.formspec_escape(session.last_input) .. "]")
+ table.insert(fs, "button[" .. (W - PAD - 2.2) .. "," .. input_y
+ .. ";2.2," .. INPUT_H .. ";send;Send]")
+ table.insert(fs, "field_close_on_enter[input;false]")
+
+ -- Toolbar
+ local tb_y = input_y + INPUT_H + PAD
+ table.insert(fs, "button[" .. PAD .. "," .. tb_y .. ";2.8,0.75;clear;Clear Chat]")
+
+ core.show_formspec(name, "llm_connect:chat", table.concat(fs))
+end
+
+-- ============================================================
+-- Formspec Handler
+-- ============================================================
+
+function M.handle_fields(name, formname, fields)
+ -- Material Picker weiterleiten
+ if formname:match("^llm_connect:material_picker") then
+ if material_picker then
+ local result = material_picker.handle_fields(name, formname, fields)
+ if fields.close_picker or fields.close_and_back or fields.quit then
+ M.show(name)
+ end
+ return result
+ end
+ return false
+ end
+
+ if not formname:match("^llm_connect:chat") then return false end
+
+ local session = get_session(name)
+ local updated = false
+
+ -- โโ WE-Buttons (privilege-geprรผft) โโโโโโโโโโโโโโโโโโโโโโ
+
+ if fields.we_btn_chat then
+ if can_worldedit(name) then session.we_mode = "chat"; updated = true end
+ elseif fields.we_btn_single then
+ if can_worldedit(name) and we_available() then session.we_mode = "single"; updated = true end
+ elseif fields.we_btn_loop then
+ if can_worldedit(name) and we_available() then session.we_mode = "loop"; updated = true end
+
+ elseif fields.we_materials_open then
+ if can_worldedit(name) and material_picker then
+ material_picker.show(name)
+ end
+ return true
+
+ elseif fields.we_undo then
+ if can_worldedit(name) and _G.we_agency then
+ local res = _G.we_agency.undo(name)
+ table.insert(session.history, {role="assistant",
+ content=(res.ok and "Undo: " or "Error: ") .. res.message})
+ updated = true
+ end
+
+ -- โโ IDE / Config (privilege-geprรผft) โโโโโโโโโโโโโโโโโโโโ
+
+ elseif fields.open_ide then
+ if can_ide(name) and _G.ide_gui then
+ _G.ide_gui.show(name)
+ end
+ return true
+
+ elseif fields.open_config then
+ if can_config(name) and _G.config_gui then
+ _G.config_gui.show(name)
+ end
+ return true
+
+ -- โโ Send โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+
+ elseif fields.send or fields.key_enter_field == "input" then
+ local input = (fields.input or ""):trim()
+
+ if input ~= "" then
+ table.insert(session.history, {role="user", content=input})
+ session.last_input = ""
+
+ -- WE Loop (nur llm_worldedit)
+ if session.we_mode == "loop" and can_worldedit(name) and we_available() then
+ table.insert(session.history, {role="assistant", content="(starting WE loop...)"})
+ updated = true
+ local mat_ctx = material_picker and material_picker.build_material_context(name)
+ local loop_input = mat_ctx and (input .. "\n\n" .. mat_ctx) or input
+ _G.we_agency.run_loop(name, loop_input, {
+ max_iterations = (_G.llm_api and _G.llm_api.config.we_max_iterations or 6),
+ timeout = (_G.llm_api and _G.llm_api.get_timeout("we") or 90),
+ on_step = function(i, plan, results)
+ local lines = {"[WE Loop] Step " .. i .. ": " .. plan}
+ for _, r in ipairs(results) do
+ table.insert(lines, " " .. (r.ok and "v" or "x") .. " " .. r.tool .. ": " .. r.message)
+ end
+ core.chat_send_player(name, table.concat(lines, "\n"))
+ end,
+ }, function(res)
+ local reply = _G.we_agency.format_loop_results(res)
+ for i = #session.history, 1, -1 do
+ if session.history[i].content == "(starting WE loop...)" then
+ session.history[i].content = reply; break
+ end
+ end
+ M.show(name)
+ end)
+
+ -- WE Single (nur llm_worldedit)
+ elseif session.we_mode == "single" and can_worldedit(name) and we_available() then
+ table.insert(session.history, {role="assistant", content="(planning WE operations...)"})
+ updated = true
+ local mat_ctx = material_picker and material_picker.build_material_context(name)
+ local single_input = mat_ctx and (input .. "\n\n" .. mat_ctx) or input
+ _G.we_agency.request(name, single_input, function(res)
+ local reply = not res.ok
+ and ("Error: " .. (res.error or "unknown"))
+ or _G.we_agency.format_results(res.plan, res.results)
+ for i = #session.history, 1, -1 do
+ if session.history[i].content == "(planning WE operations...)" then
+ session.history[i].content = reply; break
+ end
+ end
+ M.show(name)
+ end)
+
+ -- Normal Chat (immer erlaubt wenn llm)
+ else
+ -- WE-Mode zurรผcksetzen wenn kein Privileg
+ if session.we_mode ~= "chat" and not can_worldedit(name) then
+ session.we_mode = "chat"
+ end
+ local messages = {}
+ local context_added = false
+ if chat_context then
+ messages = chat_context.append_context(messages, name)
+ if #messages > 0 and messages[1].role == "system" then context_added = true end
+ end
+ if not context_added then
+ table.insert(messages, 1, {role="system",
+ content="You are a helpful assistant in the Luanti/Minetest game."})
+ end
+ for _, msg in ipairs(session.history) do table.insert(messages, msg) end
+ table.insert(session.history, {role="assistant", content="(thinking...)"})
+ updated = true
+ local llm_api = get_llm_api()
+ llm_api.request(messages, function(result)
+ local content = result.success and result.content
+ or "Error: " .. (result.error or "Unknown error")
+ for i = #session.history, 1, -1 do
+ if session.history[i].content == "(thinking...)" then
+ session.history[i].content = content; break
+ end
+ end
+ M.show(name)
+ end, {timeout = (_G.llm_api and _G.llm_api.get_timeout("chat") or 180)})
+ end
+ end
+
+ -- โโ Clear โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+
+ elseif fields.clear then
+ session.history = {}
+ session.last_input = ""
+ updated = true
+
+ elseif fields.quit then
+ return true
+ end
+
+ if updated then M.show(name) end
+ return true
+end
+
+core.register_on_leaveplayer(function(player)
+ sessions[player:get_player_name()] = nil
+end)
+
+return M
diff --git a/code_executor.lua b/code_executor.lua
new file mode 100644
index 0000000..1b1b4e2
--- /dev/null
+++ b/code_executor.lua
@@ -0,0 +1,298 @@
+-- code_executor.lua
+-- Secure Lua code execution for LLM-Connect / Smart Lua IDE
+-- Privileges:
+-- llm_dev โ Sandbox + Whitelist, no persistent registrations
+-- llm_root โ Unrestricted execution + persistent registrations possible
+
+local core = core
+local M = {}
+
+M.execution_history = {} -- per player: {timestamp, code_snippet, success, output/error}
+
+local STARTUP_FILE = core.get_worldpath() .. DIR_DELIM .. "llm_startup.lua"
+
+-- =============================================================
+-- Helper functions
+-- =============================================================
+
+local function player_has_priv(name, priv)
+ local privs = core.get_player_privs(name) or {}
+ return privs[priv] == true
+end
+
+-- llm_root ist Superrolle: impliziert llm_dev und alle anderen
+local function has_llm_priv(name, priv)
+ if player_has_priv(name, "llm_root") then return true end
+ return player_has_priv(name, priv)
+end
+
+local function is_llm_root(name)
+ return player_has_priv(name, "llm_root")
+end
+
+-- =============================================================
+-- Sandbox environment (for normal llm_dev / llm users)
+-- =============================================================
+
+local function create_sandbox_env(player_name)
+ local safe_core = {
+ -- Logging & Chat
+ log = core.log,
+ chat_send_player = core.chat_send_player,
+
+ -- Secure read access
+ get_node = core.get_node,
+ get_node_or_nil = core.get_node_or_nil,
+ find_node_near = core.find_node_near,
+ find_nodes_in_area = core.find_nodes_in_area,
+ get_meta = core.get_meta,
+ get_player_by_name = core.get_player_by_name,
+ get_connected_players = core.get_connected_players,
+ }
+
+ -- Block registration functions (require restart)
+ local function blocked_registration(name)
+ return function(...)
+ core.log("warning", ("[code_executor] Blocked registration call: %s by %s"):format(name, player_name))
+ core.chat_send_player(player_name, "Registrations are forbidden in sandbox mode.\nOnly llm_root may execute these persistently.")
+ return nil
+ end
+ end
+
+ safe_core.register_node = blocked_registration("register_node")
+ safe_core.register_craftitem = blocked_registration("register_craftitem")
+ safe_core.register_tool = blocked_registration("register_tool")
+ safe_core.register_craft = blocked_registration("register_craft")
+ safe_core.register_entity = blocked_registration("register_entity")
+
+ -- Allowed dynamic registrations (very restricted)
+ safe_core.register_chatcommand = core.register_chatcommand
+ safe_core.register_on_chat_message = core.register_on_chat_message
+
+ -- Safe standard libraries (without dangerous functions)
+ local env = {
+ -- Lua basics
+ assert = assert,
+ error = error,
+ pairs = pairs,
+ ipairs = ipairs,
+ next = next,
+ select = select,
+ type = type,
+ tostring = tostring,
+ tonumber = tonumber,
+ unpack = table.unpack or unpack,
+
+ -- Safe string/table/math functions
+ string = { byte=string.byte, char=string.char, find=string.find, format=string.format,
+ gmatch=string.gmatch, gsub=string.gsub, len=string.len, lower=string.lower,
+ match=string.match, rep=string.rep, reverse=string.reverse, sub=string.sub,
+ upper=string.upper },
+ table = { concat=table.concat, insert=table.insert, remove=table.remove, sort=table.sort },
+ math = math,
+
+ -- Minetest-safe API
+ core = safe_core,
+
+ -- Redirect print
+ print = function(...) end, -- will be overwritten later
+ }
+
+ -- Output buffer with limit
+ local output_buffer = {}
+ local output_size = 0
+ local MAX_OUTPUT = 100000 -- ~100 KB
+
+ env.print = function(...)
+ local parts = {}
+ for i = 1, select("#", ...) do
+ parts[i] = tostring(select(i, ...))
+ end
+ local line = table.concat(parts, "\t")
+
+ if output_size + #line > MAX_OUTPUT then
+ table.insert(output_buffer, "\n[OUTPUT TRUNCATED โ 100 KB limit reached]")
+ return
+ end
+
+ table.insert(output_buffer, line)
+ output_size = output_size + #line
+ end
+
+ return env, output_buffer
+end
+
+-- =============================================================
+-- Append persistent startup code (llm_root only)
+-- =============================================================
+
+local function append_to_startup(code, player_name)
+ local f, err = io.open(STARTUP_FILE, "a")
+ if not f then
+ core.log("error", ("[code_executor] Cannot open startup file: %s"):format(tostring(err)))
+ return false, err
+ end
+
+ f:write(("\n-- Added by %s at %s\n"):format(player_name, os.date("%Y-%m-%d %H:%M:%S")))
+ f:write(code)
+ f:write("\n\n")
+ f:close()
+
+ core.log("action", ("[code_executor] Appended code to %s by %s"):format(STARTUP_FILE, player_name))
+ return true
+end
+
+-- =============================================================
+-- Main execution function
+-- =============================================================
+
+function M.execute(player_name, code, options)
+ options = options or {}
+ local result = { success = false }
+
+ if type(code) ~= "string" or code:trim() == "" then
+ result.error = "No or empty code provided"
+ return result
+ end
+
+ local is_root = is_llm_root(player_name)
+ local use_sandbox = options.sandbox ~= false
+ local allow_persist = options.allow_persist or is_root
+
+ -- Prรผfen ob der Player รผberhaupt Ausfรผhrungsrechte hat
+ if not has_llm_priv(player_name, "llm_dev") then
+ result.error = "Missing privilege: llm_dev (or llm_root)"
+ return result
+ end
+
+ -- =============================================
+ -- 1. Compile
+ -- =============================================
+ local func, compile_err = loadstring(code, "=(llm_ide)")
+ if not func then
+ result.error = "Compile error: " .. tostring(compile_err)
+ core.log("warning", ("[code_executor] Compile failed for %s: %s"):format(player_name, result.error))
+ return result
+ end
+
+ -- =============================================
+ -- 2. Prepare environment & print redirection
+ -- =============================================
+ local output_buffer = {}
+ local env
+
+ if use_sandbox then
+ env, output_buffer = create_sandbox_env(player_name)
+ setfenv(func, env) -- Lua 5.1 compatibility (Luanti mostly uses LuaJIT)
+ else
+ -- Unrestricted mode โ Careful!
+ if not is_root then
+ result.error = "Unrestricted execution only allowed for llm_root"
+ return result
+ end
+
+ -- Redirect print (without overwriting _G)
+ local old_print = print
+ print = function(...)
+ local parts = {}
+ for i = 1, select("#", ...) do parts[#parts+1] = tostring(select(i, ...)) end
+ local line = table.concat(parts, "\t")
+ table.insert(output_buffer, line)
+ end
+ end
+
+ -- =============================================
+ -- 3. Execute (with instruction limit)
+ -- =============================================
+ local ok, exec_res = pcall(function()
+ -- Instruction limit could be added here later (currently dummy)
+ return func()
+ end)
+
+ -- Reset print (if unrestricted)
+ if not use_sandbox then
+ print = old_print
+ end
+
+ -- =============================================
+ -- 4. Process result
+ -- =============================================
+ result.output = table.concat(output_buffer, "\n")
+
+ if ok then
+ result.success = true
+ result.return_value = exec_res
+
+ core.log("action", ("[code_executor] Success by %s (sandbox=%s)"):format(player_name, tostring(use_sandbox)))
+ else
+ result.error = "Runtime error: " .. tostring(exec_res)
+ core.log("warning", ("[code_executor] Execution failed for %s: %s"):format(player_name, result.error))
+ end
+
+ -- =============================================
+ -- 5. Check for registrations โ Persistence?
+ -- =============================================
+ local has_registration = code:match("register_node%s*%(") or
+ code:match("register_tool%s*%(") or
+ code:match("register_craftitem%s*%(") or
+ code:match("register_entity%s*%(") or
+ code:match("register_craft%s*%(")
+
+ if has_registration then
+ if allow_persist and is_root then
+ local saved, save_err = append_to_startup(code, player_name)
+ if saved then
+ local msg = "Code with registrations saved to llm_startup.lua.\nWill be active after server restart."
+ core.chat_send_player(player_name, msg)
+ result.output = (result.output or "") .. "\n\n" .. msg
+ result.persisted = true
+ else
+ result.error = (result.error or "") .. "\nPersistence failed: " .. tostring(save_err)
+ end
+ else
+ local msg = "Code contains registrations (node/tool/...). \nOnly llm_root can execute these persistently (restart required)."
+ core.chat_send_player(player_name, msg)
+ result.error = (result.error or "") .. "\n" .. msg
+ result.success = false -- even if execution was ok
+ end
+ end
+
+ -- Save history
+ M.execution_history[player_name] = M.execution_history[player_name] or {}
+ table.insert(M.execution_history[player_name], {
+ timestamp = os.time(),
+ code = code:sub(1, 200) .. (code:len() > 200 and "..." or ""),
+ success = result.success,
+ output = result.output,
+ error = result.error,
+ })
+
+ return result
+end
+
+-- =============================================================
+-- History functions
+-- =============================================================
+
+function M.get_history(player_name, limit)
+ limit = limit or 10
+ local hist = M.execution_history[player_name] or {}
+ local res = {}
+ local start = math.max(1, #hist - limit + 1)
+ for i = start, #hist do
+ res[#res+1] = hist[i]
+ end
+ return res
+end
+
+function M.clear_history(player_name)
+ M.execution_history[player_name] = nil
+end
+
+-- Cleanup
+core.register_on_leaveplayer(function(player)
+ local name = player:get_player_name()
+ M.execution_history[name] = nil
+end)
+
+return M
diff --git a/config_gui.lua b/config_gui.lua
new file mode 100644
index 0000000..df6adee
--- /dev/null
+++ b/config_gui.lua
@@ -0,0 +1,255 @@
+-- config_gui.lua
+-- LLM API Configuration GUI (llm_root only)
+-- v0.8.1: Added timeout field for better control
+
+local core = core
+local M = {}
+
+local function has_priv(name, priv)
+ local p = core.get_player_privs(name) or {}
+ return p[priv] == true
+end
+
+local function get_llm_api()
+ if not _G.llm_api then
+ error("[config_gui] llm_api not available")
+ end
+ return _G.llm_api
+end
+
+function M.show(name)
+ if not has_priv(name, "llm_root") then
+ core.chat_send_player(name, "Missing privilege: llm_root")
+ return
+ end
+
+ local llm_api = get_llm_api()
+ local cfg = llm_api.config
+
+ local W, H = 14.0, 14.5
+ local PAD = 0.3
+ local HEADER_H = 0.8
+ local FIELD_H = 0.8
+ local BTN_H = 0.9
+
+ local fs = {
+ "formspec_version[6]",
+ "size[" .. W .. "," .. H .. "]",
+ "bgcolor[#0f0f0f;both]",
+ "style_type[*;bgcolor=#1a1a1a;textcolor=#e0e0e0;font=mono]",
+ }
+
+ -- Header
+ table.insert(fs, "box[0,0;" .. W .. "," .. HEADER_H .. ";#202020]")
+ table.insert(fs, "label[" .. PAD .. "," .. (HEADER_H/2 - 0.2) .. ";LLM Configuration (llm_root only)]")
+ table.insert(fs, "label[" .. (W - 4) .. "," .. (HEADER_H/2 - 0.2) .. ";" .. os.date("%H:%M") .. "]")
+
+ local y = HEADER_H + PAD * 2
+
+ -- API Key
+ table.insert(fs, "label[" .. PAD .. "," .. y .. ";API Key:]")
+ y = y + 0.5
+ table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. (W - PAD*2) .. "," .. FIELD_H .. ";api_key;;" .. core.formspec_escape(cfg.api_key or "") .. "]")
+ table.insert(fs, "style[api_key;bgcolor=#1e1e1e]")
+ y = y + FIELD_H + PAD
+
+ -- API URL
+ table.insert(fs, "label[" .. PAD .. "," .. y .. ";API URL:]")
+ y = y + 0.5
+ table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. (W - PAD*2) .. "," .. FIELD_H .. ";api_url;;" .. core.formspec_escape(cfg.api_url or "") .. "]")
+ table.insert(fs, "style[api_url;bgcolor=#1e1e1e]")
+ y = y + FIELD_H + PAD
+
+ -- Model
+ table.insert(fs, "label[" .. PAD .. "," .. y .. ";Model:]")
+ y = y + 0.5
+ table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. (W - PAD*2) .. "," .. FIELD_H .. ";model;;" .. core.formspec_escape(cfg.model or "") .. "]")
+ table.insert(fs, "style[model;bgcolor=#1e1e1e]")
+ y = y + FIELD_H + PAD
+
+ -- Max Tokens & Temperature (side by side)
+ table.insert(fs, "label[" .. PAD .. "," .. y .. ";Max Tokens:]")
+ table.insert(fs, "label[" .. (W/2 + PAD) .. "," .. y .. ";Temperature:]")
+ y = y + 0.5
+
+ local half_w = (W - PAD*3) / 2
+ table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. half_w .. "," .. FIELD_H .. ";max_tokens;;" .. tostring(cfg.max_tokens or 4000) .. "]")
+ table.insert(fs, "style[max_tokens;bgcolor=#1e1e1e]")
+
+ table.insert(fs, "field[" .. (W/2 + PAD) .. "," .. y .. ";" .. half_w .. "," .. FIELD_H .. ";temperature;;" .. tostring(cfg.temperature or 0.7) .. "]")
+ table.insert(fs, "style[temperature;bgcolor=#1e1e1e]")
+ y = y + FIELD_H + PAD
+
+ -- Timeout field (new in v0.8.1)
+ table.insert(fs, "label[" .. PAD .. "," .. y .. ";Timeout (seconds):]")
+ y = y + 0.5
+ table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. half_w .. "," .. FIELD_H .. ";timeout;;" .. tostring(cfg.timeout or 120) .. "]")
+ table.insert(fs, "style[timeout;bgcolor=#1e1e1e]")
+ table.insert(fs, "tooltip[timeout;Global fallback timeout (30-600s). Per-mode overrides below override this.]")
+ y = y + FIELD_H + PAD
+
+ -- Per-mode timeout overrides
+ table.insert(fs, "label[" .. PAD .. "," .. y .. ";Per-mode timeout overrides (0 = use global):]")
+ y = y + 0.5
+ local third_w = (W - PAD * 2 - 0.2 * 2) / 3
+ local function tx(i) return PAD + i * (third_w + 0.2) end
+
+ table.insert(fs, "label[" .. tx(0) .. "," .. y .. ";Chat:]")
+ table.insert(fs, "label[" .. tx(1) .. "," .. y .. ";IDE:]")
+ table.insert(fs, "label[" .. tx(2) .. "," .. y .. ";WorldEdit:]")
+ y = y + 0.45
+
+ table.insert(fs, "field[" .. string.format("%.2f", tx(0)) .. "," .. y .. ";" .. string.format("%.2f", third_w) .. "," .. FIELD_H .. ";timeout_chat;;" .. tostring(cfg.timeout_chat or 0) .. "]")
+ table.insert(fs, "style[timeout_chat;bgcolor=#1e1e1e]")
+ table.insert(fs, "tooltip[timeout_chat;Chat mode timeout (0 = global)]")
+
+ table.insert(fs, "field[" .. string.format("%.2f", tx(1)) .. "," .. y .. ";" .. string.format("%.2f", third_w) .. "," .. FIELD_H .. ";timeout_ide;;" .. tostring(cfg.timeout_ide or 0) .. "]")
+ table.insert(fs, "style[timeout_ide;bgcolor=#1e1e1e]")
+ table.insert(fs, "tooltip[timeout_ide;IDE mode timeout (0 = global)]")
+
+ table.insert(fs, "field[" .. string.format("%.2f", tx(2)) .. "," .. y .. ";" .. string.format("%.2f", third_w) .. "," .. FIELD_H .. ";timeout_we;;" .. tostring(cfg.timeout_we or 0) .. "]")
+ table.insert(fs, "style[timeout_we;bgcolor=#1e1e1e]")
+ table.insert(fs, "tooltip[timeout_we;WorldEdit mode timeout (0 = global)]")
+ y = y + FIELD_H + PAD * 2
+
+ -- WEA toggle + separator
+ table.insert(fs, "box[" .. PAD .. "," .. y .. ";" .. (W - PAD*2) .. ",0.02;#333333]")
+ y = y + 0.18
+ local wea_val = core.settings:get_bool("llm_worldedit_additions", true)
+ local wea_label = "Enable WorldEditAdditions tools (torus, ellipsoid, erode, convolve...)"
+ local wea_is_installed = type(worldeditadditions) == "table"
+ if not wea_is_installed then
+ wea_label = wea_label .. " [WEA mod not detected]"
+ end
+ table.insert(fs, "checkbox[" .. PAD .. "," .. y .. ";wea_enabled;" .. core.formspec_escape(wea_label) .. ";" .. (wea_val and "true" or "false") .. "]")
+ y = y + 0.55 + PAD
+
+ -- 4 Buttons gleichmaessig verteilt: Save, Reload, Test, Close
+ local btn_count = 4
+ local btn_spacing = 0.2
+ local btn_w = (W - PAD * 2 - btn_spacing * (btn_count - 1)) / btn_count
+ local function bx(i) return PAD + i * (btn_w + btn_spacing) end
+
+ table.insert(fs, "button[" .. string.format("%.2f", bx(0)) .. "," .. y .. ";" .. string.format("%.2f", btn_w) .. "," .. BTN_H .. ";save;Save Config]")
+ table.insert(fs, "button[" .. string.format("%.2f", bx(1)) .. "," .. y .. ";" .. string.format("%.2f", btn_w) .. "," .. BTN_H .. ";reload;Reload]")
+ table.insert(fs, "button[" .. string.format("%.2f", bx(2)) .. "," .. y .. ";" .. string.format("%.2f", btn_w) .. "," .. BTN_H .. ";test;Test Connection]")
+ table.insert(fs, "style[close;bgcolor=#3a1a1a;textcolor=#ffaaaa]")
+ table.insert(fs, "button[" .. string.format("%.2f", bx(3)) .. "," .. y .. ";" .. string.format("%.2f", btn_w) .. "," .. BTN_H .. ";close;โ Close]")
+ y = y + BTN_H + PAD
+
+ -- Info label
+ table.insert(fs, "label[" .. PAD .. "," .. y .. ";Note: Runtime changes. Edit minetest.conf for persistence.]")
+
+ core.show_formspec(name, "llm_connect:config", table.concat(fs))
+end
+
+function M.handle_fields(name, formname, fields)
+ if not formname:match("^llm_connect:config") then
+ return false
+ end
+
+ if not has_priv(name, "llm_root") then
+ return true
+ end
+
+ local llm_api = get_llm_api()
+
+ -- WEA checkbox: instant toggle (no Save needed)
+ if fields.wea_enabled ~= nil then
+ local val = fields.wea_enabled == "true"
+ core.settings:set_bool("llm_worldedit_additions", val)
+ core.chat_send_player(name, "[LLM] WorldEditAdditions tools: " .. (val and "enabled" or "disabled"))
+ M.show(name)
+ return true
+ end
+
+ if fields.save then
+ -- Validation
+ local max_tokens = tonumber(fields.max_tokens)
+ local temperature = tonumber(fields.temperature)
+ local timeout = tonumber(fields.timeout)
+
+ if not max_tokens or max_tokens < 1 or max_tokens > 100000 then
+ core.chat_send_player(name, "[LLM] Error: max_tokens must be between 1 and 100000")
+ return true
+ end
+
+ if not temperature or temperature < 0 or temperature > 2 then
+ core.chat_send_player(name, "[LLM] Error: temperature must be between 0 and 2")
+ return true
+ end
+
+ if not timeout or timeout < 30 or timeout > 600 then
+ core.chat_send_player(name, "[LLM] Error: timeout must be between 30 and 600 seconds")
+ return true
+ end
+
+ local timeout_chat = tonumber(fields.timeout_chat) or 0
+ local timeout_ide = tonumber(fields.timeout_ide) or 0
+ local timeout_we = tonumber(fields.timeout_we) or 0
+
+ for _, t in ipairs({timeout_chat, timeout_ide, timeout_we}) do
+ if t ~= 0 and (t < 30 or t > 600) then
+ core.chat_send_player(name, "[LLM] Error: per-mode timeouts must be 0 or between 30-600")
+ return true
+ end
+ end
+
+ llm_api.set_config({
+ api_key = fields.api_key or "",
+ api_url = fields.api_url or "",
+ model = fields.model or "",
+ max_tokens = max_tokens,
+ temperature = temperature,
+ timeout = timeout,
+ timeout_chat = timeout_chat,
+ timeout_ide = timeout_ide,
+ timeout_we = timeout_we,
+ })
+
+ core.chat_send_player(name, "[LLM] Configuration updated (runtime only)")
+ core.log("action", "[llm_connect] Config updated by " .. name)
+ M.show(name)
+ return true
+
+ elseif fields.reload then
+ llm_api.reload_config()
+ core.chat_send_player(name, "[LLM] Configuration reloaded from settings")
+ core.log("action", "[llm_connect] Config reloaded by " .. name)
+ M.show(name)
+ return true
+
+ elseif fields.test then
+ -- Test LLM connection with a simple request
+ core.chat_send_player(name, "[LLM] Testing connection...")
+
+ local messages = {
+ {role = "user", content = "Reply with just the word 'OK' if you can read this."}
+ }
+
+ llm_api.request(messages, function(result)
+ if result.success then
+ core.chat_send_player(name, "[LLM] โ Connection test successful!")
+ core.chat_send_player(name, "[LLM] Response: " .. (result.content or "No content"))
+ else
+ core.chat_send_player(name, "[LLM] โ Connection test failed!")
+ core.chat_send_player(name, "[LLM] Error: " .. (result.error or "Unknown error"))
+ end
+ end, {timeout = 30})
+
+ return true
+
+ elseif fields.close or fields.quit then
+ -- Zurueck zur chat_gui
+ if _G.chat_gui then
+ _G.chat_gui.show(name)
+ else
+ core.close_formspec(name, "llm_connect:config")
+ end
+ return true
+ end
+
+ return true
+end
+
+return M
diff --git a/ide_gui.lua b/ide_gui.lua
new file mode 100644
index 0000000..6d55795
--- /dev/null
+++ b/ide_gui.lua
@@ -0,0 +1,671 @@
+-- ide_gui.lua
+-- Smart Lua IDE interface for LLM-Connect
+-- v0.9.0: File manager with dropdown, save/load from dedicated snippets folder
+
+local core = core
+local M = {}
+
+-- ======================================================
+-- File Storage
+-- ======================================================
+
+-- Resolve paths at load time (like sethome/init.lua does) โ NOT lazily at runtime.
+-- Under mod security, io.open works reliably when called with paths
+-- resolved during the mod loading phase.
+local SNIPPETS_DIR = (core.get_worldpath or minetest.get_worldpath)() .. "/" .. "llm_snippets"
+local MKDIR_FN = core.mkdir or minetest.mkdir
+
+-- Create snippets dir immediately at load time
+if MKDIR_FN then
+ MKDIR_FN(SNIPPETS_DIR)
+else
+ core.log("warning", "[ide_gui] mkdir not available โ snippets dir may not exist")
+end
+
+core.log("action", "[ide_gui] snippets dir: " .. SNIPPETS_DIR)
+
+local function get_snippets_dir()
+ return SNIPPETS_DIR
+end
+
+local function ensure_snippets_dir()
+ -- Dir was already created at load time; this is now a no-op that just returns the path
+ return SNIPPETS_DIR
+end
+
+-- Index file tracks all saved snippets (avoids core.get_dir_list which is unreliable under mod security)
+local INDEX_PATH = SNIPPETS_DIR .. "/_index.txt"
+
+local function get_index_path()
+ return INDEX_PATH
+end
+
+local function read_index()
+ local path = get_index_path()
+ local f = io.open(path, "r")
+ if not f then return {} end
+ local files = {}
+ for line in f:lines() do
+ line = line:match("^%s*(.-)%s*$")
+ if line ~= "" then
+ table.insert(files, line)
+ end
+ end
+ f:close()
+ table.sort(files)
+ return files
+end
+
+local function write_index(files)
+ local path = get_index_path()
+ local sorted = {}
+ for _, v in ipairs(files) do table.insert(sorted, v) end
+ table.sort(sorted)
+ -- deduplicate
+ local seen = {}
+ local deduped = {}
+ for _, v in ipairs(sorted) do
+ if not seen[v] then seen[v] = true; table.insert(deduped, v) end
+ end
+ local ok = core.safe_file_write(path, table.concat(deduped, "\n"))
+ return ok
+end
+
+local function index_add(filename)
+ local files = read_index()
+ local exists = false
+ for _, v in ipairs(files) do
+ if v == filename then exists = true; break end
+ end
+ if not exists then
+ table.insert(files, filename)
+ write_index(files)
+ end
+end
+
+local function index_remove(filename)
+ local files = read_index()
+ local new = {}
+ for _, v in ipairs(files) do
+ if v ~= filename then table.insert(new, v) end
+ end
+ write_index(new)
+end
+
+-- One-time migration: if index is empty, probe known filenames via io.open
+-- and rebuild the index from whatever is actually on disk.
+-- Luanti doesn't give us reliable directory listing under mod security,
+-- so we use a best-effort scan of any names we can discover.
+local migration_done = false
+local function maybe_migrate()
+ if migration_done then return end
+ migration_done = true
+ local idx = read_index()
+ if #idx > 0 then return end -- index already populated, nothing to do
+
+ -- We can't list the directory, but we can check for files the user
+ -- might have saved under common names in older versions.
+ local dir = ensure_snippets_dir()
+ local candidates = {"untitled.lua", "colorstones.lua", "test.lua", "init.lua", "startup.lua"}
+ local found = {}
+ for _, name in ipairs(candidates) do
+ local f = io.open(dir .. "/" .. name, "r")
+ if f then f:close(); table.insert(found, name) end
+ end
+ if #found > 0 then
+ write_index(found)
+ core.log("action", "[ide_gui] Migration: added " .. #found .. " existing snippets to index")
+ end
+end
+
+-- Public: returns sorted list of snippet filenames
+local function list_snippet_files()
+ maybe_migrate()
+ return read_index()
+end
+
+local function read_file(filepath)
+ local f, err = io.open(filepath, "r")
+ if not f then
+ core.log("warning", "[ide_gui] read_file failed: " .. tostring(filepath) .. " โ " .. tostring(err))
+ return nil, err
+ end
+ local content = f:read("*a")
+ f:close()
+ return content
+end
+
+local function write_file(filepath, content)
+ -- core.safe_file_write does atomic write, preferred for snippets
+ local ok = core.safe_file_write(filepath, content)
+ if not ok then
+ -- fallback to io.open
+ local f, err = io.open(filepath, "w")
+ if not f then return false, err end
+ f:write(content)
+ f:close()
+ end
+ return true
+end
+
+-- ======================================================
+-- Module helpers
+-- ======================================================
+
+local function get_executor()
+ if not _G.executor then
+ error("[ide_gui] executor not available - init.lua failed?")
+ end
+ return _G.executor
+end
+
+local function get_llm_api()
+ if not _G.llm_api then
+ error("[ide_gui] llm_api not available - init.lua failed?")
+ end
+ return _G.llm_api
+end
+
+local prompts
+local function get_prompts()
+ if not prompts then
+ local ok, p = pcall(dofile, core.get_modpath("llm_connect") .. "/ide_system_prompts.lua")
+ if not ok then
+ core.log("error", "[ide_gui] Failed to load prompts: " .. tostring(p))
+ prompts = {
+ SYNTAX_FIXER = "Fix syntax errors in this Lua/Minetest code. Return raw Lua only.",
+ SEMANTIC_ANALYZER = "Analyze this Minetest Lua code for logic errors.",
+ CODE_EXPLAINER = "Explain this Minetest Lua code simply.",
+ CODE_GENERATOR = "Generate clean Minetest Lua code based on the user request.",
+ }
+ else
+ prompts = p
+ end
+ end
+ return prompts
+end
+
+-- Session data per player
+local sessions = {}
+
+local DEFAULT_CODE = [[-- Welcome to Smart Lua IDE!
+-- Write your Luanti mod code here.
+
+core.register_node("example:test_node", {
+ description = "Test Node",
+ tiles = {"default_stone.png"},
+ groups = {cracky = 3},
+})
+]]
+
+local function get_session(name)
+ if not sessions[name] then
+ sessions[name] = {
+ code = DEFAULT_CODE,
+ output = "Ready!\nUse the toolbar buttons or type a prompt and click Generate.",
+ guiding_active = false, -- Naming guide toggle (off by default)
+ filename = "untitled.lua",
+ pending_proposal = nil,
+ last_prompt = "",
+ last_modified = os.time(),
+ file_list = {},
+ selected_file = "",
+ }
+ sessions[name].file_list = list_snippet_files()
+ end
+ return sessions[name]
+end
+
+local function has_priv(name, priv)
+ local p = core.get_player_privs(name) or {}
+ return p[priv] == true
+end
+
+local function can_use_ide(name)
+ return has_priv(name, "llm_ide") or has_priv(name, "llm_dev") or has_priv(name, "llm_root")
+end
+
+local function can_execute(name)
+ return has_priv(name, "llm_dev") or has_priv(name, "llm_root")
+end
+
+local function is_root(name)
+ return has_priv(name, "llm_root")
+end
+
+-- ======================================================
+-- Main Formspec
+-- ======================================================
+
+function M.show(name)
+ if not can_use_ide(name) then
+ core.chat_send_player(name, "Missing privilege: llm_ide (or higher)")
+ return
+ end
+
+ local session = get_session(name)
+ -- Refresh file list on every render
+ session.file_list = list_snippet_files()
+
+ local code_esc = core.formspec_escape(session.code or "")
+ local output_esc = core.formspec_escape(session.output or "")
+ local fn_esc = core.formspec_escape(session.filename or "untitled.lua")
+ local prompt_esc = core.formspec_escape(session.last_prompt or "")
+
+ local W, H = 19.2, 13.0
+ local PAD = 0.2
+ local HEADER_H = 0.8
+ local TOOL_H = 0.9
+ local FILE_H = 0.9
+ local PROMPT_H = 0.8
+ local STATUS_H = 0.6
+
+ local tool_y = HEADER_H + PAD
+ local file_y = tool_y + TOOL_H + PAD
+ local prompt_y = file_y + FILE_H + PAD
+ local work_y = prompt_y + PROMPT_H + PAD
+ local work_h = H - work_y - STATUS_H - PAD * 2
+ local col_w = (W - PAD * 3) / 2
+
+ local fs = {
+ "formspec_version[6]",
+ "size[" .. W .. "," .. H .. "]",
+ "bgcolor[#0f0f0f;both]",
+ "style_type[*;bgcolor=#1a1a1a;textcolor=#e8e8e8;font=mono]",
+ }
+
+ -- โโ Header โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ table.insert(fs, "box[0,0;" .. W .. "," .. HEADER_H .. ";#1e1e1e]")
+ table.insert(fs, "label[" .. PAD .. "," .. (HEADER_H/2 - 0.15) .. ";Smart Lua IDE | " .. fn_esc .. "]")
+ table.insert(fs, "label[" .. (W - 6.2) .. "," .. (HEADER_H/2 - 0.15) .. ";" .. os.date("%H:%M") .. "]")
+ table.insert(fs, "style[close_ide;bgcolor=#3a1a1a;textcolor=#ffaaaa]")
+ table.insert(fs, "button[" .. (W - PAD - 2.0) .. ",0.08;2.0,0.65;close_ide;x Close]")
+
+ -- โโ Toolbar โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ local bw = 1.85
+ local bp = 0.12
+ local bh = TOOL_H - 0.05
+ local x = PAD
+
+ local function add_btn(id, label, tip, enabled)
+ if not enabled then
+ table.insert(fs, "style[" .. id .. ";bgcolor=#444444;textcolor=#888888]")
+ end
+ table.insert(fs, "button[" .. x .. "," .. tool_y .. ";" .. bw .. "," .. bh .. ";" .. id .. ";" .. label .. "]")
+ if tip then table.insert(fs, "tooltip[" .. id .. ";" .. tip .. "]") end
+ x = x + bw + bp
+ end
+
+ add_btn("syntax", "Syntax", "Local syntax check + AI fix if errors found", true)
+ add_btn("analyze", "Analyze", "AI: find logic & API issues", true)
+ add_btn("explain", "Explain", "AI: explain the code in plain language", true)
+ add_btn("run", "โถ Run", can_execute(name) and "Execute in sandbox" or "Execute (needs llm_dev)", can_execute(name))
+
+ if session.pending_proposal then
+ table.insert(fs, "style[apply;bgcolor=#2a6a2a;textcolor=#ffffff]")
+ add_btn("apply", "โ Apply", "Apply AI proposal into editor", true)
+ else
+ add_btn("apply", "Apply", "No pending proposal yet", false)
+ end
+
+ -- โโ File Manager Row โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ -- Layout: [Dropdown (files)] [Load] [Filename field] [Save] [New]
+ local files = session.file_list
+ local dd_str = #files > 0 and table.concat(files, ",") or "(no files)"
+ -- Find index for pre-selection
+ local dd_idx = 1
+ if session.selected_file ~= "" then
+ for i, f in ipairs(files) do
+ if f == session.selected_file then dd_idx = i; break end
+ end
+ end
+
+ local DD_W = 4.5
+ local BTN_SM = 1.4
+ local FN_W = W - PAD * 6 - DD_W - BTN_SM * 3
+ local fbh = FILE_H - 0.05
+
+ -- Dropdown
+ table.insert(fs, "dropdown[" .. PAD .. "," .. file_y .. ";" .. DD_W .. "," .. fbh
+ .. ";file_dropdown;" .. dd_str .. ";" .. dd_idx .. ";false]")
+ table.insert(fs, "tooltip[file_dropdown;Select a saved snippet]")
+
+ local fx = PAD + DD_W + PAD
+
+ -- Load button
+ table.insert(fs, "button[" .. fx .. "," .. file_y .. ";" .. BTN_SM .. "," .. fbh .. ";file_load;Load]")
+ table.insert(fs, "tooltip[file_load;Load selected file into editor]")
+ fx = fx + BTN_SM + PAD
+
+ -- Filename input
+ table.insert(fs, "field[" .. fx .. "," .. file_y .. ";" .. FN_W .. "," .. fbh .. ";filename_input;;" .. fn_esc .. "]")
+ table.insert(fs, "field_close_on_enter[filename_input;false]")
+ table.insert(fs, "style[filename_input;bgcolor=#1e1e1e;textcolor=#e8e8e8]")
+ table.insert(fs, "tooltip[filename_input;Filename to save as (auto-appends .lua)]")
+ fx = fx + FN_W + PAD
+
+ -- Save / New (root only)
+ if is_root(name) then
+ table.insert(fs, "style[file_save;bgcolor=#2a4a6a;textcolor=#ffffff]")
+ table.insert(fs, "button[" .. fx .. "," .. file_y .. ";" .. BTN_SM .. "," .. fbh .. ";file_save;Save]")
+ table.insert(fs, "tooltip[file_save;Save editor content as the given filename]")
+ fx = fx + BTN_SM + PAD
+ table.insert(fs, "button[" .. fx .. "," .. file_y .. ";" .. BTN_SM .. "," .. fbh .. ";file_new;New]")
+ table.insert(fs, "tooltip[file_new;Clear editor for a new file]")
+ end
+
+ -- โโ Prompt Row โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ -- Layout: [Prompt field ............] [โ Guide] [Generate]
+ local gen_w = 2.2
+ local guide_w = 3.2 -- checkbox + label
+ local pr_w = W - PAD * 4 - guide_w - gen_w
+
+ table.insert(fs, "field[" .. PAD .. "," .. prompt_y .. ";" .. pr_w .. "," .. PROMPT_H
+ .. ";prompt_input;;" .. prompt_esc .. "]")
+ table.insert(fs, "field_close_on_enter[prompt_input;false]")
+ table.insert(fs, "style[prompt_input;bgcolor=#1e1e1e;textcolor=#e8e8e8]")
+ table.insert(fs, "tooltip[prompt_input;Describe what code to generate, then click Generate]")
+
+ -- Naming guide toggle checkbox
+ local guide_on = session.guiding_active == true
+ local cx = PAD + pr_w + PAD
+ local guide_color = guide_on and "#1a3a1a" or "#252525"
+ table.insert(fs, "style[guide_toggle;bgcolor=" .. guide_color .. ";textcolor=#aaffaa]")
+ table.insert(fs, "checkbox[" .. cx .. "," .. (prompt_y + 0.15) .. ";guide_toggle;llm_connect: guide;" .. (guide_on and "true" or "false") .. "]")
+ table.insert(fs, "tooltip[guide_toggle;Inject naming convention guide into Generate calls.\nTeaches the LLM to use the llm_connect: prefix for registrations.]")
+
+ local gx = cx + guide_w + PAD
+ if can_execute(name) then
+ table.insert(fs, "style[generate;bgcolor=#2a4a6a;textcolor=#ffffff]")
+ else
+ table.insert(fs, "style[generate;bgcolor=#444444;textcolor=#888888]")
+ end
+ table.insert(fs, "button[" .. gx .. "," .. prompt_y .. ";" .. gen_w .. "," .. PROMPT_H
+ .. ";generate;Generate]")
+ table.insert(fs, "tooltip[generate;"
+ .. (can_execute(name) and "AI: generate code from your prompt" or "Generate (needs llm_dev)")
+ .. "]")
+
+ -- โโ Editor & Output โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ table.insert(fs, "style[code;bgcolor=#1e1e1e;textcolor=#e8e8e8;border=true]")
+ table.insert(fs, "textarea[" .. PAD .. "," .. work_y .. ";" .. (col_w - PAD) .. "," .. work_h
+ .. ";code;;" .. code_esc .. "]")
+
+ table.insert(fs, "style[output;bgcolor=#181818;textcolor=#cccccc;border=true]")
+ table.insert(fs, "textarea[" .. (PAD + col_w + PAD) .. "," .. work_y .. ";" .. (col_w - PAD) .. "," .. work_h
+ .. ";output;;" .. output_esc .. "]")
+
+ -- โโ Status Bar โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ local sy = H - STATUS_H - PAD
+ table.insert(fs, "box[0," .. sy .. ";" .. W .. "," .. STATUS_H .. ";#1e1e1e]")
+ local status = "File: " .. fn_esc .. " | Modified: " .. os.date("%H:%M", session.last_modified)
+ if session.pending_proposal then
+ status = status .. " | โ
PROPOSAL READY โ click Apply"
+ end
+ table.insert(fs, "label[" .. PAD .. "," .. (sy + 0.22) .. ";" .. status .. "]")
+
+ core.show_formspec(name, "llm_connect:ide", table.concat(fs))
+end
+
+-- ======================================================
+-- Formspec Handler
+-- ======================================================
+
+function M.handle_fields(name, formname, fields)
+ if not formname:match("^llm_connect:ide") then return false end
+ if not can_use_ide(name) then return true end
+
+ local session = get_session(name)
+ local updated = false
+
+ -- Capture live editor/field state
+ if fields.code then session.code = fields.code; session.last_modified = os.time() end
+ if fields.prompt_input then session.last_prompt = fields.prompt_input end
+ if fields.guide_toggle ~= nil then
+ session.guiding_active = (fields.guide_toggle == "true")
+ M.show(name)
+ return true
+ end
+ if fields.filename_input and fields.filename_input ~= "" then
+ local fn = fields.filename_input:match("^%s*(.-)%s*$")
+ if fn ~= "" then
+ if not fn:match("%.lua$") then fn = fn .. ".lua" end
+ session.filename = fn
+ end
+ end
+
+ -- Dropdown: track selection
+ if fields.file_dropdown then
+ local val = fields.file_dropdown
+ if val ~= "(no files)" and val ~= "" then
+ -- index_event=false โ val is the filename directly
+ -- Fallback: if val is a number string, resolve via file_list index
+ local as_num = tonumber(val)
+ if as_num and session.file_list and session.file_list[as_num] then
+ val = session.file_list[as_num]
+ end
+ session.selected_file = val
+ end
+ updated = true
+ end
+
+ -- โโ File operations โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ if fields.file_load then
+ local target = session.selected_file
+ if target == "" or target == "(no files)" then
+ session.output = "Please select a file in the dropdown first."
+ else
+ local path = ensure_snippets_dir() .. DIR_DELIM .. target
+ local content, read_err = read_file(path)
+ if content then
+ session.code = content
+ session.filename = target
+ session.last_modified = os.time()
+ session.output = "โ Loaded: " .. target
+ else
+ session.output = "โ Could not read: " .. target
+ .. "\nPath: " .. path
+ .. (read_err and ("\nError: " .. tostring(read_err)) or "")
+ -- Remove from index if file is gone
+ index_remove(target)
+ session.file_list = list_snippet_files()
+ end
+ end
+ updated = true
+
+ elseif fields.file_save and is_root(name) then
+ local fn = session.filename
+ if fn == "" then fn = "untitled.lua" end
+ if not fn:match("%.lua$") then fn = fn .. ".lua" end
+ fn = fn:match("([^/\\]+)$") or fn -- prevent path traversal
+ session.filename = fn
+
+ local path = ensure_snippets_dir() .. DIR_DELIM .. fn
+ local ok, err = write_file(path, session.code)
+ if ok then
+ index_add(fn)
+ session.output = "โ Saved: " .. fn
+ session.last_modified = os.time()
+ session.file_list = list_snippet_files()
+ session.selected_file = fn
+ else
+ session.output = "โ Save failed: " .. tostring(err)
+ end
+ updated = true
+
+ elseif fields.file_new and is_root(name) then
+ session.code = DEFAULT_CODE
+ session.filename = "untitled.lua"
+ session.last_modified = os.time()
+ session.pending_proposal = nil
+ session.output = "New file ready. Write code and save."
+ updated = true
+
+ -- โโ Toolbar actions โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ elseif fields.syntax then
+ M.check_syntax(name); return true
+
+ elseif fields.analyze then
+ M.analyze_code(name); return true
+
+ elseif fields.explain then
+ M.explain_code(name); return true
+
+ elseif fields.generate and can_execute(name) then
+ M.generate_code(name); return true
+
+ elseif fields.run and can_execute(name) then
+ M.run_code(name); return true
+
+ elseif fields.apply then
+ if session.pending_proposal then
+ session.code = session.pending_proposal
+ session.pending_proposal = nil
+ session.last_modified = os.time()
+ session.output = "โ Applied proposal to editor."
+ else
+ session.output = "No pending proposal to apply."
+ end
+ updated = true
+
+ elseif fields.close_ide or fields.quit then
+ if _G.chat_gui then _G.chat_gui.show(name) end
+ return true
+ end
+
+ if updated then M.show(name) end
+ return true
+end
+
+-- ======================================================
+-- Actions (AI)
+-- ======================================================
+
+function M.check_syntax(name)
+ local session = get_session(name)
+ local func, err = loadstring(session.code)
+ if func then
+ session.output = "โ Syntax OK โ no errors found."
+ M.show(name)
+ return
+ end
+
+ session.output = "โ Syntax error:\n" .. tostring(err) .. "\n\nAsking AI to fixโฆ"
+ M.show(name)
+
+ local p = get_prompts()
+ get_llm_api().code(p.SYNTAX_FIXER, session.code, function(result)
+ if result.success then
+ local fixed = result.content
+ fixed = fixed:match("```lua\n(.-)```") or fixed:match("```\n(.-)```") or fixed
+ session.pending_proposal = fixed
+ session.output = "AI fix proposal:\n\n" .. fixed .. "\n\nโ Press [Apply] to use."
+ else
+ session.output = "Syntax error:\n" .. tostring(err)
+ .. "\n\nAI fix failed: " .. (result.error or "?")
+ end
+ M.show(name)
+ end)
+end
+
+function M.analyze_code(name)
+ local session = get_session(name)
+ session.output = "Analyzing codeโฆ (please wait)"
+ M.show(name)
+
+ local p = get_prompts()
+ get_llm_api().code(p.SEMANTIC_ANALYZER, session.code, function(result)
+ if result.success then
+ local content = result.content
+ local code_part = content:match("```lua\n(.-)```") or content:match("```\n(.-)```")
+ local analysis = content:match("%-%-%[%[(.-)%]%]") or content
+ if code_part then
+ session.pending_proposal = code_part
+ session.output = "Analysis:\n" .. analysis .. "\n\nโ Improved code ready. Press [Apply]."
+ else
+ session.output = "Analysis:\n" .. content
+ end
+ else
+ session.output = "Error: " .. (result.error or "No response")
+ end
+ M.show(name)
+ end)
+end
+
+function M.explain_code(name)
+ local session = get_session(name)
+ session.output = "Explaining codeโฆ (please wait)"
+ M.show(name)
+
+ local p = get_prompts()
+ get_llm_api().code(p.CODE_EXPLAINER, session.code, function(result)
+ session.output = result.success and result.content or ("Error: " .. (result.error or "?"))
+ M.show(name)
+ end)
+end
+
+function M.generate_code(name)
+ local session = get_session(name)
+ local user_req = (session.last_prompt or ""):match("^%s*(.-)%s*$")
+
+ if user_req == "" then
+ session.output = "Please enter a prompt in the field above first."
+ M.show(name)
+ return
+ end
+
+ session.output = "Generating codeโฆ (please wait)"
+ M.show(name)
+
+ local p = get_prompts()
+
+ -- Append naming guide if toggle is active in session
+ local guide_addendum = ""
+ if session.guiding_active and p.NAMING_GUIDE then
+ guide_addendum = p.NAMING_GUIDE
+ end
+
+ local sys_msg = p.CODE_GENERATOR .. guide_addendum .. "\n\nUser request: " .. user_req
+ get_llm_api().code(sys_msg, session.code, function(result)
+ if result.success and result.content then
+ local gen = result.content
+ gen = gen:match("```lua\n(.-)```") or gen:match("```\n(.-)```") or gen
+ session.pending_proposal = gen
+ session.output = "Generated code proposal:\n\n" .. gen
+ .. "\n\nโ Press [Apply] to insert into editor."
+ else
+ session.output = "Generation failed: " .. (result.error or "No response")
+ end
+ M.show(name)
+ end)
+end
+
+function M.run_code(name)
+ local session = get_session(name)
+ local executor = get_executor()
+
+ session.output = "Executingโฆ (please wait)"
+ M.show(name)
+
+ local res = executor.execute(name, session.code, {sandbox = true})
+ if res.success then
+ local out = "โ Execution successful.\n\nOutput:\n"
+ .. (res.output ~= "" and res.output or "(no output)")
+ if res.return_value then out = out .. "\n\nReturn: " .. tostring(res.return_value) end
+ if res.persisted then out = out .. "\n\nโ Startup file updated (restart needed)" end
+ session.output = out
+ else
+ session.output = "โ Execution failed:\n" .. (res.error or "Unknown error")
+ if res.output and res.output ~= "" then
+ session.output = session.output .. "\n\nOutput before error:\n" .. res.output
+ end
+ end
+ M.show(name)
+end
+
+-- Cleanup
+core.register_on_leaveplayer(function(player)
+ sessions[player:get_player_name()] = nil
+end)
+
+return M
diff --git a/ide_languages.lua b/ide_languages.lua
new file mode 100644
index 0000000..aa2da21
--- /dev/null
+++ b/ide_languages.lua
@@ -0,0 +1,40 @@
+local LANGUAGE_NAMES = {
+ en = "English",
+ de = "German",
+ es = "Spanish",
+ fr = "French",
+ it = "Italian",
+ pt = "Portuguese",
+ ru = "Russian",
+ zh = "Chinese",
+ ja = "Japanese",
+ ko = "Korean",
+ ar = "Arabic",
+ hi = "Hindi",
+ tr = "Turkish",
+ nl = "Dutch",
+ pl = "Polish",
+ sv = "Swedish",
+ da = "Danish",
+ no = "Norwegian",
+ fi = "Finnish",
+ cs = "Czech",
+ hu = "Hungarian",
+ ro = "Romanian",
+ el = "Greek",
+ th = "Thai",
+ vi = "Vietnamese",
+ id = "Indonesian",
+ ms = "Malay",
+ he = "Hebrew",
+ bn = "Bengali",
+ uk = "Ukrainian",
+}
+
+function get_language_name(code)
+ return LANGUAGE_NAMES[code] or "English"
+end
+
+return {
+ get_language_name = get_language_name
+}
diff --git a/ide_system_prompts.lua b/ide_system_prompts.lua
new file mode 100644
index 0000000..2c854b7
--- /dev/null
+++ b/ide_system_prompts.lua
@@ -0,0 +1,120 @@
+-- smart_lua_ide/prompts.lua
+-- System prompts for different AI assistant modes
+
+local prompts = {}
+
+prompts.SYNTAX_FIXER = [[You are a Lua syntax corrector specialized in Minetest/Luanti mod development.
+
+Your task: Fix ONLY syntax errors in the provided code.
+
+Rules:
+1. Return ONLY the corrected Lua code
+2. NO explanations, NO markdown blocks, NO comments
+3. Preserve the original logic and structure
+4. Fix: missing 'end', unmatched parentheses, typos in keywords, etc.
+5. Do NOT refactor or optimize - only fix syntax
+6. Do NOT add any filesystem/network/system access
+
+Output format: Raw Lua code only.]]
+
+prompts.SEMANTIC_ANALYZER = [[You are a Lua code analyzer for Minetest/Luanti mods.
+
+Your task: Analyze code for logic errors, API misuse, and improvements.
+
+Context:
+- Minetest Lua API version 5.x
+- Common APIs: core.register_node, core.register_tool, core.register_chatcommand
+- Deprecated functions should be flagged
+
+Security rules:
+- Do NOT introduce os/io/debug/require/dofile/loadfile/package
+- Do NOT introduce core.request_http_api or core.request_insecure_environment
+
+Output format:
+1. First, provide the CORRECTED CODE
+2. Then, add a comment block explaining:
+ - What was wrong
+ - What was changed
+ - Why it matters
+
+Example format:
+-- [CORRECTED CODE HERE]
+--[[ ANALYSIS:
+- ...
+]]
+
+prompts.CODE_EXPLAINER = [[You are a Minetest/Luanti mod development tutor.
+
+Your task: Explain the provided Lua code in simple terms.
+
+Focus on:
+1. What the code does (high-level)
+2. Key Minetest API calls and their purpose
+3. Potential issues or improvements
+4. Best practices being followed/violated
+
+Be concise but educational.]]
+
+prompts.CODE_GENERATOR = [[You are a Minetest/Luanti mod code generator.
+
+Your task: Generate clean, functional Lua code based on the user's request.
+
+Requirements:
+1. Use modern Minetest API (5.x+)
+2. Include error handling where appropriate
+3. Add brief inline comments for complex logic
+4. Follow Minetest coding conventions
+5. Return ONLY executable Lua code
+
+Security requirements (important):
+- Do NOT use os/io/debug/package/require/dofile/loadfile
+- Do NOT use core.request_http_api or core.request_insecure_environment
+- Avoid privilege/auth manipulation APIs
+
+Output: Raw Lua code ready to execute.]]
+
+prompts.REFACTORER = [[You are a code refactoring expert for Minetest/Luanti mods.
+
+Your task: Improve code quality without changing functionality.
+
+Improvements:
+1. Better variable names
+2. Extract repeated code into functions
+3. Optimize performance (e.g., caching, avoiding repeated lookups)
+4. Improve readability and structure
+5. Add helpful comments
+
+Security requirements:
+- Do NOT add os/io/debug/package/require/dofile/loadfile
+- Do NOT add core.request_http_api or core.request_insecure_environment
+
+Output:
+1. Refactored code
+2. Brief comment explaining major changes]]
+
+-- ============================================================
+-- Naming Convention Guide (opt-in, injected when guide_toggle is active)
+-- Appended to CODE_GENERATOR when llm_connect: prefix guide is enabled.
+-- ============================================================
+
+prompts.NAMING_GUIDE = [[
+
+IMPORTANT โ Luanti/Minetest Naming Conventions for this IDE:
+This code runs inside the "llm_connect" mod context.
+
+REGISTRATIONS โ always use "llm_connect:" prefix:
+ Correct: core.register_node("llm_connect:my_stone", { ... })
+ Correct: core.register_craftitem("llm_connect:magic_dust", { ... })
+ Incorrect: core.register_node("mymod:my_stone", { ... }) -- fails
+ Incorrect: core.register_node("default:my_stone", { ... }) -- fails
+
+LUA STDLIB โ Luanti uses LuaJIT, not standard Lua:
+ No string:capitalize() โ use: (str:sub(1,1):upper() .. str:sub(2))
+ No string:split() โ use: string.gmatch or manual parsing
+
+READING other mods is always fine:
+ core.get_node(pos) -- ok
+ core.registered_nodes["default:stone"] -- ok
+]]
+
+return prompts
diff --git a/init.lua b/init.lua
index 51a345a..87b55ab 100644
--- a/init.lua
+++ b/init.lua
@@ -1,422 +1,226 @@
-- ===========================================================================
--- LLM Connect Init v0.7.8
+-- LLM Connect Init v0.9.0-dev
-- author: H5N3RG
-- license: LGPL-3.0-or-later
--- Fix: max_tokens type handling, fully configurable, robust JSON
--- Added: metadata for ingame-commads
--- Enhancement: Dynamic metadata handling, player name in prompts
--- NEW: Automatic API endpoint completion for compatibility.
--- UPDATE: Configurable context sending
-- ===========================================================================
local core = core
+local mod_dir = core.get_modpath("llm_connect")
--- Load HTTP API
+-- === HTTP API ===
local http = core.request_http_api()
if not http then
core.log("error", "[llm_connect] HTTP API not available! Add 'llm_connect' to secure.http_mods in minetest.conf!")
return
end
--- === Load settings from menu / settingtypes.txt ===
-local api_key = core.settings:get("llm_api_key") or ""
-local api_url = core.settings:get("llm_api_url") or ""
-local model_name = core.settings:get("llm_model") or ""
-
--- NEW Context Settings
-local send_server_info = core.settings:get_bool("llm_context_send_server_info")
-local send_mod_list = core.settings:get_bool("llm_context_send_mod_list")
-local send_commands = core.settings:get_bool("llm_context_send_commands")
-local send_player_pos = core.settings:get_bool("llm_context_send_player_pos")
-local send_materials = core.settings:get_bool("llm_context_send_materials")
-
--- NEW: Function to check and complete the API endpoint
-local function finalize_api_url(url)
- if not url or url == "" then
- return ""
- end
-
- -- 1. Remove trailing slash if present
- local clean_url = url:gsub("/$", "")
-
- -- Check if the URL contains a path component (everything after the host:port)
- -- We assume any '/' after the protocol part (e.g., 'http://') or the host:port
- -- indicates a user-defined path, which should not be overwritten.
- -- Simple check: if the URL contains more than two slashes (e.g. 'http://host')
- -- or if it contains any character after the host:port that is not part of the port number.
-
- -- Attempt to find the end of the host/port part (first '/' after the protocol slashes)
- local protocol_end = clean_url:find("://")
- local host_end = 0
-
- if protocol_end then
- host_end = clean_url:find("/", protocol_end + 3) -- Find the first slash after '://'
- end
-
- -- If no further slash is found (host_end is nil), it means only the base address (host:port) is present.
- if not host_end then
- -- Append the default OpenAI-compatible path
- return clean_url .. "/v1/chat/completions"
- end
-
- -- If a path is found, use the URL as is.
- return url
-end
-
--- Apply the auto-completion/finalization logic
-api_url = finalize_api_url(api_url)
-
-
--- max_tokens type: default integer, override via settings
-local setting_val = core.settings:get_bool("llm_max_tokens_integer")
-local max_tokens_type = "integer"
-if setting_val == false then
- max_tokens_type = "float"
-end
-
--- Storage for conversation history per player
-local history = {}
-local max_history = { ["default"] = 10 }
-local metadata_cache = {} -- Cache for metadata to detect changes
-
--- Helper functions
-local function get_history(name)
- history[name] = history[name] or {}
- return history[name]
-end
-
-local function get_max_history(name)
- return max_history[name] or max_history["default"]
-end
-
-local function string_split(str, delim)
- local res = {}
- local i = 1
- local str_len = #str
- local delim_len = #delim
- while i <= str_len do
- local pos = string.find(str, delim, i, true)
- if pos then
- table.insert(res, string.sub(str, i, pos - 1))
- i = pos + delim_len
- else
- table.insert(res, string.sub(str, i))
- break
- end
- end
- return res
-end
-
--- Load optional context files
-local mod_dir = core.get_modpath("llm_connect")
-local llm_materials_context = nil
-pcall(function()
- llm_materials_context = dofile(mod_dir .. "/llm_materials_context.lua")
-end)
-
-local function read_file_content(filepath)
- local f = io.open(filepath, "r")
- if not f then return nil end
- local content = f:read("*a")
- f:close()
- return content
-end
-
-local system_prompt_content = read_file_content(mod_dir .. "/system_prompt.txt") or ""
-
-- === Privileges ===
-core.register_privilege("llm", { description = "Can chat with the LLM model", give_to_singleplayer=true, give_to_admin=true })
-core.register_privilege("llm_root", { description = "Can configure the LLM API key, model, and endpoint", give_to_singleplayer=true, give_to_admin=true })
+core.register_privilege("llm", {
+ description = "LLM Connect: /llm chat interface (chat mode only)",
+ give_to_singleplayer = true,
+ give_to_admin = true,
+})
--- === Metadata Functions ===
-local meta_data_functions = {}
-local function get_username(player_name) return player_name or "Unknown Player" end
-local function get_installed_mods()
- local mods = {}
- -- NOTE: core.get_mods is undocumented. Using core.get_modnames (documented) instead.
- if core.get_modnames then
- -- core.get_modnames returns a table of mod names, already sorted alphabetically.
- mods = core.get_modnames()
- else
- -- Fallback for extremely old versions
- table.insert(mods,"Mod list not available (core.get_modnames missing)")
- end
- return mods
+core.register_privilege("llm_dev", {
+ description = "LLM Connect: Smart Lua IDE + sandbox code execution (whitelist limited)",
+ give_to_singleplayer = false,
+ give_to_admin = false,
+})
+
+core.register_privilege("llm_worldedit", {
+ description = "LLM Connect: WorldEdit agency (WE Single + WE Loop + material picker)",
+ give_to_singleplayer = false,
+ give_to_admin = false,
+})
+
+core.register_privilege("llm_root", {
+ description = "LLM Connect: Full access (implies llm + llm_dev + llm_worldedit). Config, unrestricted execution, persistent code.",
+ give_to_singleplayer = false,
+ give_to_admin = true,
+})
+
+-- === Load central LLM API module ===
+local llm_api_ok, llm_api = pcall(dofile, mod_dir .. "/llm_api.lua")
+if not llm_api_ok or not llm_api then
+ core.log("error", "[llm_connect] Failed to load llm_api.lua: " .. tostring(llm_api))
+ return
+end
+if not llm_api.init(http) then
+ core.log("error", "[llm_connect] Failed to initialize llm_api")
+ return
end
--- Function to collect chat commands
-local function get_installed_commands()
- local commands = {}
- if core.chatcommands then
- for name, cmd in pairs(core.chatcommands) do
- if not name:match("^__builtin:") then
- local desc = cmd.description or "No description"
- table.insert(commands, "/" .. name .. " " .. (cmd.params or "") .. " - " .. desc)
- end
+-- === Load code executor ===
+local executor_ok, executor = pcall(dofile, mod_dir .. "/code_executor.lua")
+if not executor_ok or not executor then
+ core.log("error", "[llm_connect] Failed to load code_executor.lua: " .. tostring(executor))
+ return
+end
+
+-- === Load GUI modules ===
+local chat_gui_ok, chat_gui = pcall(dofile, mod_dir .. "/chat_gui.lua")
+if not chat_gui_ok then
+ core.log("error", "[llm_connect] Failed to load chat_gui.lua: " .. tostring(chat_gui))
+ return
+end
+
+local ide_gui_ok, ide_gui = pcall(dofile, mod_dir .. "/ide_gui.lua")
+if not ide_gui_ok then
+ core.log("error", "[llm_connect] Failed to load ide_gui.lua: " .. tostring(ide_gui))
+ return
+end
+
+local config_gui_ok, config_gui = pcall(dofile, mod_dir .. "/config_gui.lua")
+if not config_gui_ok then
+ core.log("error", "[llm_connect] Failed to load config_gui.lua: " .. tostring(config_gui))
+ return
+end
+
+-- === Load helpers ===
+local chat_context_ok, chat_context = pcall(dofile, mod_dir .. "/chat_context.lua")
+if not chat_context_ok then
+ core.log("warning", "[llm_connect] chat_context.lua not loaded: " .. tostring(chat_context))
+ chat_context = nil
+end
+
+-- === Load WorldEdit agency module (optional dependency) ===
+local we_agency_ok, we_agency = pcall(dofile, mod_dir .. "/llm_worldedit.lua")
+if not we_agency_ok then
+ core.log("warning", "[llm_connect] llm_worldedit.lua failed to load: " .. tostring(we_agency))
+ we_agency = nil
+elseif not we_agency.is_available() then
+ core.log("warning", "[llm_connect] WorldEdit not detected at load time โ agency mode disabled")
+ core.log("warning", "[llm_connect] worldedit global type: " .. type(worldedit))
+ -- NOTE: we_agency is still set as global โ we_available() checks at runtime
+ -- so WE buttons may still appear if worldedit loads later (should not happen with optional_depends)
+end
+
+-- === Load material picker ===
+local picker_ok, material_picker = pcall(dofile, mod_dir .. "/material_picker.lua")
+if not picker_ok then
+ core.log("warning", "[llm_connect] material_picker.lua not loaded: " .. tostring(material_picker))
+ material_picker = nil
+end
+
+-- === Make modules globally available ===
+_G.chat_gui = chat_gui
+_G.llm_api = llm_api
+_G.executor = executor
+_G.we_agency = we_agency
+_G.material_picker = material_picker
+_G.ide_gui = ide_gui
+_G.config_gui = config_gui
+
+-- === Startup code loader ===
+local startup_file = core.get_worldpath() .. "/llm_startup.lua"
+
+local function load_startup_code()
+ local f = io.open(startup_file, "r")
+ if f then
+ f:close()
+ core.log("action", "[llm_connect] Loading startup code from " .. startup_file)
+ local ok, err = pcall(dofile, startup_file)
+ if not ok then
+ core.log("error", "[llm_connect] Startup code error: " .. tostring(err))
+ core.log("error", "[llm_connect] Fix the error in llm_startup.lua and restart the server")
+ else
+ core.log("action", "[llm_connect] Startup code loaded successfully")
end
- table.sort(commands)
else
- table.insert(commands, "Command list not available.")
+ core.log("action", "[llm_connect] No llm_startup.lua found (this is normal on first run)")
end
- return commands
end
-local function get_server_settings()
- local settings = {
- server_name = core.settings:get("server_name") or "Unnamed Server",
- server_description= core.settings:get("server_description") or "No description",
- motd = core.settings:get("motd") or "No MOTD set",
- port = core.settings:get("port") or "Unknown",
- gameid = (core.get_game_info and core.get_game_info().id) or core.settings:get("gameid") or "Unknown",
- game_name = (core.get_game_info and core.get_game_info().name) or "Unknown",
- worldpath = core.get_worldpath() or "Unknown",
- mapgen = core.get_mapgen_setting("mg_name") or "Unknown",
- }
- return settings
-end
-
-function meta_data_functions.gather_context(player_name)
- local context = {}
- context.player = get_username(player_name)
- context.installed_mods = get_installed_mods()
- context.installed_commands = get_installed_commands()
- context.server_settings = get_server_settings()
- -- Add dynamic player data (e.g., position)
- local player = core.get_player_by_name(player_name)
- if player then
- local pos = player:get_pos()
- context.player_position = string.format("x=%.2f, y=%.2f, z=%.2f", pos.x, pos.y, pos.z)
- else
- context.player_position = "Unknown"
- end
- return context
-end
-
--- Compute a simple hash for metadata to detect changes
-local function compute_metadata_hash(context)
- -- Hash calculation now depends on which fields are active to avoid unnecessary cache busts
- local str = context.player
- if send_server_info then str = str .. context.server_settings.server_name .. context.server_settings.worldpath end
- if send_mod_list then str = str .. table.concat(context.installed_mods, ",") end
- if send_commands then str = str .. table.concat(context.installed_commands, ",") end
- if send_player_pos then str = str .. context.player_position end
- -- Material context has its own hash in llm_materials_context.lua, so we don't include it here
- return core.sha1(str)
-end
+load_startup_code()
-- === Chat Commands ===
-core.register_chatcommand("llm_setkey", {
- params = " [url] [model]",
- description = "Sets the API key, URL, and model for the LLM.",
- privs = {llm_root=true},
- func = function(name,param)
- if not core.check_player_privs(name,{llm_root=true}) then return false,"No permission!" end
- local parts = string_split(param," ")
- if #parts==0 then return false,"Please provide API key!" end
- api_key = parts[1]
- if parts[2] then api_url = finalize_api_url(parts[2]) end -- Apply finalization here too
- if parts[3] then model_name = parts[3] end
- core.chat_send_player(name,"[LLM] API key, URL and model set. (URL auto-corrected if only host:port was provided.)")
- return true
- end,
-})
-core.register_chatcommand("llm_setmodel", {
- params = "",
- description = "Sets the LLM model.",
- privs = {llm_root=true},
- func = function(name,param)
- if not core.check_player_privs(name,{llm_root=true}) then return false,"No permission!" end
- if param=="" then return false,"Provide a model name!" end
- model_name = param
- core.chat_send_player(name,"[LLM] Model set to '"..model_name.."'.")
- return true
- end,
-})
-
-core.register_chatcommand("llm_set_endpoint", {
- params = "",
- description = "Sets the API endpoint URL.",
- privs = {llm_root=true},
- func = function(name,param)
- if not core.check_player_privs(name,{llm_root=true}) then return false,"No permission!" end
- if param=="" then return false,"Provide URL!" end
- api_url = finalize_api_url(param) -- Apply finalization here
- core.chat_send_player(name,"[LLM] API endpoint set to "..api_url.." (URL auto-corrected if only host:port was provided.)")
- return true
- end,
-})
-
-core.register_chatcommand("llm_set_context", {
- params = " [player]",
- description = "Sets the max context length.",
- privs = {llm_root=true},
- func = function(name,param)
- if not core.check_player_privs(name,{llm_root=true}) then return false,"No permission!" end
- local parts = string_split(param," ")
- local count = tonumber(parts[1])
- local target_player = parts[2]
- if not count or count<1 then return false,"Provide number > 0!" end
- if target_player and target_player~="" then max_history[target_player]=count
- else max_history["default"]=count end
- core.chat_send_player(name,"[LLM] Context length set.")
- return true
- end,
-})
-
-core.register_chatcommand("llm_float", {
- description = "Set max_tokens as float",
- privs = {llm_root=true},
- func = function(name)
- max_tokens_type="float"
- core.chat_send_player(name,"[LLM] max_tokens now sent as float.")
- return true
- end,
-})
-
-core.register_chatcommand("llm_integer", {
- description = "Set max_tokens as integer",
- privs = {llm_root=true},
- func = function(name)
- max_tokens_type="integer"
- core.chat_send_player(name,"[LLM] max_tokens now sent as integer.")
- return true
- end,
-})
-
-core.register_chatcommand("llm_reset", {
- description = "Resets conversation and context.",
- privs = {llm=true},
- func = function(name)
- history[name] = {}
- metadata_cache[name] = nil -- Reset metadata cache
- core.chat_send_player(name,"[LLM] Conversation and metadata reset.")
- end,
-})
-
--- === Main Chat Command ===
core.register_chatcommand("llm", {
- params = "",
- description = "Sends prompt to the LLM",
- privs = {llm=true},
- func = function(name,param)
- if not core.check_player_privs(name,{llm=true}) then return false,"No permission!" end
- if param=="" then return false,"Provide a prompt!" end
- if api_key=="" or api_url=="" or model_name=="" then
- return false,"[LLM] API key, URL, or Model not set! Check mod settings."
- end
-
- local player_history = get_history(name)
- local max_hist = get_max_history(name)
- -- Add player name to prompt for clarity
- local user_prompt = "Player " .. name .. ": " .. param
- table.insert(player_history,{role="user",content=user_prompt})
- while #player_history>max_hist do table.remove(player_history,1) end
-
- -- Gather and cache metadata
- local context_data = meta_data_functions.gather_context(name)
- local current_metadata_hash = compute_metadata_hash(context_data)
- local needs_metadata_update = not metadata_cache[name] or metadata_cache[name].hash ~= current_metadata_hash
-
- local messages = {}
- -- Build dynamic system prompt with metadata
- local dynamic_system_prompt = system_prompt_content
- if needs_metadata_update then
- local metadata_string = "\n\n--- METADATA ---\n" ..
- "Player: " .. context_data.player .. "\n"
-
- -- Conditional Player Position
- if send_player_pos then
- metadata_string = metadata_string .. "Player Position: " .. context_data.player_position .. "\n"
- end
-
- -- Conditional Server Info
- if send_server_info then
- metadata_string = metadata_string ..
- "Server Name: " .. context_data.server_settings.server_name .. "\n" ..
- "Server Description: " .. context_data.server_settings.server_description .. "\n" ..
- "MOTD: " .. context_data.server_settings.motd .. "\n" ..
- "Game: " .. context_data.server_settings.game_name .. " (" .. context_data.server_settings.gameid .. ")\n" ..
- "Mapgen: " .. context_data.server_settings.mapgen .. "\n" ..
- "World Path: " .. context_data.server_settings.worldpath .. "\n" ..
- "Port: " .. context_data.server_settings.port .. "\n"
- end
-
- -- Conditional Mod List
- if send_mod_list then
- local mods_list_str = table.concat(context_data.installed_mods,", ")
- if #context_data.installed_mods>10 then mods_list_str="(More than 10 installed mods: "..#context_data.installed_mods..")" end
- metadata_string = metadata_string ..
- "Installed Mods (" .. #context_data.installed_mods .. "): " .. mods_list_str .. "\n"
- end
-
- -- Conditional Command List
- if send_commands then
- local commands_list_str = table.concat(context_data.installed_commands, "\n")
- metadata_string = metadata_string ..
- "Available Commands:\n" .. commands_list_str .. "\n"
- end
-
- -- Conditional Materials Context
- if send_materials and llm_materials_context and llm_materials_context.get_available_materials then
- metadata_string = metadata_string ..
- "\n--- AVAILABLE MATERIALS ---\n" .. llm_materials_context.get_available_materials()
- end
-
- dynamic_system_prompt = system_prompt_content .. metadata_string
- metadata_cache[name] = { hash = current_metadata_hash, metadata = metadata_string }
- else
- dynamic_system_prompt = system_prompt_content .. metadata_cache[name].metadata
- end
-
- table.insert(messages,{role="system",content=dynamic_system_prompt})
- for _,msg in ipairs(player_history) do table.insert(messages,msg) end
-
- -- === max_tokens handling with final JSON fix ===
- local max_tokens_value = 2000
- if max_tokens_type == "integer" then
- max_tokens_value = math.floor(max_tokens_value)
- else
- max_tokens_value = tonumber(max_tokens_value)
- end
-
- local body = core.write_json({ model=model_name, messages=messages, max_tokens=max_tokens_value })
-
- -- Force integer in JSON string if needed (important for Go backends)
- if max_tokens_type == "integer" then
- body = body:gsub('"max_tokens"%s*:%s*(%d+)%.0', '"max_tokens": %1')
- end
-
- core.log("action", "[llm_connect DEBUG] max_tokens_type = " .. max_tokens_type)
- core.log("action", "[llm_connect DEBUG] max_tokens_value = " .. tostring(max_tokens_value))
- core.log("action", "[llm_connect DEBUG] API URL used: " .. api_url) -- Log the final URL
-
- -- Send HTTP request
- http.fetch({
- url = api_url,
- post_data = body,
- method = "POST",
- extra_headers = {
- "Content-Type: application/json",
- "Authorization: Bearer " .. api_key
- },
- timeout = 90,
- }, function(result)
- if result.succeeded then
- local response = core.parse_json(result.data)
- local text = "(no answer)"
- if response and response.choices and response.choices[1] and response.choices[1].message then
- text = response.choices[1].message.content
- table.insert(player_history,{role="assistant",content=text})
- elseif response and response.message and response.message.content then
- text = response.message.content
- end
- core.chat_send_player(name,"[LLM] "..text)
- else
- core.chat_send_player(name,"[LLM] Request failed: "..(result.error or "Unknown error"))
- end
- end)
-
- return true,"Request sent..."
+ description = "Opens the LLM chat interface",
+ privs = {llm = true},
+ func = function(name)
+ chat_gui.show(name)
+ return true, "Opening LLM chat..."
end,
})
+
+core.register_chatcommand("llm_msg", {
+ params = "",
+ description = "Send a direct message to the LLM (text-only, no GUI)",
+ privs = {llm = true},
+ func = function(name, param)
+ if not param or param == "" then
+ return false, "Usage: /llm_msg "
+ end
+ local messages = {{role = "user", content = param}}
+ llm_api.request(messages, function(result)
+ if result.success then
+ core.chat_send_player(name, "[LLM] " .. (result.content or "(no response)"))
+ else
+ core.chat_send_player(name, "[LLM] Error: " .. (result.error or "unknown error"))
+ end
+ end, {timeout = llm_api.get_timeout("chat")})
+ return true, "Request sent..."
+ end,
+})
+
+core.register_chatcommand("llm_undo", {
+ description = "Undo the last WorldEdit agency operation",
+ privs = {llm = true},
+ func = function(name)
+ if not _G.we_agency then
+ return false, "WorldEdit agency module not loaded"
+ end
+ local res = _G.we_agency.undo(name)
+ return res.ok, "[LLM] " .. res.message
+ end,
+})
+
+core.register_chatcommand("llm_reload_startup", {
+ description = "Reload llm_startup.lua (WARNING: Cannot register new items!)",
+ privs = {llm_root = true},
+ func = function(name)
+ core.log("action", "[llm_connect] Manual startup reload triggered by " .. name)
+ core.chat_send_player(name, "[LLM] WARNING: Reloading startup code at runtime")
+ core.chat_send_player(name, "[LLM] New registrations will FAIL. Restart server for registrations.")
+ local f = io.open(startup_file, "r")
+ if f then
+ f:close()
+ local ok, err = pcall(dofile, startup_file)
+ if not ok then
+ core.chat_send_player(name, "[LLM] x Reload failed: " .. tostring(err))
+ return false, "Reload failed"
+ else
+ core.chat_send_player(name, "[LLM] Reloaded (restart needed for registrations)")
+ return true, "Code reloaded"
+ end
+ else
+ core.chat_send_player(name, "[LLM] x No llm_startup.lua found")
+ return false, "File not found"
+ end
+ end,
+})
+
+-- === Central formspec handler ===
+core.register_on_player_receive_fields(function(player, formname, fields)
+ if not player then return false end
+ local name = player:get_player_name()
+
+ if formname:match("^llm_connect:chat") or formname:match("^llm_connect:material_picker") then
+ return chat_gui.handle_fields(name, formname, fields)
+ elseif formname:match("^llm_connect:ide") then
+ return ide_gui.handle_fields(name, formname, fields)
+ elseif formname:match("^llm_connect:config") then
+ return config_gui.handle_fields(name, formname, fields)
+ end
+
+ return false
+end)
+
+-- === Logging ===
+core.log("action", "[llm_connect] LLM Connect v0.9.0 loaded")
+if llm_api.is_configured() then
+ core.log("action", "[llm_connect] LLM API ready - model: " .. tostring(llm_api.config.model))
+else
+ core.log("warning", "[llm_connect] LLM API not configured yet - use /llm and open Config button")
+end
diff --git a/license.txt b/license.txt
deleted file mode 100644
index 2689ca6..0000000
--- a/license.txt
+++ /dev/null
@@ -1,168 +0,0 @@
-
-
- GNU LESSER GENERAL PUBLIC LICENSE
- Version 3, 29 June 2007
-
- Copyright (C) 2007 Free Software Foundation, Inc.
- Everyone is permitted to copy and distribute verbatim copies
- of this license document, but changing it is not allowed.
-
-This version of the GNU Lesser General Public License incorporates
-the terms and conditions of version 3 of the GNU General Public
-License, supplemented by the additional permissions listed below.
-
- 0. Additional Definitions.
-
- "This License" refers to version 3 of the GNU Lesser General Public
-License.
-
- "Copyright" also means copyright-like laws that apply to other kinds of
-works, such as semiconductor masks.
-
- "The Library" refers to a covered work governed by this License,
-other than an Application or a Combined Work as defined below.
-
- An "Application" is any work that makes use of an interface provided
-by the Library, but which is not otherwise based on the Library.
-Defining a subclass of a class defined by the Library is deemed a mode
-of using an interface provided by the Library.
-
- A "Combined Work" is a work produced by combining or linking an
-Application with the Library. The particular version of the Library
-with which the Combined Work was made is also called the "Linked
-Version".
-
- The "Minimal Corresponding Source" for a Combined Work means the
-Corresponding Source for the Combined Work, excluding any source code
-for portions of the Combined Work that, considered in isolation, are
-based on the Application, and not on the Linked Version.
-
- The "Corresponding Application Code" for a Combined Work means the
-object code and/or source code for the Application, including any data
-and utility programs needed for reproducing the Combined Work from the
-Application, but excluding the System Libraries of the Combined Work.
-
- 1. Exception to Section 3 of the GNU GPL.
-
- You may convey a covered work under sections 3 and 4 of this License
-without being bound by section 3 of the GNU GPL.
-
- 2. Conveying Modified Versions.
-
- If you modify a copy of the Library, and, in your modifications, a
-facility refers to a function or data to be supplied by an Application
-that uses the facility (other than as an argument passed when the
-facility is invoked), then you may convey a copy of the modified
-version:
-
- a) under this License, provided that you make a good faith effort to
- ensure that, in the event an Application does not supply the
- function or data, the facility still operates, and performs
- whatever part of its purpose remains meaningful, or
-
- b) under the GNU GPL, with none of the additional permissions of
- this License applicable to that copy.
-
- 3. Object Code Incorporating Material from Library Header Files.
-
- The object code form of an Application may incorporate material from
-a header file that is part of the Library. You may convey such object
-code under terms of your choice, provided that, if the incorporated
-material is not limited to numerical parameters, data structure
-layouts and accessors, or small macros, inline functions and templates
-(ten or fewer lines in length), you do both of the following:
-
- a) Give prominent notice with each copy of the object code that the
- Library is used in it and that the Library and its use are
- covered by this License.
-
- b) Accompany the object code with a copy of the GNU GPL and this license
- document.
-
- 4. Combined Works.
-
- You may convey a Combined Work under terms of your choice that,
-taken together, effectively do not restrict modification of the
-portions of the Library contained in the Combined Work and reverse
-engineering for debugging such modifications, if you also do each of
-the following:
-
- a) Give prominent notice with each copy of the Combined Work that
- the Library is used in it and that the Library and its use are
- covered by this License.
-
- b) Accompany the Combined Work with a copy of the GNU GPL and this license
- document.
-
- c) For a Combined Work that displays copyright notices during
- execution, include the copyright notice for the Library among
- these notices, as well as a reference directing the user to the
- copies of the GNU GPL and this license document.
-
- d) Do one of the following:
-
- 0) Convey the Minimal Corresponding Source under the terms of this
- License, and the Corresponding Application Code in a form
- suitable for, and under terms that permit, the user to
- recombine or relink the Application with a modified version of
- the Linked Version to produce a modified Combined Work, in the
- manner specified by section 6 of the GNU GPL for conveying
- Corresponding Source.
-
- 1) Use a suitable shared library mechanism for linking with the
- Library. A suitable mechanism is one that (a) uses at run time
- a copy of the Library already present on the user's computer
- system, and (b) will operate properly with a modified version
- of the Library that is interface-compatible with the Linked
- Version.
-
- e) Provide Installation Information, but only if you would otherwise
- be required to provide such information under section 6 of the GNU
- GPL, and only to the extent that such information is
- necessary to install and execute a modified version of the
- Combined Work produced by recombining or relinking the
- Application with a modified version of the Linked Version. (If
- you use option 4d0, the Installation Information must accompany
- the Minimal Corresponding Source and Corresponding Application
- Code. If you use option 4d1, you must provide the Installation
- Information in the manner specified by section 6 of the GNU GPL
- for conveying Corresponding Source.)
-
- 5. Combined Libraries.
-
- You may place library facilities that are a work based on the
-Library side by side in a single library together with other library
-facilities that are not Applications and are not covered by this
-License, and convey such a combined library under terms of your
-choice, if you do both of the following:
-
- a) Accompany the combined library with a copy of the same work based
- on the Library, uncombined with any other library facilities,
- conveyed under the terms of this License.
-
- b) Give prominent notice with the combined library that part of it
- is a work based on the Library, and explaining where to find the
- accompanying uncombined form of the same work.
-
- 6. Revised Versions of the GNU Lesser General Public License.
-
- The Free Software Foundation may publish revised and/or new versions
-of the GNU Lesser General Public License from time to time. Such new
-versions will be similar in spirit to the present version, but may
-differ in detail to address new problems or concerns.
-
- Each version is given a distinguishing version number. If the
-Library as you received it specifies that a certain numbered version
-of the GNU Lesser General Public License "or any later version"
-applies to it, you have the option of following the terms and
-conditions either of that published version or of any later version
-published by the Free Software Foundation. If the Library as you
-received it does not specify a version number of the GNU Lesser
-General Public License, you may choose any version of the GNU Lesser
-General Public License ever published by the Free Software Foundation.
-
- If the Library as you received it specifies that a proxy can decide
-whether future versions of the GNU Lesser General Public License shall
-apply, that proxy's public statement of acceptance of any version is
-permanent authorization for you to choose that version for the
-Library.
diff --git a/llm_api.lua b/llm_api.lua
new file mode 100644
index 0000000..b20ac0e
--- /dev/null
+++ b/llm_api.lua
@@ -0,0 +1,314 @@
+-- llm_api.lua
+-- Central LLM API interface for LLM-Connect (v0.8+)
+
+local core = core
+local M = {}
+
+-- Internal states
+M.http = nil
+M.config = {
+ api_key = "",
+ api_url = "",
+ model = "",
+ max_tokens = 4000,
+ max_tokens_integer = true,
+ temperature = 0.7,
+ top_p = 0.9,
+ presence_penalty = 0.0,
+ frequency_penalty = 0.0,
+ timeout = 120, -- global fallback
+ timeout_chat = 0, -- 0 = use global
+ timeout_ide = 0,
+ timeout_we = 0,
+ language = "en",
+ language_repeat = 1,
+ -- context
+ context_max_history = 20,
+ -- ide
+ ide_naming_guide = true,
+ ide_include_run_output = true,
+ ide_context_mod_list = true,
+ ide_context_node_sample = true,
+ ide_max_code_context = 300,
+ -- worldedit
+ we_max_iterations = 6,
+ we_snapshot = true,
+}
+
+local language_instruction_cache = nil
+
+-- ============================================================
+-- Initialization
+-- ============================================================
+
+function M.init(http_api)
+ if not http_api then
+ core.log("error", "[llm_api] No HTTP API provided")
+ return false
+ end
+ M.http = http_api
+
+ -- Load settings once
+ M.reload_config()
+ return true
+end
+
+-- ============================================================
+-- Configuration loading / updating
+-- ============================================================
+
+function M.reload_config()
+ -- Read exact keys from settingtypes.txt
+ M.config.api_key = core.settings:get("llm_api_key") or ""
+ M.config.api_url = core.settings:get("llm_api_url") or ""
+ M.config.model = core.settings:get("llm_model") or ""
+
+ M.config.max_tokens = tonumber(core.settings:get("llm_max_tokens")) or 4000
+ M.config.max_tokens_integer = core.settings:get_bool("llm_max_tokens_integer", true)
+
+ M.config.temperature = tonumber(core.settings:get("llm_temperature")) or 0.7
+ M.config.top_p = tonumber(core.settings:get("llm_top_p")) or 0.9
+ M.config.presence_penalty = tonumber(core.settings:get("llm_presence_penalty")) or 0.0
+ M.config.frequency_penalty = tonumber(core.settings:get("llm_frequency_penalty")) or 0.0
+
+ M.config.timeout = tonumber(core.settings:get("llm_timeout")) or 120
+ M.config.timeout_chat = tonumber(core.settings:get("llm_timeout_chat")) or 0
+ M.config.timeout_ide = tonumber(core.settings:get("llm_timeout_ide")) or 0
+ M.config.timeout_we = tonumber(core.settings:get("llm_timeout_we")) or 0
+
+ M.config.language = core.settings:get("llm_language") or "en"
+ M.config.language_repeat = tonumber(core.settings:get("llm_language_instruction_repeat")) or 1
+
+ M.config.context_max_history = tonumber(core.settings:get("llm_context_max_history")) or 20
+
+ M.config.ide_naming_guide = core.settings:get_bool("llm_ide_naming_guide", true)
+ M.config.ide_include_run_output = core.settings:get_bool("llm_ide_include_run_output", true)
+ M.config.ide_context_mod_list = core.settings:get_bool("llm_ide_context_mod_list", true)
+ M.config.ide_context_node_sample = core.settings:get_bool("llm_ide_context_node_sample", true)
+ M.config.ide_max_code_context = tonumber(core.settings:get("llm_ide_max_code_context")) or 300
+
+ M.config.we_max_iterations = tonumber(core.settings:get("llm_we_max_iterations")) or 6
+ M.config.we_snapshot = core.settings:get_bool("llm_we_snapshot_before_exec", true)
+
+ -- Invalidate cache
+ language_instruction_cache = nil
+
+end
+
+-- Returns the effective timeout for a given mode ("chat", "ide", "we").
+-- Uses per-mode override if > 0, otherwise falls back to global llm_timeout.
+function M.get_timeout(mode)
+ local override = 0
+ if mode == "chat" then override = M.config.timeout_chat
+ elseif mode == "ide" then override = M.config.timeout_ide
+ elseif mode == "we" then override = M.config.timeout_we
+ end
+ if override and override > 0 then return override end
+ return M.config.timeout
+end
+
+function M.set_config(updates)
+ for k, v in pairs(updates) do
+ if M.config[k] ~= nil then
+ M.config[k] = v
+ end
+ end
+ language_instruction_cache = nil
+end
+
+function M.is_configured()
+ return M.config.api_key ~= "" and
+ M.config.api_url ~= "" and
+ M.config.model ~= ""
+end
+
+-- ============================================================
+-- Language instruction (cached)
+-- ============================================================
+
+local function get_language_instruction()
+ if language_instruction_cache then
+ return language_instruction_cache
+ end
+
+ local lang = M.config.language
+ local repeat_count = math.max(0, M.config.language_repeat or 1)
+
+ if lang == "en" or repeat_count == 0 then
+ language_instruction_cache = ""
+ return ""
+ end
+
+ local lang_name = "English"
+ local lang_mod_path = core.get_modpath("llm_connect") .. "/ide_languages.lua"
+ local ok, lang_mod = pcall(dofile, lang_mod_path)
+ if ok and lang_mod and lang_mod.get_language_name then
+ lang_name = lang_mod.get_language_name(lang) or lang_name
+ end
+
+ local instr = "Important: Answer exclusively in " .. lang_name .. "!\n" ..
+ "All explanations, code, comments, output and any text you generate must be in " .. lang_name .. "."
+
+ local parts = {}
+ for _ = 1, repeat_count do
+ table.insert(parts, instr)
+ end
+
+ language_instruction_cache = table.concat(parts, "\n\n") .. "\n\n"
+ return language_instruction_cache
+end
+
+-- ============================================================
+-- Request Function
+-- ============================================================
+
+function M.request(messages, callback, options)
+ if not M.is_configured() then
+ callback({ success = false, error = "LLM API not configured (Check API Key/URL/Model)" })
+ return
+ end
+
+ options = options or {}
+ local cfg = M.config
+
+ local lang_instr = get_language_instruction()
+ if lang_instr ~= "" and (not messages[1] or messages[1].role ~= "system") then
+ table.insert(messages, 1, { role = "system", content = lang_instr })
+ end
+
+ local body_table = {
+ model = options.model or cfg.model,
+ messages = messages,
+ max_tokens = options.max_tokens or cfg.max_tokens,
+ temperature = options.temperature or cfg.temperature,
+ top_p = options.top_p or cfg.top_p,
+ presence_penalty = options.presence_penalty or cfg.presence_penalty,
+ frequency_penalty = options.frequency_penalty or cfg.frequency_penalty,
+ stream = options.stream == true,
+ }
+
+ if options.tools then
+ body_table.tools = options.tools
+ body_table.tool_choice = options.tool_choice or "auto"
+ end
+
+ local max_t = body_table.max_tokens
+ if cfg.max_tokens_integer then
+ body_table.max_tokens = math.floor(max_t)
+ else
+ body_table.max_tokens = tonumber(max_t)
+ end
+
+ local body = core.write_json(body_table)
+
+ if cfg.max_tokens_integer then
+ body = body:gsub('"max_tokens"%s*:%s*(%d+)%.0', '"max_tokens": %1')
+ end
+
+ if core.settings:get_bool("llm_debug") then
+ core.log("action", "[llm_api] Requesting " .. cfg.model .. " at " .. cfg.api_url)
+ end
+
+ M.http.fetch({
+ url = cfg.api_url,
+ method = "POST",
+ data = body,
+ timeout = options.timeout or cfg.timeout,
+ extra_headers = {
+ "Content-Type: application/json",
+ "Authorization: Bearer " .. cfg.api_key,
+ },
+ }, function(result)
+ if not result.succeeded then
+ local err = "HTTP request failed"
+ if result.timeout then
+ err = "Request timed out (limit: " .. tostring(options.timeout or cfg.timeout) .. "s)"
+ elseif result.code then
+ err = "HTTP " .. tostring(result.code)
+ elseif result.error then
+ -- Proxy-level errors (Envoy overflow, connection reset, etc.)
+ local raw = tostring(result.error)
+ if raw:find("overflow") or raw:find("reset") or raw:find("upstream") then
+ err = "Proxy/upstream error (possibly Mistral overload or rate limit). Retry in a moment."
+ else
+ err = raw
+ end
+ end
+ callback({ success = false, error = err, code = result.code })
+ return
+ end
+
+ -- Handle non-JSON responses (proxy errors often return plain text)
+ local raw_data = tostring(result.data or "")
+ if raw_data:find("upstream connect error") or raw_data:find("reset reason") then
+ callback({ success = false, error = "Proxy/upstream error: " .. raw_data:sub(1, 80) .. " โ possibly Mistral overload, retry in a moment." })
+ return
+ end
+
+ local response = core.parse_json(result.data)
+ if not response or type(response) ~= "table" then
+ local raw_preview = raw_data:sub(1, 120)
+ callback({ success = false, error = "Invalid JSON response: " .. raw_preview })
+ return
+ end
+
+ if response.error then
+ callback({
+ success = false,
+ error = response.error.message or "API error",
+ error_type = response.error.type,
+ code = response.error.code
+ })
+ return
+ end
+
+ local content = nil
+ if response.choices and response.choices[1] then
+ content = response.choices[1].message.content
+ elseif response.message and response.message.content then
+ content = response.message.content
+ end
+
+ local ret = {
+ success = content ~= nil,
+ content = content,
+ raw = response,
+ finish_reason = response.choices and response.choices[1] and response.choices[1].finish_reason,
+ usage = response.usage,
+ }
+
+ if response.choices and response.choices[1] and response.choices[1].message.tool_calls then
+ ret.tool_calls = response.choices[1].message.tool_calls
+ end
+
+ if core.settings:get_bool("llm_debug") then
+ core.log("action", "[llm_api DEBUG] Raw response: " .. tostring(result.data or "no data"))
+ core.log("action", "[llm_api DEBUG] Parsed: " .. core.write_json(response or {}, true))
+ end
+
+ callback(ret)
+ end)
+end
+
+-- ============================================================
+-- Helper Wrappers
+-- ============================================================
+
+function M.chat(messages, callback, options)
+ M.request(messages, callback, options)
+end
+
+function M.ask(system_prompt, user_message, callback, options)
+ local messages = {
+ { role = "system", content = system_prompt },
+ { role = "user", content = user_message },
+ }
+ M.request(messages, callback, options)
+end
+
+function M.code(system_prompt, code_block, callback, options)
+ local user_msg = "```lua\n" .. code_block .. "\n```"
+ M.ask(system_prompt, user_msg, callback, options)
+end
+
+return M
diff --git a/llm_materials_context.lua b/llm_materials_context.lua
deleted file mode 100644
index 23cfdf7..0000000
--- a/llm_materials_context.lua
+++ /dev/null
@@ -1,79 +0,0 @@
--- mods/llm_connect/llm_materials_context.lua
-
-local M = {}
-
--- Cache for materials to avoid recomputation
-local materials_cache = nil
-local materials_cache_hash = nil
-
--- Compute a hash for registered items to detect changes
-local function compute_materials_hash()
- local str = ""
- for name, _ in pairs(core.registered_nodes) do str = str .. name end
- for name, _ in pairs(core.registered_craftitems) do str = str .. name end
- for name, _ in pairs(core.registered_tools) do str = str .. name end
- for name, _ in pairs(core.registered_entities) do str = str .. name end
- return core.sha1(str)
-end
-
--- Function to collect available materials
-function M.get_available_materials()
- local current_hash = compute_materials_hash()
- if materials_cache and materials_cache_hash == current_hash then
- return materials_cache
- end
-
- local materials_info = {}
- local current_mod_name = core.get_current_modname()
-
- -- Collect nodes
- for name, def in pairs(core.registered_nodes) do
- if not name:match("^__builtin:") and not name:match("^ignore$") and not name:match("^air$") then
- table.insert(materials_info, "Node: " .. name)
- end
- end
-
- -- Collect craftitems
- for name, def in pairs(core.registered_craftitems) do
- if not name:match("^__builtin:") then
- table.insert(materials_info, "Craftitem: " .. name)
- end
- end
-
- -- Collect tools
- for name, def in pairs(core.registered_tools) do
- if not name:match("^__builtin:") then
- table.insert(materials_info, "Tool: " .. name)
- end
- end
-
- -- Collect entities
- for name, def in pairs(core.registered_entities) do
- if not name:match("^__builtin:") then
- table.insert(materials_info, "Entity: " .. name)
- end
- end
-
- -- Limit the output
- local max_items_to_list = 50 -- Reduced for token efficiency
- local total_items = #materials_info
- local output_string = ""
-
- if total_items > 0 then
- output_string = "Registered materials (" .. total_items .. " in total):\n"
- for i = 1, math.min(total_items, max_items_to_list) do
- output_string = output_string .. " - " .. materials_info[i] .. "\n"
- end
- if total_items > max_items_to_list then
- output_string = output_string .. " ... and " .. (total_items - max_items_to_list) .. " more materials (truncated).\n"
- end
- else
- output_string = "No registered materials found.\n"
- end
-
- materials_cache = output_string
- materials_cache_hash = current_hash
- return output_string
-end
-
-return M
diff --git a/llm_worldedit.lua b/llm_worldedit.lua
new file mode 100644
index 0000000..f067e99
--- /dev/null
+++ b/llm_worldedit.lua
@@ -0,0 +1,1315 @@
+-- llm_worldedit.lua
+-- LLM-Connect Agency Module: WorldEdit Bridge
+-- v0.4.0 โ Fixes: make_pos handles nested/string pos, resolve_node auto-corrects liquids,
+-- loop prompt enforces upfront planning, cumulative step history passed to LLM
+--
+-- ARCHITECTURE:
+-- Direct Lua wrapper around worldedit.* API (not chat command parsing).
+-- The LLM receives a compact context and responds with a JSON tool_calls array.
+-- Each tool call is validated and dispatched to the corresponding worldedit function.
+--
+-- SAFETY LIMITS (enforced per-call):
+-- - Primitives: max radius 64, max dimension 128
+-- - All region ops require explicit pos1+pos2
+-- - Chain aborts on any hard error
+-- - Snapshot taken before every chain โ undo available
+--
+-- ROADMAP:
+-- Phase 1 โ
โ Skeleton, context builder, pos1/pos2, WE toggle
+-- Phase 2 โ
โ All dispatchers: set/replace/copy/move/stack/flip/rotate + primitives
+-- Phase 3 โ
โ Snapshot before execution, M.undo(name) for rollback
+-- Phase 4a โ Iterative feedback loop: LLM sees step results, re-plans
+-- Phase 4b โ Macro mode: LLM builds complex structures over N iterations
+-- with abort condition ("done" signal) and token budget guard
+-- Phase 5 โ WorldEditAdditions support (torus, erode, mazeโฆ)
+
+local core = core
+local M = {}
+
+-- Load system prompts from external file
+local prompts_path = core.get_modpath("llm_connect") .. "/worldedit_system_prompts.lua"
+local prompts_ok, WE_PROMPTS = pcall(dofile, prompts_path)
+if not prompts_ok then
+ core.log("error", "[llm_worldedit] Failed to load worldedit_system_prompts.lua: " .. tostring(WE_PROMPTS))
+ WE_PROMPTS = nil
+end
+
+-- ============================================================
+-- Availability check
+
+local function wea_enabled()
+ if not core.settings:get_bool("llm_worldedit_additions", true) then return false end
+ return type(worldeditadditions) == "table"
+ and type(worldeditadditions.torus) == "function"
+end
+
+-- ============================================================
+
+function M.is_available()
+ if type(worldedit) ~= "table" then return false end
+ -- Erweiterte Prรผfung โ verschiedene WE-Versionen haben unterschiedliche Funktionsnamen
+ return type(worldedit.set) == "function"
+ or type(worldedit.set_node) == "function"
+ or type(worldedit.manip_helpers) == "table"
+ or next(worldedit) ~= nil -- Fallback: Tabelle nicht leer
+end
+
+-- ============================================================
+-- Snapshot / Undo (Phase 3)
+-- Serializes the bounding box of pos1..pos2 before any chain
+-- runs. M.undo(name) restores it. One snapshot per player
+-- (last-write-wins โ sufficient for agency use).
+-- ============================================================
+
+local snapshots = {} -- [player_name] = {p1, p2, data}
+
+local function take_snapshot(name)
+ if not M.is_available() then return false end
+ local p1 = worldedit.pos1 and worldedit.pos1[name]
+ local p2 = worldedit.pos2 and worldedit.pos2[name]
+ if not p1 or not p2 then return false end -- nothing to snapshot yet
+
+ -- worldedit.serialize returns a string representation of the region
+ local ok, data = pcall(worldedit.serialize, p1, p2)
+ if not ok or not data then return false end
+
+ snapshots[name] = {
+ p1 = {x=p1.x, y=p1.y, z=p1.z},
+ p2 = {x=p2.x, y=p2.y, z=p2.z},
+ data = data,
+ }
+ return true
+end
+
+-- Public: restore last snapshot for player. Returns {ok, message}.
+function M.undo(name)
+ if not M.is_available() then
+ return {ok=false, message="WorldEdit not available"}
+ end
+ local snap = snapshots[name]
+ if not snap then
+ return {ok=false, message="No snapshot available for " .. name}
+ end
+ local ok, count = pcall(worldedit.deserialize, snap.p1, snap.data)
+ if ok then
+ -- Restore selection markers
+ worldedit.pos1[name] = snap.p1
+ worldedit.pos2[name] = snap.p2
+ if worldedit.mark_pos1 then worldedit.mark_pos1(name) end
+ if worldedit.mark_pos2 then worldedit.mark_pos2(name) end
+ snapshots[name] = nil
+ return {ok=true, message=string.format("Restored %d nodes from snapshot", count or 0)}
+ else
+ return {ok=false, message="Deserialize failed: " .. tostring(count)}
+ end
+end
+
+-- Cleanup snapshots on leave
+core.register_on_leaveplayer(function(player)
+ snapshots[player:get_player_name()] = nil
+end)
+
+-- ============================================================
+-- Context builder
+-- Returns a compact but rich string that the LLM gets instead
+-- of the generic game context when WorldEdit mode is active.
+-- Goal: minimal tokens, maximum actionability.
+-- ============================================================
+
+-- Samples a coarse NxNxN grid of node names around a position.
+-- Returns a compact summary string to give the LLM spatial awareness.
+-- N=5 โ 5x5x5 = 125 nodes, sampled at step=2 โ ~32 unique checks
+local function sample_environment(pos, radius, step)
+ radius = radius or 6
+ step = step or 3
+ local counts = {}
+ local total = 0
+
+ for dx = -radius, radius, step do
+ for dy = -radius, radius, step do
+ for dz = -radius, radius, step do
+ local node = core.get_node({
+ x = math.floor(pos.x + dx),
+ y = math.floor(pos.y + dy),
+ z = math.floor(pos.z + dz),
+ })
+ if node and node.name ~= "air" and node.name ~= "ignore" then
+ counts[node.name] = (counts[node.name] or 0) + 1
+ total = total + 1
+ end
+ end
+ end
+ end
+
+ if total == 0 then return "Surroundings: mostly air" end
+
+ -- Sort by frequency, take top 8
+ local sorted = {}
+ for name, count in pairs(counts) do
+ table.insert(sorted, {name=name, count=count})
+ end
+ table.sort(sorted, function(a,b) return a.count > b.count end)
+
+ local parts = {}
+ for i = 1, math.min(8, #sorted) do
+ parts[i] = sorted[i].name .. "ร" .. sorted[i].count
+ end
+ return "Nearby nodes (sampled r=" .. radius .. "): " .. table.concat(parts, ", ")
+end
+
+function M.get_context(player_name)
+ if not M.is_available() then
+ return "WorldEdit: NOT LOADED"
+ end
+
+ local player = core.get_player_by_name(player_name)
+ if not player then return "Player not found" end
+
+ local pos = player:get_pos()
+ local px, py, pz = math.floor(pos.x), math.floor(pos.y), math.floor(pos.z)
+
+ -- Current WE selection for this player
+ local p1 = worldedit.pos1 and worldedit.pos1[player_name]
+ local p2 = worldedit.pos2 and worldedit.pos2[player_name]
+ local sel_str
+ if p1 and p2 then
+ sel_str = string.format("pos1=(%d,%d,%d) pos2=(%d,%d,%d) vol=%d",
+ math.floor(p1.x), math.floor(p1.y), math.floor(p1.z),
+ math.floor(p2.x), math.floor(p2.y), math.floor(p2.z),
+ worldedit.volume(p1, p2))
+ elseif p1 then
+ sel_str = string.format("pos1=(%d,%d,%d) pos2=not set",
+ math.floor(p1.x), math.floor(p1.y), math.floor(p1.z))
+ else
+ sel_str = "no selection"
+ end
+
+ -- Coarse environment scan
+ local env = sample_environment(pos)
+
+ -- WorldEdit API capability list (static, no token bloat)
+ -- This tells the LLM WHAT it can call. Details are in the tool schema.
+ local capabilities = table.concat({
+ "set_region(pos1,pos2,node)",
+ "replace(pos1,pos2,search,replace)",
+ "copy(pos1,pos2,axis,amount)",
+ "move(pos1,pos2,axis,amount)",
+ "stack(pos1,pos2,axis,count)",
+ "flip(pos1,pos2,axis)",
+ "rotate(pos1,pos2,angle)",
+ "sphere(pos,radius,node,hollow)",
+ "dome(pos,radius,node,hollow)",
+ "cylinder(pos,axis,length,radius1,radius2,node,hollow)",
+ "pyramid(pos,axis,height,node,hollow)",
+ "cube(pos,width,height,length,node,hollow)",
+ "set_pos1(pos)",
+ "set_pos2(pos)",
+ "get_selection()",
+ "clear_region(pos1,pos2)",
+ }, " | ")
+
+ local lines = {
+ "=== WorldEdit Agency Mode ===",
+ string.format("Player: %s Pos: (%d,%d,%d)", player_name, px, py, pz),
+ "Selection: " .. sel_str,
+ env,
+ "",
+ "Available tools: " .. capabilities,
+ "",
+ "RULES:",
+ "1. Always think about coordinates relative to the player position.",
+ "2. Set pos1 and pos2 BEFORE calling region operations.",
+ "3. Use 'air' as node name to clear/delete nodes.",
+ "4. Prefer relative offsets from player pos for natural language requests.",
+ "5. Return a JSON tool_calls array. Each call: {tool, args}.",
+ "6. If the request is ambiguous, ask for clarification instead of guessing.",
+ "=== END CONTEXT ===",
+ }
+ return table.concat(lines, "\n")
+end
+
+-- ============================================================
+-- System prompt for WorldEdit agency mode
+-- ============================================================
+
+
+-- ============================================================
+-- Tool schema (for llm_api tool_calls / function calling)
+-- Phase 2 will use this to validate and dispatch.
+-- Currently used only to document; real dispatch is in Phase 2.
+-- ============================================================
+
+M.TOOL_SCHEMA = {
+ -- Region selection
+ {
+ name = "set_pos1",
+ description = "Set WorldEdit position 1 to absolute world coordinates.",
+ parameters = {x="integer", y="integer", z="integer"}
+ },
+ {
+ name = "set_pos2",
+ description = "Set WorldEdit position 2 to absolute world coordinates.",
+ parameters = {x="integer", y="integer", z="integer"}
+ },
+ {
+ name = "get_selection",
+ description = "Return current pos1, pos2 and volume for the player.",
+ parameters = {}
+ },
+ -- Region manipulation
+ {
+ name = "set_region",
+ description = "Fill the current selection with node. Use 'air' to clear.",
+ parameters = {node="string"}
+ },
+ {
+ name = "clear_region",
+ description = "Fill the current selection with air (delete nodes).",
+ parameters = {}
+ },
+ {
+ name = "replace",
+ description = "Replace all instances of search_node with replace_node in selection.",
+ parameters = {search_node="string", replace_node="string"}
+ },
+ {
+ name = "copy",
+ description = "Copy the selection along an axis by amount nodes.",
+ parameters = {axis="string (x|y|z)", amount="integer"}
+ },
+ {
+ name = "move",
+ description = "Move the selection along an axis by amount nodes.",
+ parameters = {axis="string (x|y|z)", amount="integer"}
+ },
+ {
+ name = "stack",
+ description = "Stack (duplicate) the selection along an axis count times.",
+ parameters = {axis="string (x|y|z)", count="integer"}
+ },
+ {
+ name = "flip",
+ description = "Flip the selection along the given axis.",
+ parameters = {axis="string (x|y|z)"}
+ },
+ {
+ name = "rotate",
+ description = "Rotate the selection by angle degrees (90/180/270) around Y axis.",
+ parameters = {angle="integer (90|180|270)"}
+ },
+ -- Primitives (absolute pos, no selection needed)
+ {
+ name = "sphere",
+ description = "Generate a sphere at pos with given radius and node.",
+ parameters = {x="integer", y="integer", z="integer",
+ radius="integer", node="string", hollow="boolean (optional)"}
+ },
+ {
+ name = "dome",
+ description = "Generate a dome (half-sphere) at pos.",
+ parameters = {x="integer", y="integer", z="integer",
+ radius="integer", node="string", hollow="boolean (optional)"}
+ },
+ {
+ name = "cylinder",
+ description = "Generate a cylinder at pos along axis.",
+ parameters = {x="integer", y="integer", z="integer",
+ axis="string (x|y|z)", length="integer",
+ radius="integer", node="string", hollow="boolean (optional)"}
+ },
+ {
+ name = "pyramid",
+ description = "Generate a pyramid at pos along axis.",
+ parameters = {x="integer", y="integer", z="integer",
+ axis="string (x|y|z)", height="integer",
+ node="string", hollow="boolean (optional)"}
+ },
+ {
+ name = "cube",
+ description = "Generate a cube/box centered at pos.",
+ parameters = {x="integer", y="integer", z="integer",
+ width="integer", height="integer", length="integer",
+ node="string", hollow="boolean (optional)"}
+ },
+}
+
+-- ============================================================
+-- Tool dispatcher โ PHASE 2 STUB
+-- Currently: validates structure, logs, returns not_implemented.
+-- Phase 2: actually calls worldedit.* and reports results.
+-- ============================================================
+
+-- Resolves pos from args โ handles both flat {x,y,z} and nested {pos={x,y,z}}
+-- Also falls back to player's pos1 if args.pos == "pos1" etc.
+local function make_pos(args, player_name)
+ -- Nested: {"pos": {"x":1,"y":2,"z":3}}
+ if type(args.pos) == "table" then
+ return {
+ x = tonumber(args.pos.x) or 0,
+ y = tonumber(args.pos.y) or 0,
+ z = tonumber(args.pos.z) or 0,
+ }
+ end
+ -- String alias: "pos1" or "pos2" โ use current selection
+ if type(args.pos) == "string" then
+ if player_name then
+ if args.pos == "pos1" and worldedit.pos1[player_name] then
+ return worldedit.pos1[player_name]
+ end
+ if args.pos == "pos2" and worldedit.pos2[player_name] then
+ return worldedit.pos2[player_name]
+ end
+ end
+ end
+ -- Flat: {x=1, y=2, z=3}
+ return {
+ x = tonumber(args.x) or 0,
+ y = tonumber(args.y) or 0,
+ z = tonumber(args.z) or 0,
+ }
+end
+
+-- Validates a node name โ returns corrected name or nil + error message.
+-- Tries common suffixes (_source, _flowing) if the base name is unknown.
+local function resolve_node(node_str)
+ if not node_str or node_str == "" then
+ return nil, "node name is empty"
+ end
+ if node_str == "air" then return "air", nil end
+ if core.registered_nodes[node_str] then return node_str, nil end
+ -- Try _source suffix (liquids)
+ local with_source = node_str .. "_source"
+ if core.registered_nodes[with_source] then
+ return with_source, nil
+ end
+ -- Try stripping _flowing
+ local base = node_str:gsub("_flowing$", "_source")
+ if core.registered_nodes[base] then
+ return base, nil
+ end
+ -- Unknown but pass through โ worldedit will handle it
+ core.log("warning", ("[llm_worldedit] unknown node '%s' โ passing through"):format(node_str))
+ return node_str, nil
+end
+
+-- Map tool names โ executor functions
+-- Each returns {ok=bool, message=string, nodes_affected=int|nil}
+local DISPATCHERS = {
+
+ set_pos1 = function(name, args)
+ local pos = make_pos(args)
+ worldedit.pos1[name] = pos
+ if worldedit.mark_pos1 then worldedit.mark_pos1(name) end
+ return {ok=true, message=string.format("pos1 set to (%d,%d,%d)", pos.x, pos.y, pos.z)}
+ end,
+
+ set_pos2 = function(name, args)
+ local pos = make_pos(args)
+ worldedit.pos2[name] = pos
+ if worldedit.mark_pos2 then worldedit.mark_pos2(name) end
+ return {ok=true, message=string.format("pos2 set to (%d,%d,%d)", pos.x, pos.y, pos.z)}
+ end,
+
+ get_selection = function(name, _args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=true, message="No selection set."}
+ end
+ return {ok=true, message=string.format(
+ "pos1=(%d,%d,%d) pos2=(%d,%d,%d) vol=%d",
+ p1.x,p1.y,p1.z, p2.x,p2.y,p2.z,
+ worldedit.volume(p1,p2)
+ )}
+ end,
+
+ -- ============================================================
+ -- Region operations (require pos1 + pos2 to be set first)
+ -- ============================================================
+
+ set_region = function(name, args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=false, message="No selection: use set_pos1/set_pos2 first"}
+ end
+ local node, err = resolve_node(tostring(args.node or "air"))
+ if not node then return {ok=false, message="set_region: " .. err} end
+ local count = worldedit.set(p1, p2, node)
+ return {ok=true, message=string.format("Set %d nodes to %s", count or 0, node), nodes=count}
+ end,
+
+ clear_region = function(name, args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=false, message="No selection: use set_pos1/set_pos2 first"}
+ end
+ local count = worldedit.set(p1, p2, "air")
+ return {ok=true, message=string.format("Cleared %d nodes", count or 0), nodes=count}
+ end,
+
+ replace = function(name, args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=false, message="No selection: use set_pos1/set_pos2 first"}
+ end
+ local search, e1 = resolve_node(tostring(args.search_node or ""))
+ local replace_, e2 = resolve_node(tostring(args.replace_node or "air"))
+ if not search then return {ok=false, message="replace: search_node: " .. (e1 or "?")} end
+ if not replace_ then return {ok=false, message="replace: replace_node: " .. (e2 or "?")} end
+ local count = worldedit.replace(p1, p2, search, replace_)
+ return {ok=true,
+ message=string.format("Replaced %d nodes: %s โ %s", count or 0, search, replace_),
+ nodes=count}
+ end,
+
+ -- โโ Transforms (operate on current selection, move markers too) โโโโโโโโโโ
+
+ copy = function(name, args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=false, message="No selection: use set_pos1/set_pos2 first"}
+ end
+ local axis = tostring(args.axis or "y")
+ local amount = tonumber(args.amount or 1)
+ if not ({x=1,y=1,z=1})[axis] then
+ return {ok=false, message="copy: axis must be x, y or z"}
+ end
+ local count = worldedit.copy(p1, p2, axis, amount)
+ return {ok=true,
+ message=string.format("Copied %d nodes along %s by %d", count or 0, axis, amount),
+ nodes=count}
+ end,
+
+ move = function(name, args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=false, message="No selection: use set_pos1/set_pos2 first"}
+ end
+ local axis = tostring(args.axis or "y")
+ local amount = tonumber(args.amount or 1)
+ if not ({x=1,y=1,z=1})[axis] then
+ return {ok=false, message="move: axis must be x, y or z"}
+ end
+ -- worldedit.move also updates pos1/pos2 in place
+ local count, newp1, newp2 = worldedit.move(p1, p2, axis, amount)
+ if newp1 then worldedit.pos1[name] = newp1 end
+ if newp2 then worldedit.pos2[name] = newp2 end
+ return {ok=true,
+ message=string.format("Moved %d nodes along %s by %d", count or 0, axis, amount),
+ nodes=count}
+ end,
+
+ stack = function(name, args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=false, message="No selection: use set_pos1/set_pos2 first"}
+ end
+ local axis = tostring(args.axis or "y")
+ local count = tonumber(args.count or 1)
+ if not ({x=1,y=1,z=1})[axis] then
+ return {ok=false, message="stack: axis must be x, y or z"}
+ end
+ local nodes = worldedit.stack(p1, p2, axis, count)
+ return {ok=true,
+ message=string.format("Stacked region %d times along %s (%d nodes)", count, axis, nodes or 0),
+ nodes=nodes}
+ end,
+
+ flip = function(name, args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=false, message="No selection: use set_pos1/set_pos2 first"}
+ end
+ local axis = tostring(args.axis or "y")
+ if not ({x=1,y=1,z=1})[axis] then
+ return {ok=false, message="flip: axis must be x, y or z"}
+ end
+ local count = worldedit.flip(p1, p2, axis)
+ return {ok=true,
+ message=string.format("Flipped region along %s (%d nodes)", axis, count or 0),
+ nodes=count}
+ end,
+
+ rotate = function(name, args)
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then
+ return {ok=false, message="No selection: use set_pos1/set_pos2 first"}
+ end
+ local angle = tonumber(args.angle or 90)
+ -- worldedit.rotate(pos1, pos2, axis, angle) โ axis is always "y" for our schema
+ local axis = tostring(args.axis or "y")
+ if not ({x=1,y=1,z=1})[axis] then
+ return {ok=false, message="rotate: axis must be x, y or z"}
+ end
+ if not ({[90]=1,[180]=1,[270]=1,[-90]=1})[angle] then
+ return {ok=false, message="rotate: angle must be 90, 180, 270 or -90"}
+ end
+ local count, newp1, newp2 = worldedit.rotate(p1, p2, axis, angle)
+ if newp1 then worldedit.pos1[name] = newp1 end
+ if newp2 then worldedit.pos2[name] = newp2 end
+ return {ok=true,
+ message=string.format("Rotated %d nodes by %dยฐ around %s", count or 0, angle, axis),
+ nodes=count}
+ end,
+
+ -- โโ Primitives (standalone, no selection needed) โโโโโโโโโโโโโโโโโโโโโโโโโ
+ -- All take absolute world coordinates from args.
+
+ sphere = function(name, args)
+ local pos = make_pos(args, name)
+ local radius = tonumber(args.radius or 5)
+ local node,err = resolve_node(tostring(args.node or "default:stone"))
+ if not node then return {ok=false, message="sphere: " .. err} end
+ local hollow = args.hollow == true or args.hollow == "true"
+ if radius < 1 or radius > 64 then
+ return {ok=false, message="sphere: radius must be 1โ64"}
+ end
+ local count = worldedit.sphere(pos, radius, node, hollow)
+ return {ok=true,
+ message=string.format("Sphere r=%d %s at (%d,%d,%d): %d nodes",
+ radius, node, pos.x, pos.y, pos.z, count or 0),
+ nodes=count}
+ end,
+
+ dome = function(name, args)
+ local pos = make_pos(args, name)
+ local radius = tonumber(args.radius or 5)
+ local node,err = resolve_node(tostring(args.node or "default:stone"))
+ if not node then return {ok=false, message="dome: " .. err} end
+ local hollow = args.hollow == true or args.hollow == "true"
+ if radius < 1 or radius > 64 then
+ return {ok=false, message="dome: radius must be 1โ64"}
+ end
+ local count = worldedit.dome(pos, radius, node, hollow)
+ return {ok=true,
+ message=string.format("Dome r=%d %s at (%d,%d,%d): %d nodes",
+ radius, node, pos.x, pos.y, pos.z, count or 0),
+ nodes=count}
+ end,
+
+ cylinder = function(name, args)
+ local pos = make_pos(args, name)
+ local axis = tostring(args.axis or "y")
+ local length = tonumber(args.length or 5)
+ local r1 = tonumber(args.radius or args.radius1 or 3)
+ local r2 = tonumber(args.radius2 or r1)
+ local node,err = resolve_node(tostring(args.node or "default:stone"))
+ if not node then return {ok=false, message="cylinder: " .. err} end
+ local hollow = args.hollow == true or args.hollow == "true"
+ if not ({x=1,y=1,z=1})[axis] then
+ return {ok=false, message="cylinder: axis must be x, y or z"}
+ end
+ if length < 1 or length > 128 then
+ return {ok=false, message="cylinder: length must be 1โ128"}
+ end
+ local count = worldedit.cylinder(pos, axis, length, r1, r2, node, hollow)
+ return {ok=true,
+ message=string.format("Cylinder axis=%s len=%d r=%d %s at (%d,%d,%d): %d nodes",
+ axis, length, r1, node, pos.x, pos.y, pos.z, count or 0),
+ nodes=count}
+ end,
+
+ pyramid = function(name, args)
+ local pos = make_pos(args, name)
+ local axis = tostring(args.axis or "y")
+ local height = tonumber(args.height or 5)
+ local node,err = resolve_node(tostring(args.node or "default:stone"))
+ if not node then return {ok=false, message="pyramid: " .. err} end
+ local hollow = args.hollow == true or args.hollow == "true"
+ if not ({x=1,y=1,z=1})[axis] then
+ return {ok=false, message="pyramid: axis must be x, y or z"}
+ end
+ if height < 1 or height > 64 then
+ return {ok=false, message="pyramid: height must be 1โ64"}
+ end
+ local count = worldedit.pyramid(pos, axis, height, node, hollow)
+ return {ok=true,
+ message=string.format("Pyramid axis=%s h=%d %s at (%d,%d,%d): %d nodes",
+ axis, height, node, pos.x, pos.y, pos.z, count or 0),
+ nodes=count}
+ end,
+
+ cube = function(name, args)
+ local pos = make_pos(args, name)
+ local w = tonumber(args.width or args.w or 5)
+ local h = tonumber(args.height or args.h or 5)
+ local l = tonumber(args.length or args.l or 5)
+ local node,err = resolve_node(tostring(args.node or "default:stone"))
+ if not node then return {ok=false, message="cube: " .. err} end
+ local hollow = args.hollow == true or args.hollow == "true"
+ if w < 1 or h < 1 or l < 1 or w > 128 or h > 128 or l > 128 then
+ return {ok=false, message="cube: dimensions must be 1โ128"}
+ end
+ local count = worldedit.cube(pos, w, h, l, node, hollow)
+ return {ok=true,
+ message=string.format("Cube %dx%dx%d %s at (%d,%d,%d): %d nodes",
+ w, h, l, node, pos.x, pos.y, pos.z, count or 0),
+ nodes=count}
+ end,
+}
+
+-- Executes a list of tool calls (from LLM JSON response).
+-- Returns a results table: { {tool, ok, message}, ... }
+function M.execute_tool_calls(player_name, tool_calls)
+ if not M.is_available() then
+ return {{tool="*", ok=false, message="WorldEdit not available"}}
+ end
+
+ -- Snapshot current region before any changes (Phase 3)
+ -- Best-effort: if no selection yet, snapshot is skipped silently.
+ take_snapshot(player_name)
+
+ local results = {}
+ for i, call in ipairs(tool_calls) do
+ local tool_name = call.tool
+ local args = call.args or {}
+ -- Check WEA dispatchers first if WEA is enabled
+ local dispatcher = DISPATCHERS[tool_name]
+ or (M.wea_dispatchers and M.wea_dispatchers[tool_name])
+
+ if not dispatcher then
+ table.insert(results, {
+ tool = tool_name,
+ ok = false,
+ message = "Unknown tool: " .. tostring(tool_name)
+ })
+ else
+ local ok, res = pcall(dispatcher, player_name, args)
+ if ok then
+ table.insert(results, {tool=tool_name, ok=res.ok, message=res.message})
+ else
+ table.insert(results, {
+ tool = tool_name,
+ ok = false,
+ message = "Dispatcher error: " .. tostring(res)
+ })
+ end
+ end
+
+ -- Abort chain on hard errors (skip non-fatal informational results)
+ if not results[#results].ok then
+ table.insert(results, {tool="*", ok=false,
+ message="Chain aborted at step " .. i .. "."})
+ break
+ end
+ end
+ return results
+end
+
+-- ============================================================
+-- Iterative Macro Loop (Phase 4b)
+--
+-- M.run_loop(player_name, goal, options, callback)
+--
+-- How it works:
+-- 1. Send goal + full context to LLM
+-- 2. LLM responds with {plan, tool_calls, done, reason}
+-- - done=true โ LLM signals it's finished
+-- - done=false โ execute tool_calls, update context, iterate
+-- 3. After executing, rebuild context (new pos, new env scan)
+-- and send the step results back to LLM as the next user turn
+-- 4. Repeat until: done=true, max_iterations reached, or hard error
+--
+-- Token budget strategy:
+-- Each iteration gets a fresh context (NOT the full history).
+-- Only the goal + last step results are sent. This keeps
+-- token usage bounded at O(1) per step rather than O(n).
+--
+-- options:
+-- max_iterations int Max loop iterations (default: 6)
+-- timeout int Per-request timeout in seconds (default: 90)
+-- on_step func Called after each step: on_step(i, plan, results)
+-- Use this to stream progress to the GUI.
+-- ============================================================
+
+-- Extended system prompt for loop mode โ tells the LLM about the done signal
+
+function M.run_loop(player_name, goal, options, callback)
+ if not M.is_available() then
+ callback({ok=false, error="WorldEdit not available"})
+ return
+ end
+
+ local llm_api = _G.llm_api
+ if not llm_api then
+ callback({ok=false, error="llm_api not available"})
+ return
+ end
+
+ options = options or {}
+ local cfg = _G.llm_api and _G.llm_api.config or {}
+ local max_iter = options.max_iterations or cfg.we_max_iterations or 6
+ local timeout = options.timeout or (_G.llm_api and _G.llm_api.get_timeout("we")) or 90
+ local on_step = options.on_step -- optional progress callback
+
+ local iteration = 0
+ local all_results = {}
+ local step_history = {} -- compact log of all steps so far
+
+ local function make_user_msg()
+ if iteration == 1 then
+ return "Goal: " .. goal
+ end
+ -- Compact history: just plan + ok/fail per step, not full node counts
+ local hist_lines = {"Goal: " .. goal, "", "Completed steps so far:"}
+ for i, s in ipairs(step_history) do
+ local ok_count = 0
+ local err_count = 0
+ for _, r in ipairs(s.results or {}) do
+ if r.ok then ok_count = ok_count + 1 else err_count = err_count + 1 end
+ end
+ table.insert(hist_lines, string.format(" Step %d: %s [โ%d โ%d]",
+ i, s.plan, ok_count, err_count))
+ end
+ table.insert(hist_lines, "")
+ table.insert(hist_lines, "Continue with the next step of your plan. Set done=true when finished.")
+ return table.concat(hist_lines, "\n")
+ end
+
+ local function do_iteration()
+ iteration = iteration + 1
+
+ if iteration > max_iter then
+ callback({ok=true, finished=false,
+ reason="Reached max iterations (" .. max_iter .. ")",
+ steps=all_results})
+ return
+ end
+
+ local context = M.get_context(player_name)
+ local messages = {
+ {role="system", content=(WE_PROMPTS and WE_PROMPTS.build_loop(wea_enabled()) or "") .. context},
+ {role="user", content=make_user_msg()},
+ }
+
+ llm_api.request(messages, function(result)
+ if not result.success then
+ callback({ok=false, error=result.error or "LLM request failed", steps=all_results})
+ return
+ end
+
+ local raw = result.content or ""
+ raw = raw:match("```json\n(.-)```") or raw:match("```\n(.-)```") or raw
+ raw = raw:match("^%s*(.-)%s*$")
+
+ local parsed = core.parse_json(raw)
+ if not parsed or type(parsed) ~= "table" then
+ callback({ok=false, error="Invalid JSON on iteration " .. iteration,
+ raw=result.content, steps=all_results})
+ return
+ end
+
+ local plan = parsed.plan or "(no plan)"
+ local tool_calls = parsed.tool_calls or {}
+ local done = parsed.done == true
+ local reason = parsed.reason or ""
+
+ local exec_results = {}
+ if #tool_calls > 0 then
+ exec_results = M.execute_tool_calls(player_name, tool_calls)
+ end
+
+ local step = {iteration=iteration, plan=plan, results=exec_results, done=done}
+ table.insert(all_results, step)
+ table.insert(step_history, step)
+
+ if on_step then pcall(on_step, iteration, plan, exec_results) end
+
+ local had_hard_error = false
+ for _, r in ipairs(exec_results) do
+ if r.tool == "*" and not r.ok then had_hard_error = true; break end
+ end
+
+ if done or had_hard_error then
+ callback({
+ ok = true,
+ finished = done and not had_hard_error,
+ reason = had_hard_error
+ and ("Aborted: hard error on iteration " .. iteration)
+ or reason,
+ steps = all_results,
+ })
+ else
+ do_iteration()
+ end
+ end, {timeout=timeout})
+ end
+
+ -- Kick off
+ do_iteration()
+end
+
+-- ============================================================
+-- Format loop results for display
+-- ============================================================
+
+function M.format_loop_results(res)
+ if not res.ok then
+ return "โ Loop error: " .. (res.error or "?")
+ end
+ local lines = {}
+ local icon = res.finished and "โ" or "โ "
+ table.insert(lines, icon .. " Loop finished after "
+ .. #(res.steps or {}) .. " step(s). " .. (res.reason or ""))
+ for _, step in ipairs(res.steps or {}) do
+ table.insert(lines, string.format("\nโ Step %d: %s", step.iteration, step.plan))
+ for _, r in ipairs(step.results or {}) do
+ table.insert(lines, string.format(" %s %s โ %s",
+ r.ok and "โ" or "โ", r.tool, r.message))
+ end
+ end
+ return table.concat(lines, "\n")
+end
+
+-- ============================================================
+-- LLM request wrapper for agency mode (single-shot)
+-- Sends the WorldEdit context + user request to the LLM,
+-- parses the JSON response, executes tool calls.
+-- Phase 1: parses and logs; execution only for set_pos1/set_pos2/get_selection.
+-- ============================================================
+
+function M.request(player_name, user_message, callback)
+ if not M.is_available() then
+ callback({ok=false, error="WorldEdit is not installed or not loaded."})
+ return
+ end
+
+ local llm_api = _G.llm_api
+ if not llm_api then
+ callback({ok=false, error="llm_api not available"})
+ return
+ end
+
+ local context = M.get_context(player_name)
+
+ local messages = {
+ {role = "system", content = (WE_PROMPTS and WE_PROMPTS.build_single(wea_enabled()) or "") .. context},
+ {role = "user", content = user_message},
+ }
+
+ llm_api.request(messages, function(result)
+ if not result.success then
+ callback({ok=false, error=result.error or "LLM request failed"})
+ return
+ end
+
+ -- Parse JSON response from LLM
+ local raw = result.content or ""
+
+ -- Strip markdown fences if present
+ raw = raw:match("```json\n(.-)```") or raw:match("```\n(.-)```") or raw
+ raw = raw:match("^%s*(.-)%s*$") -- trim
+
+ local parsed = core.parse_json(raw)
+ if not parsed or type(parsed) ~= "table" then
+ callback({
+ ok = false,
+ error = "LLM returned invalid JSON",
+ raw = result.content,
+ })
+ return
+ end
+
+ local plan = parsed.plan or "(no plan)"
+ local tool_calls = parsed.tool_calls or {}
+
+ if type(tool_calls) ~= "table" or #tool_calls == 0 then
+ callback({ok=true, plan=plan, results={}, message="No tool calls in response."})
+ return
+ end
+
+ -- Execute
+ local exec_results = M.execute_tool_calls(player_name, tool_calls)
+
+ callback({
+ ok = true,
+ plan = plan,
+ results = exec_results,
+ raw = result.content,
+ })
+ end, {
+ timeout = (_G.llm_api and _G.llm_api.get_timeout("we")) or 60,
+ -- Note: NOT using native tool_calls API here because local models
+ -- (Ollama) often don't support it reliably. We use JSON-in-text instead.
+ })
+end
+
+-- ============================================================
+-- Format results for display in chat_gui
+-- ============================================================
+
+function M.format_results(plan, results)
+ local lines = {"Plan: " .. (plan or "?")}
+ for i, r in ipairs(results) do
+ local icon = r.ok and "โ" or "โ"
+ table.insert(lines, string.format(" %s [%d] %s โ %s",
+ icon, i, r.tool, r.message))
+ end
+ return table.concat(lines, "\n")
+end
+
+core.log("action", "[llm_worldedit] Agency module loaded (Phase 2 โ all dispatchers active)")
+
+-- ============================================================
+-- Phase 5: WorldEditAdditions (WEA) Integration
+-- ============================================================
+-- WEA exposes its operations via the global `worldeditadditions`
+-- table AND as registered chat commands ("//torus" etc.).
+-- We prefer the direct Lua API where available (worldeditadditions.*),
+-- and fall back to chat-command dispatch where needed.
+--
+-- DETECTION: worldeditadditions table + at least one known function
+-- SETTING: llm_worldedit_additions (bool, default true)
+-- ============================================================
+
+local function wea_enabled()
+ if not core.settings:get_bool("llm_worldedit_additions", true) then
+ return false
+ end
+ return type(worldeditadditions) == "table"
+ and type(worldeditadditions.torus) == "function"
+end
+
+-- Internal: dispatch a WEA chat command by calling its registered
+-- handler directly. Returns {ok, message, nodes}.
+-- This is used for commands where WEA's Lua API differs from the
+-- chat command (erode, convolve, overlay, replacemix, layers).
+local function wea_cmd(player_name, cmd_name, params_str)
+ -- WEA registers commands as "/" (single slash, WEA convention)
+ -- The chat command is registered as "/" by worldedit.register_command
+ local full_name = cmd_name -- e.g. "convolve", "erode"
+ -- Try: registered_chatcommands["/convolve"] first
+ local handler = core.registered_chatcommands["/" .. full_name]
+ if not handler then
+ -- Some WEA versions use double-slash: "//convolve"
+ handler = core.registered_chatcommands["//" .. full_name]
+ end
+ if not handler or not handler.func then
+ return {ok=false, message="WEA command '" .. full_name .. "' not registered"}
+ end
+ local ok, result = pcall(handler.func, player_name, params_str or "")
+ if not ok then
+ return {ok=false, message="WEA dispatch error: " .. tostring(result)}
+ end
+ -- WEA command handlers return (bool, message) or just bool
+ if type(result) == "string" then
+ return {ok=true, message=result}
+ elseif type(result) == "boolean" then
+ return {ok=result, message=(result and "OK" or "WEA command returned false")}
+ end
+ return {ok=true, message="OK"}
+end
+
+-- โโ WEA DISPATCHERS โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+
+local WEA_DISPATCHERS = {
+
+ -- torus(pos1, radius_major, radius_minor, node, hollow)
+ -- pos1 must be set first; torus is placed at pos1
+ torus = function(name, args)
+ if not wea_enabled() then return {ok=false, message="WEA not available"} end
+ local p1 = worldedit.pos1[name]
+ if not p1 then return {ok=false, message="torus: set pos1 first"} end
+ local r_major = tonumber(args.radius_major or args.radius or 10)
+ local r_minor = tonumber(args.radius_minor or args.tube_radius or 3)
+ local node, err = resolve_node(tostring(args.node or "default:stone"))
+ if not node then return {ok=false, message="torus: " .. err} end
+ local hollow = args.hollow == true or args.hollow == "true"
+ if r_major < 1 or r_major > 64 then return {ok=false, message="torus: radius_major 1โ64"} end
+ if r_minor < 1 or r_minor > 32 then return {ok=false, message="torus: radius_minor 1โ32"} end
+ -- Direct Lua API: worldeditadditions.torus(pos, radius_major, radius_minor, node, hollow)
+ local ok, count = pcall(worldeditadditions.torus, p1, r_major, r_minor, node, hollow)
+ if not ok then
+ -- Fallback: chat command "//torus [hollow]"
+ local params = r_major .. " " .. r_minor .. " " .. node .. (hollow and " hollow" or "")
+ return wea_cmd(name, "torus", params)
+ end
+ return {ok=true,
+ message=string.format("Torus r_major=%d r_minor=%d %s at (%d,%d,%d): %d nodes",
+ r_major, r_minor, node, p1.x, p1.y, p1.z, count or 0),
+ nodes=count}
+ end,
+
+ hollowtorus = function(name, args)
+ args.hollow = true
+ args.radius_major = args.radius_major or args.radius or 10
+ return WEA_DISPATCHERS.torus(name, args)
+ end,
+
+ -- ellipsoid(pos1, rx, ry, rz, node, hollow)
+ ellipsoid = function(name, args)
+ if not wea_enabled() then return {ok=false, message="WEA not available"} end
+ local p1 = worldedit.pos1[name]
+ if not p1 then return {ok=false, message="ellipsoid: set pos1 first"} end
+ local rx = tonumber(args.rx or args.radius_x or args.radius or 5)
+ local ry = tonumber(args.ry or args.radius_y or rx)
+ local rz = tonumber(args.rz or args.radius_z or rx)
+ local node, err = resolve_node(tostring(args.node or "default:stone"))
+ if not node then return {ok=false, message="ellipsoid: " .. err} end
+ local hollow = args.hollow == true or args.hollow == "true"
+ for _, r in ipairs({rx, ry, rz}) do
+ if r < 1 or r > 64 then return {ok=false, message="ellipsoid: radii must be 1โ64"} end
+ end
+ local ok, count = pcall(worldeditadditions.ellipsoid, p1, rx, ry, rz, node, hollow)
+ if not ok then
+ local params = rx .. " " .. ry .. " " .. rz .. " " .. node .. (hollow and " hollow" or "")
+ return wea_cmd(name, "ellipsoid", params)
+ end
+ return {ok=true,
+ message=string.format("Ellipsoid rx=%d ry=%d rz=%d %s at (%d,%d,%d): %d nodes",
+ rx, ry, rz, node, p1.x, p1.y, p1.z, count or 0),
+ nodes=count}
+ end,
+
+ hollowellipsoid = function(name, args)
+ args.hollow = true
+ return WEA_DISPATCHERS.ellipsoid(name, args)
+ end,
+
+ -- floodfill(pos1, node, radius)
+ -- Fills from pos1 outward, replacing all air
+ floodfill = function(name, args)
+ if not wea_enabled() then return {ok=false, message="WEA not available"} end
+ local p1 = worldedit.pos1[name]
+ if not p1 then return {ok=false, message="floodfill: set pos1 first"} end
+ local node, err = resolve_node(tostring(args.node or "default:stone"))
+ if not node then return {ok=false, message="floodfill: " .. err} end
+ local radius = tonumber(args.radius or 10)
+ if radius < 1 or radius > 50 then return {ok=false, message="floodfill: radius 1โ50"} end
+ local ok, count = pcall(worldeditadditions.floodfill, p1, node, radius)
+ if not ok then
+ return wea_cmd(name, "floodfill", node .. " " .. radius)
+ end
+ return {ok=true,
+ message=string.format("Floodfill %s r=%d from (%d,%d,%d): %d nodes",
+ node, radius, p1.x, p1.y, p1.z, count or 0),
+ nodes=count}
+ end,
+
+ -- overlay(pos1, pos2, node)
+ -- Places node on top of every surface column in the selection
+ overlay = function(name, args)
+ if not wea_enabled() then return {ok=false, message="WEA not available"} end
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then return {ok=false, message="overlay: set pos1 and pos2 first"} end
+ local node, err = resolve_node(tostring(args.node or "default:dirt_with_grass"))
+ if not node then return {ok=false, message="overlay: " .. err} end
+ local ok, count = pcall(worldeditadditions.overlay, p1, p2, node)
+ if not ok then
+ return wea_cmd(name, "overlay", node)
+ end
+ return {ok=true,
+ message=string.format("Overlay %s on selection: %d nodes", node, count or 0),
+ nodes=count}
+ end,
+
+ -- replacemix(pos1, pos2, target_node, {node=chance, ...})
+ -- Replaces target with a weighted mix of replacement nodes
+ -- args: target, replacements = [{node, chance}, ...]
+ replacemix = function(name, args)
+ if not wea_enabled() then return {ok=false, message="WEA not available"} end
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then return {ok=false, message="replacemix: set pos1/pos2 first"} end
+ local target, err = resolve_node(tostring(args.target or args.search_node or "default:stone"))
+ if not target then return {ok=false, message="replacemix: " .. err} end
+ -- Build replacement list: [{node, chance}] or flat string for chat fallback
+ local replacements = args.replacements or {}
+ if #replacements == 0 then
+ -- Simple single replacement
+ local rnode = resolve_node(tostring(args.replace_node or args.node or "default:dirt"))
+ replacements = {{node=rnode, chance=1}}
+ end
+ -- Build chat params string as fallback: " [chanceA] [chanceB]..."
+ local param_parts = {target}
+ for _, r in ipairs(replacements) do
+ table.insert(param_parts, tostring(r.node or "default:dirt"))
+ if r.chance and r.chance ~= 1 then
+ table.insert(param_parts, tostring(r.chance))
+ end
+ end
+ -- Try direct API first
+ local ok, count = pcall(worldeditadditions.replacemix, p1, p2, target, replacements)
+ if not ok then
+ return wea_cmd(name, "replacemix", table.concat(param_parts, " "))
+ end
+ return {ok=true,
+ message=string.format("Replacemix %s โ %d replacement(s): %d nodes affected",
+ target, #replacements, count or 0),
+ nodes=count}
+ end,
+
+ -- layers(pos1, pos2, layers_def)
+ -- Applies terrain layers from top down. layers_def is a list of {node, depth}.
+ -- E.g. layers_def = [{node="default:dirt_with_grass", depth=1}, {node="default:dirt", depth=3}]
+ layers = function(name, args)
+ if not wea_enabled() then return {ok=false, message="WEA not available"} end
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then return {ok=false, message="layers: set pos1/pos2 first"} end
+ local layers_def = args.layers or {}
+ if #layers_def == 0 then
+ -- Simple single-layer shortcut
+ local node = resolve_node(tostring(args.node or "default:dirt"))
+ local depth = tonumber(args.depth or 1)
+ layers_def = {{node=node, depth=depth}}
+ end
+ -- Build chat command params: " [ ...]"
+ local param_parts = {}
+ for _, layer in ipairs(layers_def) do
+ table.insert(param_parts, tostring(layer.node or "default:dirt"))
+ table.insert(param_parts, tostring(layer.depth or 1))
+ end
+ local ok, count = pcall(worldeditadditions.layers, p1, p2, layers_def)
+ if not ok then
+ return wea_cmd(name, "layers", table.concat(param_parts, " "))
+ end
+ return {ok=true,
+ message=string.format("Layers (%d layer defs) on selection: %d nodes",
+ #layers_def, count or 0),
+ nodes=count}
+ end,
+
+ -- erode(pos1, pos2, [algorithm], [iterations])
+ -- Applies erosion simulation to terrain in selection
+ -- algorithm: "snowballs" (default) | "river" | "wind"
+ erode = function(name, args)
+ if not wea_enabled() then return {ok=false, message="WEA not available"} end
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then return {ok=false, message="erode: set pos1/pos2 first"} end
+ local algorithm = tostring(args.algorithm or "snowballs")
+ local iterations = tonumber(args.iterations or 1)
+ local valid_algos = {snowballs=true, river=true, wind=true}
+ if not valid_algos[algorithm] then algorithm = "snowballs" end
+ if iterations < 1 or iterations > 10 then iterations = 1 end
+ -- Build param string for chat fallback
+ local params = algorithm .. " " .. iterations
+ -- erode Lua API: worldeditadditions.erode(pos1, pos2, params_table)
+ local ok, count = pcall(worldeditadditions.erode, p1, p2,
+ {algorithm=algorithm, iterations=iterations})
+ if not ok then
+ return wea_cmd(name, "erode", params)
+ end
+ return {ok=true,
+ message=string.format("Erode algorithm=%s iterations=%d: %d nodes",
+ algorithm, iterations, count or 0),
+ nodes=count}
+ end,
+
+ -- convolve(pos1, pos2, [kernel], [size])
+ -- Smooths terrain with a convolution kernel
+ -- kernel: "gaussian" (default) | "box" | "pascal"
+ convolve = function(name, args)
+ if not wea_enabled() then return {ok=false, message="WEA not available"} end
+ local p1 = worldedit.pos1[name]
+ local p2 = worldedit.pos2[name]
+ if not p1 or not p2 then return {ok=false, message="convolve: set pos1/pos2 first"} end
+ local kernel = tostring(args.kernel or "gaussian")
+ local size = tonumber(args.size or 5)
+ local valid_kernels = {gaussian=true, box=true, pascal=true}
+ if not valid_kernels[kernel] then kernel = "gaussian" end
+ if size < 3 then size = 3 end
+ if size > 15 then size = 15 end
+ if size % 2 == 0 then size = size + 1 end -- must be odd
+ local params = kernel .. " " .. size
+ local ok, count = pcall(worldeditadditions.convolve, p1, p2,
+ {kernel=kernel, size=size})
+ if not ok then
+ return wea_cmd(name, "convolve", params)
+ end
+ return {ok=true,
+ message=string.format("Convolve kernel=%s size=%d: %d nodes",
+ kernel, size, count or 0),
+ nodes=count}
+ end,
+}
+
+-- โโ WEA Integration in get_context() โโโโโโโโโโโโโโโโโโโโโโ
+-- Extend the context string with WEA capabilities when available
+
+local original_get_context = M.get_context
+function M.get_context(player_name)
+ local base = original_get_context(player_name)
+ if not wea_enabled() then return base end
+
+ local wea_capabilities = table.concat({
+ "torus(radius_major, radius_minor, node, hollow)",
+ "hollowtorus(radius_major, radius_minor, node)",
+ "ellipsoid(rx, ry, rz, node, hollow)",
+ "hollowellipsoid(rx, ry, rz, node)",
+ "floodfill(node, radius)",
+ "overlay(node)",
+ "replacemix(target, replacements=[{node,chance}])",
+ "layers(layers=[{node,depth}])",
+ "erode([algorithm=snowballs|river|wind], [iterations=1])",
+ "convolve([kernel=gaussian|box|pascal], [size=5])",
+ }, " | ")
+
+ local wea_block = table.concat({
+ "",
+ "=== WorldEditAdditions Tools ===",
+ "Available WEA tools (set pos1/pos2 first for region ops):",
+ wea_capabilities,
+ "torus/ellipsoid/floodfill use pos1 as center.",
+ "overlay/replacemix/layers/erode/convolve operate on pos1..pos2 region.",
+ "=== END WEA ===",
+ }, "\n")
+
+ -- Insert before "=== END CONTEXT ==="
+ return base:gsub("=== END CONTEXT ===", wea_block .. "\n=== END CONTEXT ===")
+end
+
+-- โโ WEA Integration in SYSTEM_PROMPT โโโโโโโโโโโโโโโโโโโโโโ
+
+
+-- โโ WEA TOOL_SCHEMA extension โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+
+M.WEA_TOOL_SCHEMA = {
+ {name="torus", description="Generate a torus at pos1. Set pos1 first.",
+ parameters={radius_major="integer (1โ64)", radius_minor="integer (1โ32)", node="string", hollow="boolean (optional)"}},
+ {name="hollowtorus", description="Generate a hollow torus at pos1.",
+ parameters={radius_major="integer", radius_minor="integer", node="string"}},
+ {name="ellipsoid", description="Generate an ellipsoid at pos1 with per-axis radii.",
+ parameters={rx="integer (1โ64)", ry="integer (1โ64)", rz="integer (1โ64)", node="string", hollow="boolean (optional)"}},
+ {name="hollowellipsoid",description="Generate a hollow ellipsoid at pos1.",
+ parameters={rx="integer", ry="integer", rz="integer", node="string"}},
+ {name="floodfill", description="Flood-fill from pos1 outward, replacing air with node.",
+ parameters={node="string", radius="integer (1โ50)"}},
+ {name="overlay", description="Place node on top of every surface column in pos1..pos2.",
+ parameters={node="string"}},
+ {name="replacemix", description="Replace target node with a weighted mix of nodes in selection.",
+ parameters={target="string", replacements="array of {node, chance}"}},
+ {name="layers", description="Apply terrain layers top-down in selection.",
+ parameters={layers="array of {node, depth}"}},
+ {name="erode", description="Apply erosion to terrain in selection.",
+ parameters={algorithm="string (snowballs|river|wind, optional)", iterations="integer 1โ10 (optional)"}},
+ {name="convolve", description="Smooth terrain with convolution kernel in selection.",
+ parameters={kernel="string (gaussian|box|pascal, optional)", size="odd integer 3โ15 (optional)"}},
+}
+
+-- โโ WEA dispatcher registration in main dispatch table โโโโโ
+-- This hooks WEA_DISPATCHERS into the existing dispatch system
+
+local original_dispatch = M._dispatch_tool
+-- Wrap the main dispatch function to check WEA tools first
+-- (M._dispatch_tool is set during request/run_loop execution)
+M.wea_dispatchers = WEA_DISPATCHERS
+
+-- โโ wea_available() helper for external use โโโโโโโโโโโโโโโโ
+function M.wea_is_available()
+ return wea_enabled()
+end
+
+core.log("action", "[llm_worldedit] Phase 5: WEA integration loaded (wea_enabled=" .. tostring(wea_enabled()) .. ")")
+
+return M
diff --git a/material_picker.lua b/material_picker.lua
new file mode 100644
index 0000000..ed2df26
--- /dev/null
+++ b/material_picker.lua
@@ -0,0 +1,371 @@
+-- material_picker.lua v2.0
+-- Inventar-style Materialauswahl fรผr LLM WorldEdit Kontext
+--
+-- UI: Kacheln mit Item-Icons (item_image) + farbiger Markierung wenn aktiv
+-- Suchfilter oben, Toggle-All Button, Remove-All Button
+-- Kacheln sind Buttons โ Klick togglet Selektion
+--
+-- PUBLIC API (genutzt von chat_gui.lua / llm_worldedit.lua):
+-- M.get_materials(player_name) โ sortierte Liste von Node-Strings
+-- M.has_materials(player_name) โ bool
+-- M.build_material_context(player_name) โ String fรผr LLM-Systemprompt
+-- M.show(player_name) โ Formspec รถffnen
+-- M.handle_fields(player_name, formname, fields) โ bool
+
+local core = core
+local M = {}
+
+-- ============================================================
+-- Konfiguration
+-- ============================================================
+
+local COLS = 8 -- Kacheln pro Zeile
+local TILE_SIZE = 1.4 -- Breite/Hรถhe einer Kachel in Formspec-Einheiten
+local TILE_PAD = 0.08 -- Abstand zwischen Kacheln
+local MAX_NODES = 128 -- max. Kandidaten die gerendert werden
+
+-- ============================================================
+-- Session-State
+-- ============================================================
+
+local sessions = {}
+
+local function get_session(name)
+ if not sessions[name] then
+ sessions[name] = {
+ materials = {}, -- [node_name] = true
+ filter = "",
+ page = 1, -- aktuelle Seite (Paginierung)
+ }
+ end
+ return sessions[name]
+end
+
+core.register_on_leaveplayer(function(player)
+ sessions[player:get_player_name()] = nil
+end)
+
+-- ============================================================
+-- PUBLIC API
+-- ============================================================
+
+function M.get_materials(player_name)
+ local sess = get_session(player_name)
+ local list = {}
+ for node in pairs(sess.materials) do
+ table.insert(list, node)
+ end
+ table.sort(list)
+ return list
+end
+
+function M.has_materials(player_name)
+ local sess = get_session(player_name)
+ for _ in pairs(sess.materials) do return true end
+ return false
+end
+
+function M.build_material_context(player_name)
+ local mats = M.get_materials(player_name)
+ if #mats == 0 then return nil end
+ return table.concat({
+ "--- PLAYER-SELECTED BUILD MATERIALS ---",
+ "The player has explicitly chosen the following node(s) for this build.",
+ "Prefer these exact node names when generating tool_calls.",
+ "Nodes: " .. table.concat(mats, ", "),
+ "--- END MATERIALS ---",
+ }, "\n")
+end
+
+-- ============================================================
+-- Registry-Filter
+-- ============================================================
+
+local function build_candidate_list(filter)
+ filter = (filter or ""):lower():trim()
+ local candidates = {}
+
+ for name, def in pairs(core.registered_nodes) do
+ if not name:match("^__builtin")
+ and name ~= "air" and name ~= "ignore"
+ then
+ if filter == ""
+ or name:lower():find(filter, 1, true)
+ or (def.description and def.description:lower():find(filter, 1, true))
+ then
+ table.insert(candidates, name)
+ end
+ end
+ end
+
+ table.sort(candidates)
+ return candidates
+end
+
+-- ============================================================
+-- Formspec Builder
+-- ============================================================
+
+-- Berechnet Seitenzahl
+local function get_page_info(total, per_page, current_page)
+ local total_pages = math.max(1, math.ceil(total / per_page))
+ current_page = math.max(1, math.min(current_page, total_pages))
+ local first = (current_page - 1) * per_page + 1
+ local last = math.min(total, current_page * per_page)
+ return current_page, total_pages, first, last
+end
+
+local ITEMS_PER_PAGE = COLS * 6 -- 6 Zeilen = 48 Kacheln pro Seite
+
+function M.show(player_name)
+ local sess = get_session(player_name)
+ local filter = sess.filter or ""
+ local candidates = build_candidate_list(filter)
+ local total = #candidates
+
+ local page, total_pages, first, last =
+ get_page_info(total, ITEMS_PER_PAGE, sess.page)
+ sess.page = page -- korrigierte Seite zurรผckschreiben
+
+ local selected_count = 0
+ for _ in pairs(sess.materials) do selected_count = selected_count + 1 end
+
+ -- โโ Dimensionen โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ local W = COLS * (TILE_SIZE + TILE_PAD) + 0.5
+ local HDR_H = 0.9
+ local SRCH_H = 0.7
+ local INFO_H = 0.4
+ local GRID_H = 6 * (TILE_SIZE + TILE_PAD)
+ local NAV_H = 0.7
+ local BTN_H = 0.75
+ local PAD = 0.25
+ local H = HDR_H + PAD + SRCH_H + PAD + INFO_H + PAD + GRID_H + PAD + NAV_H + PAD + BTN_H + PAD
+
+ local fs = {
+ "formspec_version[6]",
+ "size[" .. string.format("%.2f", W) .. "," .. string.format("%.2f", H) .. "]",
+ "bgcolor[#0d0d0d;both]",
+ "style_type[*;bgcolor=#181818;textcolor=#e0e0e0]",
+ }
+
+ -- โโ Header โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ table.insert(fs, "box[0,0;" .. string.format("%.2f", W) .. "," .. HDR_H .. ";#1e1e2e]")
+ table.insert(fs, "label[" .. PAD .. ",0.35;โ Build Materials โ "
+ .. core.formspec_escape(player_name)
+ .. " (" .. selected_count .. " selected)]")
+ table.insert(fs, "style[close_picker;bgcolor=#3a1a1a;textcolor=#ffaaaa]")
+ table.insert(fs, "button[" .. string.format("%.2f", W - PAD - 2.0) .. ",0.12;2.0,0.65;close_picker;โ Close]")
+
+ local y = HDR_H + PAD
+
+ -- โโ Suchfeld โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ local field_w = W - PAD * 2 - 2.6
+ table.insert(fs, "field[" .. PAD .. "," .. y .. ";" .. string.format("%.2f", field_w) .. "," .. SRCH_H
+ .. ";filter;;" .. core.formspec_escape(filter) .. "]")
+ table.insert(fs, "style[filter;bgcolor=#111122;textcolor=#ccccff]")
+ table.insert(fs, "field_close_on_enter[filter;false]")
+ table.insert(fs, "style[do_filter;bgcolor=#1a1a3a;textcolor=#aaaaff]")
+ table.insert(fs, "button[" .. string.format("%.2f", PAD + field_w + 0.1) .. "," .. y
+ .. ";2.4," .. SRCH_H .. ";do_filter;โณ Search]")
+ y = y + SRCH_H + PAD
+
+ -- โโ Info-Zeile + Toggle-All โโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ local info_str
+ if total == 0 then
+ info_str = "No nodes found"
+ else
+ info_str = string.format("%d node(s) โ page %d/%d", total, page, total_pages)
+ end
+ table.insert(fs, "label[" .. PAD .. "," .. (y + 0.05) .. ";" .. core.formspec_escape(info_str) .. "]")
+
+ -- Toggle-All Button (alle auf dieser Seite ein/aus)
+ local page_nodes = {}
+ for i = first, last do
+ table.insert(page_nodes, candidates[i])
+ end
+ local page_all_selected = #page_nodes > 0
+ for _, n in ipairs(page_nodes) do
+ if not sess.materials[n] then page_all_selected = false; break end
+ end
+ local toggle_label = page_all_selected and "โ Deselect Page" or "โ Select Page"
+ local toggle_color = page_all_selected and "#2a4a2a" or "#333344"
+ table.insert(fs, "style[toggle_page;bgcolor=" .. toggle_color .. ";textcolor=#ccffcc]")
+ table.insert(fs, "button[" .. string.format("%.2f", W - PAD - 3.5) .. "," .. (y - 0.05)
+ .. ";3.5," .. INFO_H+0.1 .. ";toggle_page;" .. toggle_label .. "]")
+ y = y + INFO_H + PAD
+
+ -- โโ Kachel-Grid โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ -- Jede Kachel = item_image_button (Icon) + farbiger Hintergrund wenn aktiv
+ local col = 0
+ local row = 0
+ local IMG = TILE_SIZE - 0.25
+ local STEP = TILE_SIZE + TILE_PAD
+
+ for idx, node_name in ipairs(page_nodes) do
+ local tx = PAD + col * STEP
+ local ty = y + row * STEP
+
+ local is_sel = sess.materials[node_name] == true
+
+ -- Hintergrund-Box: grรผn wenn selektiert, dunkel wenn nicht
+ local bg_color = is_sel and "#1a3a1a" or "#1a1a1a"
+ table.insert(fs, "box[" .. string.format("%.2f,%.2f;%.2f,%.2f", tx, ty, TILE_SIZE, TILE_SIZE)
+ .. ";" .. bg_color .. "]")
+
+ -- Item-Image-Button (klickbar, zeigt Icon)
+ -- button-name enkodiert den Kandidaten-Index: "tile_N"
+ local btn_name = "tile_" .. tostring((page - 1) * ITEMS_PER_PAGE + idx)
+ -- item_image_button[x,y;w,h;item;name;label]
+ table.insert(fs, "item_image_button["
+ .. string.format("%.2f,%.2f;%.2f,%.2f", tx + 0.05, ty + 0.05, IMG, IMG)
+ .. ";" .. core.formspec_escape(node_name)
+ .. ";" .. btn_name .. ";]")
+
+ -- Checkmark-Label oben rechts wenn selektiert
+ if is_sel then
+ table.insert(fs, "label["
+ .. string.format("%.2f,%.2f", tx + TILE_SIZE - 0.38, ty + 0.18)
+ .. ";ยง(c=#00ff00)โ]")
+ end
+
+ -- Tooltip: Node-Name
+ local def = core.registered_nodes[node_name]
+ local desc = (def and def.description and def.description ~= "")
+ and def.description or node_name
+ table.insert(fs, "tooltip[" .. btn_name .. ";"
+ .. core.formspec_escape(desc .. "\n" .. node_name) .. "]")
+
+ col = col + 1
+ if col >= COLS then
+ col = 0
+ row = row + 1
+ end
+ end
+
+ y = y + GRID_H + PAD
+
+ -- โโ Navigation โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ local nav_btn_w = 2.2
+ if total_pages > 1 then
+ table.insert(fs, "style[page_prev;bgcolor=#222233;textcolor=#aaaaff]")
+ table.insert(fs, "button[" .. PAD .. "," .. y .. ";" .. nav_btn_w .. "," .. NAV_H .. ";page_prev;โ Prev]")
+ table.insert(fs, "style[page_next;bgcolor=#222233;textcolor=#aaaaff]")
+ table.insert(fs, "button[" .. string.format("%.2f", W - PAD - nav_btn_w) .. "," .. y
+ .. ";" .. nav_btn_w .. "," .. NAV_H .. ";page_next;Next โถ]")
+ end
+ y = y + NAV_H + PAD
+
+ -- โโ Bottom Buttons โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ local b_w = (W - PAD * 3) / 2
+ table.insert(fs, "style[clear_all;bgcolor=#3a1a1a;textcolor=#ff8888]")
+ table.insert(fs, "button[" .. PAD .. "," .. y .. ";" .. string.format("%.2f", b_w) .. "," .. BTN_H
+ .. ";clear_all;โ Clear All Selected]")
+
+ table.insert(fs, "style[close_and_back;bgcolor=#1a2a1a;textcolor=#aaffaa]")
+ table.insert(fs, "button[" .. string.format("%.2f", PAD * 2 + b_w) .. "," .. y
+ .. ";" .. string.format("%.2f", b_w) .. "," .. BTN_H .. ";close_and_back;โ Done]")
+
+ core.show_formspec(player_name, "llm_connect:material_picker", table.concat(fs))
+end
+
+-- ============================================================
+-- Formspec Handler
+-- ============================================================
+
+function M.handle_fields(player_name, formname, fields)
+ if not formname:match("^llm_connect:material_picker") then
+ return false
+ end
+
+ local sess = get_session(player_name)
+ local candidates = build_candidate_list(sess.filter)
+ local total = #candidates
+
+ -- Filter aktualisieren (live)
+ if fields.filter ~= nil then
+ sess.filter = fields.filter
+ end
+
+ -- โโ Search / Filter โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ if fields.do_filter or fields.key_enter_field == "filter" then
+ sess.page = 1
+ M.show(player_name)
+ return true
+ end
+
+ -- โโ Paginierung โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ local page, total_pages = get_page_info(total, ITEMS_PER_PAGE, sess.page)
+
+ if fields.page_prev then
+ sess.page = math.max(1, page - 1)
+ M.show(player_name)
+ return true
+ end
+
+ if fields.page_next then
+ sess.page = math.min(total_pages, page + 1)
+ M.show(player_name)
+ return true
+ end
+
+ -- โโ Toggle Page โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ if fields.toggle_page then
+ local _, _, first, last = get_page_info(total, ITEMS_PER_PAGE, sess.page)
+ -- Prรผfen ob alle selektiert
+ local all_sel = true
+ for i = first, last do
+ if not sess.materials[candidates[i]] then all_sel = false; break end
+ end
+ -- Toggle
+ for i = first, last do
+ if all_sel then
+ sess.materials[candidates[i]] = nil
+ else
+ sess.materials[candidates[i]] = true
+ end
+ end
+ M.show(player_name)
+ return true
+ end
+
+ -- โโ Clear All โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ if fields.clear_all then
+ sess.materials = {}
+ M.show(player_name)
+ return true
+ end
+
+ -- โโ Close / Done โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ if fields.close_picker or fields.close_and_back or fields.quit then
+ -- Signalisierung ans chat_gui: Picker schlieรen โ Chat-GUI wieder รถffnen
+ -- (wird in handle_fields von chat_gui.lua / init.lua gehandelt)
+ return true
+ end
+
+ -- โโ Kachel-Buttons: tile_N โโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ -- Format: tile_ (1-basiert รผber alle Seiten)
+ for field_name, _ in pairs(fields) do
+ local global_idx = field_name:match("^tile_(%d+)$")
+ if global_idx then
+ global_idx = tonumber(global_idx)
+ local node = candidates[global_idx]
+ if node then
+ if sess.materials[node] then
+ sess.materials[node] = nil
+ core.chat_send_player(player_name, "[LLM] โ " .. node)
+ else
+ sess.materials[node] = true
+ core.chat_send_player(player_name, "[LLM] โ " .. node)
+ end
+ end
+ M.show(player_name)
+ return true
+ end
+ end
+
+ return true
+end
+
+-- ============================================================
+core.log("action", "[llm_connect] material_picker.lua v2.0 loaded")
+return M
diff --git a/mod.conf b/mod.conf
index 8980fb9..939fd11 100644
--- a/mod.conf
+++ b/mod.conf
@@ -1,26 +1,7 @@
name = llm_connect
-description = Connects your Luanti/Minetest server to an LLM (Large Language Model) using an OpenAI-compatible API endpoint.
+description = Connects your Luanti server to an LLM + integrated AI-powered Lua IDE
title = LLM Connect
author = H5N3RG
license = LGPL-3.0-or-later
-media_license = LGPL-3.0-or-later
-forum =
-depends =
-optional_depends =
-version = 0.7.8
-
-# === Default Settings ===
-llm_max_tokens_integer = true
-llm_api_key =
-llm_api_url =
-llm_model =
-
-# === Default Context Settings ===
-llm_context_send_server_info = true
-llm_context_send_mod_list = false
-llm_context_send_commands = true
-llm_context_send_player_pos = true
-llm_context_send_materials = false
-
-
-
+version = 0.9.0-dev
+optional_depends = worldedit, worldeditadditions
diff --git a/settingtypes.txt b/settingtypes.txt
index 43b12fd..bacd18d 100644
--- a/settingtypes.txt
+++ b/settingtypes.txt
@@ -1,36 +1,92 @@
# ===========================================================================
-# LLM Connect - Luanti/Minetest mod settings
+# LLM Connect - Luanti/Minetest Mod Settings
# ===========================================================================
-# Configure the LLM connection and behavior in the in-game menu
-# Internal name, (Label), Type, Default, [min max for int/float]
+# LLM Connection, Chat, IDE, WorldEdit & Prompt Behavior
+# Version: 0.9.0
# ===========================================================================
-# Determines whether max_tokens is sent as integer (true) or float (false)
-llm_max_tokens_integer (Send max_tokens as integer) bool true
+# === LLM API Base Settings ===
-# Your API key for the LLM endpoint
-llm_api_key (API Key) string
+llm_api_key (API Key) string
+llm_api_url (API URL โ OpenAI compatible) string
+llm_model (Model Name) string
-# The URL of the OpenAI-compatible LLM endpoint
-llm_api_url (API URL) string
+llm_max_tokens (Max Tokens โ Response length) int 4000 500 16384
+llm_max_tokens_integer (Send max_tokens as integer) bool true
-# The model to use
-[cite_start]for the LLM (leave empty for none) [cite: 34]
-llm_model (Model) string
+llm_temperature (Temperature โ Creativity 0..2) float 0.7 0.0 2.0
+llm_top_p (Top P โ Nucleus Sampling 0..1) float 0.9 0.0 1.0
+llm_presence_penalty (Presence Penalty โ -2..2) float 0.0 -2.0 2.0
+llm_frequency_penalty (Frequency Penalty โ -2..2) float 0.0 -2.0 2.0
-# === Context Configuration ===
+# Global timeout for ALL LLM requests (chat, IDE, WorldEdit).
+# Per-mode overrides (llm_timeout_chat/ide/we) take precedence if set > 0.
+# Default: 120 seconds. Range: 30โ600.
+llm_timeout (Global Request Timeout in seconds) int 120 30 600
-# Send server name, description, motd, gameid, port, worldpath, mapgen
-llm_context_send_server_info (Send Server Info) bool true
+# Per-mode timeout overrides. Set to 0 to use the global llm_timeout.
+llm_timeout_chat (Chat mode timeout override, 0=global) int 0 0 600
+llm_timeout_ide (IDE mode timeout override, 0=global) int 0 0 600
+llm_timeout_we (WorldEdit mode timeout override, 0=global) int 0 0 600
-# Send the list of all installed mods
-llm_context_send_mod_list (Send Mod List) bool false
+llm_debug (Enable debug logging) bool false
-# Send the list of all available chat commands
-llm_context_send_commands (Send Commands List) bool true
-# Send the player's current position (x,y,z)
-llm_context_send_player_pos (Send Player Position) bool true
+# === Chat Context ===
-# Send the list of registered nodes, craftitems, tools, and entities
-llm_context_send_materials (Send Available Materials) bool false
+llm_context_send_server_info (Send server info to LLM) bool true
+llm_context_send_mod_list (Send list of active mods) bool false
+llm_context_send_commands (Send available chat commands) bool true
+llm_context_send_player_pos (Send player position and HP) bool true
+llm_context_send_materials (Send node/item/tool registry sample) bool false
+
+# Max chat history messages sent per LLM request. Oldest dropped first.
+llm_context_max_history (Max chat history messages sent) int 20 2 100
+
+
+# === Language ===
+
+llm_language (Response language) enum en en,de,es,fr,it,pt,ru,zh,ja,ko,ar,hi,tr,nl,pl,sv,da,no,fi,cs,hu,ro,el,th,vi,id,ms,he,bn,uk
+llm_language_instruction_repeat (Repeat language instruction) int 1 0 5
+
+
+# === IDE โ Behavior ===
+
+llm_ide_hot_reload (Hot-reload world after execution) bool true
+llm_ide_auto_save (Auto-save code buffer) bool true
+llm_ide_live_suggestions (Live suggestions โ not yet implemented) bool false
+llm_ide_whitelist_enabled (Sandbox security whitelist) bool true
+
+# Send last run output to LLM so it can self-correct after a failed execution.
+llm_ide_include_run_output (Send last run output for self-correction) bool true
+
+# Max lines of code sent as context. Prevents token overflow. 0 = no limit.
+llm_ide_max_code_context (Max code lines sent to LLM, 0=unlimited) int 300 0 2000
+
+
+# === IDE โ Guiding Prompts ===
+
+# Inject naming-convention guide into IDE LLM calls.
+# Teaches the model that registrations must use the "llm_connect:" prefix.
+llm_ide_naming_guide (Inject naming convention guide) bool true
+
+# Inject context about active mods and nodes into Generate calls.
+llm_ide_context_mod_list (Send mod list in IDE context) bool true
+llm_ide_context_node_sample (Send node sample in IDE context) bool true
+
+
+# === WorldEdit ===
+
+llm_worldedit_additions (Enable WorldEditAdditions tools) bool true
+llm_we_max_iterations (Max iterations in WE Loop mode) int 6 1 20
+llm_we_snapshot_before_exec (Snapshot before each WE execution) bool true
+
+
+# ===========================================================================
+# Notes:
+# - llm_timeout_*: 0 = inherit global llm_timeout
+# - llm_language "en" = no language instruction injected (saves tokens)
+# - llm_ide_* settings only affect the Smart Lua IDE
+# - llm_we_* settings only affect WorldEdit agency mode
+# - Timeout/config changes take effect after /llm_config reload or restart
+# ===========================================================================
diff --git a/system_prompt.txt b/system_prompt.txt
deleted file mode 100644
index a2bcbb3..0000000
--- a/system_prompt.txt
+++ /dev/null
@@ -1 +0,0 @@
-YOU ARE A HELPFUL, KNOWLEDGEABLE, AND FRIENDLY AI ASSISTANT FOR THE LUANTI (FORMERLY MINETEST) GAME PLATFORM.
diff --git a/worldedit_system_prompts.lua b/worldedit_system_prompts.lua
new file mode 100644
index 0000000..51159b5
--- /dev/null
+++ b/worldedit_system_prompts.lua
@@ -0,0 +1,114 @@
+-- worldedit_system_prompts.lua
+-- System prompts for LLM WorldEdit agency mode
+-- Used by llm_worldedit.lua
+
+local P = {}
+
+-- ============================================================
+-- Base prompt (single-shot and loop mode)
+-- ============================================================
+
+P.SYSTEM_PROMPT = [[You are a WorldEdit agent inside a Luanti (Minetest) voxel game.
+Your job is to translate the player's natural language building request into a sequence of WorldEdit tool calls.
+
+You will receive:
+- The player's current position (x, y, z)
+- Their current WorldEdit selection (pos1, pos2) if any
+- A coarse sample of nearby nodes
+- The list of available tools
+
+Respond ONLY with a JSON object:
+{
+ "plan": "",
+ "tool_calls": [
+ {"tool": "", "args": { ... }},
+ ...
+ ]
+}
+
+Do NOT add explanation text outside the JSON.
+Do NOT invent tool names not in the available list.
+Use "air" to remove/clear nodes.
+Coordinates must be integers.
+
+Example response:
+{
+ "plan": "Place a 5x3x5 stone platform 2 blocks below the player.",
+ "tool_calls": [
+ {"tool": "set_pos1", "args": {"x": -12, "y": 63, "z": 44}},
+ {"tool": "set_pos2", "args": {"x": -8, "y": 65, "z": 48}},
+ {"tool": "set_region", "args": {"node": "default:stone"}}
+ ]
+}
+]]
+
+-- ============================================================
+-- Loop mode addendum (appended to SYSTEM_PROMPT for run_loop)
+-- ============================================================
+
+P.LOOP_ADDENDUM = [[
+
+ADDITIONAL RULES FOR ITERATIVE MODE:
+
+STEP 1 ONLY โ On the very first step (when you receive only "Goal: ..."):
+ - First write a short OVERALL PLAN as your "plan" field describing ALL steps you intend to take.
+ - Then execute only the FIRST part of that plan in tool_calls.
+ - Example: Goal is "build a house" โ plan = "Step 1/4: Place 10x5x10 stone floor. Then: hollow walls, add roof, add door."
+
+SUBSEQUENT STEPS โ You receive "Completed steps so far:" plus your original goal:
+ - Your "plan" field should say which step of your overall plan this is (e.g. "Step 2/4: Hollow out walls")
+ - Only execute the CURRENT step, not the whole plan at once.
+ - If a previous step failed, note it and adapt. Never repeat a failing call unchanged.
+
+DONE SIGNAL:
+ - Set "done": true only when the entire structure is complete.
+ - Set "done": true also if you are stuck after a failure.
+ - Always set "done": false if there are more steps remaining.
+
+COORDINATE DISCIPLINE:
+ - Always use absolute integer coordinates.
+ - pos arguments for sphere/dome/cylinder/pyramid/cube must be {x,y,z} โ never a string.
+ - pos1 and pos2 define the region for set_region, replace, copy, move, stack, flip, rotate.
+
+Response format (strict JSON, no extra text):
+{
+ "plan": "",
+ "tool_calls": [ {"tool": "...", "args": {...}}, ... ],
+ "done": false,
+ "reason": ""
+}
+]]
+
+-- ============================================================
+-- WorldEditAdditions addendum (appended when WEA is available)
+-- ============================================================
+
+P.WEA_ADDENDUM = [[
+When WorldEditAdditions (WEA) tools are available, you may use them alongside standard WorldEdit tools.
+WEA tools require pos1 to be set (torus, ellipsoid, floodfill) or both pos1+pos2 (overlay, replacemix, layers, erode, convolve).
+
+WEA tool examples:
+- Torus: {"tool": "torus", "args": {"radius_major": 10, "radius_minor": 3, "node": "default:stone"}}
+- Ellipsoid: {"tool": "ellipsoid", "args": {"rx": 8, "ry": 5, "rz": 8, "node": "default:dirt"}}
+- Overlay: {"tool": "overlay", "args": {"node": "default:dirt_with_grass"}}
+- Layers: {"tool": "layers", "args": {"layers": [{"node": "default:dirt_with_grass", "depth": 1}, {"node": "default:dirt", "depth": 3}]}}
+- Erode: {"tool": "erode", "args": {"algorithm": "snowballs", "iterations": 2}}
+- Convolve: {"tool": "convolve", "args": {"kernel": "gaussian", "size": 5}}
+- Replacemix: {"tool": "replacemix", "args": {"target": "default:stone", "replacements": [{"node": "default:cobble", "chance": 2}, {"node": "default:mossy_cobble", "chance": 1}]}}
+]]
+
+-- ============================================================
+-- Convenience: build full system prompt strings
+-- ============================================================
+
+-- Single-shot prompt (with optional WEA addendum)
+function P.build_single(wea)
+ return P.SYSTEM_PROMPT .. (wea and P.WEA_ADDENDUM or "")
+end
+
+-- Loop prompt (with optional WEA addendum)
+function P.build_loop(wea)
+ return P.SYSTEM_PROMPT .. P.LOOP_ADDENDUM .. (wea and P.WEA_ADDENDUM or "")
+end
+
+return P