Update README.md

Signed-off-by: H5N3RG <janguenni13@web.de>
This commit is contained in:
2025-09-11 22:02:52 +02:00
committed by GitHub
parent 45c362d12a
commit dddb389fea

View File

@@ -3,31 +3,33 @@
**A Minetest mod that connects the game to an LLM using an OpenAI-compatible API endpoint.** **A Minetest mod that connects the game to an LLM using an OpenAI-compatible API endpoint.**
### Purpose ### Purpose
This mod allows players to interact with a Large Language Model (LLM) directly within the Minetest chat. It sends the player's chat message along with relevant in-game context (server info, installed mods, and available materials) to a remote API endpoint. This enables the LLM to provide highly specific and helpful answers, for example, on how to craft items or where to find certain materials in the game. [cite_start]This mod allows players to interact with a Large Language Model (LLM) directly within the Minetest chat[cite: 33]. [cite_start]It sends the player's chat message along with relevant in-game context (server info, installed mods, and available materials) to a remote API endpoint[cite: 34]. [cite_start]This enables the LLM to provide highly specific and helpful answers, for example, on how to craft items or where to find certain materials in the game[cite: 35].
### Features ### Features
* **In-game AI Chat**: Use a simple chat command to send prompts to the LLM. * [cite_start]**In-game AI Chat**: Use a simple chat command to send prompts to the LLM[cite: 36].
* **Context-Aware**: Automatically includes crucial server and material data in the prompts. * [cite_start]**Context-Aware**: Automatically includes crucial server and material data in the prompts[cite: 37].
* **Configurable**: The API key, endpoint URL, and model can be set via chat commands. * [cite_start]**Configurable**: The API key, endpoint URL, and model can be set via chat commands and the in-game menu[cite: 38, 47].
* **Conversation History**: Maintains a short-term conversation history for more relevant responses. * [cite_start]**Conversation History**: Maintains a short-term conversation history for more relevant responses[cite: 39].
### Implementation ### Implementation
This mod was created with the help of an AI assistant to generate the core code. It utilizes Minetest's HTTP API to send requests to an external, OpenAI-compatible endpoint. The mod is structured to be easily understandable and extendable for new contributors. [cite_start]This mod was created with the help of an AI assistant and is currently in its early stages[cite: 44]. It utilizes Minetest's HTTP API to send requests to an external, OpenAI-compatible endpoint. The mod is structured to be easily understandable and extendable for new contributors.
### Requirements ### Requirements
* A running Minetest server. * A running Minetest server.
* An API key from a supported service. * [cite_start]An API key from a supported service[cite: 40].
* **Access to external AI services (online and/or offline).** * **Access to external AI services (online and/or offline).**
* **Supported API Endpoints**:
### Supported API Endpoints
[cite_start]This mod has been successfully tested with the following APIs:
* [Open WebUI](https://github.com/open-webui/open-webui) * [Open WebUI](https://github.com/open-webui/open-webui)
* [LM Studio](https://lmstudio.ai/) * [LM Studio](https://lmstudio.ai/)
* [Mistral AI](https://docs.mistral.ai/) * [Mistral AI](https://docs.mistral.ai/)
* **OpenAI API** * **OpenAI API**
* **Please note**: This mod might have issues with some self-hosted backends like **Ollama** and **LocalAI**, as they may not correctly handle the floating-point numbers in the API requests. * [cite_start]**Ollama** and **LocalAI** are also now compatible, as the mod supports sending `max_tokens` as an integer to avoid floating-point issues.
### Commands ### Commands
* `/llm_setkey <key> [url] [model]` * `/llm_setkey <key> [url] [model]`
* Sets the API key, URL, and model. * [cite_start]Sets the API key, URL, and model[cite: 41].
* `/llm_setmodel <model>` * `/llm_setmodel <model>`
* Sets the LLM model to be used. * Sets the LLM model to be used.
* `/llm_set_endpoint <url>` * `/llm_set_endpoint <url>`
@@ -37,13 +39,13 @@ This mod was created with the help of an AI assistant to generate the core code.
* `/llm_reset` * `/llm_reset`
* Resets the conversation history for the current player. * Resets the conversation history for the current player.
* `/llm <prompt>` * `/llm <prompt>`
* Sends a message to the LLM. * [cite_start]Sends a message to the LLM[cite: 42].
### Potential for Expansion ### Potential for Expansion
This project is in an early stage and offers many possibilities for future development: [cite_start]This project is in an early stage and offers many possibilities for future development[cite: 44]:
* Adding support for more API endpoints. * Adding support for more API endpoints.
* Integrating with other in-game events and data sources (e.g., player inventory, world data). * Integrating with other in-game events and data sources (e.g., player inventory, world data).
* Improving error handling and performance. * Improving error handling and performance.
* Creating a user interface (formspec) for configuration instead of relying on chat commands. * [cite_start]Creating a user interface (formspec) for configuration instead of relying on chat commands.
I am not an experienced programmer, and my primary goal is to make the mod known and have more experienced developers contribute to its further development. If you are a programmer and would like to help improve this mod, your contributions are highly welcome! [cite_start]I am not an experienced programmer, and my primary goal is to make the mod known and have more experienced developers contribute to its further development[cite: 45]. [cite_start]If you are a programmer and would like to help improve this mod, your contributions are highly welcome! [cite: 46]