98
README.md
98
README.md
@@ -1,51 +1,67 @@
|
|||||||
# LLM Connect
|
# LLM Connect
|
||||||
|
|
||||||
**A Minetest mod that connects the game to an LLM using an OpenAI-compatible API endpoint.**
|
A Minetest mod that connects the game to a Large Language Model (LLM) using an OpenAI-compatible API endpoint.
|
||||||
|
|
||||||
### Purpose
|
## Purpose
|
||||||
[cite_start]This mod allows players to interact with a Large Language Model (LLM) directly within the Minetest chat[cite: 33]. [cite_start]It sends the player's chat message along with relevant in-game context (server info, installed mods, and available materials) to a remote API endpoint[cite: 34]. [cite_start]This enables the LLM to provide highly specific and helpful answers, for example, on how to craft items or where to find certain materials in the game[cite: 35].
|
|
||||||
|
|
||||||
### Features
|
This mod allows players to interact with an LLM directly within the Minetest chat.
|
||||||
* [cite_start]**In-game AI Chat**: Use a simple chat command to send prompts to the LLM[cite: 36].
|
It sends the player's chat message along with relevant in-game context—such as server info, installed mods, and available materials—to a remote API endpoint.
|
||||||
* [cite_start]**Context-Aware**: Automatically includes crucial server and material data in the prompts[cite: 37].
|
This enables the LLM to provide highly specific and helpful answers, e.g., on crafting items or locating resources in-game.
|
||||||
* [cite_start]**Configurable**: The API key, endpoint URL, and model can be set via chat commands and the in-game menu[cite: 38, 47].
|
|
||||||
* [cite_start]**Conversation History**: Maintains a short-term conversation history for more relevant responses[cite: 39].
|
|
||||||
|
|
||||||
### Implementation
|
<!-- cite: 33,34,35 -->
|
||||||
[cite_start]This mod was created with the help of an AI assistant and is currently in its early stages[cite: 44]. It utilizes Minetest's HTTP API to send requests to an external, OpenAI-compatible endpoint. The mod is structured to be easily understandable and extendable for new contributors.
|
|
||||||
|
|
||||||
### Requirements
|
## Features
|
||||||
* A running Minetest server.
|
|
||||||
* [cite_start]An API key from a supported service[cite: 40].
|
|
||||||
* **Access to external AI services (online and/or offline).**
|
|
||||||
|
|
||||||
### Supported API Endpoints
|
- **In-game AI Chat:** Send prompts to the LLM using a simple chat command. <!-- cite: 36 -->
|
||||||
[cite_start]This mod has been successfully tested with the following APIs:
|
- **Context-Aware:** Automatically includes crucial server and material data in the prompts. <!-- cite: 37 -->
|
||||||
* [Open WebUI](https://github.com/open-webui/open-webui)
|
- **Configurable:** API key, endpoint URL, and model can be set via chat commands or the in-game menu. <!-- cite: 38,47 -->
|
||||||
* [LM Studio](https://lmstudio.ai/)
|
- **Conversation History:** Maintains short-term conversation history for more relevant responses. <!-- cite: 39 -->
|
||||||
* [Mistral AI](https://docs.mistral.ai/)
|
- **Robust Token Handling:** Supports sending `max_tokens` as an integer to avoid floating-point issues; optionally configurable via `settingtypes.txt` or chat commands.
|
||||||
* **OpenAI API**
|
|
||||||
* [cite_start]**Ollama** and **LocalAI** are also now compatible, as the mod supports sending `max_tokens` as an integer to avoid floating-point issues.
|
|
||||||
|
|
||||||
### Commands
|
## Implementation
|
||||||
* `/llm_setkey <key> [url] [model]`
|
|
||||||
* [cite_start]Sets the API key, URL, and model[cite: 41].
|
|
||||||
* `/llm_setmodel <model>`
|
|
||||||
* Sets the LLM model to be used.
|
|
||||||
* `/llm_set_endpoint <url>`
|
|
||||||
* Sets the API URL of the LLM endpoint.
|
|
||||||
* `/llm_set_context <count> [player]`
|
|
||||||
* Sets the maximum context length for a player or all players.
|
|
||||||
* `/llm_reset`
|
|
||||||
* Resets the conversation history for the current player.
|
|
||||||
* `/llm <prompt>`
|
|
||||||
* [cite_start]Sends a message to the LLM[cite: 42].
|
|
||||||
|
|
||||||
### Potential for Expansion
|
- Built with Minetest's HTTP API for sending requests to an external, OpenAI-compatible endpoint. <!-- cite: 44 -->
|
||||||
[cite_start]This project is in an early stage and offers many possibilities for future development[cite: 44]:
|
- Structured to be understandable and extendable for new contributors.
|
||||||
* Adding support for more API endpoints.
|
- Version: **0.7.5**
|
||||||
* Integrating with other in-game events and data sources (e.g., player inventory, world data).
|
|
||||||
* Improving error handling and performance.
|
|
||||||
* [cite_start]Creating a user interface (formspec) for configuration instead of relying on chat commands.
|
|
||||||
|
|
||||||
[cite_start]I am not an experienced programmer, and my primary goal is to make the mod known and have more experienced developers contribute to its further development[cite: 45]. [cite_start]If you are a programmer and would like to help improve this mod, your contributions are highly welcome! [cite: 46]
|
## Requirements
|
||||||
|
|
||||||
|
- A running Minetest server.
|
||||||
|
- An API key from a supported service. <!-- cite: 40 -->
|
||||||
|
- Access to external AI services (online and/or offline).
|
||||||
|
|
||||||
|
## Supported API Endpoints
|
||||||
|
|
||||||
|
Successfully tested with:
|
||||||
|
|
||||||
|
- Open WebUI
|
||||||
|
- LM Studio
|
||||||
|
- Mistral AI
|
||||||
|
- OpenAI API
|
||||||
|
- Ollama and LocalAI (integer `max_tokens` ensures compatibility)
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
- `/llm_setkey <key> [url] [model]` – Sets the API key, endpoint URL, and model. <!-- cite: 41 -->
|
||||||
|
- `/llm_setmodel <model>` – Sets the LLM model to be used.
|
||||||
|
- `/llm_set_endpoint <url>` – Sets the API endpoint URL.
|
||||||
|
- `/llm_set_context <count> [player]` – Sets the maximum context length for a player or all players.
|
||||||
|
- `/llm_reset` – Resets the conversation history for the current player.
|
||||||
|
- `/llm <prompt>` – Sends a message to the LLM. <!-- cite: 42 -->
|
||||||
|
- `/llm_integer` – Forces `max_tokens` to be sent as an integer (default).
|
||||||
|
- `/llm_float` – Sends `max_tokens` as a float (optional, experimental).
|
||||||
|
|
||||||
|
## Potential for Expansion
|
||||||
|
|
||||||
|
- Add support for more API endpoints. <!-- cite: 44 -->
|
||||||
|
- Integrate with additional in-game events or data sources (player inventory, world data).
|
||||||
|
- Improve error handling and performance.
|
||||||
|
- Create a graphical user interface (formspec) for configuration instead of relying solely on chat commands.
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
This project is in an early stage and welcomes contributions:
|
||||||
|
|
||||||
|
- Even small fixes help, especially with API integration, UI improvements, and performance tuning. <!-- cite: 45 -->
|
||||||
|
- Contributions from experienced developers are highly welcome. <!-- cite: 46 -->
|
||||||
|
- The goal is to build a robust, maintainable mod for the Minetest community.
|
||||||
|
|||||||
Reference in New Issue
Block a user