PAR LLAMA is a Terminal User Interface (TUI) application designed for easy management and use of Ollama based Large Language Models (LLMs). This means that users can interact with the Ollama model directly from their terminal or command prompt, making it easier to get work done without having to constantly switch between applications.
The application was built using Textual and Rich, two popular Python libraries for building TUI applications. These libraries provide a lot of functionality out-of-the-box, including support for complex layouts and user interfaces. By leveraging these libraries, the PAR LLAMA developers were able to create an intuitive and easy-to-use interface that is accessible on all major operating systems.
As mentioned in the GitHub repository (https://github.com/paulrobello/parllama), PAR LLAMA can run on Windows, Windows WSL, Mac, and Linux. This means that users who are already familiar with one of these operating systems can easily use PAR LLAMA without having to learn new skills or install additional software.
Current State of Parllama
• Initial release with features like model creation, remote instance connection, chat history management, and multi-model conversations.
• Custom prompt library import from Fabric and auto-completion for slash commands, input history, and multi-line edits.
• Integration with cloud AI providers (OpenAI, Anthropic, Groq, Google).
Future Developments
• RAG (Retrieval-Augmented Generation) support for local documents and web pages.
• Vision-based LLMs using images.
• Expanded custom prompt import from other tools.
• Improved tool use experience.