PLLuM is a family of large language models (LLMs) specialized in Polish and other Slavic/Baltic languages, with additional English data incorporated for broader generalization. Developed through an extensive collaboration with various data providers, PLLuM models are built on high-quality text corpora and refined through instruction tuning, preference learning, and advanced alignment techniques. These models are intended to generate contextually coherent text, offer assistance in various tasks (e.g., question answering, summarization), and serve as a foundation for specialized applications such as domain-specific intelligent assistants.
PLLuM models are available at https://huggingface.co/CYFRAGOVPL there are few versions including instruct. Chat for testing available at https://pllum.clarin-pl.eu/pllum_8x7b. No Ollama version yet, format only bf16. But still something.