Month: December 2024
Repo Prompt and Ollama
For a question about using Ollama in the same manner as o1 works (i.e., returning the Whole or Diff format ready for merging), the repository owner provided the following response: Ollama models…
Repo Prompt
Quite nice application for working with files and LLMs: Repo Prompt. I have done two tests so far, and I’m sold. The tests, of course, were super simple apps, but I found…
BlenderGPT, MoneyPrinterV2
This is the next area where AI aids designers – 3D modeling. This page offers free access to create meshes from text or images. Additionally, there are models available on Hugging Face…
Llama 3.3, Ollama structured output
Llama 3.3 70B offers similar performance compared to Llama 3.1 405B model. This model also requires less VRAM, e.g. works well on M4 64GB with a speed of 10 tokens/s. https://ollama.com/library/llama3.3 Right…
Chat from LangChain
This is a significantly different RAG we used to think of: https://chat.langchain.com/. Take a look at the sample questions and observe how agents from LangChain and LangGraph utilize them to construct responses….
Genie 2, DeepThought-8B, snowflake-arctic-embed2
Google has recently released Genie 2, their most advanced large-scale foundation model capable of generating consistent and playable worlds for a minute. This advancement could be particularly beneficial for game developers, but…