AI chatbots are taking the world by storm. We analyzed thousands of conversations to see what people are really asking them and what topics are most discussed.
For a user without much technical experience using a ready-made gui like Jan.ai with automatic model download and ability to run models with the ggml library on consumer grade hardware like mac M-series chips or cheap GPUs by either Nvidia or AMD is probably a good start.
For a little bit more technically proficient users Ollama is probably a great choice to start to host your own OpenAI-like API for local models. I mostly run gemma2 or small llama 3.1 like models with that.
I was also kind of blown away by the Firefox nightly version, where they have a new sidebar. In that sidebar, you have buttons for having chat gpt open if you want. But that’s not the impressive part. It also lets you choose from other models like huggingface, so anyone can try them and understand how the open models are without any installation.
For a user without much technical experience using a ready-made gui like Jan.ai with automatic model download and ability to run models with the ggml library on consumer grade hardware like mac M-series chips or cheap GPUs by either Nvidia or AMD is probably a good start.
For a little bit more technically proficient users Ollama is probably a great choice to start to host your own OpenAI-like API for local models. I mostly run gemma2 or small llama 3.1 like models with that.
I was also kind of blown away by the Firefox nightly version, where they have a new sidebar. In that sidebar, you have buttons for having chat gpt open if you want. But that’s not the impressive part. It also lets you choose from other models like huggingface, so anyone can try them and understand how the open models are without any installation.
Very cool.