Search Engines are a dying technology

Nov 15, 2023 AI, code

AI is going to change the whole landscape of how people get information. The days you fired up your browser and go to Google, Bing, Duckduckgo or insert your favorite search engine, are soon to be antiquated behavior.

While big tech is pushing out new tech and jumping on the AI bandwagon. What they do not know is AI wil be their own biggest competition. Sure they can try and get a ahead of it by offering their own ChatGPT version. Try and impress you with all the fancy talk, giving the impression that only they have the ability to do it. But in the end as tech and AI evolves it will be able to be run anywhere on the smallest devices. Your phone will soon be able to run a LLM (Large Language Model), probably already possible. No need to create an account, no need to give out your email and lately the very annoying trend, my phone number.

What does this mean for big tech? It means people will be able to go offline for their information. Much like reading books, which let you learn anything you want. Local, offline AI applications will allow you to search and learn anything you want without the need to access the internet (outside of downloading the LLM). This is huge for privacy, and very bad news for the companies who make millions selling our data.

I am already in this future world.

It’s super fresh (less than a week old) but so far, I have been able to ditch Google search 99% this last week.

Below is my current setup. It’s a bit techy so it might not be for the average computer user. But it’s not really that hard, and these open source projects will improve to make it easier for main stream. I already have some great ideas about how to do that already.

Run Local and Private AI on your computer

For now this project only support MacOS or Linux, but Windows should be coming soon.

ollama.ai

This tool comes with a cli for easy server starting (or they have a desktop version). They have a handful of language models referenced on their site, but you can load custom models! The best part is they have integrated an API out of the box, so you can interface with it dynamically through a variety of coding languages. The terminal experience is pretty nice and lightening fast! But since it has an API you can setup or create your own browser interface version. Enter ollama-webui

ollama-webui

Ollama WebUI setup

I tried out a couple other UI’s this one is just as fast as the cli. Which for me makes it very, very good. Not only that looking over the code base I can see it uses IndexDB instead of Local Storage, so that is a plus for size and organization. And I haven’t used it but it has a voice/speak option. Coming from ChatGPT this UI gives me the same experience, but way less lag! And those stupid, captchas I started getting lately, like after 3 chats.

Expand this to other things

So this is great for a local chat bot. But the flexibility of ollama.ai is I can actually run multiple language models separately. So I have my standard chat bot up in a browser tab using the Mistral Model. Then in my code editor I have an extension that also communicates to the Ollama server but it’s using the CodeLlama Model.

So far this setup is sick!

My brain is already spinning how I can implement it into other things. The most exciting is being about to interface with the AI via code. And I do not have to pay for usage like OpenAI which has an API but you have to pay for it. It is going to be so easy to interface with it via the API.