Help support the channel and get a weekly exclusive podcast! patreon.com/thelinuxcast ==== Time Stamps ==== 00:00:00 Intro 00:01:29 Installation of Searx 00:01:38 Docker and Docker-Compose 00:04:29 Portainer 00:08:40 Installing Searx in Portainer 00:15:31 Taking a Look at Searx 00:20:16 Wrapping Up
On Ubuntu and other non-rolling distros you can install docker and docker-compose from packages from their official repos, no need to install from source. After adding your user to the docker group you need to re-login, and reboot may be unnecessary.
Great video. I thought your "how to" was great. You didn't go too fast and explained what you were doing and why! Which is better than 90% of the other Linux "how to videos" on RUclips.
Thank you Matt! It's funny, I have SearXNG up an running, but I only use it for my development projects(API endpoint returning JSON). I had trouble getting it to work behind Traefik so I haven't used it for my personal search needs. I learned a lot from this video and I'm super excited to start using all the cool customization features!
Curious about how it actually works and how its making use of the crawled indexes of all these... Is it just a wrapper sending your query to all the selected engines and they still profile you even if you don't see the ads?
the searchx should act independently (at least in theory), meaning that the search engine will "see" your instance of searx and there isn't much to be seen on a server - this should eliminate a lot of telemetric data, such as your browser, screen resolution, os and other metrics, that they normally collect. The one thing I'm not sure how is handled is the let's say Google ID - does Searx always delete it per session so there is no tracking, or it stays so search engines, mostly Google - knows which queries come from the same place (IP) - that I'm not sure about.
It’s funny you did this video cause I was literally just looking at this again not even 5min ago before I walked into the living room and opened RUclips.
Excellent episode! You also gave a walk through for configuring a docker container. Here's the quandary. If I do a search through a searxng container, presumably Google sees the request coming from my WAN IP, just as if I do the search from my host. How does the container anonymize my searches?
You can install docker-compose on newer versions of Ubuntu/Debian/Mint without needing to compile from source, just checked on both Ubuntu 22.04 LTS and 24.04 LTS, and the package is on their repos
I use Searx for a year or so, but do not host it myself. From time to time I hop between instances of my choice. It's not ideal, but good enough for me. Maybe I'll host it myself too someday.
I'm in the same boat man, maybe I should host my own but I don't think I have a machine running consistently enough for me to rely on to host my search engine, especially since most my browsing happens on mobile.
@@Chesemiser I could do that, because I'm mostly on a desktop PC. As a mostly mobile user, this becomes complicated to self host, I can see that. I'm thinking off using a low powered old laptop for just the server side to host an instance. So it does not require too much power (money) in the longterm, if its 24 hours on. But that means I have to maintain that operating system and its updates too, even if its a minimal system. Plus maintain the searx instance with updates. I'm still lazy.
This video is too focused on the steps. No idea what your trying to do. Just give a high level overview first, and then go through the steps. I can't follow this video. I know how to work Linux machine and docker.
If there is no incentive for people to create websites, because of LLMs making it unnecessary to visit websites, then where will future LLM searches get their information?
The internet has been overrun with AI generated content for years already. It's become useless. The only way to find out anything anymore is to go to a specialized community based on that exact thing, where you can ask real people, or use your own locally hosted open source LLM that was created before the training data was contaminated by AI generated content.
OMG! "Stick it to Google?" For a guy on a RUclips channel, your concern about Google seems a little disingenuous. Nevertheless, the video is well done and interesting. Not sure who it's for; obviously new users will be glassy-eyed about now.
podman also searx is in the aur make sure that if you use searx-ng that it uses your preferred dns since by default it uses their pwn dns for stats and other stuff
Help support the channel and get a weekly exclusive podcast! patreon.com/thelinuxcast
==== Time Stamps ====
00:00:00 Intro
00:01:29 Installation of Searx
00:01:38 Docker and Docker-Compose
00:04:29 Portainer
00:08:40 Installing Searx in Portainer
00:15:31 Taking a Look at Searx
00:20:16 Wrapping Up
On Ubuntu and other non-rolling distros you can install docker and docker-compose from packages from their official repos, no need to install from source. After adding your user to the docker group you need to re-login, and reboot may be unnecessary.
Great video. I thought your "how to" was great. You didn't go too fast and explained what you were doing and why! Which is better than 90% of the other Linux "how to videos" on RUclips.
Thank you Matt! It's funny, I have SearXNG up an running, but I only use it for my development projects(API endpoint returning JSON). I had trouble getting it to work behind Traefik so I haven't used it for my personal search needs. I learned a lot from this video and I'm super excited to start using all the cool customization features!
Matt doing a HOW-TO video, LOVE IT!!
was waiting for this, great video!
Curious about how it actually works and how its making use of the crawled indexes of all these... Is it just a wrapper sending your query to all the selected engines and they still profile you even if you don't see the ads?
the searchx should act independently (at least in theory), meaning that the search engine will "see" your instance of searx and there isn't much to be seen on a server - this should eliminate a lot of telemetric data, such as your browser, screen resolution, os and other metrics, that they normally collect.
The one thing I'm not sure how is handled is the let's say Google ID - does Searx always delete it per session so there is no tracking, or it stays so search engines, mostly Google - knows which queries come from the same place (IP) - that I'm not sure about.
It’s funny you did this video cause I was literally just looking at this again not even 5min ago before I walked into the living room and opened RUclips.
Dang it, I love you Matt! I needed this, at just the right time! Thank you.
Excellent episode! You also gave a walk through for configuring a docker container.
Here's the quandary. If I do a search through a searxng container, presumably Google sees the request coming from my WAN IP, just as if I do the search from my host. How does the container anonymize my searches?
Fab Matt got it now too also docker is something Ive been meaning to start exploring too!
Great video, many thanks!
You can install docker-compose on newer versions of Ubuntu/Debian/Mint without needing to compile from source, just checked on both Ubuntu 22.04 LTS and 24.04 LTS, and the package is on their repos
Interesting. I might try that. On another note, you are the second youtuber I've heard calling the slash sign backslash. Is that an American thing?
I use Searx for a year or so, but do not host it myself. From time to time I hop between instances of my choice. It's not ideal, but good enough for me. Maybe I'll host it myself too someday.
I'm in the same boat man, maybe I should host my own but I don't think I have a machine running consistently enough for me to rely on to host my search engine, especially since most my browsing happens on mobile.
@@Chesemiser I could do that, because I'm mostly on a desktop PC. As a mostly mobile user, this becomes complicated to self host, I can see that. I'm thinking off using a low powered old laptop for just the server side to host an instance. So it does not require too much power (money) in the longterm, if its 24 hours on.
But that means I have to maintain that operating system and its updates too, even if its a minimal system. Plus maintain the searx instance with updates. I'm still lazy.
@@thingsiplay Yep, that statement of "I'm still lazy" sums up most the major reasons I haven't got any form of in home server using an old system.
This video is too focused on the steps. No idea what your trying to do. Just give a high level overview first, and then go through the steps. I can't follow this video. I know how to work Linux machine and docker.
This looks great. Is there a plugin to search multiple budget friendly media sites like Kodi for android?
i mean i tried to search 'labrador retriever' and found nothing.
Searx has no dark mode. It’s blinding
@@AnIndepentThinker uh yes it does.
How is your "ip a" colourful?
I don't know? I use bash and kitty and a color scheme for kitty. AFAIK I didn't do anything special.
@TheLinuxCast interesting. Found it on Arch wiki, adding alias ip="ip -color=auto" makes the output of ip commands colorful
@@Damglador Odd, I don't have that in mine. I have one for Man pages. I wonder if it's a feature of oh my posh? I do use that for the prompt
why not just use docker compose? so much simpler
Web Search Engines are dead, long live LLMs.
If there is no incentive for people to create websites, because of LLMs making it unnecessary to visit websites, then where will future LLM searches get their information?
@@ShaneSemler other llms kek
The internet has been overrun with AI generated content for years already. It's become useless. The only way to find out anything anymore is to go to a specialized community based on that exact thing, where you can ask real people, or use your own locally hosted open source LLM that was created before the training data was contaminated by AI generated content.
Yeah, no, an LLMs kinda sucks for in depth research, or trying to get to a website, trying to find images, or you know, searching the web...
@@Chesemiser All the images on the internet are fake now too. We're going to have to go back to film.
OMG! "Stick it to Google?" For a guy on a RUclips channel, your concern about Google seems a little disingenuous. Nevertheless, the video is well done and interesting. Not sure who it's for; obviously new users will be glassy-eyed about now.
We're tryna bro it up in da Searchx over here
U dun need to be upset
podman also searx is in the aur make sure that if you use searx-ng that it uses your preferred dns since by default it uses their pwn dns for stats and other stuff
I'm using searxng daily and its really good 😁