Not sure if this is the right community, but I didn't see a general one. What search engine do you use? Besides Google increasingly spying on its users, the quality of its search results seems to have gotten significantly worse over the last decade. What search engine(s) do you use?
I wouldn't say that it "blows my mind" or anything, but simply that it seems to work as expected (which is more than what I can say for Google). There's also a "Fediverse" button on Kagi.com, so it can search lemmy.world (and more??).
Let me know if you find one that uses AI to find groupings of my search terms in its catalogues instead of using AI to reduce my search to the nearest common searches made by others, over some arbitrary popularity threshold.
Theoretical search: "slip banana peel 1980s comedy movie"
Expected results in 2010: Pages about people slipping on banana peels, mostly in comedy movies, mostly from the 80s.
Expected results in 2024: More than I ever wanted to know about buying bananas online, the health impacts of eating too many or not enough bananas, and whatever "celebrities" have recently said something about them. Nothing about movies from the 80s.
DuckDuckGo as a default with Google as fallback depending on what I'm looking for. For lemmy the default search of my instance works well enough so haven't tried external engines.
I also spun up my own yacy instance. It was pretty terrible. It could be good, but you would need a pretty beefy machine with a lot of storage and a lot of time for it to index for it to be anything approaching good.
A combination of SearXNG and Stract, for the most part. They're definitely not perfect (yet), but they mostly get the job done! And I think they both have a lot of cool filters and refinement settings that I haven't even taking advantage of so far.
For niche stuff it seems like a lot of hyper targeted search engines are popping up, like Sepia Search for PeerTube.
I'm using Kagi. I find that it does a better job at finding "legitimate" sites rather than blogspam and content marketing. However I'm not sure I will stick with it a long time. I seems like it has mostly stalled and the team is getting distracted by making a browser, non-relevant AI (I have no problem with the few AI experiments tied to searching) and other side projects. We'll see. I really hope that they pull themselves together and focus or it might not last. But for now they seem like one of the better options available.
Bing's new "Deep Search" where it has some sort of LLM refinement iteration process has also been helpful sometimes. Probably the best AI search product I have seen, but definitely doesn't replace most searches for me.
MetaGer is a metasearch engine focused on protecting users’ privacy. Based in Germany, and hosted as a cooperation between the German NGO ‘SUMA-EV - Association for Free Access to Knowledge’ and the University of Hannover, the system is built on 24 small-scale web crawlers under MetaGer’s own control. In September 2013, MetaGer launched MetaGer.net, an English-language version of their search engine.
That you can hide yourself behind our proxyserver just by opening the result anonymously? Use “OPEN ANONYMOUSLY”; this also affects the following links.
I'm avoiding the major search engines. If I really need a search engine, I use DuckDuckGo. Most of the time, search forms of a few websites provide better results. I've bookmarked search forms of e.g. wikipedia, Wiktionary, the python docs, Arch Linux wiki, github, dict.leo.org, bug trackers of software I commonly use (such as Mozilla's bug tracker) and so on. I'm basically using Firefox's "keyword" search feature in the way DuckDuckGo's !bang syntax works.
I used DuckDuckGo a couple of years ago, but they added their own blacklist of sites (pretty stupid), and for my language it started returning crappy generated spam sites instead of relevant results. They shouted at the top of their lungs that for my language they simply index the results from Yandex, but this is a lie, they are different.
StartPage gave the best results, but they introduced a captcha that I got every damn request.
I'm currently using SearXNG, which collects results from Google. And these are damn normal results, unlike other search engines that consider themselves the smartest and edit the results.
Shameless self promo: I was upset by this as well so I’m working now on a curated search engine just for anything related to webdev. It focuses on blogs and docs. No BS, just high quality sources.
I feel like I see this question come up now and then across the communities I'm in, and there's always a debate over search engines lol. Anyway, to answer the question, I use Kagi for its custom rankings (and, more recently, Wolfram|Alpha integration, which I've found more useful than I expected it to be).
If you mean for programming specifically, I... don't, really. At most it would be for a quick sanity check on syntax in a language I don't write often, for which Google is fine. But otherwise I rely on documentation and search features of the various language/tool-specific websites.
certain specific private source (maybe tor). Not reliable but high quality.
large collection of raw links ( requires labor, skill. And yields imperfect results )
searx ( mainly to share with friends and as a fallback. it is a pretty great premade metasearch engine when selfhosted)
ready-to-go foss searchengine implementations. (Limited in scope, requires decent amount of "labor" and time. Requires setup phase and light maintaining. Extremely high quality results. Optionally invest money in various ways to supercharge. Perhaps recruit collaborators)
other stuff:
creating private collections and bookmarks
not using internet or using rarely or using in cautious way. Or not using www.
focusing on distracting self with hands-on projects.
What am i actually using at this point?
(Nothing is set up currently!).
Sometimes i use tor 70% of the time. Sometimes i use tor 30% of the time.
very frequently non www .
duckduckgo when needed. [Often] without visiting the links
niche sources (2, ..)
reddit
*this isnt perfect! but i think overall i think i dont spend much time traveling to websites for info.
Historically:
searx
searchengine
dabble in scaling.
Future:
selfhost
scaling
further isolation
IRL
other stuff. such as creating new solutions.
There is certainly room for immediate improvement here.
Im just lazy.
I dont need the internet as much as the internet needs me.