Is there a way to reduce the number of AI generated websites that appear in search results?
Is there a way to reduce the number of AI generated websites that appear in search results?
I keep trying to find things like “making waffles from sour dough discard” and all the sites are the same: long meandering paragraphs full of links to other things on the site with dubious instructions.
Considering at this point I can pretty much identify the type of site by looking at it; are there good extensions or search engines which might remove them from search results?
Update: for now it seems duck duck go’s date range filter is kinda a magic bullet for this type of thing. Set the range between 2010 and 2020 and the top results for a lot of temporally agnostic searches.
I've switched to presearch.com long ago. No more tracking.
This is one of the neater concepts for blockchain I’ve seen, though the “PRE” coin is giving me very serious pause… I mean, I guess they have to make money, but still.
EDIT: I take this back. It's a plan for a neat concept that's not implemented yet... Even though the crypto token is.
The UI is pretty nice though.
I don't think you're going back far enough. As others have pointed out, the problem existed due to seo stuff long before any LLM would see common usage, even in advertising rat races. Copy-pasted-extended descriptions became a problem around 2015, in my filtering journey. That's where I see real blogs (and just simple websites) start to be the majority of search results.