I suspect that this is the direct result of AI generated content just overwhelming any real content.
I tried ddg, google, bing, quant, and none of them really help me find information I want these days.
Perplexity seems to work but I don’t like the idea of AI giving me “facts” since they are mostly based on other AI posts
ETA: someone suggested SearXNG and after using it a bit it seems to be much better compared to ddg and the rest.
This is a most excellent place for technology news and articles.
I feel like it’s especially bad if you are searching for anything related to a marketable product. I tried searching ddg for information about using a surge protector with halogen bulbs and all I got was pages and pages of listicles on “best halogen lights 2024” full of affiliate links.
I use brave search. I can generally find most things. They even have an answer with ai thing that gives some useful stuff when you want a specific quick answer.
I also use ddg.
I wanted to make a joke about my first search engine, MetaCrawler, and then found out it’s still around and still does search. Going down that rabbithole, it’s changed hands a ton and was only relaunched kinda recently at some point. Is it any good? Nah, probably not.
I guess I’ll just have to rely on my other aggregate search engine, SavvySearch (no, no the first search engine does not in fact still exist, much to my disappointment).
Kagi is very good.
The whole internet is in the process of being filled with garbage content. Search engines are bad but also there’s not much good content left to find (in % of the total)
The Internet is dead ™
You know what I miss? Search engines that honored Boolean operators. I am often looking for niche results and being able to -, ! and NOT is incredibly useful. But that’s just not a thing anymore. I know part of it is that SEO includes antonym meta data that ruins this but it would still be helpful on occasion.
I’ve been using Mojeek lately and it looks like their advanced search can do some of that.
https://www.mojeek.com/advanced.html
Reminds me of early Google search.
I have it a test with some operators from the search bar instead of using the form and it did exactly what it was supposed to. I’ll keep this on hand. Thank you.
let us know if you find anything which could be better also, we’re always looking for ways to improve
It becomes more and more true every day.
It doesn’t really, it’s just that human activity on the internet is more and more taking place on platforms without any search indexing. 20 years ago, internet forum are where you’d go for advice online. Nowadays, it’s more and more becoming discord servers and similar, which just aren’t indexed by internet search.
It’s intentional.
Obviously, Google makes money showing ads during search. But they have finally bit the bullet and starting tarpitting users in search in order to show more ads.
A quick, useful, and accurate search means that you’re on their site for the least amount of time, perhaps mere seconds. That’s not what’s best for revenue growth.
PS: Go try Kagi and be reminded what good clean search results look like. I use it because my time has value. It’s very good.
Brave has their own search. There is also meta searches such as metager, searx and mojeek. I hope more search engines enter the market
I just use chatGPT to search now. I have a super-prompt in its memory telling it how to search and to cite sources and provide links and it is so much better than Google even though it’s using AI, too.
*The future is now, old men!
There are no search engines besides Google and Bing, because everyone else just uses Bing under the hood.
I think it’s just you. Differential Transformers are pretty good at regurgitating information that’s widely talked about. They fall short when it comes to specific information on niche subjects, but generally that’s only a matter of understanding the jargon needed to plug into a search engine to find what you’re looking for. Paired with uBlock Origin, it’s all typically pretty straight forward, so long as you know which to use in which circumstance.
Almost always, I can plug some error for an OS into a LLM and get specific instructions on how to resolve it.
Additionally if you understand and learn how to use a model that can parse your own set of user-data, it’s easy to feed in documentation to make it subject-specific and get better results.
Honestly, I think the older generation who fail to embrace and learn how to use this tool will be left in the dust, as confused as the pensioners who don’t know how to write an email.
I don’t use perplexity, but AI is generally 60-80% effective with a larger than average open weights off line model running on your own hardware.
DDG offers the ability to use some of these. I use a modified Mistral model still, even though its base model(s) are Llama 2. Llama 3 can be better in some respects but it has terrible alignment bias. The primary entity in the underlying model structure is idiotic in alignment strength and incapable of reason with edge cases like creative writing for SciFi futurism. The alignment bleeds over. If you get on DDG and use the Anthropic Mixtral 8×7b, it is pretty good. The thing with models is to not talk to them like humans. Everything must be explicitly described. Humans make a lot of implied context in general where we assume people understand what we are talking about. Talking to an AI is like appearing in court before a judge; every word matters. The LLM is basically a reflection of all of human language too. If the majority of humans are wrong about something, so is the AI.
If you ask something simple like just a question, you’re not going to get very far into what the model knows. Models have very limited scope of focus. If you do not build prompt momentum into the space by describing a lot of details, the scope of focus is large but the depth is shallow. The more you build up momentum by describing what you are asking in detail, the more it narrows the scope and deeper connections can be made.
It is hard to tell what a model really knows unless you can observe the perplexity output. This is more advanced, but the perplexity score for each generated token is how you infer that the model does not know something.
Search sucks because it is a monopoly. There are only 2 relevant web crawlers m$ and the goo. All search queries go through these either directly or indirectly. No search provider is deterministic any more. Your results are uniquely packaged to manipulate you. They are also obfuscated to block others from using them for training better or competitive models. Then there is the anti trust US government case and all of that which makes obfuscating one’s market position to push people onto other platforms temporarily, their best path forward. - criminal manipulators are going to manipulate.
DDG and qwant are basically bing
“Perplexity AI”
Yeah 100% agree. Especially for the type of search where you’re googling for an answer. This feels like what searches used to be when Google was young and forums still existed.