For me, Google Search just doesn't seem to be getting better over the years, if anything its getting worse. I honestly feel like its hard to get what I want half the time with all the SEO spam, most of the time I have to input "inurl:reddit.com" just to get good results.

At the same time, ChatGPT has frequently impressed me, not with everything (my expectations are reasonably low) but it has performed amazing work for me (typing out form letters, code language conversions).

For what it's worth I wouldn't use ChatGPT for search like I do with Google, but what it has done is taken away time I would be Googling for things like "how to write X form letter". I expect as it matures, it will take more time away from me Googling.

All these takes underestimate the following:

1) How quickly ChatGPT and its ilk will advance to solve relatively low hanging fruit like "ChatGPT is wrong about this one thing". The delta is extremely important here.

2) How slowly the Google bureaucracy will grind when releasing anything remotely like ChatGPT. All the committees and the burdensome processes in place in Google will keep this new technology locked up for years, and ensure that the final result is a camel (horse designed by committee). It doesn't matter if they have superior technology if they never use it or release it.

3) How much Search means to Google will mean they will treat any product changes to it extremely carefully while Microsoft will be willing to experiment with Bing like they have with Co-Pilot and GitHub.

Personally, I wouldn't go long on search engines that don't have a strong ML component to them in the future.

> Personally, I wouldn't go long on search engines that don't have a strong ML component to them in the future.

Arguably, it's the ML that made Google useless for some people. Since some time, Google seems to be curating it's results to address searches in a question format. In the past we were searching for occurrences of our keywords in webpages but today Google seems to be trying to be an answer machine. Unfortunately it's not very good at it and it is just as inaccurate as ChatGPT.

>In the past we were searching for occurrences of our keywords in webpages

I mean that died decades ago when spammers just made pages with your word repeated over and over again. Spam makes everything worse.

Completely agree. I wish Google was able to fix spam instead of trying to be something else than a search engine.

The cynic in me thinks that Google is doing it because it’s more profitable. If the results are crap, maybe ads are a better content? it’s not like you are going to use Bing?

I don't even understand why Google doesn't allow blocking some sites in search results. Paid ads, I understand, these generate profits. Although Facebook still allow me to remove some ads that I don't want to see. But unpaid SEO rubbish, how does it benefit Google at all? If anything these parasites bogs down the quality of search results

They had a feature to block domains from search results in the past (like, 10 years ago). It was removed. I don't know why, but it feels like exactly the kind of feature that sounds great on paper but doesn't actually survive contact with real users.

First, I'd bet that very few people are actually interested in doing that kind of manual curation or engaging with power user features. How large a % of users need to interact with this for the feature to be worth maintaining (in all the backends and frontends)? How many of them actually do so?

Second, the task of blocking spam is adversarial and sisyphean. Trying to deal with web spam by domain blocking (with an individual blocklist) would be like trying to deal with email spam with your own blocklist of spam words. The results will be worse than whatever can be done centrally, where much more information is available both on the sites and on how users actually interact with those domains. And even if you managed to make a good blocklist for a point in time, your job is not done. Tens of thousands of new domains will have popped out next week.

(The dream here of course would be to use the block decisions from individual users to drive the centralized protections. But unless legit users are actually using this in very significant numbers, it'll quickly become just another abuse surface. E.g. brigading, "downrank your competitor in the results" as a service, etc.)

Third, some people will probably block domains they shouldn't have blocked, and then have a bad user experience in future searches as the sites with genuinely best results is blocked. And then you're only left with only bad options: ignoring the users' stated preferences which they'll hate, or serving bad results that they'll also hate.

Can the feature work for a different search engine? Sure. For example, what if you have a paid search engine only used by power users and are looking for a simple to explain feature that people think they want to entice them to sign up? It'll be great for that. And if your entire user base actually loves and uses the feature? Well, it becomes a feature worth maintaining and expanding; it'll actually be a high quality ranking signal rather than something that's trivially gameable; etc.

You can use uBlacklist[1] and subscribe to custom made blacklists[2] for specific content.

1. https://github.com/iorate/ublacklist 2. https://github.com/rjaus/awesome-ublacklist