One of the biggest criticisms of Mastodon's technical capabilities involves how limited its search system is. With a new update, everything has changed.
It's hard not to overstate how much of a big deal it is that Mastodon is adopting this kind of search functionality. Mastodon still makes up a vast portion of the Fediverse.
While other platforms have supported this for way longer, having buy-in by the biggest player in the space will probably have a huge effect on standard expectations moving forward.
It also adds to the deployment complexity even more. Just from memory, to run Mastodon you need:
any number of Rails web servers (horizontally scalable)
any number of Sidekiq worker processes (horizontally scalable)
a PostgreSQL database for persistent storage (vertically scalable modulo sharding)
a Redis server for caching and Sidekiq (vertically scalable modulo sharding)
a Elasticsearch server for full text search (vertically scalable modulo sharding)
So this is at least 5 different server processes to manage, In reality for almost all deployments, Redis and Elasticsearch are unnecessary; the database can be used for jobs and full text search. Further, it could even be SQLite for all but large instances.
The deployment story for Mastodon is a nightmare and a substitute like Pleroma or even better something in Rust is necessary.
Did they, though? A bunch of other Fediverse platforms have supported this for literally years, to the point that Mastodon was the butt of jokes for breaking basic search functionality.
Having standard search that just works is a huge deal, and helps solve against the decentralized content discovery problem.
There is absolutely a segment of Mastodon users who behave like they do, and tbh I think there should always be small instances, but i also think this will be a great improvement.