You've clearly never lived with a cat. Your metaphor is crushed by the Kitty Expansion Theory: No piece of furniture is large enough for a cat and any other additional being.
Just like the human eye can only register 60fps and no more, your computer can only register 4gb of RAM and no more. Anything more than that is just marketing.
Current 4 year old laptop with 128GB of ECC RAM is wonderful and is used all the time with simulations, LLMs, ML modelling, and the real heavy lifter, Google Chrome.
About 10 years ago I was like "FINE, clearly 512MB of memory isn't enough to avoid swapping hell, I'll get 1 GB of extra memory." ...and that was that!
These days I'm like "4 GB on a single board computer? Oh that's fine. You may need that much to run a browser. And who's going to run a browser regularly on a SBC? ...oh I've done it a lot of times and it's... fine."
The thing I learned is that you can run a whole bunch of SHIT HOT server software on a system with less than a gigabyte of memory. The moment you run a web browser? FUCK ALL THAT.
And that's basically what I found out long ago. I had a laptop that had like 32 megs of memory. Could be a perfectly productive person with that. Emacs. Darcs. SSH over a weird USB Wi-Fi dongle. But running a web browser? Can't do Firefox. Opera kinda worked. Wouldn't work nowadays, no. But Emacs probably still would.
I remember building my gaming machine in 2008 and put 4GB (2x2) in, then RAM prices tanked 6 months later so I added another 4GB. I remember having lots of conversations where I was like "yeah, 8GB is over kill" but what I didn't expect is that it was such overkill that when I built my next machine in 2012, I still only put 8GB on it.
It wasn't until 2019 that I built a machine and put 16GB in it. I ran on 8GB for over a decade. Pretty impressive for gaming.
The other day I got a Mini PC to use as a home server (including as media server with Kodi).
It has 8GB of RAM, came with some Windows (10 or 11), didn't even try it and wiped it out, put Lubunto on it and a bunch of services along with Kodi.
Even though it's running X in order to have Kodi there and Firefox is open and everything, it's using slightly over 2GB of RAM.
I keep wanting to upgrade it to 16 GB, because, you know, I just like mucking about with hardware and there's the whole new toy feeling, but I look at the memory usage and just can't bring myself around to do it just for fun, as it would be a completelly useless upgrade and not even bright eyed uuh, shinny me can convince adult me to waste 60 bucks on something so utterly completelly useless.
Much like a cat can stretch out and somehow occupy an entire queen-sized bed, Linux will happily cache your file system as long as there is available memory.
The cat is the Rimworld mod with a hefty memory leak yesterday. 32 GB was full in seconds. But it gave me enough time to find the culprit and kill Rimworld without trashing my session every time.
"Free" memory is actually usually used for cache. So instead of waiting to get data from the disk, the system can just read it directly from RAM after the first access. The more RAM you have, the more free space you'll have to use for cache. My machine often has over 20GB of RAM used as cache. You can see this with free -m. IIRC both Gnome and KDE's system managers also show that now.
I just took a Core i5, 6 GB RAM laptop from 2011 and reinstalled Linux Mint and put in a 1 TB SSD. The difference between that and Ubuntu 23.10 and a 750 GB 5400 RPM drive was like night and day.
I feel like recently developed games and apps expect the user to have a "moden" sized RAM, meaning that the decs don't give a crap about optimizing RAM-usage.
In a similar fashion I got my sons old netbook. It has 32GB flash as storage medium. 27GB were in use by Windows, Office, and Firefox. User file size was neglectable. Then it ran into problems because it wanted to download an 8GB update.
Now it runs Kubuntu, which uses about 4GB with LibreOffice and a load of other things.
Transcoding an HDR blueray to h265 filled it up pretty quick and I'm about to start dabbling with game development/3d modeling.
I've also filled it up pretty quick learning how fast various data structures are in which situations. You don't really see a difference in speed until you get into the billions of items at least for python.
It's great that the system is so efficient. But things do come up. I once worked with an LSP server that was so hungry that I had to upgrade from 32 to 64gb to stop the OOM crashes. (Tbf I only ran out of memory when running the LSP server and compiler at the same time - but hey, I have work to do!) But now since I'm working in a different area I'm just way over-RAMed.
That that to the 3000 browser tabs I have open, two instances of VS code, the multithreaded python app I’m running and developing, the several-gigabytes large dataset that’s active in memory.
I ran with 8gb ram for 7 years because zram would shove my swap into what little ram I had available and it actually worked well enough that I didn't feel like upgrading until this year lol.
I was running out of RAM on my 16GB system for years (just doing normal work tasks), so I finally upgraded to a new laptop with 64GB of RAM. Now I never run out of memory.
I got 32 just so I could hoard more browser tabs. I have a more minimal setup on my laptop that goes with me places and any tabs I anticipate not needing for a couple weeks or more go to the desktop with more ram.
It really depends on what you are doing with your system...
On my main PC I want the full Linux Desktop experience, including some Gnome tools that require webkit - and since I am running Gentoo, installing/updating webkit takes a lot of RAM - I would recommend 32 GiB at least.
My laptop on the other hand is an MNT Reform, powered by a Banana Pi CM4 with merely 4 GiB of memory. There I am putting in some effort to keep the system lightweight, and that seems to work well for me up to now. As long as I can avoid installing webkit or compiling the Rust compiler from source, I am perfectly happy with 4 GiB. So happy actually, that I currently don't feel the need to upgrade the Reform to the newly released RK3588 processor module, despite it being a lot faster and it having 32 GiB of memory.
Oh, and last, but not least, my work PC... I'm doing Unreal game development at work, and there the 64 GiB main memory and 8 GiB VRAM I have are the absolute bare minimum. If it were an option, I would prefer to have 128 GiB of RAM, and 16 GiB of VRAM, to prevent swapping and to prevent spilling of VRAM into main memory...