Thanks to dust I deleted a 70 gig file on my drive
Dust is a rewrite of du (in rust obviously) that visualizes your directory tree and what percentage each file takes up. But it only prints as many files fit in your terminal height, so you see only the largest files. It's been a better experience that du, which isn't always easy to navigate to find big files (or atleast I'm not good at it.)
Anyway, found a log file at .local/state/nvim/log that was 70gb. I deleted it. Hope it doesn't bite me. Been pushing around 95% of disk space for a while so this was a huge win 👍
So I found out that qbittorrent generates errors in a log whenever it tries to write to a disk that is full...
Everytime my disk was full I would clear out some old torrents, then all the pending log entries would write and the disk would be full again. The log was well over 50gb by the time I figured out that i'm an idiot. Hooray for having dedicated machines.
I once did something even dumber. When I was new to Linux and the CLI, I added a recursive line to my shell config that would add it self to the shell config. So I pretty much had exponential growth of my shell config and my shell would take ~20 seconds to start up before I found the broken code snippet.
If you have ideas please let me know. I'm preparing to hop distros so I'm very tempted to ignore the problem, blame the old distro, and hope it doesn't happen again :)
I would have to look at the log file. Some plugin probably has an issue and writes massive amounts of data to the log every time you use Neovim. Monitor the growth of the log file and contact me via DM if it goes crazy again, I'm gonna try to figure out what's going on.
But did he even look at the log file? They don't get that big when things are running properly, so it was probably warning him about something. Like "Warning: Whatever you do, don't delete this file. It contains the protocol conversion you will need to interface with the alien computers to prevent their takeover."
MINI GUIDE TO FREEING UP DISK SPACE (by a datahoarder idiot who runs on 5 gigs free space on 4 TB)
You will find more trash with the combination of 4 tools. Czkawka (duplicates and big files), Dupeguru (logs), VideoDuplicateFinder by 0x90d, and tune2fs.
VDF finds duplicates by multiple frames of a video, and with reversing frames, and you can set similarity % rate and duration of videos. It is the best tool of its kind with nothing to match it, and uses ffmpeg as backend.
There is a certain amount of disk space reserved on partitions for root or privileged processes, but users who create /home partition separately do not need this reserved space there. 5% space is reserved by default, no matter if your disk is 1 TB, 2 TB or 4 TB. To change this, use command sudo tune2fs -m N (where N is % you want to reserve, can be put to 0% for /home, but NEVER touch root, swap or others, use GParted to check which is which partition).
Regular junk cleaning on Linux can be done with BleachBit. Wipe free disk space once in 3-6 months atleast.
On Windows, use PrivaZer instead of BleachBit.
Since all of these are GUI tools (except tune2fs which requires no commandline hackerman knowledge), this guide is targeted towards tech literacy level of users who can atleast replace crack EXEs in pirated games on Windows.
I admin around three hundred linux servers and this is one of my most common tasks - although I use -shc as I like the total too, and don't bother with less as it's only the biggest files and dirs that I'm interested in and they show up last, so no need to scrollback.
When managing a lot of servers, the storage requirements when installing extra software is never trivial. (Although our storage does do very clever compression and it might recognise the duplication of the file even across many vm filesystems, I'm never quite sure that works as advertised on small files)
check out dua. I usually use it in interactive most which lets you navigate through the file system with visual representations of total dir/file size.
Here is a screenshot randomly found from the github issues
I also recently found this gui program called k4dirstat buried in the repos. There are a few more modern options but this one blows them all out of the park.
Screenshot from the github repo:
Too bad they used such an ugly configuration for the screenshot.. It allows you to modify the visualization to look better and display information differently. Anyway just thought I'd share as the project is old and little known.
thanks for sharing a screenshot of ncdu, should help others discover it
for the visualization itself IMHO Disk Usage Analyzer gives aesthetically pleasing results, not a fan of the UX but it works well enough to identify efficiently large files or directories
Maybe other tools support this too but one thing I like about xdiskusage is that you can pipe regular du output into it. That means that I can run du on some remote host that doesn't have anything fancy installed, scp it back to my desktop and analyze it there. I can also pre-process the du output before feeding it into xdiskusage.
I also often work with textual du output directly, just sorting it by size is very often all I need to see.
For anyone who doesn't want to install extra software for this, or who manages a great many servers "du -shc |sort -hr" will show the largest files and directories at the bottom so you don't need to scroll back for results.