What is the best duplicate file finder, that preserves my source of truth?
Having been so meticulous about taking back ups, I’ve perhaps not as been as careful about where I stored them, so I now have a loads of duplicate files in various places. I;ve tried various tools fdupes, czawka etc. , but none seems to do what I want.. I need a tool that I can tell which folder (and subfolders) is the source of truth, and to look for anything else, anywhere else that’s a duplicate, and give me an option to move or delete. Seems simple enough, but I have found nothing that allows me to do that.. Does anyone know of anything ?
Thanks - I think I tried that - but at the time it had no concept of a source (location) of truth to preserve / find duplicates against - has that changed ? They don’t seem to reference that specific capability on that link ?
Thanks @speculatrix - I wish I had your confidence in scripting - hence I’m hoping to find something that does all that clever stuff for me.. The key thing for me is to say something like multimedia/photos/ is the source of truth anything found elsewhere is a duplicate ..
I’d like to find something that has that capability- so I can say multimedia/photos/ is the source of truth - anything identical found elsewhere is a duplicate. I hoped this would be an easy thing to as the ask is simply to ignore any duplicates in a particular folder hierarchy..
The duff utility reports clusters of duplicates in the specified files and/or directories.
In the default mode, duff prints a customizable header, followed by the names of all the
files in the cluster. In excess mode, duff does not print a header, but instead for each
cluster prints the names of all but the first of the files it includes.
If no files are specified as arguments, duff reads file names from stdin.
In this screenshot for example, I added 3 folders and marked the first folder as reference folder (the checkmark behind it). It will now look for files from this folder in the other folders and delete all identical files found in the non-reference folders (it will off course first list all of them and ask you to confirm before deleting)
I use double killer on Windows and rmlint on Linux. With rmlint you can use tagged directories and tell it to keep the tagged and only match against the tagged. It has a lot of options, but no GUI.
I think you’re asking for a duplicate finder that can tell where that file came from(source of truth?)
Most duplicate finders work by hashing the files and looking for matches. If the file indicated where it came from it would have a different hash and not be found to be a duplicate.
So I don’t think what you’re asking for can be done. But I’m not sure I understand what you’re asking.
Thanks @CrappyTan69 - I ideally need this to run on my NAS, and if possible be opensource/free - looks like for what I’d need Double Killer for, it’s £15/$20 - maybe an option as a last resort..
I don't think there is a good way to tell which two duplicate files was "first" other than checking Creation Date but if this is Linux that attribute may not be enabled in your fs type.
The closest thing I've seen is a python dedup scripts but after it identifies all the dups it deletes all but one of them and then puts hard links, to that real file, where all the deleted dups were.
Hi @xewgranodius - I’m not actually worried about which came first, the key thing for me is which one is located in the directory (source) of truth. If it’s not in there then it’s fair game and can be moved/deleted..
I’ll have to reinstall it to remind myself what it was, if I recall correctly it was not easy to work out what I needed to do, as I simply wanted to say scan everything for duplicates that are in the (directory hierarchy e.g. multimedia/photos/) I have deemed as being the source of truth)..
If you're 100% sure that the dupes are only between your source of truth and "everything else", you can run fdupes then grep -v /path/to/source/of/truth/root the output - all the file paths that remain are duplicate files outside your source of truth, which can be deleted.