Feature requests.
Posted: Fri Sep 04, 2020 6:43 am
9Feature requests:
(1) Folder selection pane: Add "expand all" and "collapse all" options.
(1.1) Buttons on the top or bottom to "expand all" and "collapse all".
(1.2) Right-click any folder and select "expand all subfolders" and "collapse all subfolders".
(2) Folder selection pane: Add a search function to display all paths that contain a string.
ex. I would like to enter "photos" to see every path that has photos and add them to my search.
Then I would repeat for "pictures" and "DCIM" and "camera" to try to find all of the places I have stored good photos. I don't want to simple delete all dupes because I want to delete the old backups of unsorted folders rather than the later ones that are sorted. The problem with a global photo search is that I am then also searching photos that were downloaded, supplied by programs, etc.
ex. I am cleaning up years of old backups, so dedupe all at once is overload. Want to see all of my "finance" folders, then look at them, and then do another subject.
(3) Add a folder tree view of the results, with % duplicates, # duplicate files, and size.
If I knew that folder A contains 90% duplicates and folder B is 30% duplicates, then it helps me decide that I should delete duplicates from folder A instead of folder B.
[Found this program that shows similar folders % similar but way out of my price range at 189Euro (intended for servers), but you can see how they present the data. https://www.jam-software.com/spaceobser ... ders.shtml ]
(3.1) Add a toggle to show the % duplicates in the results list: (could have option to show %, #, GB)
Ex. In the below case it would be clear that I should delete all of FolderC and probably all dups in FolderB rather than deleting from FolderZ.
file1.txt D:\FolderA(20%)\FolderB(80%)\FolderC(100%)
file1.txt F:\FolderX(2%)\FolderY(10%)\FolderZ(5%)
(4) Duplicate folders view:
(4.1) Allow the user to browse upwards in the right detail pane. There's ".." but it doesn't go up.
(4.2) Add the right-click "parents" selection option.
Why: Often there are many duplicate folders within a higher folder, so instead of scrolling through a long list to selecting every subfolder, it's best for me to select all duplicates in a higher folder. To do this, I first right-click open in explorer, look at the higher folders, then copy the folder path and then go to the files tab and select by path, paste that path and then shorten it. (typically looking at the higher folders of both duplicate subfolders.)
(5) Enable multiple instances (as noted in the forum). Ideally, Launch it from the current session:
ex. I do a fast check with filename only and see a few large folders that have duplicates. I then want to perform a contents check on them. So, I would like to select the two folders that I want to compare, right-click "open in a new compare session" where I can change the comparison parameters and run it without losing the other session, that I can continue to manually look through while that session is comparing.
(6) Files and Folders results: Enable multiple selection:
ex. When I want to look closer at two folders (or files), then I need to individually select and right-click "open in explorer" or "view file" but I would prefer to select several folders and then open in one click. (Could limit to max 10 items selected)
(6.1) Even Better: Be able to open two or three folders in a side-by-side explorer window. If there's any free folder compare program or multi-tab browser that could accept a command, then you could pass the task to that program (if it's not worth creating yourself, but I expect that it's easily done with your existing modules).
(7) File/Folder results: Exclude/protect files/folders.
ex. There's a lot of duplicates in a folder but I decide that I don't want to touch that folder, so I would like to right-click "exclude" or right-click "protect" so that I don't accidentally delete from that folder. (Obviously, exclude and protect are very different actions. protect would still show the files but identified bold(?) to indicate protected whereas exclude would drop any single companion duplicates from the list.)
(8) "Refresh existence" of the files in the results list without repeating the compare. i.e. Only refresh the existence of the files.
Ex. If I find files/folders that I want to manually delete, move or rename then I want to refresh to update the results without having to repeat an entire long comparison. Obviously, these changed files would simply be removed from the list.
(9) Add option to remove file errors from the list or otherwise deal with saved html pages that have and html file + folder.
When deleting, I get an error that the files in the "-files" folder cannot be found, presumably because Windows treats the files folder as part of the html file but duplicate cleaner pro treats them as separate. Simply having the option to remove any file that has a "file not found" error would be nice. (Obviously cannot be automatic because same error if a drive was disconnected or asleep.)
(10) Filename/path too long when moving files.
Is there a way to predict too-long names and ask to adjust them?
Moving files is far faster with Windows Shell but if path+filename is too long it simply fails. Then I need to use the built-in move which will wait and give all errors at the end, which is slower. Not a big deal but a prediction would be nice to prevent failure.
(11) UNDO LAST STEP! When marking items.
Ex. In the Duplicate File window I marked quite a few files for deletion, no "group has all duplicates", then I selected a parent folder to "mark all" and then boom, I had many many groups with "all duplicates" marked. Ideally, I could undo the last marking step. Since there is no parent>Unmark, I must copy the path and then use the text pattern to unmark that folder. Or I select all of the "group has all duplicates" and unmark all selected. Both unmark things that I had already marked, so I need to go back and repeat my marking. (Since no "undo" there's always a trade-off between frequently deleting files vs trying to get more marked before taking the time to delete them.
(12) Remove Empty Directories
I am always left with a lot of empty directories even though I have checked that Duplicate Cleaner should delete empty directories. I haven't found any exclusions that would cause this. I always have to use the free RED (Remove Empty Directories) software to remove the empty directories.
(1) Folder selection pane: Add "expand all" and "collapse all" options.
(1.1) Buttons on the top or bottom to "expand all" and "collapse all".
(1.2) Right-click any folder and select "expand all subfolders" and "collapse all subfolders".
(2) Folder selection pane: Add a search function to display all paths that contain a string.
ex. I would like to enter "photos" to see every path that has photos and add them to my search.
Then I would repeat for "pictures" and "DCIM" and "camera" to try to find all of the places I have stored good photos. I don't want to simple delete all dupes because I want to delete the old backups of unsorted folders rather than the later ones that are sorted. The problem with a global photo search is that I am then also searching photos that were downloaded, supplied by programs, etc.
ex. I am cleaning up years of old backups, so dedupe all at once is overload. Want to see all of my "finance" folders, then look at them, and then do another subject.
(3) Add a folder tree view of the results, with % duplicates, # duplicate files, and size.
If I knew that folder A contains 90% duplicates and folder B is 30% duplicates, then it helps me decide that I should delete duplicates from folder A instead of folder B.
[Found this program that shows similar folders % similar but way out of my price range at 189Euro (intended for servers), but you can see how they present the data. https://www.jam-software.com/spaceobser ... ders.shtml ]
(3.1) Add a toggle to show the % duplicates in the results list: (could have option to show %, #, GB)
Ex. In the below case it would be clear that I should delete all of FolderC and probably all dups in FolderB rather than deleting from FolderZ.
file1.txt D:\FolderA(20%)\FolderB(80%)\FolderC(100%)
file1.txt F:\FolderX(2%)\FolderY(10%)\FolderZ(5%)
(4) Duplicate folders view:
(4.1) Allow the user to browse upwards in the right detail pane. There's ".." but it doesn't go up.
(4.2) Add the right-click "parents" selection option.
Why: Often there are many duplicate folders within a higher folder, so instead of scrolling through a long list to selecting every subfolder, it's best for me to select all duplicates in a higher folder. To do this, I first right-click open in explorer, look at the higher folders, then copy the folder path and then go to the files tab and select by path, paste that path and then shorten it. (typically looking at the higher folders of both duplicate subfolders.)
(5) Enable multiple instances (as noted in the forum). Ideally, Launch it from the current session:
ex. I do a fast check with filename only and see a few large folders that have duplicates. I then want to perform a contents check on them. So, I would like to select the two folders that I want to compare, right-click "open in a new compare session" where I can change the comparison parameters and run it without losing the other session, that I can continue to manually look through while that session is comparing.
(6) Files and Folders results: Enable multiple selection:
ex. When I want to look closer at two folders (or files), then I need to individually select and right-click "open in explorer" or "view file" but I would prefer to select several folders and then open in one click. (Could limit to max 10 items selected)
(6.1) Even Better: Be able to open two or three folders in a side-by-side explorer window. If there's any free folder compare program or multi-tab browser that could accept a command, then you could pass the task to that program (if it's not worth creating yourself, but I expect that it's easily done with your existing modules).
(7) File/Folder results: Exclude/protect files/folders.
ex. There's a lot of duplicates in a folder but I decide that I don't want to touch that folder, so I would like to right-click "exclude" or right-click "protect" so that I don't accidentally delete from that folder. (Obviously, exclude and protect are very different actions. protect would still show the files but identified bold(?) to indicate protected whereas exclude would drop any single companion duplicates from the list.)
(8) "Refresh existence" of the files in the results list without repeating the compare. i.e. Only refresh the existence of the files.
Ex. If I find files/folders that I want to manually delete, move or rename then I want to refresh to update the results without having to repeat an entire long comparison. Obviously, these changed files would simply be removed from the list.
(9) Add option to remove file errors from the list or otherwise deal with saved html pages that have and html file + folder.
When deleting, I get an error that the files in the "-files" folder cannot be found, presumably because Windows treats the files folder as part of the html file but duplicate cleaner pro treats them as separate. Simply having the option to remove any file that has a "file not found" error would be nice. (Obviously cannot be automatic because same error if a drive was disconnected or asleep.)
(10) Filename/path too long when moving files.
Is there a way to predict too-long names and ask to adjust them?
Moving files is far faster with Windows Shell but if path+filename is too long it simply fails. Then I need to use the built-in move which will wait and give all errors at the end, which is slower. Not a big deal but a prediction would be nice to prevent failure.
(11) UNDO LAST STEP! When marking items.
Ex. In the Duplicate File window I marked quite a few files for deletion, no "group has all duplicates", then I selected a parent folder to "mark all" and then boom, I had many many groups with "all duplicates" marked. Ideally, I could undo the last marking step. Since there is no parent>Unmark, I must copy the path and then use the text pattern to unmark that folder. Or I select all of the "group has all duplicates" and unmark all selected. Both unmark things that I had already marked, so I need to go back and repeat my marking. (Since no "undo" there's always a trade-off between frequently deleting files vs trying to get more marked before taking the time to delete them.
(12) Remove Empty Directories
I am always left with a lot of empty directories even though I have checked that Duplicate Cleaner should delete empty directories. I haven't found any exclusions that would cause this. I always have to use the free RED (Remove Empty Directories) software to remove the empty directories.