Feature requests.

The best solution for finding and removing duplicate files.
Post Reply
wwcanoer
Posts: 49
Joined: Wed Aug 19, 2020 5:49 am

Feature requests.

Post by wwcanoer »

9Feature requests:
(1) Folder selection pane: Add "expand all" and "collapse all" options.
(1.1) Buttons on the top or bottom to "expand all" and "collapse all".
(1.2) Right-click any folder and select "expand all subfolders" and "collapse all subfolders".

(2) Folder selection pane: Add a search function to display all paths that contain a string.
ex. I would like to enter "photos" to see every path that has photos and add them to my search.
Then I would repeat for "pictures" and "DCIM" and "camera" to try to find all of the places I have stored good photos. I don't want to simple delete all dupes because I want to delete the old backups of unsorted folders rather than the later ones that are sorted. The problem with a global photo search is that I am then also searching photos that were downloaded, supplied by programs, etc.
ex. I am cleaning up years of old backups, so dedupe all at once is overload. Want to see all of my "finance" folders, then look at them, and then do another subject.

(3) Add a folder tree view of the results, with % duplicates, # duplicate files, and size.
If I knew that folder A contains 90% duplicates and folder B is 30% duplicates, then it helps me decide that I should delete duplicates from folder A instead of folder B.
[Found this program that shows similar folders % similar but way out of my price range at 189Euro (intended for servers), but you can see how they present the data. https://www.jam-software.com/spaceobser ... ders.shtml ]

(3.1) Add a toggle to show the % duplicates in the results list: (could have option to show %, #, GB)
Ex. In the below case it would be clear that I should delete all of FolderC and probably all dups in FolderB rather than deleting from FolderZ.
file1.txt D:\FolderA(20%)\FolderB(80%)\FolderC(100%)
file1.txt F:\FolderX(2%)\FolderY(10%)\FolderZ(5%)

(4) Duplicate folders view:
(4.1) Allow the user to browse upwards in the right detail pane. There's ".." but it doesn't go up.
(4.2) Add the right-click "parents" selection option.
Why: Often there are many duplicate folders within a higher folder, so instead of scrolling through a long list to selecting every subfolder, it's best for me to select all duplicates in a higher folder. To do this, I first right-click open in explorer, look at the higher folders, then copy the folder path and then go to the files tab and select by path, paste that path and then shorten it. (typically looking at the higher folders of both duplicate subfolders.)

(5) Enable multiple instances (as noted in the forum). Ideally, Launch it from the current session:
ex. I do a fast check with filename only and see a few large folders that have duplicates. I then want to perform a contents check on them. So, I would like to select the two folders that I want to compare, right-click "open in a new compare session" where I can change the comparison parameters and run it without losing the other session, that I can continue to manually look through while that session is comparing.

(6) Files and Folders results: Enable multiple selection:
ex. When I want to look closer at two folders (or files), then I need to individually select and right-click "open in explorer" or "view file" but I would prefer to select several folders and then open in one click. (Could limit to max 10 items selected)

(6.1) Even Better: Be able to open two or three folders in a side-by-side explorer window. If there's any free folder compare program or multi-tab browser that could accept a command, then you could pass the task to that program (if it's not worth creating yourself, but I expect that it's easily done with your existing modules).

(7) File/Folder results: Exclude/protect files/folders.
ex. There's a lot of duplicates in a folder but I decide that I don't want to touch that folder, so I would like to right-click "exclude" or right-click "protect" so that I don't accidentally delete from that folder. (Obviously, exclude and protect are very different actions. protect would still show the files but identified bold(?) to indicate protected whereas exclude would drop any single companion duplicates from the list.)

(8) "Refresh existence" of the files in the results list without repeating the compare. i.e. Only refresh the existence of the files.
Ex. If I find files/folders that I want to manually delete, move or rename then I want to refresh to update the results without having to repeat an entire long comparison. Obviously, these changed files would simply be removed from the list.

(9) Add option to remove file errors from the list or otherwise deal with saved html pages that have and html file + folder.
When deleting, I get an error that the files in the "-files" folder cannot be found, presumably because Windows treats the files folder as part of the html file but duplicate cleaner pro treats them as separate. Simply having the option to remove any file that has a "file not found" error would be nice. (Obviously cannot be automatic because same error if a drive was disconnected or asleep.)

(10) Filename/path too long when moving files.
Is there a way to predict too-long names and ask to adjust them?
Moving files is far faster with Windows Shell but if path+filename is too long it simply fails. Then I need to use the built-in move which will wait and give all errors at the end, which is slower. Not a big deal but a prediction would be nice to prevent failure.

(11) UNDO LAST STEP! When marking items.
Ex. In the Duplicate File window I marked quite a few files for deletion, no "group has all duplicates", then I selected a parent folder to "mark all" and then boom, I had many many groups with "all duplicates" marked. Ideally, I could undo the last marking step. Since there is no parent>Unmark, I must copy the path and then use the text pattern to unmark that folder. Or I select all of the "group has all duplicates" and unmark all selected. Both unmark things that I had already marked, so I need to go back and repeat my marking. (Since no "undo" there's always a trade-off between frequently deleting files vs trying to get more marked before taking the time to delete them.

(12) Remove Empty Directories
I am always left with a lot of empty directories even though I have checked that Duplicate Cleaner should delete empty directories. I haven't found any exclusions that would cause this. I always have to use the free RED (Remove Empty Directories) software to remove the empty directories.
User avatar
DigitalVolcano
Site Admin
Posts: 1717
Joined: Thu Jun 09, 2011 10:04 am

Re: Feature requests.

Post by DigitalVolcano »

Thanks for the suggestions - All useful for current development.

Note -
-You can refresh to remove deleted files using F5.

-There is an undo button in the Selection Assistant.
DickyJean
Posts: 5
Joined: Sun Sep 13, 2020 10:11 pm

Re: Feature requests.

Post by DickyJean »

In the file removal tool, I like the option to keep folder structure at destination. One thing that would be helpful for my use case is to be able to specify which part of the folder structure to keep. Example: moving from c:\users\username\documents\folder\subfolder\[folder structure to keep] to \\server\share\ and keeping the folder structure gives me \\server\share then everything after c: In this case it turns out to be \\server\share\users\username\documents\folder\subfolder\[folder structure to keep].

I would like to specify to keep the structure after "subfolder," and get \\server\share\[folder structure to keep]

If I'm just missing some option, please point me to it. If not, I appreciate your consideration.
cstern
Posts: 6
Joined: Thu Apr 09, 2020 6:57 pm

Re: Feature requests.

Post by cstern »

One more:
When comparing to a protected folder to another, and e.g. there is a match between two or more files in the protected folder to a file in the unprotected one, it will list as two "bold" files and some "normal" ones (I mean by font typography). If I go and "Drop selected folder from list" on the unprotected fie, then it dissapears (as it should) along with other files from the same, unprotected folder, from the comparison list.
Here comes the catch: If the files from the protected folder has only one match to each file in the "dropped" folder, they too dissapear from the comparison. but If there are two or more matches in the protected folder, they stay even if the matching, unprotected files/folders are dropped.
The behavior I would prefer was if there are only matches (left) of protected files in each group after the "drop", they should also dissapear from the list.
JimG
Posts: 4
Joined: Sun Aug 22, 2021 7:32 am

Re: Feature requests.

Post by JimG »

This is a required, critical feature. I'm on day 2 of a trial and I recognize that this is absolutely required. I have 4 million duplicate files, and 22k folders. Two primary paths are mirror-duplicates of photos (with tons of subfolders). They are meant to be duplicated, and, to be protected at ALL costs. However, there's tons of (source) paths that don't need to be around anymore (thanks for finding those).

I'd like to PROTECT the true paths and all subpaths, so I can't possibly delete them accidently, but, they need to stay so they can identify duplicates OF those in order to consider deleting.

After a 21 hour scan, I added a couple of subfolders of the scanned folders as protected. However, the results tabs don't show any changes, and get marked along with any wildcard patterns I specify. How do I know they are protected? Does it require rescanning everything just to NOW protect paths that it already scanned?

What is the correct approach here?
JimG
Posts: 4
Joined: Sun Aug 22, 2021 7:32 am

Re: Feature requests.

Post by JimG »

Also, please do put the UI and file/folder marking on a separate thread. With 4 million files, clicking a single folder and moving onto the next folder to click is paused by the "Please Wait" progress dialog that takes 15 seconds to mark a single folder. With 22k folders, if I was FLASH with no time to think, I would still waste 91 hours waiting for a dialog box alone. This isn't scalable.
User avatar
DigitalVolcano
Site Admin
Posts: 1717
Joined: Thu Jun 09, 2011 10:04 am

Re: Feature requests.

Post by DigitalVolcano »

added a couple of subfolders of the scanned folders as protected. However, the results tabs don't show any changes, and get marked along with any wildcard patterns I specify. How do I know they are protected? Does it require rescanning everything just to NOW protect paths that it already scanned?
How did you protect them? The should show a red padlock instead of a checkbox if protected and shouldn't be able to be marked.
JimG
Posts: 4
Joined: Sun Aug 22, 2021 7:32 am

Re: Feature requests.

Post by JimG »

DigitalVolcano wrote: Mon Aug 23, 2021 10:39 am
added a couple of subfolders of the scanned folders as protected. However, the results tabs don't show any changes, and get marked along with any wildcard patterns I specify. How do I know they are protected? Does it require rescanning everything just to NOW protect paths that it already scanned?
How did you protect them? The should show a red padlock instead of a checkbox if protected and shouldn't be able to be marked.
I had scanned (20+hours), then recognized folders I wanted to protect, went back the Folders tab, added the protected folders (went under the scanned folders, of course), clicked the now RED lock next to the subfolders, but (without rescanning for ? hours again), it still allows checks on all folders in the results tab.
User avatar
DigitalVolcano
Site Admin
Posts: 1717
Joined: Thu Jun 09, 2011 10:04 am

Re: Feature requests.

Post by DigitalVolcano »

Changing settings in the Scan Location tab won't affect completed scan results. You need to right click on a folder in the results tab and select 'Protect folder tree'.
Post Reply