Some suggestions

The best solution for finding and removing duplicate files.
User avatar
Daniel

Some suggestions

Post by Daniel »

Hi,

your program is very handy and i want to help to improve it.
Therefore i have compiled some suggestions.

1.There is a little problem with removing empty folders.
Duplicate Cleaner only removes directories it has directly emptied itself.
This action can result in empty parent directories, which the program keeps.

2.The warning message if you want to delete "protected" files could be improved. The first time i read it, i thought it had something to do with the "OS file write protection". It could instead read "The files you want to move/delete are protected according to the option settings." . Something like that.

3.During my tests of duplicate finder programs i came across a special situation.
Imagine many thousand directories with the following folder structure within:

����\Test1
� � test_with_name_changed_1.css
� �
� ����\Test1a
� � test_with_name_changed_2.css
� � other.file
� �
� ����\Test1b
� ����\Test1aa
� test_with_name_changed_3.css

����\Test2
����\Test2a
test.css
other.file

test.css, test_with_name_changed_1,test_with_name_changed_2,
test_with_name_changed_3 are all the same file and i ONLY
want to remove the direct copy of \Test2\Test2a\test.css in \Test1,
probably \Test1\Test1a\test_with_name_changed_2.css.
There is no way to automatically mark only test_with_name_changed_2.css in Duplicate Cleaner and if you have thousands of files it's a real pain to find such constellations.

My first suggestion is to add an alternative "duplicate folder view", where the Cleaner compares whole folders instead of files by binary content.
This should be relatively simple if you regard a directory as one big special file in all calculations. The difficulties lie more likely with the visualization.
You could use a "TreeView" where the highest equal directory is the root and all nodes (files, folders) are markable with checkboxes.
My second suggestion is simpler, but no solution.
Improve the "Select by Master Path" selection modus by marking all groups where there are more files in each slave path than in the master path or the other way around in a special way.
So the user knows there is no direct mapping possible.

Greetings,
Daniel
User avatar
Daniel

Post by Daniel »

Seems there is a character limitation
User avatar
Daniel

Post by Daniel »

OK, last attempt
User avatar
Daniel

Post by Daniel »

Tried to post with firefox and opera to no avail.
How shall i proceed ?
I could send my comment to an anonymous email service (e.g. www.mytrashmail.com).
User avatar
DV

Post by DV »

Hi
Sorry you are having posting problems. The forum software is still in beta so has a few issues to iron out. If you email me the comment to software AT digitalvolcano.co.uk. I'll have a look (and will try and post it/fix the forum).

thanks
User avatar
Daniel

Post by Daniel »

Hi again,

i did send the comment by an anonymous remailer service, so it will take awhile. Please post here if it doesn't arrive by tomorrow evening.

greets
User avatar
DV

Post by DV »

I've repaired the comment above. It had saved ok, but the text you pasted in contained tabs which confused the data file when reading back- will fix this issue at some point, and will answer you comments later.
User avatar
Daniel

Post by Daniel »

The depiction lost some tabs during your repair. Please replace it with the one from my e-mail (it's without tabs).
User avatar
repost

Post by repost »

1.There is a little problem with removing empty folders.
Duplicate Cleaner only removes directories it has directly emptied itself.
This action can result in empty parent directories, which the program keeps.

2.The warning message if you want to delete "protected" files could be improved. The first time i read it, i thought it had something to do with the "OS file write protection". It could instead read "The files you want to move/delete are protected according to the option settings." . Something like that.

3.During my tests of duplicate finder programs i came across a special situation.
Imagine many thousand directories with the following folder structure within:

\Test1
....test_with_name_changed_1.css
....\Test1a
........test_with_name_changed_2.css
........other.file
....\Test1b
........\Test1aa
............test_with_name_changed_3.css
\Test2
....\Test2a
........test.css
........other.file

test.css, test_with_name_changed_1,test_with_name_changed_2,
test_with_name_changed_3 are all the same file and i ONLY
want to remove the direct copy of \Test2\Test2a\test.css in \Test1,
probably \Test1\Test1a\test_with_name_changed_2.css.
There is no way to automatically mark only test_with_name_changed_2.css in Duplicate Cleaner and if you have thousands of files it's a real pain to find such constellations.

My first suggestion is to add an alternative "duplicate folder view", where the Cleaner compares whole folders instead of files by binary content.
This should be relatively simple if you regard a directory as one big special file in all calculations. The difficulties lie more likely with the visualization.
You could use a "TreeView" where the highest equal directory is the root and all nodes (files, folders) are markable with checkboxes.
My second suggestion is simpler, but no real solution.
Improve the "Select by Master Path" selection modus by marking all groups where there are more files in each slave path than in the master path or the other way around in a special way.
So the user knows there is no direct mapping possible.
User avatar
DV

Post by DV »

Responses!

1 -Thanks for letting me know. I'll see if I can make the directory remover a bit smarter in the next point update.

2 -Yes, this does seem to cause a bit of confusion. Will update the English version.

3-Some good points. Directory comparison is a feature lacking in DC and would really take it to the next level. I hope to address this in the next major generation of DC (version 2.0).

Thanks for your input!

Post Reply