Hi,
your program is very handy and i want to help to improve it.
Therefore i have compiled some suggestions.
1.There is a little problem with removing empty folders.
Duplicate Cleaner only removes directories it has directly emptied itself.
This action can result in empty parent directories, which the program keeps.
2.The warning message if you want to delete "protected" files could be improved. The first time i read it, i thought it had something to do with the "OS file write protection". It could instead read "The files you want to move/delete are protected according to the option settings." . Something like that.
3.During my tests of duplicate finder programs i came across a special situation.
Imagine many thousand directories with the following folder structure within:
����\Test1
� � test_with_name_changed_1.css
� �
� ����\Test1a
� � test_with_name_changed_2.css
� � other.file
� �
� ����\Test1b
� ����\Test1aa
� test_with_name_changed_3.css
�
����\Test2
����\Test2a
test.css
other.file
test.css, test_with_name_changed_1,test_with_name_changed_2,
test_with_name_changed_3 are all the same file and i ONLY
want to remove the direct copy of \Test2\Test2a\test.css in \Test1,
probably \Test1\Test1a\test_with_name_changed_2.css.
There is no way to automatically mark only test_with_name_changed_2.css in Duplicate Cleaner and if you have thousands of files it's a real pain to find such constellations.
My first suggestion is to add an alternative "duplicate folder view", where the Cleaner compares whole folders instead of files by binary content.
This should be relatively simple if you regard a directory as one big special file in all calculations. The difficulties lie more likely with the visualization.
You could use a "TreeView" where the highest equal directory is the root and all nodes (files, folders) are markable with checkboxes.
My second suggestion is simpler, but no solution.
Improve the "Select by Master Path" selection modus by marking all groups where there are more files in each slave path than in the master path or the other way around in a special way.
So the user knows there is no direct mapping possible.
Greetings,
Daniel
Some suggestions
Tried to post with firefox and opera to no avail.
How shall i proceed ?
I could send my comment to an anonymous email service (e.g. www.mytrashmail.com).
How shall i proceed ?
I could send my comment to an anonymous email service (e.g. www.mytrashmail.com).
1.There is a little problem with removing empty folders.
Duplicate Cleaner only removes directories it has directly emptied itself.
This action can result in empty parent directories, which the program keeps.
2.The warning message if you want to delete "protected" files could be improved. The first time i read it, i thought it had something to do with the "OS file write protection". It could instead read "The files you want to move/delete are protected according to the option settings." . Something like that.
3.During my tests of duplicate finder programs i came across a special situation.
Imagine many thousand directories with the following folder structure within:
\Test1
....test_with_name_changed_1.css
....\Test1a
........test_with_name_changed_2.css
........other.file
....\Test1b
........\Test1aa
............test_with_name_changed_3.css
\Test2
....\Test2a
........test.css
........other.file
test.css, test_with_name_changed_1,test_with_name_changed_2,
test_with_name_changed_3 are all the same file and i ONLY
want to remove the direct copy of \Test2\Test2a\test.css in \Test1,
probably \Test1\Test1a\test_with_name_changed_2.css.
There is no way to automatically mark only test_with_name_changed_2.css in Duplicate Cleaner and if you have thousands of files it's a real pain to find such constellations.
My first suggestion is to add an alternative "duplicate folder view", where the Cleaner compares whole folders instead of files by binary content.
This should be relatively simple if you regard a directory as one big special file in all calculations. The difficulties lie more likely with the visualization.
You could use a "TreeView" where the highest equal directory is the root and all nodes (files, folders) are markable with checkboxes.
My second suggestion is simpler, but no real solution.
Improve the "Select by Master Path" selection modus by marking all groups where there are more files in each slave path than in the master path or the other way around in a special way.
So the user knows there is no direct mapping possible.
Duplicate Cleaner only removes directories it has directly emptied itself.
This action can result in empty parent directories, which the program keeps.
2.The warning message if you want to delete "protected" files could be improved. The first time i read it, i thought it had something to do with the "OS file write protection". It could instead read "The files you want to move/delete are protected according to the option settings." . Something like that.
3.During my tests of duplicate finder programs i came across a special situation.
Imagine many thousand directories with the following folder structure within:
\Test1
....test_with_name_changed_1.css
....\Test1a
........test_with_name_changed_2.css
........other.file
....\Test1b
........\Test1aa
............test_with_name_changed_3.css
\Test2
....\Test2a
........test.css
........other.file
test.css, test_with_name_changed_1,test_with_name_changed_2,
test_with_name_changed_3 are all the same file and i ONLY
want to remove the direct copy of \Test2\Test2a\test.css in \Test1,
probably \Test1\Test1a\test_with_name_changed_2.css.
There is no way to automatically mark only test_with_name_changed_2.css in Duplicate Cleaner and if you have thousands of files it's a real pain to find such constellations.
My first suggestion is to add an alternative "duplicate folder view", where the Cleaner compares whole folders instead of files by binary content.
This should be relatively simple if you regard a directory as one big special file in all calculations. The difficulties lie more likely with the visualization.
You could use a "TreeView" where the highest equal directory is the root and all nodes (files, folders) are markable with checkboxes.
My second suggestion is simpler, but no real solution.
Improve the "Select by Master Path" selection modus by marking all groups where there are more files in each slave path than in the master path or the other way around in a special way.
So the user knows there is no direct mapping possible.
Responses!
1 -Thanks for letting me know. I'll see if I can make the directory remover a bit smarter in the next point update.
2 -Yes, this does seem to cause a bit of confusion. Will update the English version.
3-Some good points. Directory comparison is a feature lacking in DC and would really take it to the next level. I hope to address this in the next major generation of DC (version 2.0).
Thanks for your input!
1 -Thanks for letting me know. I'll see if I can make the directory remover a bit smarter in the next point update.
2 -Yes, this does seem to cause a bit of confusion. Will update the English version.
3-Some good points. Directory comparison is a feature lacking in DC and would really take it to the next level. I hope to address this in the next major generation of DC (version 2.0).
Thanks for your input!