Page 1 of 1

Some suggestions

Posted: Mon Sep 21, 2009 5:27 pm
by Daniel
Hi,

your program is very handy and i want to help to improve it.
Therefore i have compiled some suggestions.

1.There is a little problem with removing empty folders.
Duplicate Cleaner only removes directories it has directly emptied itself.
This action can result in empty parent directories, which the program keeps.

2.The warning message if you want to delete "protected" files could be improved. The first time i read it, i thought it had something to do with the "OS file write protection". It could instead read "The files you want to move/delete are protected according to the option settings." . Something like that.

3.During my tests of duplicate finder programs i came across a special situation.
Imagine many thousand directories with the following folder structure within:

����\Test1
� � test_with_name_changed_1.css
� �
� ����\Test1a
� � test_with_name_changed_2.css
� � other.file
� �
� ����\Test1b
� ����\Test1aa
� test_with_name_changed_3.css

����\Test2
����\Test2a
test.css
other.file

test.css, test_with_name_changed_1,test_with_name_changed_2,
test_with_name_changed_3 are all the same file and i ONLY
want to remove the direct copy of \Test2\Test2a\test.css in \Test1,
probably \Test1\Test1a\test_with_name_changed_2.css.
There is no way to automatically mark only test_with_name_changed_2.css in Duplicate Cleaner and if you have thousands of files it's a real pain to find such constellations.

My first suggestion is to add an alternative "duplicate folder view", where the Cleaner compares whole folders instead of files by binary content.
This should be relatively simple if you regard a directory as one big special file in all calculations. The difficulties lie more likely with the visualization.
You could use a "TreeView" where the highest equal directory is the root and all nodes (files, folders) are markable with checkboxes.
My second suggestion is simpler, but no solution.
Improve the "Select by Master Path" selection modus by marking all groups where there are more files in each slave path than in the master path or the other way around in a special way.
So the user knows there is no direct mapping possible.

Greetings,
Daniel

Posted: Mon Sep 21, 2009 5:32 pm
by Daniel
Seems there is a character limitation

Posted: Mon Sep 21, 2009 5:36 pm
by Daniel
OK, last attempt

Posted: Mon Sep 21, 2009 5:46 pm
by Daniel
Tried to post with firefox and opera to no avail.
How shall i proceed ?
I could send my comment to an anonymous email service (e.g. www.mytrashmail.com).

Posted: Mon Sep 21, 2009 8:26 pm
by DV
Hi
Sorry you are having posting problems. The forum software is still in beta so has a few issues to iron out. If you email me the comment to software AT digitalvolcano.co.uk. I'll have a look (and will try and post it/fix the forum).

thanks

Posted: Mon Sep 21, 2009 10:22 pm
by Daniel
Hi again,

i did send the comment by an anonymous remailer service, so it will take awhile. Please post here if it doesn't arrive by tomorrow evening.

greets

Posted: Tue Sep 22, 2009 9:10 am
by DV
I've repaired the comment above. It had saved ok, but the text you pasted in contained tabs which confused the data file when reading back- will fix this issue at some point, and will answer you comments later.

Posted: Tue Sep 22, 2009 11:33 am
by Daniel
The depiction lost some tabs during your repair. Please replace it with the one from my e-mail (it's without tabs).

Posted: Tue Sep 22, 2009 9:31 pm
by repost
1.There is a little problem with removing empty folders.
Duplicate Cleaner only removes directories it has directly emptied itself.
This action can result in empty parent directories, which the program keeps.

2.The warning message if you want to delete "protected" files could be improved. The first time i read it, i thought it had something to do with the "OS file write protection". It could instead read "The files you want to move/delete are protected according to the option settings." . Something like that.

3.During my tests of duplicate finder programs i came across a special situation.
Imagine many thousand directories with the following folder structure within:

\Test1
....test_with_name_changed_1.css
....\Test1a
........test_with_name_changed_2.css
........other.file
....\Test1b
........\Test1aa
............test_with_name_changed_3.css
\Test2
....\Test2a
........test.css
........other.file

test.css, test_with_name_changed_1,test_with_name_changed_2,
test_with_name_changed_3 are all the same file and i ONLY
want to remove the direct copy of \Test2\Test2a\test.css in \Test1,
probably \Test1\Test1a\test_with_name_changed_2.css.
There is no way to automatically mark only test_with_name_changed_2.css in Duplicate Cleaner and if you have thousands of files it's a real pain to find such constellations.

My first suggestion is to add an alternative "duplicate folder view", where the Cleaner compares whole folders instead of files by binary content.
This should be relatively simple if you regard a directory as one big special file in all calculations. The difficulties lie more likely with the visualization.
You could use a "TreeView" where the highest equal directory is the root and all nodes (files, folders) are markable with checkboxes.
My second suggestion is simpler, but no real solution.
Improve the "Select by Master Path" selection modus by marking all groups where there are more files in each slave path than in the master path or the other way around in a special way.
So the user knows there is no direct mapping possible.

Posted: Thu Sep 24, 2009 10:26 am
by DV
Responses!

1 -Thanks for letting me know. I'll see if I can make the directory remover a bit smarter in the next point update.

2 -Yes, this does seem to cause a bit of confusion. Will update the English version.

3-Some good points. Directory comparison is a feature lacking in DC and would really take it to the next level. I hope to address this in the next major generation of DC (version 2.0).

Thanks for your input!