Found a bug in 3.0

The best solution for finding and removing duplicate files.
Post Reply
robzyb
Posts: 2
Joined: Sun Feb 12, 2012 12:13 am

Found a bug in 3.0

Post by robzyb »

Hi there,

I did a scan on roughly 800,000 files, using Duplicate Cleaner Pro 3.0.3 trial on Windows 7 64-bit, with the following settings:

- Regular Mode
- Ignore Content
- Same File Name Only
- Same Size
- Don't Scan System Files/Folders
- Don't Follow NTFS Mountpoints and Junctions
- Included *.*
- Any size
- Any Date

The scan included four partitions, but not my C:\ partition where my Windows install was.

Unfortunately, when the scan was finished, the following error message appeared and the GUI was unusable...

http://i.imgur.com/JmbXe.png

Out of curiosity, if this bug were to be fixed, would an update be released immediately or would I have to wait until the next planned release?

Thanks.
User avatar
DV
Posts: 78
Joined: Fri Jun 10, 2011 9:00 am

Re: Found a bug in 3.0

Post by DV »

Was the error message blank, or have you hidden it?
If fixed, we'd put out an update straight away. Thanks for reporting this!
robzyb
Posts: 2
Joined: Sun Feb 12, 2012 12:13 am

Re: Found a bug in 3.0

Post by robzyb »

Sorry, I should've mentioned, that was the actual error. Entirely blank.

The only editing I did to the screenshot was to black-out my files.

[edit]: In an attempt to recreate the other error message I was getting yesterday, something along the lines of an "Out of Memory" error, I ran Duplicate Cleaner again. This time I left a drive off the scan, resulting in 500k files.

This resulted in it locking up entirely at the end, with the "Scan Complete" window stuck open, and everything entirely unresponsive. In Task Manager I can see that its using 0% CPU and 1,115MB of RAM.

Rob.
abrasion
Posts: 34
Joined: Sun Mar 18, 2012 9:16 am

Re: Found a bug in 3.0

Post by abrasion »

Not a good sign when it gets to really large datasets, as I'm working with 6 disks and 12TB of data (admitedly trying to keep my scans to only 1 or 2 disks and maybe 2 TB of data) - still I could be in excess of 200,000 files easily.

Love the program so far but I think it might need a little code cleanup.
User avatar
DigitalVolcano
Site Admin
Posts: 1731
Joined: Thu Jun 09, 2011 10:04 am

Re: Found a bug in 3.0

Post by DigitalVolcano »

We are still working on this- some people have problems around the 500,000 files mark. Hope to have an update out in the coming month for this, but for now you'll need to split your scans up.
Post Reply