Hi there,
I did a scan on roughly 800,000 files, using Duplicate Cleaner Pro 3.0.3 trial on Windows 7 64-bit, with the following settings:
- Regular Mode
- Ignore Content
- Same File Name Only
- Same Size
- Don't Scan System Files/Folders
- Don't Follow NTFS Mountpoints and Junctions
- Included *.*
- Any size
- Any Date
The scan included four partitions, but not my C:\ partition where my Windows install was.
Unfortunately, when the scan was finished, the following error message appeared and the GUI was unusable...
http://i.imgur.com/JmbXe.png
Out of curiosity, if this bug were to be fixed, would an update be released immediately or would I have to wait until the next planned release?
Thanks.
Found a bug in 3.0
Re: Found a bug in 3.0
Was the error message blank, or have you hidden it?
If fixed, we'd put out an update straight away. Thanks for reporting this!
If fixed, we'd put out an update straight away. Thanks for reporting this!
Re: Found a bug in 3.0
Sorry, I should've mentioned, that was the actual error. Entirely blank.
The only editing I did to the screenshot was to black-out my files.
[edit]: In an attempt to recreate the other error message I was getting yesterday, something along the lines of an "Out of Memory" error, I ran Duplicate Cleaner again. This time I left a drive off the scan, resulting in 500k files.
This resulted in it locking up entirely at the end, with the "Scan Complete" window stuck open, and everything entirely unresponsive. In Task Manager I can see that its using 0% CPU and 1,115MB of RAM.
Rob.
The only editing I did to the screenshot was to black-out my files.
[edit]: In an attempt to recreate the other error message I was getting yesterday, something along the lines of an "Out of Memory" error, I ran Duplicate Cleaner again. This time I left a drive off the scan, resulting in 500k files.
This resulted in it locking up entirely at the end, with the "Scan Complete" window stuck open, and everything entirely unresponsive. In Task Manager I can see that its using 0% CPU and 1,115MB of RAM.
Rob.
Re: Found a bug in 3.0
Not a good sign when it gets to really large datasets, as I'm working with 6 disks and 12TB of data (admitedly trying to keep my scans to only 1 or 2 disks and maybe 2 TB of data) - still I could be in excess of 200,000 files easily.
Love the program so far but I think it might need a little code cleanup.
Love the program so far but I think it might need a little code cleanup.
- DigitalVolcano
- Site Admin
- Posts: 1863
- Joined: Thu Jun 09, 2011 10:04 am
Re: Found a bug in 3.0
We are still working on this- some people have problems around the 500,000 files mark. Hope to have an update out in the coming month for this, but for now you'll need to split your scans up.