Portability makes it better.But that will be expected I will say isn't? To obtain the best possible performance, at a logical level really, you will need to use the usual ReFS for your synthetic, might be weekly or so, and then once you know your synthetic file is created, that is the file you want to apply dedupe.
Maybe there's a more comprehensive task manager utility?Īnyway, this utility is, on the whole, really good and very useful. When I shut down Fastcopy, the "phantom" disappears. But in the performance tab, one of my cores is running at over 50%. In the processes listing, I've got nothing running more than 3%, and then only one thing. I've noticed that this utility also fools Task Manager.
I've always thought USB was really fecal technology - software based and vulnerable, awkward plug, no latch, CPU-bound! I mean, we had a dedicated plug with it's own register that DIDN'T ride on the processor, but nooooo. (USB Wacom tablet) If I run FastCopy at 30% or less, it just makes the mouse "stupid" but doesn't kill it entirely. Using VNC to drive another computer doing this works fine, but running it directly on my main PC, the mouse is brain dead all the time and frequently goes into a coma requiring that I unplug and re-plug the cable. I've been using this to get some data back from bad DVDR media where the last 300MB generally produces CRC errors a a few months after burning.(pretty much ALL DVDR media!)įirst I run FastCopy at full speed, then if a disc throws CRC errors, I slow it down. So, if you have such files, be carefull and use another app, like winmerge or similar.įurthermore, FC is able to verify the backupped files hash, but it doesnt verify in any way the actual files in source/destination folder or partition. The author specifies the rules correctly, size and date. Also, since the file was overwritten there was no way to recover even the faulty one. I discovered later it wasnt never updated, the file was just like I copied at first time, original state lets say. Not a big problem, I thought, and I overwritten it with the backupped copy. After several years, I file like described upon suffered a sort of corruption. It compares files size and date but NOT the actual content.įor example, it happens that you have a file with same age and size but different/updated content (some applications are able to generate such containers, with constant dimension and untouched last modified date).
It's fast and solid, but it doesn't check file content.