I’ve recently started to use Crashplan to back up a rather large file server. It was crashing repeatedly around ~1.1TB and ~300k files.
The error message we were seeing on our remote host was “target lost” which led us to many hours troubleshooting disk performance and network connectivity. After attaching a “local” disk to the VM for local backups, waiting the ~12-14 hours for the initial backup to get to the same spot – and then fail – it appeared that it was something more systemic.
Contacting Crashplan support yielded this very helpful response:
Crashplan Rep Response:
It appears that the CrashPlan backup engine is running out of memory.
Running Notepad or any other text-editor as an Admin, edit the CrashPlan engine’s CrashPlanService.ini file to allow it to use more java memory:
1. Stop the backup engine: http://support.crashplan.com/doku.php/how_to/stop_and_start_engine
2. Locate the Notepad program, right-click and Launch as Administrator
3. Go to File > Open, and navigate to C:\Program Files\CrashPlan\CrashPlanService.ini
4. Find the following line in the file:
-Xmx512m
5. Edit to something larger such as 640, 768, 896, or 1024. E.g.:
-Xmx1024m
This sets the maximum amount of memory that CrashPlan can use. CrashPlan will not use that much until it needs it. I would recommend starting out setting it to 768, and go higher only if you continue experiencing problems. You can set it as high as 2048 on 32-bit systems, or even higher on 64-bit systems.
6. Start the backup engine.
Outcome:
We set it to -Xmx1024m after increasing the memory allocation by 1GB as well. The server is running like a top and backups are consistently running successfully.
Troubleshooting backups, especially mulit-TB datasets, can be a huge pain as they take so long to redo and reproduce. Props to Crashplan for getting back to me within two hours on our free trial, which has since been converted to their family unlimited plan for two years. *thumbs up*
–Nat
Nope. Still crashes…