Eventually I've to load 35GB of data in an aragnodb instance.
So far I've tried those approaches to load only 5GB (and failed):
Loading via gremlin. It worked, but it took something like 3 days; this is not an option.
bulkimport features an
import?
API endpoint but I got the following error:
...[1] WARNING maximal body size is 536870912, request body size is -2032123904
arangodbimp command but I ended up with two different errors:
- With no/small
--batch-size
it fires
import file is too big. please increase the value of --batch-size
- With a bigger
--batch-size
it returns the same error as the bulkimport.
- With no/small
Could someone tell me how to fix does commands, or a way to actually load those data?
Thanks
Edit for @DavidThomas, here comes the specs:
- RAM: 128G
- CPU: 2x Intel(R) Xeon(R) CPU E5-2420 0 @ 1.90GHz
- OS: Linux (ubuntu) sneezy 3.13.0-86-generic
- HDD: classic (non SSD)