Based on what I have seen in review comments, we need more detail in your answers. See below:
Cbtransfer & CSVs, http://www.couchbase.com/issues/browse/MB-7102
Size limits for import/export
As far as I know, no limit.
> What are the limitations of using this given limited RAM, CPU and disk space on node/cluster/machine? What can break if you upload something 2x RAM on node, for instance?
>How should someone prepare their cluster and machines export/import in terms of RAM, CPU, Disk space, RAM quota, etc?
Performance consequences, performance compared to other methods
Since we reuse same transfer engine and code base, essentially no difference compared to other methods.
Errors that can occur
1. When csv file is not well formatted.
>What is the exact error message that you will see?
How to monitor
1. For csv output, progressing messages on command window
2. For csv import, you can monitor from admin console as other restore tools.
> For import, where exactly in admin console do you need to look? What UI element do you look at? What should you see?
>For import and monitoring with 'other restore tools' which tools are these? What should you see?
When to use
1. Need data transferring between couchbase and other database systems. CSV is a well known data format.
Expected data formats
1. Standard csv data format
Sample of command running and output
, sample command running and output is documented already.
Where should command be run from?
cbtransfer sits in the same bin directory as cbbackup, cbrestore.
>People want to know what machine you run it on (node in cluster, machine outside the cluster, etc.)
We will need this by COB Thursday, June 20 in order to add it to the documentation by release.