Details
-
Bug
-
Resolution: Fixed
-
Major
-
7.1.4, 7.1.0, 7.1.1, 7.1.2, 7.1.3
-
Untriaged
-
0
-
Yes
Description
Workaround
Given we're still writing out completely valid json, the file can be pre-processed slightly before re-importing using a tool such as jq.
$ cat test.json
|
{
|
"key1": "value1"
|
}
|
{
|
"key2": "value2"
|
}
|
$ cat test.json | jq -c | sponge test.json
|
$ cat test.json
|
{"key1":"value1"}
|
{"key2":"value2"}
|
What's the issue?
We have an optimization in cbexport which skips marshaling JSON unless required.
This means cbimport will potentially write out valid JSON data without re-encoding it, which is fine, unless the original JSON data is pretty-printed.
When pretty-printed, we export a multi-line JSON document in what should be the newline delimited format.
This means documents exported by cbexport may be incompatible with cbimport when using the same lines format.
Steps to reproduce
- Import the provided 'one.json' file using 'cbimport json --format list -c 172.20.1.1:8091 -u Administrator -p asdasd -d file://one.json -b beer-sample -g '#UUID#''
- Export the data using 'cbexport json -c 172.20.1.1 -u Administrator -p asdasd -b beer-sample -f lines -o two.json'
- You should see the export contains the document in pretty-printed format
- Repeat with the '--include-key key' flag
- You should see the export contains the document on one line
What's the fix?
We should remove this optimization (for at least the lines format).