MongoDB import of geojson data fails - json

I'm trying to import some GeoJson data into MongoDB. The entire file is about 24MB, so in theory the per-document limit of 16MB shouldn't be exceeded. But it looks like it's complaining about the size. I have tried solutions offered here, but none seems to work. I type the command:
mongoimport -d userdata -c countries < countries.geojson
and I get
2017-11-17T01:09:29.561+0400 connected to: localhost
2017-11-17T01:09:31.055+0400 num failures: 1
2017-11-17T01:09:31.055+0400 Failed: lost connection to server
2017-11-17T01:09:31.055+0400 imported 0 documents
and the mongod logs show (after backtrace):
2017-11-17T01:09:31.055+0400 I - [conn153] AssertionException handling request, closing client connection: 10334 BSONObj size: 17756597 (0x10EF1B5) is invalid. Size must be between 0 and 16793600(16MB) First element: insert: "countries"
2017-11-17T01:09:31.055+0400 I - [conn153] end connection 127.0.0.1:61806 (2 connections now open)
I have tried
mongoimport -d userdata -c countries < countries.geojson --batchSize 1
and
mongoimport -d userdata -c countries -j 4 < countries.geojson
based on other similar answers but got the same result, with the same response and logs.
Anyone have clues as to what's going on here? Should I break the GeoJson into two and give that a shot? I thought the 16MB limit was on individual documents, not collections or collection imports.

Related

How to read data from socket in Lua until no more data is available?

I can't manage to read the data from a luasocket. If i read more than the available data, the function call keeps blocked waiting until the client decides to close.
https://github.com/StringManolo/LuaServer/blob/main/tmpServer.lua#L216
line, errorStr = clientObj:receive("*a")
I'm using this command to test:
$ curl -X POST -d "a=b" http://localhost:1337 -v
Got same problem using Chrome to send a request to the Lua server.
I tryied to read byte to byte, line to line, all, etc.

mongoimport fails due to invalid character in massive file, possibly an issue with the character encoding

When I run the following command:
mongoimport -v -d ntsb -c data xml_results.json --jsonArray
I get this error:
2020-07-15T22:51:41.267-0400 using write concern: &{majority false 0}
2020-07-15T22:51:41.270-0400 filesize: 68564556 bytes
2020-07-15T22:51:41.270-0400 using fields:
2020-07-15T22:51:41.270-0400 connected to: mongodb://localhost/
2020-07-15T22:51:41.270-0400 ns: ntsb.data
2020-07-15T22:51:41.271-0400 connected to node type: standalone
2020-07-15T22:51:41.271-0400 Failed: error processing document #1: invalid character '}' looking for beginning of object key string
2020-07-15T22:51:41.271-0400 0 document(s) imported successfully. 0 document(s) failed to import.
I have tried all the solutions in this file and nothing worked. My JSON file is 60ish MB in size so it would be really hard to go through it and find the bracket issue. I believe that it is a problem with the UTF-8 formatting maybe? I take an XML file I downloaded on the internet and convert it into JSON with a Python script. When I try the --jsonArray flag, it gives the same error. Any ideas? Thanks!
It turns out within this massive file there were a few unnecessary commas. I was able to use Pythons built in JSON parsing to jump to lines with errors and remove them manually. As far as I can tell, the invalid character had nothing to do with the } but with the comma that caused it to expect another value before the closing bracket.
After solving this, I was still unable to import successfully because now the file was too large. The trick around this was to surround all the JSON objects with array brackets [] and use the following command: mongoimport -v -d ntsb -c data xml_results.json --batchSize 1 --jsonArray
After a few seconds the data imported successfully into Mongo.

MongoDb The handle is invalid

I am trying to import a JSON file into MongoDb using Mongoimport. It throws the following error Failed: error processing document #1: read C:\Users\mbryant2\Documents\primer-dataset.json: The handle is invalid.
Here is my cmd:
$ mongoimport --db tempTestDb --collection restaurants --drop --file C:/Users/mbryant2/Documents/primer-dataset.json
and response:
2018-09-14T12:17:36.337-0600 connected to: localhost
2018-09-14T12:17:36.338-0600 dropping: tempTestDb.restaurants
2018-09-14T12:17:36.339-0600 Failed: error processing document #1: read C:\Users\mbryant2\Documents\primer-dataset.json: The handle is invalid.
2018-09-14T12:17:36.339-0600 imported 0 documents
Anyone have any ideas on what I am missing? Is it needing login credentials or something like that?
If the data is represented as a JSON array, rather than individual lines of JSON text, you will need to add the --jsonArray parameter to mongoimport.

Couchbase : cbimport only importing last row

Task : Importing a json to a local couchbase DB. This is from the tutorial CB110 on learn.couchbase.com.
Issue : Only the last row, from thousands of other json rows, gets imported.
Command :
$cbimport json -c couchbase://127.0.0.1 -u Administrator -p abcd -b couchmusic2 -f lines -d file://C:/Users/Deep_Kulshreshtha/Downloads/CB110-Data/couchmusic2-userprofiles.json -t 1 -g %type%::%username%
Result :
Json file://C:/Users/Deep_Kulshreshtha/Downloads/CB110-Data/couchmusic2-userprofiles.json imported to http://127.0.0.1:8091 successfully
Couchbase Admin Screen :
You can see only a single row is imported. This is the last row from the import document, which contains 50,000 records.
Any pointers or help is much appreciated ! Thanks.
Deep
Got the solution.
Windows users : Please use ^ as escape character in your key generation parameters.
Following worked for me:
-g %type%::​^​%username​^​%
Thanks.

Zabbix Trapper: Cannot get data from orabbix

I am using orabbix to monitor my db. The data from the queries executed on this db using orabbix are sent to zabbix server. However, I am not able to see the data reaching zabbix.
On my zabbix web console, I see this message on the triggers added - "Trigger expression updated. No status update so far."
Any ideas?
My update interval for the trigger is set to 30 sec.
Based on the screenshots you posted, your host is named "wfc1dev1" and you have items with keys "WFC_WFS_SYS_001" and "WFC_WFS_SYS_002". However, based on the Orabbix XML that it sends to Zabbix, the hostname and item keys are different. Here is the XML:
<req><host>V0ZDMURFVg==</host><key>V0ZDX0xFQUZfU1lTXzAwMg==</key><data>MA==</dat‌​a></req>
From this, we can deduce the host:
$ echo V0ZDMURFVg== | base64 -d
WFC1DEV
The key:
$ echo V0ZDX0xFQUZfU1lTXzAwMg== | base64 -d
WFC_LEAF_SYS_002
The data:
$ echo MA== | base64 -d
0
It can be seen that neither the host name, nor item key match those configured on Zabbix server. Once you fix that, it should work.