Unknown users in /home/gke in Google Computer Engine - google-compute-engine

I'm picking on my Google Cloud instances are created as new system users.
The format is: /home/gke-xxxxxxxxxx
These users appear in instances of Linux based on Debian and Google Container Engine.
For example:
root#node1:/home# ls -lh
total 24K
drwxr-xr-x 3 gke-34cf46593ebc10a5beb5 gke-34cf46593ebc10a5beb5 4.0K Sep 29 04:18 gke-34cf46593ebc10a5beb5
drwxr-xr-x 3 gke-b230f34ceeb7c905fdb6 gke-b230f34ceeb7c905fdb6 4.0K Sep 29 04:18 gke-b230f34ceeb7c905fdb6
root#node1:/etc# cat /etc/passwd | grep gke
gke-34cf46593ebc10a5beb5:x:1021:1022::/home/gke-34cf46593ebc10a5beb5:/bin/bash
gke-b230f34ceeb7c905fdb6:x:1022:1023::/home/gke-b230f34ceeb7c905fdb6:/bin/bash
root#node1:/etc# cat /etc/group | grep gke
adm:x:4:gke-34cf46593ebc10a5beb5,gke-b230f34ceeb7c905fdb6
dip:x:30:gke-34cf46593ebc10a5beb5,gke-b230f34ceeb7c905fdb6
video:x:44:gke-34cf46593ebc10a5beb5,gke-b230f34ceeb7c905fdb6
plugdev:x:46:gke-34cf46593ebc10a5beb5,gke-b230f34ceeb7c905fdb6
google-sudoers:x:1000:gke-34cf46593ebc10a5beb5,gke-b230f34ceeb7c905fdb6
gke-34cf46593ebc10a5beb5:x:1022:
gke-b230f34ceeb7c905fdb6:x:1023:
This is a piece of log in: /var/log/auth.log
Sep 29 04:18:57 node1 useradd[11226]: new group: name=gke-34cf46593ebc10a5beb5, GID=1022
Sep 29 04:18:57 node1 useradd[11226]: new user: name=gke-34cf46593ebc10a5beb5, UID=1021, GID=1022, home=/home/gke-34cf46593ebc10a5beb5, shell=/bin/bash
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to group 'adm'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to group 'dip'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to group 'video'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to group 'plugdev'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to group 'google-sudoers'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to shadow group 'adm'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to shadow group 'dip'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to shadow group 'video'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to shadow group 'plugdev'
Sep 29 04:18:57 node1 usermod[11231]: add 'gke-34cf46593ebc10a5beb5' to shadow group 'google-sudoers'
Sep 29 04:18:57 node1 useradd[11236]: new group: name=gke-b230f34ceeb7c905fdb6, GID=1023
Sep 29 04:18:57 node1 useradd[11236]: new user: name=gke-b230f34ceeb7c905fdb6, UID=1022, GID=1023, home=/home/gke-b230f34ceeb7c905fdb6, shell=/bin/bash
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to group 'adm'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to group 'dip'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to group 'video'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to group 'plugdev'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to group 'google-sudoers'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to shadow group 'adm'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to shadow group 'dip'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to shadow group 'video'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to shadow group 'plugdev'
Sep 29 04:18:57 node1 usermod[11241]: add 'gke-b230f34ceeb7c905fdb6' to shadow group 'google-sudoers'
I suspect that this is some internal thing from Google Cloud.
The firewall not allow SSH connections outside my authorized IPs.
What is the reason why these users appear?
Thanks for all.

When a GKE cluster is created in your project, it also adds an SSH key associated to it in the project metadata. These SSH keys can be displayed by going to your Google Cloud Console -> Compute Engine -> Metadata -> SSH keys.
Project-Wide SSH keys, like the ones created during the deployment of GKE clusters, are transferred to all the instances in your project unless the instance is defined to work with specific keys. These keys are copied in the home directory of each VM for each user (/home/user/.ssh). When you delete a GKE deployment, the SSH key is removed from the metadata. Keys that are removed from the metadata are also deleted from /home/user/.ssh/authorized_keys. Neverhteless the home directory for users is not deleted on the VMs.

Related

Unable to start mysql on EC2 instance, I keep receiving apparmor error

I am unable to start mysql on my EC2 instance. Receiving following errors
Oct 28 05:21:08 ip-172-30-1-66 kernel: [ 841.115678] init: mysql main process (19029) terminated with status 7
Oct 28 05:21:08 ip-172-30-1-66 kernel: [ 841.115687] init: mysql main process ended, respawning
Oct 28 05:21:09 ip-172-30-1-66 kernel: [ 842.011098] init: mysql post-start process (19030) terminated with status 1
Oct 28 05:21:09 ip-172-30-1-66 kernel: [ 842.016704] type=1400 audit(1540704069.482:278): apparmor="STATUS" operation="profile_replace" profile="unconfined" name="/usr/sbin/mysqld" pid=19084 comm="apparmor_parser"
This is on an Ubuntu system
# uname -a
Linux ip-172-30-1-66 3.13.0-91-generic #138-Ubuntu SMP Fri Jun 24 17:00:34 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux

Server-sent event received after 3-4 seconds

While reading about Server-Sent Events from this page, I got confused about the timing of events. Basically, the example shown has a PHP script sending the system time to the web page:
<?php
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
$time = date('r');
echo "data: The server time is: {$time}\n\n";
flush();
?>
While the page receives and renders it:
if(typeof(EventSource) !== "undefined") {
var source = new EventSource("demo_sse.php");
source.onmessage = function(event) {
document.getElementById("result").innerHTML += event.data + "<br>";
};
} else {
document.getElementById("result").innerHTML = "Sorry, your browser does not support server-sent events...";
}
My confusion is that the server seems to be sending messages every 3-4 seconds:
The server time is: Tue, 20 Sep 2016 11:55:12 -0400
The server time is: Tue, 20 Sep 2016 11:55:16 -0400
The server time is: Tue, 20 Sep 2016 11:55:20 -0400
The server time is: Tue, 20 Sep 2016 11:55:23 -0400
The server time is: Tue, 20 Sep 2016 11:55:28 -0400
The server time is: Tue, 20 Sep 2016 11:55:32 -0400
The server time is: Tue, 20 Sep 2016 11:55:35 -0400
The server time is: Tue, 20 Sep 2016 11:55:39 -0400
The server time is: Tue, 20 Sep 2016 11:55:43 -0400
The server time is: Tue, 20 Sep 2016 11:55:46 -0400
The server time is: Tue, 20 Sep 2016 11:55:50 -0400
The server time is: Tue, 20 Sep 2016 11:55:53 -0400
The server time is: Tue, 20 Sep 2016 11:55:57 -0400
The server time is: Tue, 20 Sep 2016 11:56:01 -0400
The server time is: Tue, 20 Sep 2016 11:56:04 -0400
The server time is: Tue, 20 Sep 2016 11:56:08 -0400
The server time is: Tue, 20 Sep 2016 11:56:12 -0400
The server time is: Tue, 20 Sep 2016 11:56:15 -0400
However, I don't see this delay either at the server end or at the client end. Is it the network lag between the website's server and my browser? Or is it something else?
Default retry is 3 seconds as shown on http://www.html5rocks.com/en/tutorials/eventsource/basics/
Check out the "Controlling the Reconnection-timeout section" on that link.
You may customize it by sending a line retry:100
to enforce just 100ms retry time interval.

MongoDB mongoimport gives error when importing large json file

I have a large JSON formatted file (254MB) which I am trying to import into MongoDB in Cloud9 IDE using mongoimport. Json file consists of mongodb documents each on its own row using newline character. The file looks like this:
...
{"t":"1358836264","p":"1.33470"}
{"t":"1358836265","p":"1.33475"}
{"t":"1358836271","p":"1.33477"}
{"t":"1358836272","p":"1.33481"}
{"t":"1358836274","p":"1.33475"}
{"t":"1358836275","p":"1.33478"}
{"t":"1358836288","p":"1.33480"}
{"t":"1358836291","p":"1.33481"}
{"t":"1358836294","p":"1.33481"}
{"t":"1358836295","p":"1.33478"}
...
I have tried:
mongoimport -d woption -c eurusd eurusddata.json
This gave me the following output with error:
connected to: 127.0.0.1
Sun Oct 5 16:51:15.094 Progress: 1636800/26345472 6%
Sun Oct 5 16:51:15.094 49600 16533/second
Sun Oct 5 16:51:18.088 Progress: 3375900/26345472 12%
Sun Oct 5 16:51:18.088 102300 17050/second
Sun Oct 5 16:51:21.089 Progress: 4867500/26345472 18%
Sun Oct 5 16:51:21.089 147500 16388/second
Sun Oct 5 16:51:24.103 Progress: 7728600/26345472 29%
Sun Oct 5 16:51:24.103 234200 19516/second
Sun Oct 5 16:51:27.093 Progress: 10467600/26345472 39%
Sun Oct 5 16:51:27.093 317200 21146/second
Sun Oct 5 16:51:30.094 Progress: 13312200/26345472 50%
Sun Oct 5 16:51:30.094 403400 22411/second
Sun Oct 5 16:51:33.302 Progress: 16038000/26345472 60%
Sun Oct 5 16:51:33.303 486000 23142/second
Sun Oct 5 16:51:36.088 Progress: 17341500/26345472 65%
Sun Oct 5 16:51:36.088 525500 21895/second
Sun Oct 5 16:51:39.004 Progress: 18526200/26345472 70%
Sun Oct 5 16:51:39.004 561400 20792/second
Sun Oct 5 16:51:42.032 Progress: 19067400/26345472 72%
Sun Oct 5 16:51:42.032 577800 19260/second
Sun Oct 5 16:51:45.088 Progress: 20829600/26345472 79%
Sun Oct 5 16:51:45.088 631200 19127/second
Sun Oct 5 16:51:48.071 Progress: 23007600/26345472 87%
Sun Oct 5 16:51:48.071 697200 19366/second
Sun Oct 5 16:51:51.914 Progress: 23443200/26345472 88%
Sun Oct 5 16:51:51.914 710400 18215/second
Sun Oct 5 16:51:54.103 Progress: 23611500/26345472 89%
Sun Oct 5 16:51:54.104 715500 17035/second
Sun Oct 5 16:51:57.709 Progress: 23967900/26345472 90%
Sun Oct 5 16:51:57.709 726300 16140/second
Sun Oct 5 16:52:00.096 Progress: 24538800/26345472 93%
Sun Oct 5 16:52:00.096 743600 15491/second
Sun Oct 5 16:52:03.088 Progress: 25548600/26345472 96%
Sun Oct 5 16:52:03.088 774200 15180/second
Sun Oct 5 16:52:04.644 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting ':': offset:21
Sun Oct 5 16:52:04.644
Sun Oct 5 16:52:04.644 check 9 798347
Sun Oct 5 16:52:05.061 imported 798347 objects
Sun Oct 5 16:52:05.061 ERROR: encountered 1 error(s)
Then I tried using --jsonArray at the end of previous command:
mongoimport -d woption -c eurusd eurusddata.json --jsonArray
This gave me the following output with error:
connected to: 127.0.0.1
Sun Oct 5 16:54:20.343 exception:JSONArray file too large
Sun Oct 5 16:54:20.444 warning: log line attempted (16384k) over max size(10k), printing beginning and end ... {"t":"1357070402","p":"1.32041"}
{"t":"1357070424","p":"1.32040"}
{"t":"1357070447","p":"1.32038"}
{"t":"1357070457","p":"1.32034"}
{"t":"1357070463","p":"1.32039"}
{"t":"1357070464","p":"1.32038"}
{"t":"1357070470","p":"1.32034"}
{"t":"1357070485","p":"1.32035"}
{"t":"1357070491","p":"1.32046"}
{"t":"1357070520","p":"1.32050"}
{"t":"1357070522","p":"1.32049"}
{"t":"1357070535","p":"1.32050"}
{"t":"1357070581","p":"1.32049"}
{"t":"1357070582","p":"1.32050"}
{"t":"1357070587","p":"1.32035"}
{"t":"1357070589","p":"1.32034"}
{"t":"1357070593","p":"1.32033"}
{"t":"1357070594","p":"1.32024"}
{"t":"1357070595","p":"1.32025"}
{"t":"1357070599","p":"1.32009"}
{"t":"1357070602","p":"1.32024"}
{"t":"1357070630","p":"1.32023"}
{"t":"1357070637","p":"1.32025"}
{"t":"1357070656","p":"1.32023"}
{"t":"1357070691","p":"1.32025"}
{"t":"1357070702","p":"1.32027"}
{"t":"1357070703","p":"1.32028"}
{"t":"1357070706","p":"1.32027"}
{"t":"1357070707","p":"1.32026"}
{"t":"1357070709","p":"1.32027"}
{"t":"1357070710","p":"1.32026"}
{"t":"1357070721","p":"1.32027"}
{"t":"1357070723","p":"1.32026"}
{"t":"1357070761","p":"1.32025"}
{"t":"1357070767","p":"1.32027"}
{"t":"1357070768","p":"1.32025"}
{"t":"1357070784","p":"1.32026"}
{"t":"1357070798","p":"1.32027"}
{"t":"1357070799","p":"1.32026"}
{"t":"1357070804","p":"1.32036"}
{"t":"1357070819","p":"1.32034"}
{"t":"1357070842","p":"1.32035"}
{"t":"1357070880","p":"1.32030"}
{"t":"1357070881","p":"1.32035"}
{"t":"1357070966","p":"1.32053"}
{"t":"1357070967","p":"1.32063"}
{"t":"1357070973","p":"1.32048"}
{"t":"1357070974","p":"1.32046"}
{"t":"1357070977","p":"1.32066"}
{"t":"1357070978","p":"1.32065"}
{"t":"1357070984","p":"1.32064"}
{"t":"1357070987","p":"1.32063"}
{"t":"1357070988","p":"1.32064"}
{"t":"1357070993","p":"1.32065"}
{"t":"1357070997","p":"1.32065"}
{"t":"1357070998","p":"1.32063"}
{"t":"1357071000","p":"1.32064"}
{"t":"1357071003","p":"1.32064"}
{"t":"1357071011","p":"1.32065"}
{"t":"1357071012","p":"1.32069"}
{"t":"1357071013","p":"1.32069"}
{"t":"1357071023","p":"1.32068"}
{"t":"1357071024","p":"1.32068"}
{"t":"1357071026","p":"1.32072"}
{"t":"1357071027","p":"1.32070"}
{"t":"1357071028","p":"1.32069"}
{"t":"1357071047","p":"1.32068"}
{"t":"1357071048","p":"1.32069"}
{"t":"1357071070","p":"1.32072"}
{"t":"1357071071","p":"1.32069"}
{"t":"1357071077","p":"1.32068"}
{"t":"1357071078","p":"1.32069"}
{"t":"1357071085","p":"1.32068"}
{"t":"1357071086","p":"1.32069"}
{"t":"1357071095","p":"1.32072"}
{"t":"1357071096","p":"1.32069"}
{"t":"1357071097","p":"1.32070"}
{"t":"1357071104","p":"1.32069"}
{"t":"1357071105","p":"1.32069"}
{"t":"1357071114","p":"1.32068"}
{"t":"1357071125","p":"1.32068"}
{"t":"1357071132","p":"1.32072"}
{"t":"1357071133","p":"1.32068"}
{"t":"1357071134","p":"1.32069"}
{"t":"1357071136","p":"1.32043"}
{"t":"1357071137","p":"1.32049"}
{"t":"1357071139","p":"1.32052"}
{"t":"1357071140","p":"1.32058"}
{"t":"1357071156","p":"1.32052"}
{"t":"1357071163","p":"1.32050"}
{"t":"1357071174","p":"1.32052"}
{"t":"1357071176","p":"1.32055"}
{"t":"1357071178","p":"1.32053"}
{"t":"1357071185","p":"1.32054"}
{"t":"1357071187","p":"1.32053"}
{"t":"1357071189","p":"1.32025"}
{"t":"1357071190","p":"1.32023"}
{"t":"1357071191","p":"1.32045"}
{"t":"1357071192","p":"1.32044"}
{"t":"1357071193","p":"1.32045"}
{"t":"1357071197","p":"1.32044"}
{"t":"1357071198","p":"1.32043"}
{"t":"1357071199","p":"1.32004"}
{"t":"13570712 .......... ":"1358836127","p":"1.33455"}
{"t":"1358836128","p":"1.33454"}
{"t":"1358836129","p":"1.33455"}
{"t":"1358836130","p":"1.33458"}
{"t":"1358836131","p":"1.33463"}
{"t":"1358836132","p":"1.33462"}
{"t":"1358836133","p":"1.33460"}
{"t":"1358836134","p":"1.33459"}
{"t":"1358836135","p":"1.33464"}
{"t":"1358836137","p":"1.33463"}
{"t":"1358836139","p":"1.33461"}
{"t":"1358836140","p":"1.33462"}
{"t":"1358836142","p":"1.33463"}
{"t":"1358836143","p":"1.33459"}
{"t":"1358836144","p":"1.33461"}
{"t":"1358836146","p":"1.33462"}
{"t":"1358836147","p":"1.33466"}
{"t":"1358836148","p":"1.33473"}
{"t":"1358836149","p":"1.33475"}
{"t":"1358836151","p":"1.33479"}
{"t":"1358836152","p":"1.33484"}
{"t":"1358836153","p":"1.33492"}
{"t":"1358836154","p":"1.33488"}
{"t":"1358836155","p":"1.33492"}
{"t":"1358836156","p":"1.33489"}
{"t":"1358836157","p":"1.33482"}
{"t":"1358836158","p":"1.33483"}
{"t":"1358836159","p":"1.33480"}
{"t":"1358836160","p":"1.33479"}
{"t":"1358836161","p":"1.33482"}
{"t":"1358836162","p":"1.33481"}
{"t":"1358836163","p":"1.33481"}
{"t":"1358836166","p":"1.33480"}
{"t":"1358836167","p":"1.33480"}
{"t":"1358836168","p":"1.33475"}
{"t":"1358836169","p":"1.33454"}
{"t":"1358836170","p":"1.33454"}
{"t":"1358836172","p":"1.33459"}
{"t":"1358836173","p":"1.33456"}
{"t":"1358836174","p":"1.33455"}
{"t":"1358836175","p":"1.33457"}
{"t":"1358836176","p":"1.33459"}
{"t":"1358836177","p":"1.33460"}
{"t":"1358836178","p":"1.33462"}
{"t":"1358836179","p":"1.33458"}
{"t":"1358836180","p":"1.33459"}
{"t":"1358836181","p":"1.33456"}
{"t":"1358836182","p":"1.33458"}
{"t":"1358836183","p":"1.33457"}
{"t":"1358836184","p":"1.33459"}
{"t":"1358836185","p":"1.33460"}
{"t":"1358836186","p":"1.33467"}
{"t":"1358836189","p":"1.33466"}
{"t":"1358836190","p":"1.33472"}
{"t":"1358836191","p":"1.33470"}
{"t":"1358836192","p":"1.33470"}
{"t":"1358836193","p":"1.33469"}
{"t":"1358836194","p":"1.33465"}
{"t":"1358836195","p":"1.33466"}
{"t":"1358836196","p":"1.33461"}
{"t":"1358836197","p":"1.33460"}
{"t":"1358836198","p":"1.33466"}
{"t":"1358836199","p":"1.33465"}
{"t":"1358836200","p":"1.33466"}
{"t":"1358836201","p":"1.33458"}
{"t":"1358836202","p":"1.33457"}
{"t":"1358836203","p":"1.33453"}
{"t":"1358836204","p":"1.33454"}
{"t":"1358836205","p":"1.33456"}
{"t":"1358836210","p":"1.33455"}
{"t":"1358836211","p":"1.33450"}
{"t":"1358836214","p":"1.33449"}
{"t":"1358836216","p":"1.33451"}
{"t":"1358836217","p":"1.33452"}
{"t":"1358836218","p":"1.33457"}
{"t":"1358836220","p":"1.33458"}
{"t":"1358836221","p":"1.33457"}
{"t":"1358836222","p":"1.33456"}
{"t":"1358836223","p":"1.33457"}
{"t":"1358836226","p":"1.33460"}
{"t":"1358836229","p":"1.33459"}
{"t":"1358836230","p":"1.33458"}
{"t":"1358836236","p":"1.33461"}
{"t":"1358836237","p":"1.33466"}
{"t":"1358836240","p":"1.33467"}
{"t":"1358836241","p":"1.33467"}
{"t":"1358836242","p":"1.33463"}
{"t":"1358836243","p":"1.33466"}
{"t":"1358836244","p":"1.33467"}
{"t":"1358836245","p":"1.33468"}
{"t":"1358836250","p":"1.33468"}
{"t":"1358836256","p":"1.33469"}
{"t":"1358836263","p":"1.33472"}
{"t":"1358836264","p":"1.33470"}
{"t":"1358836265","p":"1.33475"}
{"t":"1358836271","p":"1.33477"}
{"t":"1358836272","p":"1.33481"}
{"t":"1358836274","p":"1.33475"}
{"t":"1358836275","p":"1.33478"}
{"t":"1358836288","p":"1.33480"}
{"t":"1358836291","p":"1.33481"}
{"t":"1358836294","p":"1.33481"}
{"t":"1358836295","p":"1.33478"}
{"t":"1358836296
Sun Oct 5 16:54:20.445 check 0 0
Sun Oct 5 16:54:20.445 imported 0 objects
Sun Oct 5 16:54:20.445 ERROR: encountered 1 error(s)
I then tried using something like this:
mongoimport -d woption -c eurusd < /home/ubuntu/workspace/eurusddata.json
This gave me the following output with error:
connected to: 127.0.0.1
Sun Oct 5 16:58:08.107 77400 25800/second
Sun Oct 5 16:58:11.096 136300 22716/second
Sun Oct 5 16:58:14.107 177700 19744/second
Sun Oct 5 16:58:17.093 232900 19408/second
Sun Oct 5 16:58:20.125 280400 18693/second
Sun Oct 5 16:58:23.103 343500 19083/second
Sun Oct 5 16:58:26.091 424800 20228/second
Sun Oct 5 16:58:29.095 482800 20116/second
Sun Oct 5 16:58:32.097 517900 19181/second
Sun Oct 5 16:58:35.088 566700 18890/second
Sun Oct 5 16:58:38.091 608700 18445/second
Sun Oct 5 16:58:41.098 664700 18463/second
Sun Oct 5 16:58:44.088 718300 18417/second
Sun Oct 5 16:58:47.089 776800 18495/second
Sun Oct 5 16:58:47.848 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting ':': offset:21
Sun Oct 5 16:58:47.848
Sun Oct 5 16:58:47.848 check 9 798347
Sun Oct 5 16:58:48.852 imported 798347 objects
Sun Oct 5 16:58:48.852 ERROR: encountered 1 error(s)
Also tried using previous command with --jsonArray:
mongoimport -d woption -c eurusd < /home/ubuntu/workspace/eurusddata.json --jsonArray
This gave me the following output with error:
connected to: 127.0.0.1
Sun Oct 5 16:59:35.063 exception:JSONArray file too large
Sun Oct 5 16:59:35.101 warning: log line attempted (16384k) over max size(10k), printing beginning and end ... {"t":"1357070402","p":"1.32041"}
{"t":"1357070424","p":"1.32040"}
{"t":"1357070447","p":"1.32038"}
{"t":"1357070457","p":"1.32034"}
{"t":"1357070463","p":"1.32039"}
{"t":"1357070464","p":"1.32038"}
{"t":"1357070470","p":"1.32034"}
{"t":"1357070485","p":"1.32035"}
{"t":"1357070491","p":"1.32046"}
{"t":"1357070520","p":"1.32050"}
{"t":"1357070522","p":"1.32049"}
{"t":"1357070535","p":"1.32050"}
{"t":"1357070581","p":"1.32049"}
{"t":"1357070582","p":"1.32050"}
{"t":"1357070587","p":"1.32035"}
{"t":"1357070589","p":"1.32034"}
{"t":"1357070593","p":"1.32033"}
{"t":"1357070594","p":"1.32024"}
{"t":"1357070595","p":"1.32025"}
{"t":"1357070599","p":"1.32009"}
{"t":"1357070602","p":"1.32024"}
{"t":"1357070630","p":"1.32023"}
{"t":"1357070637","p":"1.32025"}
{"t":"1357070656","p":"1.32023"}
{"t":"1357070691","p":"1.32025"}
{"t":"1357070702","p":"1.32027"}
{"t":"1357070703","p":"1.32028"}
{"t":"1357070706","p":"1.32027"}
{"t":"1357070707","p":"1.32026"}
{"t":"1357070709","p":"1.32027"}
{"t":"1357070710","p":"1.32026"}
{"t":"1357070721","p":"1.32027"}
{"t":"1357070723","p":"1.32026"}
{"t":"1357070761","p":"1.32025"}
{"t":"1357070767","p":"1.32027"}
{"t":"1357070768","p":"1.32025"}
{"t":"1357070784","p":"1.32026"}
{"t":"1357070798","p":"1.32027"}
{"t":"1357070799","p":"1.32026"}
{"t":"1357070804","p":"1.32036"}
{"t":"1357070819","p":"1.32034"}
{"t":"1357070842","p":"1.32035"}
{"t":"1357070880","p":"1.32030"}
{"t":"1357070881","p":"1.32035"}
{"t":"1357070966","p":"1.32053"}
{"t":"1357070967","p":"1.32063"}
{"t":"1357070973","p":"1.32048"}
{"t":"1357070974","p":"1.32046"}
{"t":"1357070977","p":"1.32066"}
{"t":"1357070978","p":"1.32065"}
{"t":"1357070984","p":"1.32064"}
{"t":"1357070987","p":"1.32063"}
{"t":"1357070988","p":"1.32064"}
{"t":"1357070993","p":"1.32065"}
{"t":"1357070997","p":"1.32065"}
{"t":"1357070998","p":"1.32063"}
{"t":"1357071000","p":"1.32064"}
{"t":"1357071003","p":"1.32064"}
{"t":"1357071011","p":"1.32065"}
{"t":"1357071012","p":"1.32069"}
{"t":"1357071013","p":"1.32069"}
{"t":"1357071023","p":"1.32068"}
{"t":"1357071024","p":"1.32068"}
{"t":"1357071026","p":"1.32072"}
{"t":"1357071027","p":"1.32070"}
{"t":"1357071028","p":"1.32069"}
{"t":"1357071047","p":"1.32068"}
{"t":"1357071048","p":"1.32069"}
{"t":"1357071070","p":"1.32072"}
{"t":"1357071071","p":"1.32069"}
{"t":"1357071077","p":"1.32068"}
{"t":"1357071078","p":"1.32069"}
{"t":"1357071085","p":"1.32068"}
{"t":"1357071086","p":"1.32069"}
{"t":"1357071095","p":"1.32072"}
{"t":"1357071096","p":"1.32069"}
{"t":"1357071097","p":"1.32070"}
{"t":"1357071104","p":"1.32069"}
{"t":"1357071105","p":"1.32069"}
{"t":"1357071114","p":"1.32068"}
{"t":"1357071125","p":"1.32068"}
{"t":"1357071132","p":"1.32072"}
{"t":"1357071133","p":"1.32068"}
{"t":"1357071134","p":"1.32069"}
{"t":"1357071136","p":"1.32043"}
{"t":"1357071137","p":"1.32049"}
{"t":"1357071139","p":"1.32052"}
{"t":"1357071140","p":"1.32058"}
{"t":"1357071156","p":"1.32052"}
{"t":"1357071163","p":"1.32050"}
{"t":"1357071174","p":"1.32052"}
{"t":"1357071176","p":"1.32055"}
{"t":"1357071178","p":"1.32053"}
{"t":"1357071185","p":"1.32054"}
{"t":"1357071187","p":"1.32053"}
{"t":"1357071189","p":"1.32025"}
{"t":"1357071190","p":"1.32023"}
{"t":"1357071191","p":"1.32045"}
{"t":"1357071192","p":"1.32044"}
{"t":"1357071193","p":"1.32045"}
{"t":"1357071197","p":"1.32044"}
{"t":"1357071198","p":"1.32043"}
{"t":"1357071199","p":"1.32004"}
{"t":"13570712 .......... ":"1358836127","p":"1.33455"}
{"t":"1358836128","p":"1.33454"}
{"t":"1358836129","p":"1.33455"}
{"t":"1358836130","p":"1.33458"}
{"t":"1358836131","p":"1.33463"}
{"t":"1358836132","p":"1.33462"}
{"t":"1358836133","p":"1.33460"}
{"t":"1358836134","p":"1.33459"}
{"t":"1358836135","p":"1.33464"}
{"t":"1358836137","p":"1.33463"}
{"t":"1358836139","p":"1.33461"}
{"t":"1358836140","p":"1.33462"}
{"t":"1358836142","p":"1.33463"}
{"t":"1358836143","p":"1.33459"}
{"t":"1358836144","p":"1.33461"}
{"t":"1358836146","p":"1.33462"}
{"t":"1358836147","p":"1.33466"}
{"t":"1358836148","p":"1.33473"}
{"t":"1358836149","p":"1.33475"}
{"t":"1358836151","p":"1.33479"}
{"t":"1358836152","p":"1.33484"}
{"t":"1358836153","p":"1.33492"}
{"t":"1358836154","p":"1.33488"}
{"t":"1358836155","p":"1.33492"}
{"t":"1358836156","p":"1.33489"}
{"t":"1358836157","p":"1.33482"}
{"t":"1358836158","p":"1.33483"}
{"t":"1358836159","p":"1.33480"}
{"t":"1358836160","p":"1.33479"}
{"t":"1358836161","p":"1.33482"}
{"t":"1358836162","p":"1.33481"}
{"t":"1358836163","p":"1.33481"}
{"t":"1358836166","p":"1.33480"}
{"t":"1358836167","p":"1.33480"}
{"t":"1358836168","p":"1.33475"}
{"t":"1358836169","p":"1.33454"}
{"t":"1358836170","p":"1.33454"}
{"t":"1358836172","p":"1.33459"}
{"t":"1358836173","p":"1.33456"}
{"t":"1358836174","p":"1.33455"}
{"t":"1358836175","p":"1.33457"}
{"t":"1358836176","p":"1.33459"}
{"t":"1358836177","p":"1.33460"}
{"t":"1358836178","p":"1.33462"}
{"t":"1358836179","p":"1.33458"}
{"t":"1358836180","p":"1.33459"}
{"t":"1358836181","p":"1.33456"}
{"t":"1358836182","p":"1.33458"}
{"t":"1358836183","p":"1.33457"}
{"t":"1358836184","p":"1.33459"}
{"t":"1358836185","p":"1.33460"}
{"t":"1358836186","p":"1.33467"}
{"t":"1358836189","p":"1.33466"}
{"t":"1358836190","p":"1.33472"}
{"t":"1358836191","p":"1.33470"}
{"t":"1358836192","p":"1.33470"}
{"t":"1358836193","p":"1.33469"}
{"t":"1358836194","p":"1.33465"}
{"t":"1358836195","p":"1.33466"}
{"t":"1358836196","p":"1.33461"}
{"t":"1358836197","p":"1.33460"}
{"t":"1358836198","p":"1.33466"}
{"t":"1358836199","p":"1.33465"}
{"t":"1358836200","p":"1.33466"}
{"t":"1358836201","p":"1.33458"}
{"t":"1358836202","p":"1.33457"}
{"t":"1358836203","p":"1.33453"}
{"t":"1358836204","p":"1.33454"}
{"t":"1358836205","p":"1.33456"}
{"t":"1358836210","p":"1.33455"}
{"t":"1358836211","p":"1.33450"}
{"t":"1358836214","p":"1.33449"}
{"t":"1358836216","p":"1.33451"}
{"t":"1358836217","p":"1.33452"}
{"t":"1358836218","p":"1.33457"}
{"t":"1358836220","p":"1.33458"}
{"t":"1358836221","p":"1.33457"}
{"t":"1358836222","p":"1.33456"}
{"t":"1358836223","p":"1.33457"}
{"t":"1358836226","p":"1.33460"}
{"t":"1358836229","p":"1.33459"}
{"t":"1358836230","p":"1.33458"}
{"t":"1358836236","p":"1.33461"}
{"t":"1358836237","p":"1.33466"}
{"t":"1358836240","p":"1.33467"}
{"t":"1358836241","p":"1.33467"}
{"t":"1358836242","p":"1.33463"}
{"t":"1358836243","p":"1.33466"}
{"t":"1358836244","p":"1.33467"}
{"t":"1358836245","p":"1.33468"}
{"t":"1358836250","p":"1.33468"}
{"t":"1358836256","p":"1.33469"}
{"t":"1358836263","p":"1.33472"}
{"t":"1358836264","p":"1.33470"}
{"t":"1358836265","p":"1.33475"}
{"t":"1358836271","p":"1.33477"}
{"t":"1358836272","p":"1.33481"}
{"t":"1358836274","p":"1.33475"}
{"t":"1358836275","p":"1.33478"}
{"t":"1358836288","p":"1.33480"}
{"t":"1358836291","p":"1.33481"}
{"t":"1358836294","p":"1.33481"}
{"t":"1358836295","p":"1.33478"}
{"t":"1358836296
Sun Oct 5 16:59:35.101 check 0 0
Sun Oct 5 16:59:35.102 imported 0 objects
Sun Oct 5 16:59:35.102 ERROR: encountered 1 error(s)
What am I doing wrong or what am I missing here? How can I import large json files to MongoDB?
According to the message:
Sun Oct 5 16:52:03.088 Progress: 25548600/26345472 96%
Sun Oct 5 16:52:03.088 774200 15180/second
Sun Oct 5 16:52:04.644 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting ':': offset:21
I think there are some mistakes near the end of your imported file.
For {"t":"1358836291","p":"1.33481"}, the offset of second : is 21. So, please check syntax of documents near the end (96% ~ 100%) of imported file.
By the way, --jsonArray is improper to use here because it has a limit of 16MB.
Change your command from mongoimport -d woption -c eurusd eurusddata.json --jsonArray to mongoimport -d woption -c eurusd --file eurusddata.json --jsonArray and this will work. Hope this helps. Update if this doesn't work.
Try this option: --batchSize 1
Like:
mongoimport -d woption -c eurusd eurusddata.json --batchSize 1

SOLR DataImport Error "Unable to Execute Query"

I have a website running on an Amazon EC2 Instance, and I'm trying to get Solr to interface and work with the database I'm using. I'm able to use the admin interface and have gotten the example xml files indexed, but whenever I try to import one of my database tables, I get the error
SEVERE: Exception while processing: gamelydb document : SolrInputDocument[{}]:org.apache.solr.handler.dataimport.DataImportHandlerException:
Unable to execute query: SELECT * FROM league Processing Document # 1
Here's my data-config.xml file. I also added the dataimporthandler to the solrconfig.xml file.
<dataConfig>
<dataSource type="JdbcDataSource"
driver="com.mysql.jdbc.Driver"
url="jdbc:mysql://www.mysite.com/mydb"
user="root"
password="mypassword"/>
<document>
<entity name="mydb"
query="SELECT * FROM league">
<field column="id" name="id" />
<field column="leaguename" name="leaguename" />
</entity>
</document>
</dataConfig>
Any idea why this is happening? And just let me know if I need to clarify anything.
So I changed the url a bit and removed the 'http://' and that error seemed to go away. However none of the information seems to be searchable through the admin. There are now a few files in the data directory (_1.fnm, _1.frq,...). Here is what solr prints to the log.
Jul 13, 2011 1:19:45 PM org.apache.solr.core.SolrCore execute
INFO: [] webapp=/solr path=/dataimport params={} status=0 QTime=4
Jul 13, 2011 1:19:48 PM org.apache.solr.core.SolrCore execute
INFO: [] webapp=/solr path=/dataimport params={command=full-import} status=0 QTime=4
Jul 13, 2011 1:19:48 PM org.apache.solr.handler.dataimport.DataImporter doFullImport
INFO: Starting Full Import
Jul 13, 2011 1:19:48 PM org.apache.solr.handler.dataimport.SolrWriter readIndexerProperties
INFO: Read dataimport.properties
Jul 13, 2011 1:19:48 PM org.apache.solr.update.DirectUpdateHandler2 deleteAll
INFO: [] REMOVING ALL DOCUMENTS FROM INDEX
Jul 13, 2011 1:19:48 PM org.apache.solr.core.SolrDeletionPolicy onInit
INFO: SolrDeletionPolicy.onInit: commits:num=1
commit{dir=/home/ec2-user/public_html/solr/example/solr/data/index,segFN=segments_2,version=1310405039852,generation=2,filenames=[_0.tis, _0.nrm, _0.fnm, _0.tii, _0.frq, segments_2, _0.fdx, _0.fdt]
Jul 13, 2011 1:19:48 PM org.apache.solr.core.SolrDeletionPolicy updateCommits
INFO: newest commit = 1310405039852
Jul 13, 2011 1:19:48 PM org.apache.solr.handler.dataimport.JdbcDataSource$1 call
INFO: Creating a connection for entity gamelydb with URL: jdbc:mysql://www.gamely.us/gamelydb
Jul 13, 2011 1:19:49 PM org.apache.solr.handler.dataimport.JdbcDataSource$1 call
INFO: Time taken for getConnection(): 667
Jul 13, 2011 1:19:49 PM org.apache.solr.handler.dataimport.DocBuilder finish
INFO: Import completed successfully
Jul 13, 2011 1:19:49 PM org.apache.solr.update.DirectUpdateHandler2 commit
INFO: start commit(optimize=true,waitFlush=false,waitSearcher=true,expungeDeletes=false)
Jul 13, 2011 1:19:49 PM org.apache.solr.core.SolrDeletionPolicy onCommit
INFO: SolrDeletionPolicy.onCommit: commits:num=2
commit{dir=/home/ec2-user/public_html/solr/example/solr/data/index,segFN=segments_2,version=1310405039852,generation=2,filenames=[_0.tis, _0.nrm, _0.fnm, _0.tii, _0.frq, segments_2, _0.fdx, _0.fdt]
commit{dir=/home/ec2-user/public_html/solr/example/solr/data/index,segFN=segments_3,version=1310405039855,generation=3,filenames=[_1.fdx, _1.tis, _1.frq, _1.fdt, _1.tii, _1.fnm, _1.nrm, segments_3]
Jul 13, 2011 1:19:49 PM org.apache.solr.core.SolrDeletionPolicy updateCommits
INFO: newest commit = 1310405039855
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher <init>
INFO: Opening Searcher#1c4795e main
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming Searcher#1c4795e main from Searcher#1d38b87 main
fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming result for Searcher#1c4795e main
fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming Searcher#1c4795e main from Searcher#1d38b87 main
filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming result for Searcher#1c4795e main
filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming Searcher#1c4795e main from Searcher#1d38b87 main
queryResultCache{lookups=0,hits=0,hitratio=0.00,inserts=1,evictions=0,size=1,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming result for Searcher#1c4795e main
queryResultCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming Searcher#1c4795e main from Searcher#1d38b87 main
documentCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming result for Searcher#1c4795e main
documentCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.core.QuerySenderListener newSearcher
INFO: QuerySenderListener sending requests to Searcher#1c4795e main
Jul 13, 2011 1:19:49 PM org.apache.solr.core.QuerySenderListener newSearcher
INFO: QuerySenderListener done.
Jul 13, 2011 1:19:49 PM org.apache.solr.update.DirectUpdateHandler2 commit
INFO: end_commit_flush
Jul 13, 2011 1:19:49 PM org.apache.solr.core.SolrCore registerSearcher
INFO: [] Registered new searcher Searcher#1c4795e main
Jul 13, 2011 1:19:49 PM org.apache.solr.search.SolrIndexSearcher close
INFO: Closing Searcher#1d38b87 main
fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
queryResultCache{lookups=0,hits=0,hitratio=0.00,inserts=1,evictions=0,size=1,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
documentCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
Jul 13, 2011 1:19:49 PM org.apache.solr.handler.dataimport.SolrWriter readIndexerProperties
INFO: Read dataimport.properties
Jul 13, 2011 1:19:49 PM org.apache.solr.handler.dataimport.SolrWriter persist
INFO: Wrote last indexed time to /home/ec2-user/public_html/solr/example/solr/./conf/dataimport.properties
Jul 13, 2011 1:19:49 PM org.apache.solr.update.processor.LogUpdateProcessor finish
INFO: {deleteByQuery=*:*,add=[9, 10, 11, 12, 13, 14],optimize=} 0 4
Jul 13, 2011 1:19:49 PM org.apache.solr.handler.dataimport.DocBuilder execute
INFO: Time taken = 0:0:1.66
EDIT: If you find something on here that's downvotable, please just let me know so I can fix it.
I think the problem is with the url parameter.
If the MySQL database is on the same machine, then use url="jdbc:mysql://localhost/mydb"
If its on www.mysite.com, then use url="jdbc:mysql://www.mysite.com/mydb"
Also, your log files may have more details regarding the error - please go through the logs and post the appropriate entries here.
I faced a similar problem. My database is on the same machine.
In data-config.xml, I changed the line:
url="jdbc:mysql://localhost/mydb"
to
url="jdbc:mysql://127.0.0.1/mydb"
and then things worked. Strange are the ways of Solr/Lucene!
You can try doing oone of the following troubleshotting:
Use url="jdbc:mysql://localhost:1433/mydb in case the sqlDB was installed on the same machine.
Check the Tomcat folder for errors (Catalina file).
Enable the auditing on the database level and check the logs on it.

Array items sorting and editing with MXML AS3 (in practical case)?

I have an array with lots of items with same names like
CloudObserverCMSStub edited
CloudObserverCMSStub edited
CloudObserverCMSStub created
CloudObserverCMSStub2 edited
CloudObserverCMSStub2 edited
CloudObserverCMSStub2 created
and different related to names dates for each item in such format
Wed, 17 Mar 2010 22:32:09 GMT
Wed, 17 Mar 2010 22:32:07 GMT
Wed, 17 Mar 2010 22:32:02 GMT
Wed, 17 Mar 2010 22:31:02 GMT
Wed, 17 Mar 2010 21:32:02 GMT
Wed, 15 Mar 2009 22:32:02 GMT
I want to sort them so that I get only the latest ones in such format (with no such stuff like edited or created)
CloudObserverCMSStub | Wed, 17 Mar 2010 22:32:09 GMT
CloudObserverCMSStub2 | Wed, 17 Mar 2010 22:31:02 GMT
So I want a new array of 2 items from for example 6 how to do such thing?
You create an object, store the common names as the keys, and use dates as values. Then you can compare the dates and replace it if the date is more recent. Eg:
var obj:Object;
for(var element in array) // I honestly forget AS3 syntax.
{
if(obj[element.name] == null)
{
obj[element.name] = element;
}
else
{
if(obj[element.name].date > element.date)
{
obj[element.name] = element;
}
}
}
Then just enumerate all the elements in obj.