Unable to import data from CouchDB - json
I am trying to import and then export data to a remote machine. Here goes the schema of the database. This is just a document that I got it in a form of json.
{"docs":[
{"id":"702decba698fea7df3fa46fdd9000fa4","key":"702decba698fea7df3fa46fdd9000fa4","value":{"rev":"1-f8c63611d5bc7354cac42d2a697ad57a"},"doc":{"_id":"702decba698fea7df3fa46fdd9000fa4","_rev":"1-f8c63611d5bc7354cac42d2a697ad57a","contributors":null,"truncated":false,"text":"RT #Whistlepodu4Csk: : First time since 1987 World Cup no Asian teams in the WC final\nThis had occurred in 1975, 1979, 1987 and now #CWC15\nā¦","in_reply_to_status_id":null,"in_reply_to_user_id":null,"id":583090814735155201,"favorite_count":0,"author":{"py/object":"tweepy.models.User","py/state":{"follow_request_sent":false,"profile_use_background_image":true,"profile_text_color":"333333","id":3102321084,"verified":false,"profile_location":null,"profile_image_url_https":"https://pbs.twimg.com/profile_images/579460416977252352/weSzVnPF_normal.jpg","profile_sidebar_fill_color":"DDEEF6","is_translator":false,"geo_enabled":false,"entities":{"description":{"urls":[]}},"followers_count":1,"profile_sidebar_border_color":"C0DEED","id_str":"3102321084","default_profile_image":false,"location":"Chennai","is_translation_enabled":false,"utc_offset":null,"statuses_count":9,"description":"12/11","friends_count":23,"profile_link_color":"0084B4","profile_image_url":"http://pbs.twimg.com/profile_images/579460416977252352/weSzVnPF_normal.jpg","notifications":false,"profile_background_image_url_https":"https://abs.twimg.com/images/themes/theme1/bg.png","profile_background_color":"C0DEED","profile_background_image_url":"http://abs.twimg.com/images/themes/theme1/bg.png","name":"charandevaa","lang":"en","profile_background_tile":false,"favourites_count":7,"screen_name":"charandevaarg","url":null,"created_at":{"py/object":"datetime.datetime","__reduce__":[{"py/type":"datetime.datetime"},["B98DFgEtLgAAAA=="]]},"contributors_enabled":false,"time_zone":null,"protected":false,"default_profile":true,"following":false,"listed_count":0}},"retweeted":false,"coordinates":null,"entities":{"symbols":[],"user_mentions":[{"indices":[3,19],"id_str":"570379002","screen_name":"Whistlepodu4Csk","name":"Chennai Super Kings","id":570379002}],"hashtags":[{"indices":[132,138],"text":"CWC15"},{"indices":[139,140],"text":"IndvsAus"}],"urls":[]},"in_reply_to_screen_name":null,"id_str":"583090814735155201","retweet_count":9,"metadata":{"iso_language_code":"en","result_type":"recent"},"favorited":false,"retweeted_status":{"py/object":"tweepy.models.Status","py/state":{"contributors":null,"truncated":false,"text":": First time since 1987 World Cup no Asian teams in the WC final\nThis had occurred in 1975, 1979, 1987 and now #CWC15\n#IndvsAus\"","in_reply_to_status_id":null,"in_reply_to_user_id":null,"id":581059988317073409,"favorite_count":6,"author":{"py/object":"tweepy.models.User","py/state":{"follow_request_sent":false,"profile_use_background_image":true,"profile_text_color":"333333","id":570379002,"verified":false,"profile_location":null,"profile_image_url_https":"https://pbs.twimg.com/profile_images/460329225124188160/FgnIhlVM_normal.jpeg","profile_sidebar_fill_color":"DDEEF6","is_translator":false,"geo_enabled":false,"entities":{"url":{"urls":[{"indices":[0,22],"url":"http://t.co/Kx3erXpkEJ","expanded_url":"http://chennaisuperkings.com","display_url":"chennaisuperkings.com"}]},"description":{"urls":[{"indices":[138,160],"url":"http://t.co/yfitkkfz5D","expanded_url":"http://www.facebook.com/chennaisuperkingsofficialfansclub","display_url":"facebook.com/chennaisuperkiā¦"}]}},"followers_count":13604,"profile_sidebar_border_color":"000000","id_str":"570379002","default_profile_image":false,"location":"Chennai","is_translation_enabled":false,"utc_offset":19800,"statuses_count":13107,"description":"Chennai super kings fans club:All about Mahi, Raina,Mccullum,Aswhin,Bravo. Updates about Suriya: Beleive in CSK: Whistlepodu!Suriya Rocks http://t.co/yfitkkfz5D","friends_count":11962,"profile_link_color":"CCC200","profile_image_url":"http://pbs.twimg.com/profile_images/460329225124188160/FgnIhlVM_normal.jpeg","notifications":false,"profile_background_image_url_https":"https://pbs.twimg.com/profile_background_images/518467484358164480/yUXQYv3m.jpeg","profile_background_color":"FFF04D","profile_banner_url":"https://pbs.twimg.com/profile_banners/570379002/1370113848","profile_background_image_url":"http://pbs.twimg.com/profile_background_images/518467484358164480/yUXQYv3m.jpeg","name":"Chennai Super Kings","lang":"en","profile_background_tile":true,"favourites_count":283,"screen_name":"Whistlepodu4Csk","url":"http://t.co/Kx3erXpkEJ","created_at":{"py/object":"datetime.datetime","__reduce__":[{"py/type":"datetime.datetime"},["B9wFAxUWFAAAAA=="]]},"contributors_enabled":false,"time_zone":"Chennai","protected":false,"default_profile":false,"following":false,"listed_count":23}},"retweeted":false,"coordinates":null,"entities":{"symbols":[],"user_mentions":[],"hashtags":[{"indices":[111,117],"text":"CWC15"},{"indices":[118,127],"text":"IndvsAus"}],"urls":[]},"in_reply_to_screen_name":null,"id_str":"581059988317073409","retweet_count":9,"metadata":{"iso_language_code":"en","result_type":"recent"},"favorited":false,"source_url":"http://twitter.com/download/android","user":{"py/id":13},"geo":null,"in_reply_to_user_id_str":null,"lang":"en","created_at":{"py/object":"datetime.datetime","__reduce__":[{"py/type":"datetime.datetime"},["B98DGgsvMwAAAA=="]]},"in_reply_to_status_id_str":null,"place":null,"source":"Twitter for Android"}},"source_url":"http://www.twitter.com","user":{"py/id":1},"geo":null,"in_reply_to_user_id_str":null,"lang":"en","doc_type":"tweet","created_at":{"py/object":"datetime.datetime","__reduce__":[{"py/type":"datetime.datetime"},["B98EAQIRJgAAAA=="]]},"in_reply_to_status_id_str":null,"place":null,"source":"Twitter for Windows Phone"}}]}
Approach 1:
Here is the command:
curl -d #db.json -H "Content-type: application/json" -X POST http://127.0.0.1:5984/cwc15/_bulk_docs
I get below error:
{"error":"not_found","reason":"no_db_file"}
I did follow below post before I am posting this problem -
https://www.google.com.au/url?sa=t&rct=j&q=&esrc=s&source=web&cd=4&cad=rja&uact=8&ved=0CC4QFjAD&url=http%3A%2F%2Fstackoverflow.com%2Fquestions%2F26264647%2Fcouchdb-exported-file-wont-import-back-into-database&ei=8GMbVY3eNNjo8AW18YL4BA&usg=AFQjCNHdm1o0NS49nKPrEl0zU-n7eVRv8Q&bvm=bv.89744112,d.dGc
And I did not get any help from google. The last post I can see over there is way back in 2012, and I couldn't find the help any good. Could someone please help me out. I could be a life saver for me.
Approach 2 -
curl -H 'Content-Type: application/json' -X POST http://localhost:5984/_replicate -d ' {"source": "http://example.com:5984/dbname/", "target": "http://localhost#:5984/dbname/"}'
Gave my source and the target where I wanted to copy. In target gave the IP address of that machine followed by port no/dbname/
Got error: Connection Timedout
Approach 3:
Exported the couch database with filename - cwc15.couch
Stored in flash drive.
Took root login and went to the location where this file is stored.
Command - cp cwc15.couch /var/lib/couchdb
Get error -
Error:{{case_clause,{{badmatch,
{error,eaccess}},
[{couch_file,init,1,
[{file,"couch_file.erl"},{line,314}]},
{gen_server,init_it,6,
[{file,"gen_server.erl"},{line,304}]},
{proc_lib,init_p_do_apply,3,
[{file,"proc_lib.erl"},{line,
239}]}]}},
[{couch_server.handle_info,2,
[{file,couch_server.erl"},{line,442}]},
{gen_server,handle_msg,5,
[{file,"gen_server.erl"},{line,604}]},
{proc_lib,init_p_do_apply,3,
[{file,"proc_lib.erl"},{line,239}]}]}
{gen_server,call,
[couch_server,
{open,<<"cwc15>>,
[{user_ctx,
{user_ctx,null,
[<<"_admin">>],
<<"{couch_httpd_auth,
default_authentication_handler}">>}}]},
infinity]}
{"error":"not_found","reason":"no_db_file"} - database doesn't exists, you need to create it first: 1 Also, don't use -d curl key for uploading files: that argument is for sending data in text mode, while binary one (-T or --data-binary) is what you really want to. JSON is ineed text format, but Unicode data may play devil role here.
For Connection Timedout error happened because source or target databases aren't reachable by URLs you'd specified. Not sure what they were in real, but localhost#:5984 doesn't looks good one. Also, here you didn't create a database again, so initial error may occur.
The error in your logs {error,eaccess} means bad file permissions which you accidentally broken with copying a file. Follow the install instructions to restore it and ensure that nothing else is broken.
Related
AOSP How to verify OTA updates by their metadata
I'm building an OTA update for my custom Android 10 build as follows: ./build/make/tools/releasetools/ota_from_target_files \ --output_metadata_path metadata.txt \ target-files.zip \ ota.zip The resulting ota.zip can be applied by extracting the payload.bin and payload_properties.txt according to the android documentation for update_engine_client. update_engine_client --payload=file:///<wherever>/paypload.bin \ --update \ --headers=<Contents of payload_properties.txt> This all works so I'm pretty sure from this result that I've created the OTA correctly, however, I'd like to be able to download the metadata and verify that the payload can be applied before having the client download the entire payload. Looking at the update_engine_client --help options, it appears one can verify the metadata as follows: update_engine_client --verify --metadata=<path to metadata.txt from above> This is where I'm failing to achieve the desired result though. I get an error that says it failed to parse the payload header. It's failing with kDownloadInvalidMetadataMagicString which when I read the source appears to be the first 4 bytes of the metadata. Apparently the metadata.txt I created isn't right for the verification tool. So I'm hoping someone can point me in the right direction to either generate the metadata correctly or tell me how to use the tool correctly.
Turns out the metadata generated by the ota tool is in human readable format. The verify method expects a binary file. That file is not part of the zip contents as a unique file. Instead, it's prepended to the payload.bin. So the first bytes of payload.bin are actually payload_metadata.bin, and those bytes will work correctly with the verify method of update_engine_client to determine if the payload is applicable. I'm extracting the payload_metadata.bin in a makefile as follows: $(DEST)/%.meta: $(DEST)/%.zip unzip $< -d /tmp META-INF/com/android/metadata python -c 'import re; meta=open("/tmp/META-INF/com/android/metadata").read(); \ m=re.match(".*payload_metadata.bin:([0-9]*):([0-9]*)", meta); \ s=int(m.groups()[0]); l=int(m.groups()[1]); \ z=open("$<","rb").read(); \ open("$#","wb").write(z[s:s+l])' rm -rf /tmp/META-INF
OpenShift Online API responds with single white space and HTTP 406 "not acceptable" to any request
According to the OpenShift docs, the following should return a result: curl -X GET https://openshift.redhat.com/broker/rest/api However, any call to the API actually just returns a single whitespace - including those called with username & password. I've confirmed the issue from several different machines accross the globe. What might be the reason?
Inspired by a similar question regarding Twitter, I added .json at the end of the URL and it worked: curl -X GET https://openshift.redhat.com/broker/rest/api.json It's kind of disappointing that RedHat never replied to any inquiry regarding this issue.
The issue is logged here: https://bugzilla.redhat.com/show_bug.cgi?id=1324208 You can add yourself to the "cc" if you would like to be notified as the fix gets rolled out to production. FYI, asking for help on a public forum and giving it only 4 hours for a response from the company seems a bit much. In the future it may be worth it for you to either check for open bugs, or to ask the company directly instead.
M2 - Decrypt Data Block command failed
I am facing problem with DUKPT Decryption. I am sending the Decrypt Data Block (M2) command as per THales HSM manual, but I am getting error response as: 0000M315 Please find command below, Request you to help me whats wrong with my command. COMMAND: 0000M20011009U3BEE6C2C1850D691299B843984177A9A609FFFFFF8500000600016200E0beb0297d81e42bf9e07b1948dfaba7f8f032622173f61d2bacf6f485fa0a9babaf58637184b5e459cbae55f2b53ff9c356e4817f2efa9d70e740b27e2e089ccf42fefa56ee38c58d49f89206f9709c31e7ec616767f7638e3f853dde45af94e7cdb06502017a16c44ab472c3ce03260e Thanks, Nazir
It's working now, I have just changed data to upper case. Thanks, Nazir
How to Use RCurl or RMongo via HTTP with Authentication and Self Signed SSL to Read in JSON Data
I am using R to write a program and perform some analyses. The data is being captured by an outside vendor with MongoDB in JSON format. They are providing it to me via a URI on port 443, which they want me to query using cURL. They have authentication in place and self signed SSL. I can authenticate and dump the data via curl in Windows, however to create a long term sustainable solution it needs to all be done within R. The vendor says that RCurl "should" work but they aren't providing any support and they basically just don't like the idea of using RMongo and have no comment on it (but if we could make it work that would be awesome, in my opinion). I have the following packages loaded - ggplot2 - DBI - rjson - RJSONIO (I sometimes don't load this one if I'm using rjson, or visa versa) - RMongo - rstudio - RCurl The self signed certificate caused issues even with curl, but those were resolved by editing settings in Ruby and then launching a cmd shell with Ruby and using curl that way. I'm not sure if the problems in R are related. When trying to go the RCurl route I end up with commands/errors like this: x <- getURL("https://xxx.xx.xxx.xxx:443/db/_authenticate", userpwd="xxxx:xxxxx") }{Error in function (type, msg, asError = TRUE) : couldn't connect to host and when trying to use RMongo I'm even more clueless... > mongo <- mongoDbConnect("xxx.xx.xxx.xxx") username = "xxxx" password="xxxxxxxxxxxxx" authenticated <- dbAuthenticate(mongo, username, password) Feb 25, 2013 4:00:09 PM com.mongodb.DBTCPConnector fetchMaxBsonObjectSize WARNING: Exception determining maxBSON size using0 java.io.IOException: couldn't connect to [/127.0.0.1:27017] bc:java.net.ConnectException: Connection refused: connect at com.mongodb.DBPort.open(DBPort.java:224) at com.mongodb.DBPort.go(DBPort.java:101) at com.mongodb.DBPort.go(DBPort.java:82) at com.mongodb.DBPort.findOne(DBPort.java:142) at com.mongodb.DBPort.runCommand(DBPort.java:151) at com.mongodb.DBTCPConnector.fetchMaxBsonObjectSize(DBTCPConnector.java:429) at com.mongodb.DBTCPConnector.checkMaster(DBTCPConnector.java:416) at com.mongodb.DBTCPConnector.call(DBTCPConnector.java:193) at com.mongodb.DBApiLayer$MyCollection._find(DBApiLayer.java:303) at com.mongodb.DB.command(DB.java:159) at com.mongodb.DB.command(DB.java:144) at com.mongodb.DB._doauth(DB.java:503) at com.mongodb.DB.authenticate(DB.java:440) at rmongo.RMongo.dbAuthenticate(RMongo.scala:24) Error in .jcall(rmongo.object#javaMongo, "Z", "dbAuthenticate", username, : com.mongodb.MongoException$Network: can't call something Feb 25, 2013 4:00:10 PM com.mongodb.DBPortPool gotError WARNING: emptying DBPortPool to 127.0.0.1:27017 b/c of error java.io.IOException: couldn't connect to [/127.0.0.1:27017] bc:java.net.ConnectException: Connection refused: connect at com.mongodb.DBPort._open(DBPort.java:224) at com.mongodb.DBPort.go(DBPort.java:101) at com.mongodb.DBPort.go(DBPort.java:82) at com.mongodb.DBPort.call(DBPort.java:72) at com.mongodb.DBTCPConnector.call(DBTCPConnector.java:202) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:303) at com.mongodb.DB.command(DB.java:159) at com.mongodb.DB.command(DB.java:144) at com.mongodb.DB._doauth(DB.java:503) at com.mongodb.DB.authenticate(DB.java:440) at rmongo.RMongo.dbAuthenticate(RMongo.scala:24) any help would be greatly appreciated!
I had an issue in the past with RCurl where I needed to explicitly point it toward the security certificates to get it to work okay. I ended up needing something like this: out <- postForm("https://url.org/api/", token="IMATOKEN", .opts=curlOptions(cainfo="C:/path/aaa.crt")) I had manually exported the certificate I needed to get that working. Also, it kind of looks like you should be doing a POST request given that URI, not a GET. Try the postForm() command, maybe? EDITED TO ADD: Okay, I think things might be a little more clear if we stepped back a second. Is your goal to get some file from a specific URL (basically, doing a wget but from within R)? Or is your goal to submit a form that subsequently returns the data you need? IF you are just trying to get something that is behind basic (and also fairly INSECURE) HTTP authentication, you should do two things: Tell your data provider to use a more secure option Use the getURL() option as shown (using the www.omegahat.org example you posted about): Code: getURL("http://www.omegahat.org/RCurl/testPassword/",.opts=list(userpwd="bob:welcome")) OR getURL("http://bob:welcome#www.omegahat.org/RCurl/testPassword/") Now, if you need to submit a form to get the data, you would generally pass authentication tokens, etc, as parameters (so, in the example above, `token='.
Why do I get DBIx "No such relationship" error on one of the two clone instances of perl Catalyst?
I replicated a Perl catalyst web application on a new server, making sure catalyst, MySQL, and all required Perl modules have the same versions across both servers. But I keep getting a strange DBIx error message when I try to login on one server. This is the error. [error] DBIx::Class::ResultSet::search(): No such relationship committee_members on Committee at /mnt/data/www/apps/org/script/../lib/org/Controller/Users.pm line 57 [debug] Response Code: 500; Content-Type: text/html; charset=utf-8; Content-Length: 204782 The relationship clearly exists in the database. Has any one else had this issue? Any help is appreciated.
This was the problem - the module DBIx::Class::Schema::Loader was not up to date. The original version catalyst and related modules worked fine with the code as it was (with relationship/table nomenclature of olden times) but some modules, not all, were updated along the way breaking the back-compatibility of above module. When that was updated too, the warnings were reported and relationship names were automatically resolved (I still need to see the details..). It was able to resolve some back-compatibility issues that were left out of the intermediate release. see http://metacpan.org/pod/DBIx::Class::Schema::Loader::Manual::UpgradingFromV4