Model failed to converge with max|grad| How can I solve this problem? - lme4

I'm using glmer in lme4 package. My formula is like this:
glmer1 <- glmer(answer ~ (1|subject) + target*major, data=data_identification_analysis, family=binomial)
Then I've got this message:
Warning message:
In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
Model failed to converge with max|grad| = 0.010091 (tol = 0.001, component 1)
My data set is here:
https://www.dropbox.com/s/lbbq5bnhpo1i17v/ask_help.xlsx?dl=0
Could anyone help me?

Related

Lua Script can't connect to MySQL-Database

I'm following the Lua part of this tutorial: http://wiki.dragino.com/index.php?title=Save_Data_to_MySQL.
Especially this code:
require "luasql.mysql"
env = assert (luasql.mysql())
con = assert (env:connect"nkt_development",'db_user','db_passwordL','172.31.10.60',3306)
Unfortunately I got an error that I can't fix
lua: mysql_test.lua:7: attempt to index global 'luasql' (a nil value)
stack traceback:
mysql_test.lua:7: in main chunk
[C]: ?
I am working on an dragino gatway / Arduino Yun.
I found the answer myself by typing the following into the first line. Instead of just requiring the libary I had to declare it onto a variable to use later in the code.
luasql = require "luasql.mysql"
This is what my final code looks like:
luasql = require "luasql.mysql"
value=arg[1]
current_time=os.date("%Y-%m-%d %H:%M:%S")
env = luasql.mysql()
con = assert (env:connect('development', 'DBUSER', 'PASSWORD', 'HOSTIP','3306'))
res = assert (con:execute('INSERT INTO record(time,value) VALUES("'..current_time..'",'..value..')'))

failing to decrypt blob passwords only once in a while using amazon kms

import os, sys
AWS_DIRECTORY = '/home/jenkins/.aws'
certificates_folder = 'my_folder'
SUCCESS = 'success'
class AmazonKMS(object):
def __init__(self):
# making sure boto3 has the certificates and region files
result = os.system('mkdir -p ' + AWS_DIRECTORY)
self._check_os_result(result)
result = os.system('cp ' + certificates_folder + 'kms_config ' + AWS_DIRECTORY + '/config')
self._check_os_result(result)
result = os.system('cp ' + certificates_folder + 'kms_credentials ' + AWS_DIRECTORY + '/credentials')
self._check_os_result(result)
# boto3 is the amazon client package
import boto3
self.kms_client = boto3.client('kms', region_name='us-east-1')
self.global_key_alias = 'alias/global'
self.global_key_id = None
def _check_os_result(self, result):
if result != 0 and raise_on_copy_error:
raise FAILED_COPY
def decrypt_text(self, encrypted_text):
response = self.kms_client.decrypt(
CiphertextBlob = encrypted_text
)
return response['Plaintext']
when using it
amazon_kms = AmazonKMS()
amazon_kms.decrypt_text(blob_password)
getting
E ClientError: An error occurred (AccessDeniedException) when calling the Decrypt operation: The ciphertext refers to a customer master key that does not exist, does not exist in this region, or you are not allowed to access.
stacktrace is
../keys_management/amazon_kms.py:77: in decrypt_text
CiphertextBlob = encrypted_text
/home/jenkins/.virtualenvs/global_tests/local/lib/python2.7/site-packages/botocore/client.py:253: in _api_call
return self._make_api_call(operation_name, kwargs)
/home/jenkins/.virtualenvs/global_tests/local/lib/python2.7/site-packages/botocore/client.py:557: in _make_api_call
raise error_class(parsed_response, operation_name)
This happens in a script that runs once an hour.
it's only failing 2 -3 times a day.
after a retry it succeed.
Tried to upgraded from boto3 1.2.3 to 1.4.4
what is the possible cause for this behavior ?
My guess is that the issue is not in anything you described here.
Most likely the login-tokes time out or something along those lines. To investigate this further a closer look on the way the login works here is probably helpful.
How does this code run? Is it running inside AWS like on Lambda or EC2? Do you run it from your own server (looks like it runs on jenkins)? How is the login access established? What are those kms_credentials used for and how do they look like? Do you do something like assumeing a role (which would probably work through access tokens which after some time will no longer work)?

Using the LDAvis package in R to create a gist file of the result

I'm using LDAvis for topic modeling and trying to use the as.gist option to create a gist. When serVis executes there is a timeout in curl::curl_fetch_memory after about 10 seconds. If I immediately execute serVis again I get a different error 'Problems parsing JSON' and from then on whenever serVis is run that same error recurs.
If I start all over with a fresh workspace the same behavior occurs. The first time serVis is run, curl::curl_fetch_memory times out after about 10 seconds. Subsequent executions return 'Problems parsing JSON'.
If I don't use the as.gist option it works fine, but of course doesn't create a gist.
Very rarely, it works and a gist is created. If I change parameters to reduce the size of the JSON object it usually works, which makes me think it may be related to size.
I have explored the various RCurlOptions timeout settings. Currently, they are set as
options(RCurlOptions = list(cainfo = system.file("CurlSSL", "cacert.pem",
package = "RCurl"),
connecttimeout = 300, timeout = 3000,
followlocation = TRUE, dns.cache.timeout = 300))
Below is a console listing with debug set on curl::curl_fetch_memory.
> json <- createJSON(phi = cases$phi,
+ theta = cases$theta,
+ doc.len .... [TRUNCATED]
> serVis(json, open.browser = TRUE, as.gist = TRUE, description = 'APM Community')
debugging in: curl::curl_fetch_memory(url, handle = handle)
debug: {
output <- .Call(R_curl_fetch_memory, url, handle)
res <- handle_response_data(handle)
res$content <- output
res
}
Browse[2]> output <- .Call(R_curl_fetch_memory, url, handle)
Error: Timeout was reached
Browse[2]> output <- .Call(R_curl_fetch_memory, url, handle)
Browse[2]> rawToChar(output)
[1] "{\"message\":\"Problems parsing JSON\",\"documentation_url\":\"https://developer.github.com/v3\"}"
Browse[2]>
.
.
exiting from: curl::curl_fetch_memory(url, handle = handle)
Error: Problems parsing JSON
Any hints on how to debug this problem?

R/httr - errors when adding a progress bar for json request

When I make a json POST request I am trying to incorporate a progress bar to monitor progress. However, I only see the progress bar when the task has completed (it shows up at 100%) so that's not very useful...
I also get the following warnings:
> response <- POST(url,config=progress(),body=data2)
|==============================================================================================================================================================================================================================| 100%
There were 50 or more warnings (use warnings() to see the first 50)
> warnings()
Warning messages:
1: In curl::curl_fetch_memory(url, handle = handle) :
progress callback must return boolean
2: In curl::curl_fetch_memory(url, handle = handle) :
progress callback must return boolean
3: In curl::curl_fetch_memory(url, handle = handle) :
progress callback must return boolean
etc.
I found this conversation about httr on github which seems to imply the problem has been solved, but I'm still getting the warnings. Any ideas how I can make this progress bar work? Thanks!
Per a comment below I tried
devtools:install_github("hadley/httr")
And got the following error:
Reloading installed httr
Error in get(method, envir = home) :
lazy-load database 'C:/Users/.../R/win-library/3.2/httr/R/httr.rdb' is corrupt
In addition: Warning message:
In get(method, envir = home) : internal error -3 in R_decompress1
> response <- POST(url,config=progress(),body=data2)
Error: could not find function "POST"
> library(httr)
Error in get(method, envir = home) :
lazy-load database 'C:/Users.../R/win-library/3.2/httr/R/httr.rdb' is corrupt
And when I tried calling library(httr),
In addition: Warning messages:
1: In .registerS3method(fin[i, 1], fin[i, 2], fin[i, 3], fin[i, 4], :
restarting interrupted promise evaluation
2: In get(method, envir = home) :
restarting interrupted promise evaluation
3: In get(method, envir = home) : internal error -3 in R_decompress1
Error: package or namespace load failed for ‘httr’.

Error in fromJSON(paste(raw.data, collapse = "")) : unclosed string

I am using the R package rjson to download weather data from Wunderground.com. Often I leave the program to run and there are no problems, with the data being collected fine. However, often the program stops running and I get the following error message:
Error in fromJSON(paste(raw.data, collapse = "")) : unclosed string
In addition: Warning message:
In readLines(conn, n = -1L, ok = TRUE) :
incomplete final line found on 'http://api.wunderground.com/api/[my_API_code]/history_20121214pws:1/q/pws:IBIRMING7.json'
Does anyone know what this means, and how I can avoid it since it stops my program from collecting data as I would like?
Many thanks,
Ben
I can recreate your error message using the rjson package.
Here's an example that works.
rjson::fromJSON('{"x":"a string"}')
# $x
# [1] "a string"
If we omit a double quote from the value of x, then we get the error message.
rjson::fromJSON('{"x":"a string}')
# Error in rjson::fromJSON("{\"x\":\"a string}") : unclosed string
The RJSONIO package behaves slightly differently. Rather than throwing an error, it silently returns a NULL value.
RJSONIO::fromJSON('{"x":"a string}')
# $x
# NULL