Using the LDAvis package in R to create a gist file of the result - json

I'm using LDAvis for topic modeling and trying to use the as.gist option to create a gist. When serVis executes there is a timeout in curl::curl_fetch_memory after about 10 seconds. If I immediately execute serVis again I get a different error 'Problems parsing JSON' and from then on whenever serVis is run that same error recurs.
If I start all over with a fresh workspace the same behavior occurs. The first time serVis is run, curl::curl_fetch_memory times out after about 10 seconds. Subsequent executions return 'Problems parsing JSON'.
If I don't use the as.gist option it works fine, but of course doesn't create a gist.
Very rarely, it works and a gist is created. If I change parameters to reduce the size of the JSON object it usually works, which makes me think it may be related to size.
I have explored the various RCurlOptions timeout settings. Currently, they are set as
options(RCurlOptions = list(cainfo = system.file("CurlSSL", "cacert.pem",
package = "RCurl"),
connecttimeout = 300, timeout = 3000,
followlocation = TRUE, dns.cache.timeout = 300))
Below is a console listing with debug set on curl::curl_fetch_memory.
> json <- createJSON(phi = cases$phi,
+ theta = cases$theta,
+ doc.len .... [TRUNCATED]
> serVis(json, open.browser = TRUE, as.gist = TRUE, description = 'APM Community')
debugging in: curl::curl_fetch_memory(url, handle = handle)
debug: {
output <- .Call(R_curl_fetch_memory, url, handle)
res <- handle_response_data(handle)
res$content <- output
res
}
Browse[2]> output <- .Call(R_curl_fetch_memory, url, handle)
Error: Timeout was reached
Browse[2]> output <- .Call(R_curl_fetch_memory, url, handle)
Browse[2]> rawToChar(output)
[1] "{\"message\":\"Problems parsing JSON\",\"documentation_url\":\"https://developer.github.com/v3\"}"
Browse[2]>
.
.
exiting from: curl::curl_fetch_memory(url, handle = handle)
Error: Problems parsing JSON
Any hints on how to debug this problem?

Related

MySQL Connection Error Handling with tryCatch

I have a R Shiny Application which uses MySQL as a datasource. When the application loads and the user logs into the app with their username and password, a database connection interface opens up where the user inputs their MySQL credentials. In order to prevent the app from crashing when the user enters the wrong MySQL connection credential, I am trying to use the following error handling.
# run when Load Data button is clicked
datapms <- eventReactive(input$pull_data, {
req(input$db_user,input$db_user,input$db_password,input$db_name,input$db_host,input$db_port)
progress <- Progress$new(session, min=1, max=15)
on.exit(progress$close())
progress$set(message = 'Pulling data from database',
detail = 'This message will disappear once completed.')
# establish a database connection
tryCatch({
con <- RMySQL::dbConnect(
RMySQL::MySQL(),
user = input$db_user,
password = input$db_password,
dbname = input$db_name,
host = input$db_host
)
}, error = function(e) {
debug_msg(e$message)
})
# construct the SQL statement
sql <- "SELECT * FROM pmsanalytics;"
# Fetch data
pmsanalytics <- tryCatch({
pmsanalytics <- dbGetQuery(conn = con, sql)
}, error = function(e) {
debug_msg(e$message)
})
### display debugging message in R (if local)
### and in the console log (if running in shiny)
debug_msg <- function(...) {
is_local <- Sys.getenv('SHINY_PORT') == ""
in_shiny <- !is.null(shiny::getDefaultReactiveDomain())
txt <- toString(list(...))
if (is_local) message(txt)
if (in_shiny) shinyjs::runjs(sprintf("console.debug(\"%s\")", text))
}
Initially this code was working, and the app was not crashing. However, now, when for example one enters the wrong connection credentials, i am getting the following error message:
Warning: Error in as.character: cannot coerce type 'closure' to vector of type 'character'
138: sprintf
136: debug_msg [C:\PMSAnalytics/app.R#107]
135: value[[3L]] [C:\PMSAnalytics/app.R#211]
134: tryCatchOne
133: tryCatchList
132: tryCatch
131: eventReactiveValueFunc [C:\PMSAnalytics/app.R#202]
Basically, the app crashes because there is no data which it is expecting to get, in other words the reactive datapms() data source it is expecting to get is empty.
Kind assist in reviewing my code to prevent app crashing.
Regards,
Chris

tryCatch to Prevent R Shiny App Crushing on MySQL Connection Error

My Shiny App was crushing when wrong connection credentials were passed to the connection string. I then put my connection string within a tryCatch as follows:
,,,
ConnectToDb <- function(){
con <- tryCatch({
dbConnect(MySQL(),
user = input$db_user,
password = input$db_password,
dbname = input$db_name,
host = input$db_host,
port = input$db_port)
print("Connection made")
####
sql <- "SELECT * FROM PMSAnalytics;"
data <- dbGetQuery(con, sql)
# # Disconnect from the DB
dbDisconnect(con)
# # Convert to data.frame
data <- data.frame(data)
data$timestamp <- as_datetime(now())
data
####
}, error = function(e) {
message('Please confirm your login details')
print(e)
},
warning = function(w){
message('A warning has occured')
print(w)
return(NA)
}
)
}
,,,
Now the application does not crush, but however no error messages or warning are passed when wrong credentials are used and neither do I get a connection success. I have checked this site for similar questions, but I seem not to get any. Kindly assist with polishing the code.
Regards,
Chris
I work with showNotification, which directly shows a notification in the shiny UI, you could also use it for the connection success.
Also, Options for this are,
duration = 60 (in this case for 60 seconds)
closeButton = FALSE
For example:
error = function(e) {
showNotification(paste0(e), type = 'error')
}
warning = function(w){
showNotification(paste0(w), type = 'warning')
return(NA)
}

ReadProcessMemory() dont read pages with specific AllocationProtect values

I'm building a Memory Scanner and with some error handling I noticed that ReadProcessMemory() is reading 90% of process' pages, but the ones that have mbi.Protect value == 1 or 260 it fails and returns ERROR 299 (Partial Copy) and the output of BytesRead is 0.
I run it as admin, set debug privileges and open process with VM_READ, but these exactly pages with mbi.Protect == 260 and 1 are unreadable. So, it's normal that it cant read all pages or am I doing something wrong ? Here is the code: (to be reproducable it also need this part of code that I import to main code and its where I setup all the ctypes background: https://pastebin.com/hMxLej5k, then you open python, import the code below and write "main(pid)" where pid is the pid of the process you want to read).
from ctypes import *
from ctypes import wintypes
import win32security
from setup_apis import *
def setDebugPriv():
token_handle = wintypes.HANDLE()
if not OpenProcessToken(
GetCurrentProcess(),
TOKEN_ADJUST_PRIVILEGES | TOKEN_QUERY,
byref(token_handle),
):
print("Error:",kernel32.GetLastError())
return False
luidvalue = win32security.LookupPrivilegeValue ( None, win32security.SE_DEBUG_NAME )
if not win32security.LookupPrivilegeValue(
None,
win32security.SE_DEBUG_NAME ,
):
print("Error",kernel32.GetLastError())
return False
se_debug_name_value = LUID(luidvalue) # Valor local do Privilégio de Debug
LAA = LUID_AND_ATTRIBUTES (
se_debug_name_value,
SE_PRIVILEGE_ENABLED
)
tkp = TOKEN_PRIVILEGES (
1, # DWORD PrivilegeCount
LAA, # LUID_AND_ATTRIBUTES
)
if not AdjustTokenPrivileges(
token_handle,
False,
byref(tkp),
sizeof(tkp),
None,
None,
):
print("Error:",GetLastError)
CloseHandle(token_handle)
return False
return True
#################################
def main(pid=None):
setDebugPriv()
process = OpenProcess (
PROCESS_VM_READ|PROCESS_QUERY_INFORMATION,
False,
pid,
)
system_info = SYSTEM_INFO()
GetSystemInfo ( byref(system_info) )
MaxAppAdress = system_info.lpMaximumApplicationAdress
VirtualQueryEx = VirtualQueryEx64
mbi = MEMORY_BASIC_INFORMATION64()
memset (
byref(mbi),
0,
sizeof(mbi),
)
Adress = 0
BytesRead = c_size_t (0)
while MaxAppAdress > Adress:
VirtualQueryEx(
process,
Adress,
byref(mbi),
sizeof(mbi),
)
if mbi.State == MEM_COMMIT:
try:
ContentsBuffer = create_string_buffer(mbi.RegionSize)
except:
pass
if not ReadProcessMemory (
process,
Adress,
ContentsBuffer,
mbi.RegionSize,
byref(BytesRead),
):
print("Cant Read, Error: %i, Protect State: %i" %(kernel32.GetLastError(), mbi.Protect) )
print("BytesRead:", BytesRead)
Adress += mbi.RegionSize
continue
Adress += mbi.RegionSize
'''
See Memory Protection Constants (260 = 0x104). No access and page guard regions cause exceptions. You can't access a no_access and you don't want to fire page_guard exceptions as they are meant to warn a process that a stack needs to grow and commit more pages. Don't attempt to read them.
Constant
Value
Description
PAGE_NOACCESS
0x01
Disables all access to the committed region of pages. An attempt to read from, write to, or execute the committed region results in an access violation.This flag is not supported by the CreateFileMapping function.
PAGE_READWRITE
0x04
Enables read-only or read/write access to the committed region of pages. If Data Execution Prevention is enabled, attempting to execute code in the committed region results in an access violation.
PAGE_GUARD
0x100
Pages in the region become guard pages. Any attempt to access a guard page causes the system to raise a STATUS_GUARD_PAGE_VIOLATION exception and turn off the guard page status. Guard pages thus act as a one-time access alarm. For more information, see Creating Guard Pages.When an access attempt leads the system to turn off guard page status, the underlying page protection takes over.If a guard page exception occurs during a system service, the service typically returns a failure status indicator.This value cannot be used with PAGE_NOACCESS.This flag is not supported by the CreateFileMapping function.

Get only part of a JSON when using an API on NodeMCU

I am using http.get() to get a JSON from an API I am using, but it's not getting the data. I have the suspicion that this JSON is too big for the NodeMCU. I only need the information in the subpart "stats:". Is it possible to only http.get() that part of the JSON?
EDIT:
This is my code
function getstats()
http.get("https://api.aeon-pool.com/v1/stats_address?address=WmsGUrXTR7sgKmHEqRNLgPLndWKSvjFXcd4soHnaxVjY3aBWW4kncTrRcBJJgUkeGwcHfzuZABk6XK6qAp8VmSci2AyGHcUit", nil, function(code, pool)
if (code < 0) then
print("can't get stats")
else
h = cjson.decode(pool)
hashrate = h[1]["hashrate"]
print(hashrate)
dofile('update_display.lua')
end
end)
end
I also have another function getting data from another api above getstats()
function getaeonrate()
http.get("https://api.coinmarketcap.com/v1/ticker/aeon/?convert=EUR", nil, function(code, dataaeon)
if (code < 0) then
print("can't get aeon")
else
-- Decode JSON data
m = cjson.decode(dataaeon)
-- Extract AEON/EUR price from decoded JSON
aeonrate = string.format("%f", m[1]["price_eur"]);
aeonchange = "24h " .. m[1]["percent_change_24h"] .. "% 7d " .. m[1]["percent_change_7d"] .. "%"
dofile('update_display.lua')
end
end)
end
But now the weird thing is, when I want to access 'pool' from getstats() I get the json data from getaeonrate(). So "hashrate" isn't even in the json because I am getting the json from another function.
I tried making a new project only with getstats() and that doesn't work at all I always get errors like this
HTTP client: Disconnected with error: -9
HTTP client: Connection timeout
HTTP client: Connection timeout
Yesterday I thought that the response was too big from api.aeon-pool.com, I if you look at the json in your webbrowser you can see that the top entry is 'stats:' and I only need that, none of the other stuff. So If the request is to big It would be nice to only http.get() that part of the json, hence my original question. At the moment I am not even sure what is not working correctly, I read that the nodemcu firmware generally had problems with http.get() and that it didn't work correctly for a long time, but getting data from api.coinmarketcap.com works fine in the original project.
The problems with the HTTP module are with near certainty related to https://github.com/nodemcu/nodemcu-firmware/issues/1707 (SSL and HTTP are problematic).
Therefore, I tried with the more bare-bone TLS module on the current master branch. This means you need to manually parse the HTTP response including all headers looking for the JSON content. Besides, you seem to be on an older NodeMCU version as you're still using CJSON - I used SJSON below:
Current NodeMCU master branch
function getstats()
buffer = nil
counter = 0
local srv = tls.createConnection()
srv:on("receive", function(sck, payload)
print("[stats] received data, " .. string.len(payload))
if buffer == nil then
buffer = payload
else
buffer = buffer .. payload
end
counter = counter + 1
-- not getting HTTP content-length header back -> poor man's checking for complete response
if counter == 2 then
print("[stats] done, processing payload")
local beginJsonString = buffer:find("{")
local jsonString = buffer:sub(beginJsonString)
local hashrate = sjson.decode(jsonString)["stats"]["hashrate"]
print("[stats] hashrate from aeon-pool.com: " .. hashrate)
end
end)
srv:on("connection", function(sck, c)
sck:send("GET /v1/stats_address?address=WmsGUrXTR7sgKmHEqRNLgPLndWKSvjFXcd4soHnaxVjY3aBWW4kncTrRcBJJgUkeGwcHfzuZABk6XK6qAp8VmSci2AyGHcUit HTTP/1.1\r\nHost: api.aeon-pool.com\r\nConnection: close\r\nAccept: */*\r\n\r\n")
end)
srv:connect(443, "api.aeon-pool.com")
end
Note that the receive event is fired for every network frame: https://nodemcu.readthedocs.io/en/latest/en/modules/net/#netsocketon
NodeMCU fails to establish a connection to api.coinmarketcap.com due to a TLS handshake failure. Not sure why that is. Otherwise your getaeonrate() could be implemented likewise.
Frozen 1.5.4 branch
With the old branch the net module can connect to coinmarketcap.com.
function getaeonrate()
local srv = net.createConnection(net.TCP, 1)
srv:on("receive", function(sck, payload)
print("[aeon rate] received data, " .. string.len(payload))
local beginJsonString = payload:find("%[")
local jsonString = payload:sub(beginJsonString)
local json = cjson.decode(jsonString)
local aeonrate = string.format("%f", json[1]["price_eur"]);
local aeonchange = "24h " .. json[1]["percent_change_24h"] .. "% 7d " .. json[1]["percent_change_7d"] .. "%"
print("[aeon rate] aeonrate from coinmarketcap.com: " .. aeonrate)
print("[aeon rate] aeonchange from coinmarketcap.com: " .. aeonchange)
end)
srv:on("connection", function(sck, c)
sck:send("GET /v1/ticker/aeon/?convert=EUR HTTP/1.1\r\nHost: api.coinmarketcap.com\r\nConnection: close\r\nAccept: */*\r\n\r\n")
end)
srv:connect(443, "api.coinmarketcap.com")
end
Conclusion
The HTTP module and TLS seem a no-go for your APIs due to a bug in the firmware (1707).
The net/TLS module of the current master branch manages to connect to api.aeon-pool.com but not to api.coinmarketcap.com.
With the old and frozen 1.5.4 branch it's exactly the other way around.
There may (also) be issues with cipher suits that don't match between the firmware and the API provider(s).
-> :( no fun like that

I am getting this error in my R code "Error in curl::curl_fetch_memory(url, handle = handle) : Failure when receiving data from the peer "

I had written code in R to get data from bigquery and write into MySQL. My code working properly from last few month but now i am getting this error.
Error in curl::curl_fetch_memory(url, handle = handle) :
Failure when receiving data from the peer
it writes incomplete data in mysql.
i had written this code.
sql <- 'SELECT * FROM[XXXXXX-XXXXXX:catalog_output.catalog_dynamic];'
catalog_dynamic <- query_exec(sql, project, destination_table =
"XXXXXXXX.XXXXXX", default_dataset = NULL,page_size = 100,
max_pages = Inf, warn = TRUE,create_disposition = "CREATE_IF_NEEDED",
write_disposition = "WRITE_TRUNCATE")