quantmod R getsymbols.MySQL modification - mysql

I have already studied the case
Quantmod: Error loading symbols from MySQL DB
and already try to fix the getSymbols.MySQL function in R
However, I found that my database just contain
date, open, high, low, close, volume (without the close.adj column).
So, if I want to further modify the getSymbols.MySQL function, what can I do?
I have tried to use 'fix(getSymbols.MySQL)' to fix the function. However, it returns
Error in colnames<-(*tmp*, value = c("H0001.Open", "H0001.High", "H0001.Low", : length of 'dimnames' [2] not equal to array extent
when I connect to my database.
function (Symbols, env, return.class = "xts", db.fields = c("date",
"o", "h", "l", "c", "v", "a"), field.names = NULL, user = NULL,
password = NULL, dbname = NULL, host = "localhost", port = 3306,
...)
{
importDefaults("getSymbols.MySQL")
this.env <- environment()
for (var in names(list(...))) {
assign(var, list(...)[[var]], this.env)
}
if (!hasArg(verbose))
verbose <- FALSE
if (!hasArg(auto.assign))
auto.assign <- TRUE
if (!requireNamespace("DBI", quietly = TRUE))
stop("package:", dQuote("DBI"), "cannot be loaded.")
if (!requireNamespace("RMySQL", quietly = TRUE))
stop("package:", dQuote("RMySQL"), "cannot be loaded.")
if (is.null(user) || is.null(password) || is.null(dbname)) {
stop(paste("At least one connection argument (", sQuote("user"),
sQuote("password"), sQuote("dbname"), ") is not set"))
}
con <- DBI::dbConnect("MySQL", user = user, password = password,
dbname = dbname, host = host, port = port)
db.Symbols <- DBI::dbListTables(con)
if (length(Symbols) != sum(Symbols %in% db.Symbols)) {
missing.db.symbol <- Symbols[!Symbols %in% db.Symbols]
warning(paste("could not load symbol(s): ", paste(missing.db.symbol,
collapse = ", ")))
Symbols <- Symbols[Symbols %in% db.Symbols]
}
for (i in 1:length(Symbols)) {
if (verbose) {
cat(paste("Loading ", Symbols[[i]], paste(rep(".",
10 - nchar(Symbols[[i]])), collapse = ""), sep = ""))
}
query <- paste("SELECT ", paste(db.fields, collapse = ","),
" FROM ", Symbols[[i]], " ORDER BY date")
rs <- DBI::dbSendQuery(con, query)
fr <- DBI::fetch(rs, n = -1)
fr <- xts(as.matrix(fr[, -1]), order.by = as.Date(fr[,
1], origin = "1970-01-01"), src = dbname, updated = Sys.time())
colnames(fr) <- paste(Symbols[[i]], c("Open", "High",
"Low", "Close", "Volume", "Adjusted"), sep = ".")
fr <- convert.time.series(fr = fr, return.class = return.class)
if (auto.assign)
assign(Symbols[[i]], fr, env)
if (verbose)
cat("done\n")
}
DBI::dbDisconnect(con)
if (auto.assign)
return(Symbols)
return(fr)
}
I think the problem is the function was designed to read 7 column of data rather than 6 column of data. Hope someone can help.

Here's a patch that should allow you to do what you want. I'm unable to test because I don't have a MySQL installation to test against. Please let me know whether or not it works.
diff --git a/R/getSymbols.R b/R/getSymbols.R
index 0a2e814..7a9be66 100644
--- a/R/getSymbols.R
+++ b/R/getSymbols.R
## -634,9 +634,9 ## function(Symbols,env,return.class='xts',
fr <- xts(as.matrix(fr[,-1]),
order.by=as.Date(fr[,1],origin='1970-01-01'),
src=dbname,updated=Sys.time())
- colnames(fr) <- paste(Symbols[[i]],
- c('Open','High','Low','Close','Volume','Adjusted'),
- sep='.')
+ if(is.null(field.names))
+ field.names <- c('Open','High','Low','Close','Volume','Adjusted')
+ colnames(fr) <- paste(Symbols[[i]], field.names, sep='.')
fr <- convert.time.series(fr=fr,return.class=return.class)
if(auto.assign)
assign(Symbols[[i]],fr,env)
Then your function call should be:
getSymbols.MySQL("H0001", env, return.class = 'xts',
db.fields = c("date", "open", "high", "low", "close", "volume"),
field.names = c("date", "open", "high", "low", "close", "volume"),
user = 'xxxx', password = 'xxxx', host='xxxx', dbname = 'xxxx')

Related

Convert txt to json file

I have a text file, need to convert that into JSON, please help
RULE_NAME{
RULE 1
SOURCE
DESTINAION
PORT
POROTCOL
RULE 2
SOURCE
DESTINAION
PORT
POROTCOL
RULE 3
....
}
Need json format to be like this
Local{
01
SOURCE : SAG
DESTINAION : any
PORT :02
POROTCOL: icmp
04
SOURCE : SAG
DESTINAION :any
PORT: any
POROTCOL : tcp
}
bk1-2-internal{
02
SOURCE : any
DESTINAION : SoftLY
PORT :any
POROTCOL: any
28
SOURCE : 119.111.126.115/18
DESTINAION :129.37.164.74/30
PORT: 112
POROTCOL : udpt
}
My text file looks like this:
>Local = 01 : SAG = any = 02 = tcp
>Local = 04 : SAG = any = any = tcp
>bk1-2-internal = 2 : any = SoftLY = any = any
>bk1-2-internal = 28 : 119.111.126.115/18 = 129.37.164.74/30 = 112 = udpt
Goes upto 200 more lines with same format
i tried the code below but does not get expected structure. please check and assist in getting expected output, please feel free to contact at your own space, need to close this by today, please help
#filename = '/home/AY.txt'
import json
# the file to be converted
filename = '/home/ay/txt'
# resultant dictionary
dict1 = {}
# fields in the sample file
fields =['name', 'source', 'destination', 'port' 'protocol']
with open(filename) as fh:
# count variable for employee id creation
l = 1
for line in fh:
# reading line by line from the text file
description = list( line.strip().split(None, 4))
# for output see below
print(description)
# for automatic creation of id for each employee
sno =' '+str(l)
# loop variable
i = 0
# intermediate dictionary
dict2 = {}
while i<len(fields):
# creating dictionary for each employee
dict2[fields[i]]= description[i]
i = i + 1
# appending the record of each employee to
# the main dictionary
dict1[sno]= dict2
l = l + 1
# creating json file
out_file = open("/home/task.json", "w")
json.dump(dict1, out_file, indent = 4)
out_file.close()
Code
import os
import json
def json_from_log(path):
'''
Convert log to json text file as follows
Steps:
Create dictionary from log file
Deserialize dictionary to json
Output file same name as input but with suffix of json
'''
outfile = os.path.splitext(path)[0] + '.json' # same as input but with json extension
with open(path, 'r') as fread, open(outfile, 'w') as outfile:
result = {}
for line in fread:
line = line.strip()
rule, address = line.split(':')
rule = rule[1:-1] # skip '>', and drop trailing space ata end of string
address = address.lstrip() # Drop leading space
rule_id, rule_number = rule.split(' = ')
source, destination, port, protocol = address.split(' = ')
port = int(port) if port.isnumeric() else port
# Add nested ditionaries if they don't exist for key
result.setdefault(rule_id, {})
result[rule_id].setdefault(rule_number, {})
# Place in dictionary
result[rule_id][rule_number].update({"SOURCE":source,
"DESTINAION": destination,
"PORT": port,
"PROTOCOL": protocol})
json.dump(result, outfile, indent = 1) # indent = 1 allows "pretty" output
Usage
json_from_log('test3.txt')
Input File: test3.txt
>Local = 01 : SAG = any = 02 = tcp
>Local = 04 : SAG = any = any = tcp
>bk1-2-internal = 2 : any = SoftLY = any = any
>bk1-2-internal = 28 : 119.111.126.115/18 = 129.37.164.74/30 = 112 = udpt
Output File: test3.json
{
"Local": {
"01": {
"SOURCE": "SAG",
"DESTINAION": "any",
"PORT": 2,
"PROTOCOL": "tcp"
},
"04": {
"SOURCE": "SAG",
"DESTINAION": "any",
"PORT": "any",
"PROTOCOL": "tcp"
}
},
"bk1-2-internal": {
"2": {
"SOURCE": "any",
"DESTINAION": "SoftLY",
"PORT": "any",
"PROTOCOL": "any"
},
"28": {
"SOURCE": "119.111.126.115/18",
"DESTINAION": "129.37.164.74/30",
"PORT": 112,
"PROTOCOL": "udpt"
}
}
}

json to lua with multiple stings backslash and dot

Hello i'm trying to use Json from my washer with lua. It's for visualizing the samsung in Domoitcz.
A part of the Json what i get from https://api.smartthings.com/v1/devices/abcd-1234-abcd is:
"main": {
"washerJobState": {
"value": "wash"
},
"mnhw": {
"value": "1.0"
},
"data": {
"value": "{
\"payload\":{
\"x.com.samsung.da.state\":\"Run\",\"x.com.samsung.da.delayEndTime\":\"00:00:00\",\"x.com.samsung.da.remainingTime\":\"01:34:00\",\"if\":[\"oic.if.baseline\",\"oic.if.a\"],\"x.com.samsung.da.progressPercentage\":\"2\",\"x.com.samsung.da.supportedProgress\":[\"None\",\"Wash\",\"Rinse\",\"Spin\",\"Finish\"],\"x.com.samsung.da.progress\":\"Wash\",\"rt\":[\"x.com.samsung.da.operation\"]}}"
},
"washerRinseCycles": {
"value": "3"
},
"switch": {
"value": "on"
},
if i use in my script
local switch = item.json.main.switch.value
I got the valua on or off and i can use it for showing the status of the washer.
i'm trying to find out how to get the "data"value in my script, there are more items with dots en backslhases:
local remainingTime = rt.data.value.payload['x.com.samsung.da.remainingTime']
or
local remainingTime = rt.data.value['\payload']['\x.com.samsung.da.remainingTime']
i tried some more opions with 'or // , "" but always got a nill value.
Can someone explain me how to get:
\"x.com.samsung.da.remainingTime\":\"01:34:00\"
\"x.com.samsung.da.progressPercentage\":\"2\",
All the " , \, x., ar confusing me
Below is my script to test where i only left the Json log (Dzvents Lua Based) i get an error:
dzVents/generated_scripts/Samsung_v3.lua:53: attempt to index a nil value (global 'json') i don't heave any idea how te use/adjust my code for decode the string.
local json = require"json" -- the JSON library
local outer = json.decode(your_JSON_string)
local rt = outer.main
local inner = json.decode(rt.data.value)
local remainingTime = inner.payload['x.com.samsung.da.remainingTime']
local API = 'API'
local Device = 'Device'
local LOGGING = true
--Define dz Switches
local WM_STATUS = 'WM Status' --Domoitcz virtual switch ON/Off state Washer
return
{
on =
{
timer =
{
'every 1 minutes', -- just an example to trigger the request
},
httpResponses =
{
'trigger', -- must match with the callback passed to the openURL command
},
},
logging =
{
level = domoticz.LOG_DEBUG ,
},
execute = function(dz, item)
local wm_status = dz.devices(WM_STATUS)
if item.isTimer then
dz.openURL({
url = 'https://api.smartthings.com/v1/devices/'.. Device .. '/states',
headers = { ['Authorization'] = 'Bearer '.. API },
method = 'GET',
callback = 'trigger', -- see httpResponses above.
})
end
if (item.isHTTPResponse) then
if item.ok then
if (item.isJSON) then
rt = item.json.main
-- outer = json.decode'{"payload":{"x.com.samsung.da.state":"Run","x.com.samsung.da.delayEndTime":"00:00:00","x.com.samsung.da.remainingTime":"00:40:00","if":["oic.if.baseline","oic.if.a"],"x.com.samsung.da.progressPercentage":"81","x.com.samsung.da.supportedProgress":["None","Weightsensing","Wash","Rinse","Spin","Finish"],"x.com.samsung.da.progress":"Rinse","rt":["x.com.samsung.da.operation"]}}
inner = json.decode(rt.data.value)
-- local remainingTime = inner.payload['x.com.samsung.da.remainingTime']
dz.utils.dumpTable(rt) -- this will show how the table is structured
-- dz.utils.dumpTable(inner)
local washerSpinLevel = rt.washerSpinLevel.value
-- local remainingTime = inner.payload['x.com.samsung.da.remainingTime']
dz.log('Debuggg washerSpinLevel:' .. washerSpinLevel, dz.LOG_DEBUG)
dz.log('Debuggg remainingTime:' .. remainingTime, dz.LOG_DEBUG)
-- dz.log('Resterende tijd:' .. remainingTime, dz.LOG_INFO)
-- dz.log(dz.utils.fromJSON(item.data))
-- end
elseif LOGGING == true then
dz.log('There was a problem handling the request', dz.LOG_ERROR)
dz.log(item, dz.LOG_ERROR)
end
end
end
end
}
This is a weird construction: a serialized JSON inside a normal JSON.
This means you have to invoke deserialization twice:
local json = require"json" -- the JSON library
local outer = json.decode(your_JSON_string)
local rt = outer.main
local inner = json.decode(rt.data.value)
local remainingTime = inner.payload['x.com.samsung.da.remainingTime']

R package monitoR error on dbUploadTemplate [pkg-monitor]

I'm using the R package monitoR and getting an error message that I can't figure out.
I'm trying to upload a correlation template list ("bithTemps") to a MySQL database ("noh") using the dbUploadTemplate command.
dbUploadTemplate(templates = bithTemps,
uid = "root",
pwd = "****",
db.name = "noh",
analyst = 1,
locationID = "2",
date.recorded = "2017/09/07",
recording.equip = "Unknown",
species.code = "BITH",
type = "COR")
This returns:
Error: $ operator is invalid for atomic vectors
I have confirmed the ODBC connection is working, that the template list is functional (i.e., it works when called to other arguments in the package), and that the SQL database has the required entries for analyst, location, and species code.
It seems that this error was actually triggered by a non-functional ODBC connection. This part of the dbUploadTemplate function
species <- RODBC::sqlQuery(dbCon, paste("SELECT `pkSpeciesID`, `fldSpeciesCode` FROM `tblSpecies` WHERE `fldSpeciesCode` = '",
paste(species.code, sep = "", collapse = "' OR `fldSpeciesCode` = '"),
"'", sep = ""))
queries a table in the SQL database and returns an object called 'species'. If the query fails (e.g., because RODBC can't connect) than 'species' is empty, and the following operation
speciesID <- NULL
for (i in 1:length(species.code)) {
speciesID[i] <- species$pkSpeciesID[species$fldSpeciesCode ==
species.code[i]]
}
triggers the error. Fixing the ODBC connection resolves the error.

could not find function "eventReactive"

When I run the app in Ubuntu it works perfectly but when I run in on Mac OSX, things (like buttons) are not aligned and after a while I get the following error:
> shiny::runApp()
Loading required package: shiny
Listening on http://127.0.0.1:7240
Loading required package: lattice
Loading required package: ggplot2
data.table 1.8.10 For help type: help("data.table")
Error in (structure(function (input, output) :
could not find function "eventReactive"
ERROR: [on_request_read] connection reset by peer
Here's some part of code:
trainres <- eventReactive(input$buttontrain, {
thisfds = list(); singtrain = NULL; singtest = NULL
thiskfkds = list(); multtrain = NULL; multtest = NULL
yvectr = NULL; yvects = NULL; predvectr = NULL; predvects = NULL
tim = 0.0
if(input$dbterm == "Multi table") {
thiskfkds = append(thiskfkds, KFKD(EntCol=input$fk1, AttCol=input$pk1, UseFK=input$usefk1))
if(!is.null(input$fk2)) {
thiskfkds = append(thiskfkds, KFKD(EntCol=input$fk2, AttCol=input$pk2, UseFK=input$usefk2))
}
if(!is.null(input$fk3)) {
thiskfkds = append(thiskfkds, KFKD(EntCol=input$fk3, AttCol=input$pk3, UseFK=input$usefk3))
}
cat("KFKDs:\n")
print(thiskfkds)
multtrain = switch(input$dataset,
"Walmart" = MultData(Target=as.data.frame(WStr[,1]), EntTable=WStr[,-1], AttTables=list(WR1, WR2), KFKDs=thiskfkds),
"Walmart (R)" = MultData(Target=as.data.frame(DWStr[,1]), EntTable=DWStr[,-1], AttTables=list(DWR1, DWR2), KFKDs=thiskfkds),
"Yelp" = MultData(Target=as.data.frame(YStr[,1]), EntTable=YStr[,-1], AttTables=list(YR1, YR2), KFKDs=thiskfkds),
"Yelp (R)" = MultData(Target=as.data.frame(DYStr[,1]), EntTable=DYStr[,-1], AttTables=list(DYR1, DYR2), KFKDs=thiskfkds),
"Expedia" = MultData(Target=as.data.frame(EStr[,1]), EntTable=EStr[,-1], AttTables=list(ER1, ER2), KFKDs=thiskfkds),
"Expedia (R)" = MultData(Target=as.data.frame(DEStr[,1]), EntTable=DEStr[,-1], AttTables=list(DER1, DER2), KFKDs=thiskfkds),
"Flights" = MultData(Target=as.data.frame(FStr[,1]), EntTable=FStr[,-1], AttTables=list(FR1, FR2, FR3), KFKDs=thiskfkds),
"Flights (R)" = MultData(Target=as.data.frame(DFStr[,1]), EntTable=DFStr[,-1], AttTables=list(DFR1, DFR2, DFR3), KFKDs=thiskfkds)
)
Here's how apps looks like after running:
Here's the code in ui.R:
library(shiny)
library(caret)
shinyUI(fluidPage(
list(tags$head(HTML('<h4><table><tr><td rowspan="2"><img src="http://umark.wisc.edu/brand/templates-and-downloads/downloads/print/UWCrest_4c.jpg"
border="0" style="padding-right:10px" width="34" height="40" alt="UW-Madison Database Group"/>
</td><td><b>Santoku</b></td></tr><tr><td>University of Wisconsin-Madison Database Group</td></tr></table></h4>'))),
sidebarLayout(
sidebarPanel(width = 6,
wellPanel(fluidRow(column(6, radioButtons("dbterm", "Database Type", c("Multi table", "Single table"))),
column(6, selectInput("dataset", "Load Dataset", c("Walmart", "Walmart (R)", "Yelp", "Yelp (R)", "Expedia",
"Expedia (R)", "Flights", "Flights (R)")))),
uiOutput("uideps")),
wellPanel(fluidRow(column(6, radioButtons("mlalgo", "ML Model:", c("Logistic Regression" = "lr", "Naive Bayes" = "nb",
"TAN" = "tan", "Decision Tree" = "dt"))),
column(6, uiOutput("uimlpt"))),
fluidRow(div(class="padding2", column(3, checkboxInput("checkcv", "Validate", TRUE))),
div(class="padding3", column(2, actionButton("buttontrain", "Learning"))),
div(class="padding4", column(3, actionButton("buttonfe", "Feature Exploration")))))
),
mainPanel(width = 6,
tabsetPanel(
tabPanel("Single Learning", verbatimTextOutput("trainreso")),
tabPanel("Feature Exploration", plotOutput("feplotso"))
#tabPanel("Wiki", verbatimTextOutput("Wiki")),
#tabPanel("Analysis", tableOutput("plots"))
)
)
)#end sidebarLayout
))#end main
The current version is shiny 0.12.2 in this version there is a function called eventReactive. To quickly update you can use the code;
install.packages(shiny)

sqlSave fails when tablename is longer than 18 characters

I am currently writing a script that downloads a bunch of .csv's from a FTP server, and then puts each .csv in a MySQL database as its own table.
I download the .csv's from the FTP using RCurl and place all of the .csv's in my working directory. To create tables out of each .csv, I am using the sqlSave function from the RODBC package, where the table name is the same name as the .csv. This works fine whenever a .csv name is less than 18 characters, but when it is greater it fails. And by "fails", I mean R crashes. To track down the bug, I called debug on sqlSave.
I found that there are at least two functions that sqlSave calls that cause R to crash. The first is RODBC:::odbcTableExists, which is a non-visible function. Here is the code for the function:
RODBC:::odbcTableExists
function (channel, tablename, abort = TRUE, forQuery = TRUE,
allowDot = attr(channel, "interpretDot"))
{
if (!odbcValidChannel(channel))
stop("first argument is not an open RODBC channel")
if (length(tablename) != 1)
stop(sQuote(tablename), " should be a name")
tablename <- as.character(tablename)
switch(attr(channel, "case"), nochange = {
}, toupper = tablename <- toupper(tablename), tolower = tablename <- tolower(tablename))
isExcel <- odbcGetInfo(channel)[1L] == "EXCEL"
hasDot <- grepl(".", tablename, fixed = TRUE)
if (allowDot && hasDot) {
parts <- strsplit(tablename, ".", fixed = TRUE)[[1]]
if (length(parts) > 2)
ans <- FALSE
else {
res <- if (attr(channel, "isMySQL"))
sqlTables(channel, catalog = parts[1], tableName = parts[2])
else sqlTables(channel, schema = parts[1], tableName = parts[2])
ans <- is.data.frame(res) && nrow(res) > 0
}
}
else if (!isExcel) {
res <- sqlTables(channel, tableName = tablename)
ans <- is.data.frame(res) && nrow(res) > 0
}
else {
res <- sqlTables(channel)
tables <- stables <- if (is.data.frame(res))
res[, 3]
else ""
if (isExcel) {
tables <- sub("^'(.*)'$", "\\1", tables)
tables <- unique(c(tables, sub("\\$$", "", tables)))
}
ans <- tablename %in% tables
}
if (abort && !ans)
stop(sQuote(tablename), ": table not found on channel")
enc <- attr(channel, "encoding")
if (nchar(enc))
tablename <- iconv(tablename, to = enc)
if (ans && isExcel) {
dbname <- if (tablename %in% stables)
tablename
else paste(tablename, "$", sep = "")
if (forQuery)
paste("[", dbname, "]", sep = "")
else dbname
}
else if (ans) {
if (forQuery && !hasDot)
quoteTabNames(channel, tablename)
else tablename
}
else character(0L)
}
This fails here when the table name over 18 characters in length:
res <- sqlTables(channel, tableName = tablename)
I have fixed it by changing this to:
res <- sqlTables(channel, tablename)
I then reassign the function with the same name (odbcTableExists) in the namespace with this code change using assignInNamepace.
RODBC:::odbcTableExists no longer causes an issue. However, R still crashes when sqlwrite is called from within sqlSave(). I called debug on sqlwrite, and I found that RODBC:::odbcColumns (another non-visible function) causes that to crash when tablenames are too long. Unfortunately, I am not sure how to change RODBC:::odbcColumns to avoid the bug like I did before.
I am using R 2.15.1, and the platform is :x86_64-pc-ming32/x64 (64-bit). I should also note that I am trying to run this on a work computer, but if I run the exact same code on my personal computer, R does not crash (no bug). The work computer runs windows 7 professional, and my home computer runs windows 7 home premium with R 2.14.1.
I love this hack (I too have Windows 7 Professional at R 2.15.1 at work), and it does not crash anymore, but it causes another problem after I replaced that line and used assignInNamespace; also for some reason I had to replace odbcValidChannel with RODBC:::odbcValidChannel and quoteTabNames with RODBC:::quoteTabNames
But when I used sqlSave, I got the following error:
Error in odbcUpdate(channel, query, mydata, coldata[m, ], test = test, :
no parameters, so nothing to update
I don't even use odbcUpdate anywhere in the code, and the RODBC::: sqlSave does not have the odbcUpdate call inside.
Any thoughts?
thank you,
-Alex