Scotty api with mysql simple in haskell - mysql

I am following this tutorial which uses scotty with persistent to create a simple API .
However, I am trying to create a simple api with scotty and mysql simple library.
Now I am stuck at one point in code .
In the below code I am not able to convert getUser function to type "ActionT Error ConfigM" because of which my code is failing.
Can anyone help me with understanding how I can convert getUser function to achieve needed type signature?
Code
type Error = Text
type Action = ActionT Error ConfigM ()
config :: Config
config = Config
{ environment = Development
,db1Conn = connect connectionInfo
}
main :: IO ()
main = do
runApplication config
runApplication :: Config -> IO ()
runApplication c = do
o <- getOptions (environment c)
let r m = runReaderT (runConfigM m) c
scottyOptsT o r application
application :: ScottyT Error ConfigM ()
application = do
e <- lift (asks environment)
get "/user" getTasksA
getTasksA :: Action
getTasksA = do
u <- getUser
json u
getUser :: IO User
getUser = do
e <- asks environment
conn <- db1Conn config
[user]<- query_ conn "select login as userId, email as userEmail from member limit 1"
return user
Error
• Couldn't match type ‘IO’ with ‘ActionT Error ConfigM’
Expected type: ActionT Error ConfigM User
Actual type: IO User
• In a stmt of a 'do' block: u <- getUser
In the expression:
do { u <- getUser;
json u }
In an equation for ‘getTasksA’:
getTasksA
= do { u <- getUser;
json u }

You left out plenty of code (imports and pragmas and the definitions of User, please include that next time - see MCVE.
But now to your question:
I would change the Action type to the following
type Action a = ActionT Error ConfigM a
then getTasksA has the following type signature
getTasksA :: Action ()
getTasksA = do
u <- getUser
json u
(alternatively you can write this as getTasksA = getUser >>= json)
and getUser
getUser :: Action User
getUser = do
e <- asks environment
conn <- db1Conn config
[user] <- liftIO $ query_ conn "select login as userId, ..."
return user
A few remarks
[user] <- liftIO $ query .. is a bad idea - if no user is found this crashes your application - try to write total functions and pattern matches. Better return a Maybe User.
getUser :: Action (Maybe User)
getUser = do
e <- asks environment
conn <- db1Conn config
fmap listToMaybe . liftIO $ query_ conn "select login as userId, ..."
if you can wrap your head around it, rather use persistent than writing your SQL queries by hand - this is quite error prone, especially when refactoring, just imagine renaming userId to userID.
you ask several times for the environment but then don't use it. Compile with -Wall or even -Werror to get a warning or even elevate warnings to compile errors (which is a good idea for production settings.

Related

MySQL Connection Error Handling with tryCatch

I have a R Shiny Application which uses MySQL as a datasource. When the application loads and the user logs into the app with their username and password, a database connection interface opens up where the user inputs their MySQL credentials. In order to prevent the app from crashing when the user enters the wrong MySQL connection credential, I am trying to use the following error handling.
# run when Load Data button is clicked
datapms <- eventReactive(input$pull_data, {
req(input$db_user,input$db_user,input$db_password,input$db_name,input$db_host,input$db_port)
progress <- Progress$new(session, min=1, max=15)
on.exit(progress$close())
progress$set(message = 'Pulling data from database',
detail = 'This message will disappear once completed.')
# establish a database connection
tryCatch({
con <- RMySQL::dbConnect(
RMySQL::MySQL(),
user = input$db_user,
password = input$db_password,
dbname = input$db_name,
host = input$db_host
)
}, error = function(e) {
debug_msg(e$message)
})
# construct the SQL statement
sql <- "SELECT * FROM pmsanalytics;"
# Fetch data
pmsanalytics <- tryCatch({
pmsanalytics <- dbGetQuery(conn = con, sql)
}, error = function(e) {
debug_msg(e$message)
})
### display debugging message in R (if local)
### and in the console log (if running in shiny)
debug_msg <- function(...) {
is_local <- Sys.getenv('SHINY_PORT') == ""
in_shiny <- !is.null(shiny::getDefaultReactiveDomain())
txt <- toString(list(...))
if (is_local) message(txt)
if (in_shiny) shinyjs::runjs(sprintf("console.debug(\"%s\")", text))
}
Initially this code was working, and the app was not crashing. However, now, when for example one enters the wrong connection credentials, i am getting the following error message:
Warning: Error in as.character: cannot coerce type 'closure' to vector of type 'character'
138: sprintf
136: debug_msg [C:\PMSAnalytics/app.R#107]
135: value[[3L]] [C:\PMSAnalytics/app.R#211]
134: tryCatchOne
133: tryCatchList
132: tryCatch
131: eventReactiveValueFunc [C:\PMSAnalytics/app.R#202]
Basically, the app crashes because there is no data which it is expecting to get, in other words the reactive datapms() data source it is expecting to get is empty.
Kind assist in reviewing my code to prevent app crashing.
Regards,
Chris

tryCatch to Prevent R Shiny App Crushing on MySQL Connection Error

My Shiny App was crushing when wrong connection credentials were passed to the connection string. I then put my connection string within a tryCatch as follows:
,,,
ConnectToDb <- function(){
con <- tryCatch({
dbConnect(MySQL(),
user = input$db_user,
password = input$db_password,
dbname = input$db_name,
host = input$db_host,
port = input$db_port)
print("Connection made")
####
sql <- "SELECT * FROM PMSAnalytics;"
data <- dbGetQuery(con, sql)
# # Disconnect from the DB
dbDisconnect(con)
# # Convert to data.frame
data <- data.frame(data)
data$timestamp <- as_datetime(now())
data
####
}, error = function(e) {
message('Please confirm your login details')
print(e)
},
warning = function(w){
message('A warning has occured')
print(w)
return(NA)
}
)
}
,,,
Now the application does not crush, but however no error messages or warning are passed when wrong credentials are used and neither do I get a connection success. I have checked this site for similar questions, but I seem not to get any. Kindly assist with polishing the code.
Regards,
Chris
I work with showNotification, which directly shows a notification in the shiny UI, you could also use it for the connection success.
Also, Options for this are,
duration = 60 (in this case for 60 seconds)
closeButton = FALSE
For example:
error = function(e) {
showNotification(paste0(e), type = 'error')
}
warning = function(w){
showNotification(paste0(w), type = 'warning')
return(NA)
}

F# Connect to Online MySQL DB execute query

I am making a F# project and need to do some database queries to an online mysql db. Can anyone please help me. I need something like this
\\ Connect to DB
let servername = "localhost"
let username = "username"
let password = "password"
\\ Code that connects to db
\\ Print error message if can connect
\\ Query
let query = "SELECT * FROM table ..."
\\ Code that executes query
\\ Error Message if query not executed
You should install the .NET driver for MySQL. Then install the SQLprovider. There are samples for MySQL in the docs. You would connect to the db and query it like this:
type sql = SqlDataProvider<
dbVendor,
connString,
ResolutionPath = resPath,
IndividualsAmount = indivAmount,
UseOptionTypes = useOptTypes,
Owner = "HR"
>
let ctx = sql.GetDataContext()
let employees =
ctx.Hr.Employees
|> Seq.map (fun e -> e.ColumnValues |> Seq.toList)
|> Seq.toList
connstring will be something like this:
[<Literal>]
let connString = "Server=localhost;Database=HR;User=root;Password=password"
You should also read https://msdn.microsoft.com/visualfsharpdocs/conceptual/walkthrough-accessing-a-sql-database-by-using-type-providers-%5bfsharp%5d
I would also say SQLProvider is the way to go as then you have a validation of your logics against your database and you notice if your database changes.
But you can connect manually if you want to:
// Reference Nuget package MySql.Data
//#r #"./../packages/MySql.Data/lib/net40/MySql.Data.dll"
open System
open MySql.Data.MySqlClient
let cstr = "server = localhost; database = myDatabase; uid = username;pwd = password"
let ExecuteSqlAsync (query : string) parameters =
use rawSqlConnection = new MySqlConnection(cstr)
async {
do! rawSqlConnection.OpenAsync() |> Async.AwaitIAsyncResult |> Async.Ignore
use command = new MySqlCommand(query, rawSqlConnection)
parameters |> List.iter(fun (par:string*string) -> command.Parameters.AddWithValue(par) |> ignore)
let! affectedRows = command.ExecuteNonQueryAsync() |> Async.AwaitTask
match affectedRows with
| 0 -> "ExecuteSql 0 rows affected: " + query |> Console.WriteLine
()
| x -> ()
}
let ExecuteSql (query : string) parameters =
use rawSqlConnection = new MySqlConnection(cstr)
rawSqlConnection.Open()
use command = new MySqlCommand(query, rawSqlConnection)
parameters |> List.iter(fun (par:string*string) -> command.Parameters.AddWithValue(par) |> ignore)
let affectedRows = command.ExecuteNonQuery()
match affectedRows with
| 0 -> "ExecuteSql 0 rows affected: " + query |> Console.WriteLine
()
| x -> ()

Using the LDAvis package in R to create a gist file of the result

I'm using LDAvis for topic modeling and trying to use the as.gist option to create a gist. When serVis executes there is a timeout in curl::curl_fetch_memory after about 10 seconds. If I immediately execute serVis again I get a different error 'Problems parsing JSON' and from then on whenever serVis is run that same error recurs.
If I start all over with a fresh workspace the same behavior occurs. The first time serVis is run, curl::curl_fetch_memory times out after about 10 seconds. Subsequent executions return 'Problems parsing JSON'.
If I don't use the as.gist option it works fine, but of course doesn't create a gist.
Very rarely, it works and a gist is created. If I change parameters to reduce the size of the JSON object it usually works, which makes me think it may be related to size.
I have explored the various RCurlOptions timeout settings. Currently, they are set as
options(RCurlOptions = list(cainfo = system.file("CurlSSL", "cacert.pem",
package = "RCurl"),
connecttimeout = 300, timeout = 3000,
followlocation = TRUE, dns.cache.timeout = 300))
Below is a console listing with debug set on curl::curl_fetch_memory.
> json <- createJSON(phi = cases$phi,
+ theta = cases$theta,
+ doc.len .... [TRUNCATED]
> serVis(json, open.browser = TRUE, as.gist = TRUE, description = 'APM Community')
debugging in: curl::curl_fetch_memory(url, handle = handle)
debug: {
output <- .Call(R_curl_fetch_memory, url, handle)
res <- handle_response_data(handle)
res$content <- output
res
}
Browse[2]> output <- .Call(R_curl_fetch_memory, url, handle)
Error: Timeout was reached
Browse[2]> output <- .Call(R_curl_fetch_memory, url, handle)
Browse[2]> rawToChar(output)
[1] "{\"message\":\"Problems parsing JSON\",\"documentation_url\":\"https://developer.github.com/v3\"}"
Browse[2]>
.
.
exiting from: curl::curl_fetch_memory(url, handle = handle)
Error: Problems parsing JSON
Any hints on how to debug this problem?

Persist an entity with a user reference in Yesod?

I'm changing my existing Yesod application to run on a SQL backend instead of mongo. The generated table structure is more strict then the mongo backend. Foreign key references should be created correctly on insert.
postFeedingsR :: Handler RepJson
postFeedingsR = do
muser <- maybeAuth
parsedFeeding <- parseJsonBody_ --get content as JSON
let userId = getUserId muser
let feedingWithUser = Feeding (feedingDate parsedFeeding) (feedingSide parsedFeeding) (feedingTime parsedFeeding) (feedingExcrements parsedFeeding) (feedingRemarks parsedFeeding) userId --should be linked to user..
fid <- runDB $ insert feedingWithUser --store in database
--runDB $ update fid [ FeedingUserId =. userId ] --Old mongo style of linking the feeding to the user
sendResponseCreated $ FeedingR fid --return the id
I try to update the Entity I get from parseJsonBody with the user UID from the maybeAuth. However this gives me the following error:
No instance for (aeson-0.6.0.2:Data.Aeson.Types.Class.FromJSON
(FeedingGeneric backend0))
arising from a use of `parseJsonBody_'
Possible fix:
add an instance declaration for
(aeson-0.6.0.2:Data.Aeson.Types.Class.FromJSON
(FeedingGeneric backend0))
In a stmt of a 'do' block: parsedFeeding <- parseJsonBody_
In the expression:
do { muser <- maybeAuth;
parsedFeeding <- parseJsonBody_;
let userId = getUserId muser;
let feedingWithUser
= Feeding
(feedingDate parsedFeeding)
(feedingSide parsedFeeding)
(feedingTime parsedFeeding)
(feedingExcrements parsedFeeding)
(feedingRemarks parsedFeeding)
userId;
.... }
In an equation for `postFeedingsR':
postFeedingsR
= do { muser <- maybeAuth;
parsedFeeding <- parseJsonBody_;
let userId = ...;
.... }
I'm not sure why this happens. Could anyone put me in the right direction to solve this?
Solved by changing the auth line to:
Entity uid u <- requireAuth
and by adding the function:
addUserToFeeding :: UserId -> Feeding -> Feeding
addUserToFeeding uid Feeding {feedingDate=date, feedingSide=side, feedingTime=time, feedingExcrements=ex, feedingRemarks=remarks} = Feeding date side time ex remarks uid
to create a new Feeding with associated user. This Feeding can then be stored in the normal way in Yesod:
let feedingWithUser = addUserToFeeding uid parsedFeeding
fid <- runDB $ insert feedingWithUser --store in database