Currently, there is no api for creating a MIX channel.
I'm written a custom module for the same.
So far, I have written the following code. But I'm not how to proceed further.
I would really appreciate someone's guidance here. Thanks in advance.
-module(mod_custom).
-behaviour(gen_mod).
-include("logger.hrl").
-export([start/2, stop/1, reload/3, mod_options/1,
get_commands_spec/0, depends/2]).
-export([
% Create channel
add_channel/4
]).
-include("ejabberd_commands.hrl").
-include("ejabberd_sm.hrl").
-include("xmpp.hrl").
start(_Host, _Opts) ->
ejabberd_commands:register_commands(get_commands_spec()).
stop(Host) ->
case gen_mod:is_loaded_elsewhere(Host, ?MODULE) of
false ->
ejabberd_commands:unregister_commands(get_commands_spec());
true ->
ok
end.
reload(_Host, _NewOpts, _OldOpts) ->
ok.
depends(_Host, _Opts) ->
[].
get_commands_spec() ->
[
#ejabberd_commands{name = add_channel, tags = [group],
desc = "Create a WhatsApp like group",
module = ?MODULE, function = add_channel,
args = [{jid, binary}, {channel, binary}, {id, binary}],
args_example = [<<"admin#localhost">>, <<"testgroup123#localhost">>, <<"abc123456">>],
args_desc = ["Admin JID", "Channel JID", "Unique ID"],
result = {res, rescode}}
].
add_channel(JID, Channel, ID) ->
%%% Create channel code goes here...
ok.
mod_options(_) -> [].
Try something like this:
-module(mod_custom).
-behaviour(gen_mod).
-export([start/2, stop/1, reload/3, mod_options/1,
get_commands_spec/0, depends/2]).
-export([create_channel/3]).
-include("logger.hrl").
-include("ejabberd_commands.hrl").
-include("ejabberd_sm.hrl").
-include_lib("xmpp/include/xmpp.hrl").
start(_Host, _Opts) ->
ejabberd_commands:register_commands(get_commands_spec()).
stop(Host) ->
case gen_mod:is_loaded_elsewhere(Host, ?MODULE) of
false ->
ejabberd_commands:unregister_commands(get_commands_spec());
true ->
ok
end.
reload(_Host, _NewOpts, _OldOpts) ->
ok.
depends(_Host, _Opts) ->
[].
get_commands_spec() ->
[#ejabberd_commands{name = create_channel, tags = [group],
desc = "Create a WhatsApp like group",
module = ?MODULE, function = create_channel,
args = [{from, binary},
{channel, binary},
{service, binary}],
args_example = [<<"admin#localhost">>,
<<"testgroup123">>,
<<"mix.localhost">>],
args_desc = ["From JID", "Channel Name", "MIX Service"],
result = {res, rescode}}
].
create_channel(From, ChannelName, Service) ->
try xmpp:decode(
#xmlel{name = <<"iq">>,
attrs = [{<<"to">>, Service},
{<<"from">>, From},
{<<"type">>, <<"set">>},
{<<"id">>, p1_rand:get_string()}],
children =
[#xmlel{name = <<"create">>,
attrs = [{<<"channel">>, ChannelName},
{<<"xmlns">>, ?NS_MIX_CORE_0}]}
]},
?NS_CLIENT, []) of
#iq{type = set} = Iq ->
case mod_mix:process_mix_core(Iq) of
#iq{type = result} ->
ok;
_ ->
{error, unexpected_response}
end
catch _:{xmpp_codec, Why} ->
{error, xmpp:format_error(Why)}
end.
mod_options(_) -> [].
I am following this tutorial which uses scotty with persistent to create a simple API .
However, I am trying to create a simple api with scotty and mysql simple library.
Now I am stuck at one point in code .
In the below code I am not able to convert getUser function to type "ActionT Error ConfigM" because of which my code is failing.
Can anyone help me with understanding how I can convert getUser function to achieve needed type signature?
Code
type Error = Text
type Action = ActionT Error ConfigM ()
config :: Config
config = Config
{ environment = Development
,db1Conn = connect connectionInfo
}
main :: IO ()
main = do
runApplication config
runApplication :: Config -> IO ()
runApplication c = do
o <- getOptions (environment c)
let r m = runReaderT (runConfigM m) c
scottyOptsT o r application
application :: ScottyT Error ConfigM ()
application = do
e <- lift (asks environment)
get "/user" getTasksA
getTasksA :: Action
getTasksA = do
u <- getUser
json u
getUser :: IO User
getUser = do
e <- asks environment
conn <- db1Conn config
[user]<- query_ conn "select login as userId, email as userEmail from member limit 1"
return user
Error
• Couldn't match type ‘IO’ with ‘ActionT Error ConfigM’
Expected type: ActionT Error ConfigM User
Actual type: IO User
• In a stmt of a 'do' block: u <- getUser
In the expression:
do { u <- getUser;
json u }
In an equation for ‘getTasksA’:
getTasksA
= do { u <- getUser;
json u }
You left out plenty of code (imports and pragmas and the definitions of User, please include that next time - see MCVE.
But now to your question:
I would change the Action type to the following
type Action a = ActionT Error ConfigM a
then getTasksA has the following type signature
getTasksA :: Action ()
getTasksA = do
u <- getUser
json u
(alternatively you can write this as getTasksA = getUser >>= json)
and getUser
getUser :: Action User
getUser = do
e <- asks environment
conn <- db1Conn config
[user] <- liftIO $ query_ conn "select login as userId, ..."
return user
A few remarks
[user] <- liftIO $ query .. is a bad idea - if no user is found this crashes your application - try to write total functions and pattern matches. Better return a Maybe User.
getUser :: Action (Maybe User)
getUser = do
e <- asks environment
conn <- db1Conn config
fmap listToMaybe . liftIO $ query_ conn "select login as userId, ..."
if you can wrap your head around it, rather use persistent than writing your SQL queries by hand - this is quite error prone, especially when refactoring, just imagine renaming userId to userID.
you ask several times for the environment but then don't use it. Compile with -Wall or even -Werror to get a warning or even elevate warnings to compile errors (which is a good idea for production settings.
I've succeeded in triggering a simple http request in ELM and decoding the JSON response into an ELM value - [https://stackoverflow.com/questions/43139316/decode-nested-variable-length-json-in-elm]
The problem I'm facing now-
How to chain (concurrency preferred) two http requests and merge the json into my new (updated) model. Note - please see the updated Commands.elm
Package used to access remote data - krisajenkins/remotedata http://package.elm-lang.org/packages/krisajenkins/remotedata/4.3.0/RemoteData
Github repo of my code - https://github.com/areai51/my-india-elm
Previous Working Code -
Models.elm
type alias Model =
{ leaders : WebData (List Leader)
}
initialModel : Model
initialModel =
{ leaders = RemoteData.Loading
}
Main.elm
init : ( Model, Cmd Msg )
init =
( initialModel, fetchLeaders )
Commands.elm
fetchLeaders : Cmd Msg
fetchLeaders =
Http.get fetchLeadersUrl leadersDecoder
|> RemoteData.sendRequest
|> Cmd.map Msgs.OnFetchLeaders
fetchLeadersUrl : String
fetchLeadersUrl =
"https://data.gov.in/node/85987/datastore/export/json"
Msgs.elm
type Msg
= OnFetchLeaders (WebData (List Leader))
Update.elm
update msg model =
case msg of
Msgs.OnFetchLeaders response ->
( { model | leaders = response }, Cmd.none )
Updated Code - (need help with Commands.elm)
Models.elm
type alias Model =
{ lsLeaders : WebData (List Leader)
, rsLeaders : WebData (List Leader) <------------- Updated Model
}
initialModel : Model
initialModel =
{ lsLeaders = RemoteData.Loading
, rsLeaders = RemoteData.Loading
}
Main.elm
init : ( Model, Cmd Msg )
init =
( initialModel, fetchLeaders )
Commands.elm
fetchLeaders : Cmd Msg
fetchLeaders = <-------- How do I call both requests here ? And fire separate msgs
Http.get fetchLSLeadersUrl lsLeadersDecoder <----- There will be a different decoder named rsLeadersDecoder
|> RemoteData.sendRequest
|> Cmd.map Msgs.OnFetchLSLeaders
fetchLSLeadersUrl : String
fetchLSLeadersUrl =
"https://data.gov.in/node/85987/datastore/export/json"
fetchRSLeadersUrl : String <------------------ New data source
fetchRSLeadersUrl =
"https://data.gov.in/node/982241/datastore/export/json"
Msgs.elm
type Msg
= OnFetchLSLeaders (WebData (List Leader))
| OnFetchRSLeaders (WebData (List Leader)) <-------- New message
Update.elm
update msg model =
case msg of
Msgs.OnFetchLSLeaders response ->
( { model | lsLeaders = response }, Cmd.none )
Msgs.OnFetchRSLeaders response -> <--------- New handler
( { model | rsLeaders = response }, Cmd.none )
The way to fire off two concurrent requests is to use Cmd.batch:
init : ( Model, Cmd Msg )
init =
( initialModel, Cmd.batch [ fetchLSLeaders, fetchRSLeaders ] )
There is no guarantee on which request will return first and there is no guarantee that they will both be successful. One could fail while the other succeeds, for example.
You mention that you want to merge the results, but you didn't say how the merge would work, so I'll just assume you want to append the lists of leaders together in one list, and it will be useful to your application if you had only to deal with a single RemoteData value rather than multiple.
You can merge multiple RemoteData values together with a custom function using map and andMap.
mergeLeaders : WebData (List Leader) -> WebData (List Leader) -> WebData (List Leader)
mergeLeaders a b =
RemoteData.map List.append a
|> RemoteData.andMap b
Notice that I'm using List.append there. That can really be any function that takes two lists and merges them however you please.
If you prefer an applicative style of programming, the above could be translated to the following infix version:
import RemoteData.Infix exposing (..)
mergeLeaders2 : WebData (List Leader) -> WebData (List Leader) -> WebData (List Leader)
mergeLeaders2 a b =
List.append <$> a <*> b
According to the documentation on andMap (which uses a result tuple rather than an appended list in its example):
The final tuple succeeds only if all its children succeeded. It is still Loading if any of its children are still Loading. And if any child fails, the error is the leftmost Failure value.
I have a MYSQL server and MYSQL-PROXY and I am trying to manipualte the results I send to the client as a response to a SELECT query. I have writen this code in lua:
function string.starts(String,Start)
return string.sub(String,1,string.len(Start))==Start
end
function read_query_result(inj)
local fn = 1
local fields = inj.resultset.fields
while fields[fn] do
fn = fn + 1
end
fn = fn - 1
print("FIELD NUMBER: " .. fn)
for row in inj.resultset.rows do
print ("--------------")
for i = 1, fn do
if (string.starts(fields[i].name,"TEST")) then
row[i]="TESTED"
end
print ("DATA: " .. fields[i].name .. " -> " .. row[i])
end
end
return proxy.PROXY_SEND_RESULT
end
I can correctly read the field names and values. I can detect the condition where I want the result modified, but I can not get the data sent to the client.
I see two problems:
I am setting the value in the local row variable, but I have not found the way to set the real resultset (inj.Resultset.row[i] or something similar).
There is something wrong with return proxy.PROXY_SEND_RESULT, because I am seeing that whenever I comment that sentence I see the results, and If I uncomment it I get an error.
I have not found example code as a reference.
Ok. Solved.
Data has to be inserted in a table
PROXY_SEND_RESULT requires proxy.response.type to be set.
This is the correct module:
function read_query_result(inj)
local fn = 1
local fields = inj.resultset.fields
proxy.response.resultset = {fields = {}, rows = {}}
while fields[fn] do
table.insert(proxy.response.resultset.fields, {type = proxy.MYSQL_TYPE_STRING, name = fields[fn].name})
fn = fn + 1
end
fn = fn - 1
for row in inj.resultset.rows do
for i = 1, fn do
if (string.starts(fields[i].name,"TEST")) then
row[i]="TESTED"
end
end
table.insert(proxy.response.resultset.rows, row )
end
proxy.response.type = proxy.MYSQLD_PACKET_OK
return proxy.PROXY_SEND_RESULT
end
I'm changing my existing Yesod application to run on a SQL backend instead of mongo. The generated table structure is more strict then the mongo backend. Foreign key references should be created correctly on insert.
postFeedingsR :: Handler RepJson
postFeedingsR = do
muser <- maybeAuth
parsedFeeding <- parseJsonBody_ --get content as JSON
let userId = getUserId muser
let feedingWithUser = Feeding (feedingDate parsedFeeding) (feedingSide parsedFeeding) (feedingTime parsedFeeding) (feedingExcrements parsedFeeding) (feedingRemarks parsedFeeding) userId --should be linked to user..
fid <- runDB $ insert feedingWithUser --store in database
--runDB $ update fid [ FeedingUserId =. userId ] --Old mongo style of linking the feeding to the user
sendResponseCreated $ FeedingR fid --return the id
I try to update the Entity I get from parseJsonBody with the user UID from the maybeAuth. However this gives me the following error:
No instance for (aeson-0.6.0.2:Data.Aeson.Types.Class.FromJSON
(FeedingGeneric backend0))
arising from a use of `parseJsonBody_'
Possible fix:
add an instance declaration for
(aeson-0.6.0.2:Data.Aeson.Types.Class.FromJSON
(FeedingGeneric backend0))
In a stmt of a 'do' block: parsedFeeding <- parseJsonBody_
In the expression:
do { muser <- maybeAuth;
parsedFeeding <- parseJsonBody_;
let userId = getUserId muser;
let feedingWithUser
= Feeding
(feedingDate parsedFeeding)
(feedingSide parsedFeeding)
(feedingTime parsedFeeding)
(feedingExcrements parsedFeeding)
(feedingRemarks parsedFeeding)
userId;
.... }
In an equation for `postFeedingsR':
postFeedingsR
= do { muser <- maybeAuth;
parsedFeeding <- parseJsonBody_;
let userId = ...;
.... }
I'm not sure why this happens. Could anyone put me in the right direction to solve this?
Solved by changing the auth line to:
Entity uid u <- requireAuth
and by adding the function:
addUserToFeeding :: UserId -> Feeding -> Feeding
addUserToFeeding uid Feeding {feedingDate=date, feedingSide=side, feedingTime=time, feedingExcrements=ex, feedingRemarks=remarks} = Feeding date side time ex remarks uid
to create a new Feeding with associated user. This Feeding can then be stored in the normal way in Yesod:
let feedingWithUser = addUserToFeeding uid parsedFeeding
fid <- runDB $ insert feedingWithUser --store in database