Email Alert to be sent when a Qlik Sense Server Tasks fails using Zabbix Monitoring Software - zabbix

I'm trying to monitor my qlik sense server tasks using zabbix and incase of any task failure I want to trigger a email citing the failure. I'm currently facing the following issues:
Enabling SNMP protocol in the Qlik Sense Server
post the integration how the QMC parameters are brought into zabbix
application
This is my first time working in integration so kindly let me know if you have any idea or suggestions on how to proceed. Thanks in advance.
Have a nice day!

I do not know Zabbix, but you can use QRS REST API to get the list of tasks with their status.
the endpoint is:
/qrs/executionresult/full
or
/qrs/executionresult/{taskid}
the status comes as an integer whose values are as follow:
0, 'NeverStarted'
1, 'Triggered'
2, 'Running'
3, 'Queued'
4, 'AbortInitiated'
5, 'Aborting'
6, 'Aborted'
7, 'FinishedSuccess'
8, 'FinishedFailed'
9, 'Skipped'
10, 'Retry'
11, 'Error',
12, 'Reset'
For info, I built such mechanism directly in a Qlik Sense application using Qlik REST connector.
The Qlik application is calling another REST API when detecting such a problem and this fires an incident in our system.

Related

Keeping sync between two devices accessing the same account (different/same session)

Kind of curious as I'm aiming for a stateless setup, how some people go about coding/setting up their session handling when many devices accessing a single account occurs.
I work with Node.JS currently but the pseudo is appreciated,
This is how my sessions look currently, ID is a unique value. (Redis stored JSON by KEY)
{"cookie": {
"originalMaxAge": null,
"expires": null,
"secure": true,
"httpOnly": true,
"domain": "",
"path": "/",
"sameSite": "strict"
},
"SameSite": "7e5b3108-2939-4b4b-afdc-39ed5dbd00d0",
"loggedin": 1,
"validated": 1,
"username": "Tester12345",
"displayself": 1,
"avatar": "{ \"folder\": \"ad566c0b-aeac-4db8-9f54-36529c99ef15/\", \"filetype\": \".png\" }",
"admin": 0,
"backgroundcolor": "#ffffff",
"namebackgroundcolor": "#000000",
"messagetextcolor": "#5d1414"}
I have no issues with this setup until I have a user logged in twice different devices and one decides to adjust their colors or avatar; one session is up to date and the other is completely lost.
I do my best when possibly to call out to database to ensure the information is up to date when it's most important but curious for this small slip up what I should be doing? I'd hate to call for database each request to get this information but think most do this any-how?
I could set up in my mind a hundred different ways to go about this but was hoping maybe someone who has dealt with this has some excellent ideas about this. I'd like to just be efficient and not make my databases work as hard if they don't need to, but I know session handling makes the call each request so trying to determine a final thought.
Open to all ideas, and my example above is a JSON insert into Redis; I'm open to changing to MySQL or another store.
One way to notify devices and keep them up-to-date about changes made elsewhere is with a webSocket or socket.io connection from device to the server. When the device logs in as a particular user and then makes a corresponding webSocket or socket.io connection to the server, the server keeps track of what user that connection belongs to. The connection stays connected for the duration of the user's presence.
Then, if a client changes something (let's use a background color as an example), and tells the server to update its state to that new color, the server can look in its list of connections to see if there are any other connections for this same account. If so, the server sends that other connection a message indicating what change has been made. That client will then receive that notification and can update their view immediately. This whole thing can happen in milliseconds without any polling by the client.
If you aren't familiar with socket.io, it is a layer on top of webSocket that offers some additional features.
In socket.io, you can add each device that connects on behalf of a specific account to a socket.io room that has a unique name derived from the account (often an email address or username). Upon login:
// join this newly connected socket to a room with the name
// of the account it belongs to
socket.join(accountName);
Then, you can easily broadcast to all devices connected to that room with one simple socket.io API call:
// send a message to all currently connected devices using this account
io.emit(accountName, msg);
When socket.io connections are disconnected, they are automatically removed from any rooms that they have been placed in.
A room is a lightweight collection of currently connected sockets so it works quite well for a use like this.

OnSubscriptionError : ServiceResponseException: Unable to retrieve events for this subscription

Using C# .NET 4.6.1 with Microsoft.Exchange.WebServices nuget package 2.2.0 connecting to Office 365.
Expected disconnection is set to every 3 minutes.
Error is handled as below:
The OnSubscriptionError event fires with a ServiceResponseException.
"Unable to retrieve events for this subscription.  The subscription must be recreated., The events couldn't be read."
I recreate/renew the subscription as suggested by the error.
I then receive one notification but no more.
I have to restart the C# program/service in order to get StreamingSubscriptions working again.
What is the best practice for recreating/renewing the subscription?
Do you know of any other possible causes? - Microsoft themselves have been mainly focused on tweaking the throttling.
Thanks.

mosquitto_publish returns MOSQ_ERR_SUCCESS eventhough MQTT broker is not running

I connected to MQTT broker using Mosquitto C client libraray.
I used below code for connection.
ret = mosquitto_connect (mosq, MQTT_HOSTNAME, MQTT_PORT, 0);
After connecting to broker I stopped the broker service.
Now I tried to publish message using below code.
ret = mosquitto_publish (mosq, NULL,topic, strlen (text), text, 1, 1);
Eventhough the broker is running, mosquitto_publish API returns success.
When calling mosquitto_publish API second time, it returns 14.
Why mosquitto_publish returns success evethough the broker is running?How to fix this issue?
Thanks in advance.
When used together with mosquitto_start(), the mosquitto_publish() function is entirely asynchronous. All it does is add a new message to the queue and wake up the network thread. If everything was fine the last time the client attempted to communicate with the broker, then we have no way of knowing that the connection is down. When you call mosquitto_publish() it can only return success, barring any other errors. When the client attempts to send that publish, it discovers that the network is down and so any subsequent publishes will return the appropriate error.

Is Windows.Media.Ocr API available on Windows IoT Core with RaspberryPi 2?

I checked out many places for the answer but couldn't get one. According to this page the API should be available. But, when I run the codes on RaspberryPi 2, I get Unhandled Exception Error. To ensure that my codes are correct I ran the codes on my local machine, it runs perfectly fine.
Windows.Media.Ocr is universal API, so it is available.
With RaspberryPy there are no any language resources installed on device.
On such device, if you try code like following, ocrEngine will be null.
var ocrEngine = OcrEngine.TryCreateFromLanguage(new Language("en"));
You can check for available language recognizers with OcrEngine.AvailableRecognizerLanguages property.

SQL Server 2008 table change (insert/update/delete) notification push on broker

I have a rather complex and large database with about 3000+ objects (tables/triggers/sps combined). I inherited this DB and restructuring it is probably 3-4 years away.
meanwhile, I need to implement a pub sub feature for any insert/update/delete on these tables. Given number of tables and existing queries probably query notification (and SQL Dependency) will not work. What I am looking for is a way to push the changes (what changed in table - like records PK and table name) on the service broker so I can use external activator to then retrieve change,and then use my custom pub sub from that point onwards.
I have pretty much all the ducks lined up except for the way to push change notification on service broker.
Any help/pointers are appreciated.
Thanks.
N M
PS. I did look around for similar postings and did come across a few however, MSDN articles they referred to seem to have all removed - not sure what's going on on MSDN site.
For external activator look at Microsoft SQL Server 2008 Feature Pack- "Microsoft SQL Server 2008 R2 Service Broker External Activator".
For console application (that processes messages) great idea is to drop an eye in codeplex. There is good examples.
To put event notification (notifications, that will be used by external activator service) code looks something like this:
Create Queue ExternalActivatorQueue;
Create Service ExternalActivatorService On Queue ExternalActivatorQueue
([http://schemas.microsoft.com/SQL/Notifications/PostEventNotification])
Create Event Notification NotifyExternalActivator
On Queue dbo.ProcessQueue
For QUEUE_ACTIVATION
To Service 'ExternalActivatorService', 'current database'
To send message in the queue:
Declare #h UniqueIdentifier;
Declare #x xml = '<tag/>';
Begin Dialog Conversation #h
From Service MyTableService
To Service 'ProcessService'
With Encryption = OFF;
Send On Conversation #h(#x)
All steps i done to make it work is here, but just in Latvian :). There actually is almost what you need (trigger that sends messages when data are inserted in table..).