I have been trying to create a stream with Spring Cloud Dataflow but have not had much luck (mostly due to the lack of documentation).
Issue 1: Accessing web GUI of dockerized Spring Cloud Dataflow
I have a dockerized Spring Cloud server running with Kafka on a basic Ubuntu container. For some reason I can't access the web GUI in Windows (at < docker-machine ip >:9393/dashboard). However, I have a separate Docker Ubuntu container running Nginx reverse proxy, which shows up when I go to < docker-machine ip >/index.html or etc. I don't think it is an issue with ports, I have the Spring Cloud container setup with -p 9393:9393 and the port is otherwise unused.
Issue 2: Routing by JSON Header
My ultimate goal is to get a file loaded in from Nginx and routed based on its JSON header (there are two different JSON headers) and then ingest queries to Cassandra.
I can do all of this except the sorting by JSON header. Which app would you recommend I use?
Issue 1: Accessing web GUI of dockerized Spring Cloud Dataflow
We might need little more details around this. Assuming this is the local-server, perhaps you could share the docker scripts/image, so we could try it out.
Issue 2: Routing by JSON Header
The router-sink application would come handy for this type of use-cases. This application routes the payload to named destinations based on certain conditions, so you'd have the opportunity to route the payload with respective ingest-query to Cassandra.
stream 1:
stream create test --definition "file | router --expression=header.contains('a')?':foo':':bar’"
stream 2:
stream create baz --definition ":foo > cassandra --ingest-query=\"query1\""
stream 3:
stream create wiz --definition ":bar > cassandra --ingest-query=\"query2\""
(where: foo and bar are named destinations)
Related
I am new to AWS/Database.
Since i am completely beginner to this, any suggestions will be appreciated.
Currently in the project it has been planned like data from AWS database will be pushed using SNS HTTP fanout to external MySql Database.
NOTE :
1.The data will be pushed by the Client using AWS SNS
2. We have no access to the AWS account nor we are planning to have a AWS account.
3. External MySql database is a private database running on Linux Server
I have gone through the Official documentation of AWS SNS, and also some websites. This is all i found :
Use external applications like Zapier to map the data.
Develop some application to map the data.
Is it like using a Servlet application in the receiver side to update the table, or is there any other methods?
AWS DB -----> SNS -----> _________ -----> External MySql DB
Thanks
If you cannot have an AWS Account, you can have your own web server consume the SNS Messages. SNS can deliver messages to an HTTP/HTTPS endpoint in a predefined structure. Read more details here. You can enable such an endpoint on your own server and share your server URL with the AWS Account owner. They can create a subscription from their SNS topic to your endpoint.
For setting up this endpoint, there are many options. ExpressJS is one such popular framework to quickly implement HTTP APIs.
Probably, option two would be more suited, or at least first to be considered. For that option you would have have to develop a lambda function which would receive data from SNS, re-format if needed and upload it to MySQL. So your architecture would look like:
Data--->SNS--->Lambda function---> MySQL
Depending on the amount of incoming data to the SNS, you may add SQS queue as well to the mix, to buffer the records and enable fun-out architecture. For example:
/---> SQS queue 1---> Lambda function 1---> MySQL
Data -->SNS --/
\
\--- SQS queue 2 ---> Lambda function 2, EC2 instance, Container ---> Other destination
Other solutions are possible. But I would first consider the above, before looking into other ways.
I am developing a Fiware application and I am using many Fiware GEs (Wirecloud, IoT Agent, Orion, Cygnus, MondoDB, MySQL) that are integrated locally on my linux pc using docker.
I managed to make Orion to receive measurements from a temperature sensor and store them in a MySQL database through Cygnus.
Now, I would like to create a history graph in Wirecloud using those measurements. I tried to use a History Module to Linear Graph operator that intermediates between an NGSI source operator and a Linear graph widget but I don't know what URL should I use for the HistoryMod Server URL.
I've tried to open the user manual for the History Module operator but the link is broken so I can not read it.
I am posting some images with the wiring, HistoryModule settings, NGSI source settings and the Linear Graph error that I am receiving, for better understanding.
My questions are the following:
What URL should I use for the HistoryMod Server URL?
Am I wiring the components correctly or am I missing something?
I don't know the History Module operator, but I think that your wiring is correct: data resource, operator that request it and a widget to show the results.
In your HistoryModule settings try to change your URL for: http://130.206.82.141:8666/STH/v1/contextEntities/
Did you try to change your Linear Graph widget for a table? (for test that the problem is in the operator and isn't in your widget).
How do you please managed to connect to the database (mysql). I have almost same configuration using postgres instead, on a remote server and managed via docker-compose.
I can configure everything from the docker-compose file, but I don't know how to start postgres running, so as to create the database to use.
I'm writing an application which delivers data from remote devices over an HTTP API. These devices are on a mobile data connection and have limited resources.
I wish to receive custom monitoring data over the HTTP API, relying on the security model designed in the application, and push that data to Zabbix directly (or indirectly) from node.js. I do not wish to use Zabbix Agent on the remote devices.
I see that I can use zabbix_sender to send data to a Zabbix server containing a pre-configured host. This works great. I intend to deliver monitoring data over my custom API, and when received give this data to zabbix_sender inside the server network.
The problem is there are many devices in the field and more are being added all the time.
TL;DR:
When zabbix_sender provides a custom hostname which doesn't exist in Zabbix already, it fails.
I would like to auto-add discovered hosts, based upon new hostnames from zabbix_sender. How would I do this?
Also, extra respect if anyone can give examples of how to avoid zabbix_sender and send data directly from node.js to the Zabbix server. I mean: suggest an NPM package that you have experience using. (Update: Found working node.js package here: https://www.npmjs.com/package/node-zabbix-sender)
Zabbix configuration: I'm learning from Zabbix 2.4 installed in Docker, no custom configuration from this Dockerhub: https://hub.docker.com/r/zabbix/zabbix-2.4/
Probably the best would be to use the Zabbix API to create hosts directly.
Alternatively, you could set up an action and emulate active agent connection, which would make Zabbix create the host via the active agent auto-regstration.
You could also use low level discovery (LLD) to send in JSON, which would result in hosts/items being created, based on prototypes.
In all of these cases you have to wait for one minute (by default) for the hosts to appear in the Zabbix cache, then you can send the data.
Also note that Zabbix 2.4 is not supported anymore, it will receive no fixes - it is not a "long-term support" release.
I'm working with an API app on Azure by deploying an API written in NodeJs which stores data in MongoDb.
Make a new API for web & mobile apps from Azure portal.
Choose MongoDb by adding MongoLab module from Azure.
Create a table (collection) and populate it with few entries.
Prepare our Git repository from Azure portal and link it to our local Git on computers.
Decide which NodeJs modules to use to set the dependencies.
Edit the configuration file for NodeJs.
Make the main API file with the following functionalities:
Connection to the database.
Running the service.Making CRUD operation services (CREATE, READ,DELETE...)
Testing our API on a browser.
Making an application using our API
My question: how to use these steps to store data in mysql database(azure)?
I cannot fully understand your requirement, do you want to use the MySQL database to store your data in your API application in Node.js? If not, please clarify your purpose.
To implement connection to MySQL in Node.js, you can use some 3rd part MySQL handler modules.
For example:
node-mysql - A pure node.js JavaScript Client implementing the MySql protocol
or
Sequelize - A promise-based ORM for Node.js and io.js
Any further concern, please feel free to let me know.
Can Arduino Yun connect to MYSQL on External Server and store sensor data on it. If yes how?.
Technically you can.
You could write or port a mysql driver to the Arduino, but the small amount of memory and process power will make that no very easy.
With the Yún you could also install some libraries and program an app running on the linux side that forwards incoming data from the serial to your mysql database.
An other option:
Use an intermediary app to save incoming data to your database. The interface could be a typical HTTP API or via a publish/subscribe broker.
For simplicity I would recommend to go for option three.