Connecting a database with Thingsboard - mysql

Will you Please help me in one more important thing??? I need to store dashboard's data in a database.. according to my study thingsboard supports 3 database at the moment. NoSQL, Mysql and Hybrid (Psql+cassandra) so i have researched a lot but could not find any way to send my telemetry data to any database. I know it is possible because the thingsboard doc itself say so... but how do i do that?? I checked Psql database thingsboard that i created during installation but there are those relations are present that was made by default. i need to store my project's data in databases just like in AWS we store IoT core's data in Dynamo DB or in IoT analytics. Thingsboard do not provide any node related to DB in his rule engine?? so How do i make a rule chain to transfer my projects data in any Database server. i have installed pgadmin4 to Graphically see the database but nothing useful found. Documentation and stakoverflow geeks said that configuring Thingsboard.yml file located in monolithic installation on linux (/etc/thingsboard/conf/thingsboard.conf ) in this path it have cassandra,mysql,postgres configuration but how to properly configure it??? i tried to access the default database of postgres named thingsboard that i created on installing time but when i list the content of database it only shows the default things/relations of thingsboard if i create a device on thingsboard that is not showing in database why?? I really can use your help. Please enlighten me a way to connect my THINGSBOARD with a DATABASE.
see in my attached images there are everything default, nothing that i create on thingsboard.
enter image description here

That's wrong, ThingsBoard currently supports 3 database setups: Postgres only, Hybrid Postgres + Cassandra (only telemetries) & Postgres + Timescale. So there is no MySQL database used anywhere.
https://thingsboard.io/docs/user-guide/install/ubuntu/#step-3-configure-thingsboard-database
Find guides for connecting your devices to Thingsboard here, e.g. via MQTT:
https://thingsboard.io/docs/reference/mqtt-api/
If you would like to forward the stored telemetries of ThingsBoard to different databases, this is not possible directly with rulechains (there's only one node to store data in a cassandra table)
One way to achieve this, would be fetching the data with an external microservice/program via HTTP API and persisting the data in the database of your choice. You could use a python script for example.
https://thingsboard.io/docs/reference/rest-api/
Alternatively, you can send the data to a Message queue like Kafka instead of fetching via HTTP API. But still it would require additional tools for storing the data in external databases outside ThingsBoard.

Related

What is the most efficient way to export data from Azure Mysql?

I have searched high and low, but it seems like mysqldump and "select ... into outfile" are both intentionally blocked by not allowing file permissions to the db admin. Wouldn't it save a lot more server resources to allow file permissions than to disallow them? Any other import/export method I can find uses executes much slower, especially with tables that have millions of rows. Does anyone know a better way? I find it hard to believe Azure left no good way to do this common task.
You did not list the other options you found to be slow, but have you thought about using Azure Data Factory:
Use Data Factory, a cloud data integration service, to compose data storage, movement, and processing services into automated data pipelines.
It supports exporting data from Azure MySQL and MySQL:
You can copy data from MySQL database to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see Supported data stores and formats
Azure Data Factory allows you to define mappings (optional!), and / or transform the data as needed. It has a pay per use pricing model.
You can start an export manually or using a schedule using the .Net or Python SKD , the Rest api or Powershell.
It seems you are looking to export the data to a file, so Azure Blob Storage or Azure Files are likely to be a good destination. FTP or the local file system are also possible.
"SELECT INTO ... OUTFILE" we can achieve this using mysqlworkbench
1.Select the table
2.Table Data export wizard
3.export the data in the form of csv or Json

Azure: How to beste connect SQL-Database with MySQL/Webservice

I am new to MS Azure just with some SQL Server Background and now we are facing some design / architecture questions and I am somehow lost.
One the one hand, there is a DataWarehouse and a small SQL-Database in Azure and they store all the structured or not structured incoming data. Works fine!
Now we think of moving the MySQL Database for the first version of the website (we need to stay at MySQL for the web-service) to Azure. In version two of the website, we like to integrate some of the Data from SQL-Database and DataWarehouse so it sounds good to have all the stuff at one place.
As much as possible from all the structured Data, we would like to store at the SQL-Database and not in MySQL. MySQL should stay lightweight. But what will be the beste way to create some interaction between Webservice, MySQL und SQL-Database?
Our Webdesigner asked for some APIs and as the users should be able to change some settings in their account we would need a lot of get and set APIs. And those APIs will just handle traffic within Azure, no external Access is needed. I just discovered the option of external Tables in Azure MySQL but cant find use-cases or best practice of that.
I am looking for a solution, in which I can deliver the necessary data for the Webdesigner / Frontend and they do not need to work with any databases (as they do not like to do that).
The traffic between MySQL and SQL-Database will be low (our stream goes into Datawarehouse, some analysis in there and we save the results as structured Data in the SQL-Database) and up to now we do not need a connection between Datawarehouse and MySQL/Webservice.
Any suggestions? How would you design such a connection?
Using Azure Data Factory you can maintain both databases and transfer (called as COPY in Data Factory) from SQL DB to Azure DB.

Fill CoreData with a large SQL database

I have a large 180k row SQL (mysql) database that I want to use in CoreData. Can I create the SQLite database using Xcode, then use an SQLight client app to connect to that database, and fill it using my mysql data?
Or is there a better way to efficiently import a large data set to a CoreData store?
It will only be filled once and the data should reside on-device.
The reason I want to do this is because I am building an iOS app that needs to read from a persistent store containing most words in the English language. Along with the word, each row will contain a few other things. The app will never need to write to the database, just read from it, but it will need to read from it very quickly.
From Apple's docs it appears this is not recommended (or maybe impossible): "do not manipulate an existing Core Data-created SQLite store using the native SQLite API"
Update:
Another option that I am currently working on is to export the MySQL database to json using phpmyadmin (or another tool). Then load that json file in to the project. When the app loads (hopefully just the first time it is used), push the data from the json file in to Core Data.
You could reverse-engineer Core Data and produce a Core Data sqlite file directly if you really wanted to, but as you quoted from Apple docs this is not a good idea.
It would be easier to simply write a little macOS command-line tool which includes the same Core Data data model as your iOS app. This tool would read your MySQL database and write it to a Core Data SQLite file, which you would then ship with your iOS app.

Firebase Automatic Sync with Local PC

I'm working on a project that can take data from a Weintek HMI, put them on a webserver and then send them to an application that I created on android studio.
I've found firebase that can help me in this task.
In easybuilder that works with my hmi, I can create a mysql database that can store the data.
The problem is how can I update automatically firebase database with mysql database with an interval of time in order to access them on the android app.
If there is no solution with mysql, can someone suggest other method to extract the data and use some web server to sync it with the android app?
I don't know your specific need, in terms of data volume or application, but as a workaround, maybe this can help you:
I usually apply MQTT, which many Weintek HMIs have, to send telemetry data, and then use NodeRed to process and redirect the data to a database, email, SMS, Telegram, CSV, TXT... depending on the need , which in your case could be Firebase (I never used it).
It works great for me as I don't have to worry about HMI limitations.
The problem is the reliability of the data, in terms of confirming that when the HMI sends, the server listens and writes, but there are certainly ways to deal with this, and the fact that you need to have a server with NodeRed running.
If you have never done so, in Weintek HMIs you can send the MQTT payload cyclically using macros easily.

Connecting a MySql database to an IOS application

Basically I need to connect a MySql database to an IOS application and save a local copy to the device but i'm confused about which path I should take to do this.
Here is a basic description of the application:
The application is used to replace multiple paper based forms, allowing the user to complete a desired form on an iPad. Once the user has completed the form, the forms data is uploaded to a server.
Some forms have fields where the user is required to 'select' an option (drop down list). These options need to be pulled from a database because the options will be changed regularly.
The application still needs to work if there is no internet connection!
This means that whenever there is a connection the application needs to save a copy of the current database so that any required information to fill out forms is still available even if there is no connection.
In short my question is: What is my best option to save a local copy of a database (or just a few tables) to an IOS application?
You should look into Core Data. If you're trying to keep an updated copy of a couple tables, I would create a Core Data database that contains the information you need for your app and, every time the user uses your app, check to see if there's an internet connection. If there is, use NSURLSession to download the necessary data from the web server, after which you can compare the downloaded data to that which is in your Core Data database. If there are any discrepancies between the two, you can update your Core Data database as needed. This way you will always have a relatively up-to-date copy of your MySQL database.
This is a good tutorial for getting a feel for NSURLSession in case you haven't used it much.
Hope it helps!