Sync SQLITe / Core Data with MySQL Database - mysql

I want to sync Data between a MySQL WebServer and a mobile Database Core Data on the iPhone. On my last project I wrote php files who creates XML files with the content of the MySQL Data. And the iOS Project parsed the XML files to sync the data. To transmit the data from the iPhone to the MySQL Server I wrote a second php file. This connected to the MySQL database and execute the statement.
What do you think is it a good way to sync data between the iOS Application and the MySQL Web Server? Do you have any Ideas to make it better?

This seems to be a good plan. If you follow the plist specifications or use JSON you can convert the data even more easily (without NSXMLParser) into your custom classes and insert them into the Core Data store.
Make sure to devise a scheme where you only have to send/receive incremental changes.

Related

Connecting a database with Thingsboard

Will you Please help me in one more important thing??? I need to store dashboard's data in a database.. according to my study thingsboard supports 3 database at the moment. NoSQL, Mysql and Hybrid (Psql+cassandra) so i have researched a lot but could not find any way to send my telemetry data to any database. I know it is possible because the thingsboard doc itself say so... but how do i do that?? I checked Psql database thingsboard that i created during installation but there are those relations are present that was made by default. i need to store my project's data in databases just like in AWS we store IoT core's data in Dynamo DB or in IoT analytics. Thingsboard do not provide any node related to DB in his rule engine?? so How do i make a rule chain to transfer my projects data in any Database server. i have installed pgadmin4 to Graphically see the database but nothing useful found. Documentation and stakoverflow geeks said that configuring Thingsboard.yml file located in monolithic installation on linux (/etc/thingsboard/conf/thingsboard.conf ) in this path it have cassandra,mysql,postgres configuration but how to properly configure it??? i tried to access the default database of postgres named thingsboard that i created on installing time but when i list the content of database it only shows the default things/relations of thingsboard if i create a device on thingsboard that is not showing in database why?? I really can use your help. Please enlighten me a way to connect my THINGSBOARD with a DATABASE.
see in my attached images there are everything default, nothing that i create on thingsboard.
enter image description here
That's wrong, ThingsBoard currently supports 3 database setups: Postgres only, Hybrid Postgres + Cassandra (only telemetries) & Postgres + Timescale. So there is no MySQL database used anywhere.
https://thingsboard.io/docs/user-guide/install/ubuntu/#step-3-configure-thingsboard-database
Find guides for connecting your devices to Thingsboard here, e.g. via MQTT:
https://thingsboard.io/docs/reference/mqtt-api/
If you would like to forward the stored telemetries of ThingsBoard to different databases, this is not possible directly with rulechains (there's only one node to store data in a cassandra table)
One way to achieve this, would be fetching the data with an external microservice/program via HTTP API and persisting the data in the database of your choice. You could use a python script for example.
https://thingsboard.io/docs/reference/rest-api/
Alternatively, you can send the data to a Message queue like Kafka instead of fetching via HTTP API. But still it would require additional tools for storing the data in external databases outside ThingsBoard.

Fill CoreData with a large SQL database

I have a large 180k row SQL (mysql) database that I want to use in CoreData. Can I create the SQLite database using Xcode, then use an SQLight client app to connect to that database, and fill it using my mysql data?
Or is there a better way to efficiently import a large data set to a CoreData store?
It will only be filled once and the data should reside on-device.
The reason I want to do this is because I am building an iOS app that needs to read from a persistent store containing most words in the English language. Along with the word, each row will contain a few other things. The app will never need to write to the database, just read from it, but it will need to read from it very quickly.
From Apple's docs it appears this is not recommended (or maybe impossible): "do not manipulate an existing Core Data-created SQLite store using the native SQLite API"
Update:
Another option that I am currently working on is to export the MySQL database to json using phpmyadmin (or another tool). Then load that json file in to the project. When the app loads (hopefully just the first time it is used), push the data from the json file in to Core Data.
You could reverse-engineer Core Data and produce a Core Data sqlite file directly if you really wanted to, but as you quoted from Apple docs this is not a good idea.
It would be easier to simply write a little macOS command-line tool which includes the same Core Data data model as your iOS app. This tool would read your MySQL database and write it to a Core Data SQLite file, which you would then ship with your iOS app.

Sending .csv files to a database: MariaDB

I will preface this by saying I am very new to databases. I am working on a project for my undergraduate research that requires various sensor data to be send from a Raspberry Pi via the internet to a database. I am using MariaDB at the moment, but am open to other options.
The background: Currently all sensor data is being saved in csv files on the RPi. There will be automation to send data at given intervals to the database.
The question: Am I able to audit the file itself to a database? For our application, a csv file is the most logical data storage format and we simply want the database to be a way for us to retrieve data remotely, since the system will be installed miles away from where we work.
I have read about "LOAD DATA INFILE" on this website, but am unsure how it applies to this database. Would JSON be at all applicable for this? I am willing to learn if it makes the process more streamlined.
Thank you!
If 'sending data to the database' means that, by one means or another, additional or replacement CSV files are saved on disk, in a location accessible to a MariaDB client program, then you can load these into the database using the "mysql" command-line client and an appropriate script of SQL commands. That script very likely will make use of the LOAD DATA LOCAL INFILE command.
The "mysql" program may be launched in a variety of ways: 1) spawned by the process that receives the uploaded file; 2) launched by a cron job (Task Scheduler on Windows) that runs periodically to check for new or changed CSV files; of 3) launched by a daemon that continually monitors the disk for new or changed CSV files.
A CSV is typically human readable. I would work with that first before worrying about using JSON. Unless the CSVs are huge, you could probably open them up in a simple text editor to read their contents to get an idea of what the data looks like.
I'm not sure of your environment (feel free to elaborate), but you could just use whatever web services you have to read in the CSV directly and inject the data into your database.
You say that data is being sent using automation. How is it communicating to your web service?
What is your web service? (Is it php?)
Where is the database being hosted? (Is it in the same webservice?)

Grails with CSV (No DB)

I have been building a grails application for quite a while with dummy data using MySQL server, this was eventually supposed to be connected to Greenplum DB (postgresql cluster).
But this is not feasible anymore due to firewall issues.
We were contemplating connecting grails to a CSV file on a shared drive( which is constantly updated by greenplum DB, data is appended hourly only)
These CSV files are fairly large(3mb, 30mb and 60mb) The last file has 550,000+ rows.
Quick questions:
Is this even feasible? Can CSV be treated as a database and can grails directly access this CSV file and run queries on it, similar to that of a DB?
Assuming this is feasible, how much rework will be required in the grails codes in Datasource, controller and index ( Currently, we are connected to Mysql and we filter data in controller and index using sql queries and ajax calls using remotefunction)
Will the constant reading( csv -> grails ) and writing (greenplum -> csv) render the csv file corrupt or bring up any more problems?
I know this is not a very robust method, but I really need to understand the feasibility of this idea. Can grails function wihtout any DB and merely a CSV file on a shared drive accesssible to multiple users?
The short answer is, No. This won't be a good solution.
No.
It would be nearly impossible, if at all possible to rework this.
Concurrent access to a file like that in any environment is a recipe for disaster.
Grails is not suitable for a solution like this.
update:
Have you considered using the built in H2 database which can be packaged with the Grails application itself? This way you can distribute the database engine along with your Grails application within the WAR. You could even have it populate it's database from the CSV you mention the first time it runs, or periodically. Depending on your requirements.

What is the best practice for retrieving and downloading mass data from SQL Server using Flex/Actionscript

I have a Flex web-based application that retrieves data from SQL Server and displays it in a data grid. I'm using FileReference to export the data from data grid into a CSV file, which is fine for small amount of data. I need another method for retrieving and downloading massive data directly without displaying it in the data grid. I was thinking it must be a way to export data on the server using SQL server (like using OPENROWSET method) and then download the exported file. But I couldn't implement it so far. Also I'm not sure if it is the best approach to do such task. I was wondering if anybody could help me out to find a solution for it. I really appreciate it.
I'm using IIS 7 as web server and Adobe Flex Connector for MS SQL Server as web service to connect to SQL Server
You can use FileReference.download(URLRequest) to download the file directly from the remote server. The file itself should be created on the server side. For example, if you use ASP.NET, you can create a service (WCF, Fluorine etc) that fetches data from the DB, puts it into generated file and returns the url of the file to Flex client which can then use FileReference.download.