Fill CoreData with a large SQL database - mysql

I have a large 180k row SQL (mysql) database that I want to use in CoreData. Can I create the SQLite database using Xcode, then use an SQLight client app to connect to that database, and fill it using my mysql data?
Or is there a better way to efficiently import a large data set to a CoreData store?
It will only be filled once and the data should reside on-device.
The reason I want to do this is because I am building an iOS app that needs to read from a persistent store containing most words in the English language. Along with the word, each row will contain a few other things. The app will never need to write to the database, just read from it, but it will need to read from it very quickly.
From Apple's docs it appears this is not recommended (or maybe impossible): "do not manipulate an existing Core Data-created SQLite store using the native SQLite API"
Update:
Another option that I am currently working on is to export the MySQL database to json using phpmyadmin (or another tool). Then load that json file in to the project. When the app loads (hopefully just the first time it is used), push the data from the json file in to Core Data.

You could reverse-engineer Core Data and produce a Core Data sqlite file directly if you really wanted to, but as you quoted from Apple docs this is not a good idea.
It would be easier to simply write a little macOS command-line tool which includes the same Core Data data model as your iOS app. This tool would read your MySQL database and write it to a Core Data SQLite file, which you would then ship with your iOS app.

Related

Backup core data, one entity only

My application requires some kind of data backup and some kind of data exchange between users, so what I want to achieve is the ability to export an entity but not the entire database.
I have found some help but for the full database, like this post:
Backup core data locally, and restore from backup - Swift
This applies to the entire database.
I tried to export a JSON file, this might work except that the entity I'm trying to export contains images as binary data.
So I'm stuck.
Any help exporting not the full database but just one entity or how to write a JSON that includes binary data.
Take a look at protobuf. Apple has an official swift lib for it
https://github.com/apple/swift-protobuf
Protobuf is an alternate encoding to JSON that has direct support for serializing binary data. There are client libraries for any language you might need to read the data in, or command-line tools if you want to examine the files manually.

Grails with CSV (No DB)

I have been building a grails application for quite a while with dummy data using MySQL server, this was eventually supposed to be connected to Greenplum DB (postgresql cluster).
But this is not feasible anymore due to firewall issues.
We were contemplating connecting grails to a CSV file on a shared drive( which is constantly updated by greenplum DB, data is appended hourly only)
These CSV files are fairly large(3mb, 30mb and 60mb) The last file has 550,000+ rows.
Quick questions:
Is this even feasible? Can CSV be treated as a database and can grails directly access this CSV file and run queries on it, similar to that of a DB?
Assuming this is feasible, how much rework will be required in the grails codes in Datasource, controller and index ( Currently, we are connected to Mysql and we filter data in controller and index using sql queries and ajax calls using remotefunction)
Will the constant reading( csv -> grails ) and writing (greenplum -> csv) render the csv file corrupt or bring up any more problems?
I know this is not a very robust method, but I really need to understand the feasibility of this idea. Can grails function wihtout any DB and merely a CSV file on a shared drive accesssible to multiple users?
The short answer is, No. This won't be a good solution.
No.
It would be nearly impossible, if at all possible to rework this.
Concurrent access to a file like that in any environment is a recipe for disaster.
Grails is not suitable for a solution like this.
update:
Have you considered using the built in H2 database which can be packaged with the Grails application itself? This way you can distribute the database engine along with your Grails application within the WAR. You could even have it populate it's database from the CSV you mention the first time it runs, or periodically. Depending on your requirements.

Sync SQLITe / Core Data with MySQL Database

I want to sync Data between a MySQL WebServer and a mobile Database Core Data on the iPhone. On my last project I wrote php files who creates XML files with the content of the MySQL Data. And the iOS Project parsed the XML files to sync the data. To transmit the data from the iPhone to the MySQL Server I wrote a second php file. This connected to the MySQL database and execute the statement.
What do you think is it a good way to sync data between the iOS Application and the MySQL Web Server? Do you have any Ideas to make it better?
This seems to be a good plan. If you follow the plist specifications or use JSON you can convert the data even more easily (without NSXMLParser) into your custom classes and insert them into the Core Data store.
Make sure to devise a scheme where you only have to send/receive incremental changes.

Sencha local mysql server [duplicate]

I'm about to port an Android-Travellog App to other Plattforms using Sencha Touch.
The Problem is, that Sencha only has a Store System to store Data, but doesnt appear to have a possibilty to acctually make MySql queries.
And since most of the Mysql code in my previous app is already there, id would be quite a pain to redo everything with Senchas new System.
Is there a possibilty to use mysql (or any other sql) queries with Sencha to Store Data on the Phone?
Sencha stores and proxies abstract away the need to write raw query code. A store can use one of a number of different proxies for interfacing with different back-end data stores, one of which is the SQL proxy, which as you can see in the source code provides an API for basic data querying WebSQL databases.
If you want to gain the full benefit of the framework and do things the "Sencha way" you'll probably want to start from scratch and architect your app to use the stores API.

NetSuite Migrations

Has anyone had much experience with data migration into and out of NetSuite? I have to export DB2 tables into MySQL, manipulate data, and then export ina CSV file. Then take a CSV file of accounts and manipulate the data again for accounts to match up from our old system to new. Anyone tried to do this in MySQL?
A couple of options:
Invest in a data transformation tool that connects to NetSuite and DB2 or MySQL. Look at Dell Boomi, IBM Cast Iron, etc. These tools allow you to connect to both systems, define the data to be extracted, perform data transformation functions and mappings and do all the inserts/updates or whatever you need to do.
For MySQL to NetSuite, php scripts can be written to access MySQL and NetSuite. On the NetSuite side, you can either do SOAP web services, or you can write custom REST APIs within NetSuite. SOAP is probably a bit slower than REST, but with REST, you have to write the API yourself (server side JavaScript - it's not hard, but there's a learning curve).
Hope this helps.
I'm an IBM i programmer; try CPYTOIMPF to create a pretty generic CSV file. I'll go to a stream file - if you have NetServer running you can map a network drive to the IFS directory or you can use FTP to get the CSV file from the IFS to another machine in your network.
Try Adeptia's Netsuite integration tool to perform ETL. You can also try Pentaho ETL for this (As far as I know Celigo's Netsuite connector is built upon Pentaho). Also Jitterbit does have an extension for Netsuite.
We primarily have 2 options to pump data into NS:
i)SuiteTalk ---> Using which we can have SOAP based transformations.There are 2 versions of SuiteTalk synchronous and asynchronous.
Typical tools like Boomi/Mule/Jitterbit use synchronous SuiteTalk to pump data into NS.They also have decent editors to help you do mapping.
ii)RESTlets ---> which are typical REST based architures by NS can also be used but you may have to write external brokers to communicate with them.
Depending on your need you can have whatever you need.IN most of the cases you will be using SuiteTalk to bring in data to Netsuite.
Hope this helps ...
We just got done doing this. We used an iPAAS platform called Jitterbit (similar to Dell Boomi). It can connect to mySql and to NetSuite and you can do transformations in the tool. I have been really impressed with the platform overall so far
There are different approaches, I like the following to process a batch job:
To import data to Netsuite:
Export CSV from old system and place it in Netsuite's a File Cabinet folder (Use a RESTlet or Webservices for this).
Run a scheduled script to load the files in the folder and update the records.
Don't forget to handle errors. Ways to handle errors: send email, create custom record, log to file or write to record
Once the file has been processed move the file to another folder or delete it.
To export data out of Netsuite:
Gather data and export to a CSV (You can use a saved search or similar)
Place CSV in File Cabinet folder.
From external server call webservices or RESTlet to grab new CSV files in the folder.
Process file.
Handle errors.
Call webservices or RESTlet to move CSV File or Delete.
You can also use Pentaho Data Integration, its free and the learning curve is not that difficult. I took this course and I was able to play around with the tool within a couple of hours.