CasperJS and MySQL - mysql

I want to save data to a MySQL DB that was retrieved while using casperJS.
I have not been able to find any way to do this directly.
Is there a way to DIRECTLY connect the two?
Will node-mysql work from within Casper?

No, there is no way to do it directly.
You will need to do it indirectly. Remember that CasperJS is built on top of PhantomJS which has a different execution environment that node.js. Very few node.js modules actually work inside of PhantomJS/CasperJS without change. You will have to write a script (e.g. node.js script) which has the ability to read a file and write to MySQL.
CasperJS script scrapes data and stores the data in some (temporary) file (see PhantomJS' fs module),
Call the external script with the scraped data file (see PhantomJS' child_process module) and
if necessary, delete the temporary data file either in CasperJS (see PhantomJS' fs module) or the external script.

Related

Nuxt Content and JSON file automatically retrieved from API

I'm integration nuxtjs/content (https://content.nuxtjs.org/writing) for my contents. But I would like to have the json file generated from responses from my API.
How I can create a command to retrieve, maybe thought cron, the contents and save it in content/ folder?
You could indeed, depending of your hosting solution, have something running every midnight and rebuilding your app. Where you could run some Node.js script to create files in the given directories before it is handled by nuxt/content.
An example of code can be found here: https://stackoverflow.com/a/67689890/8816585

import csv file into Google CloudSQL using nodejs Cloud functions

Besides streaming a csv file yourself and painstakingly executing inserts for each line of data, is it possible to use the google cloud sdk to import an entire csv file in bulk, from inside a cloud function. I know in gcp console you can go to the import tab, select a file from storage and just import. But how can I emulate this programmatically?
in general, one has to parse the .csv and generate SQL from that; one line in the .csv would be represented by one INSERT statement. in the first place, you would have to upload the file - or pull it into Cloud Functions temporary storage, eg. with gcs.bucket.file(filePath).download.
then the most easy might be utilize a library, eg. csv2sql-lite - with the big downside, that one does not have full control over the import - while eg. csv-parse would provide a little more control (eg. checking for possible duplicates, skipping some columns, importing to different tables, whatever).
... and order to connect to Cloud SQL, see Connecting from Cloud Functions.

synchronizing json files on device with a database

I want to synchronize data from a nosql database, have it emit json, and when there is a change and the app is online, have the app pull the changes and update them.
My react-native app uses a few language-specific json files loading them dynamically into javascript objects and then using them as my "database". One file for rendering content, one for the app texts, and one for the styles.
When the user changes the language a different directory is used with the same set of json filenames. This is done via the index.js in that directory and loaded into redux.
Must I change the way my app works, and use a NoSQL/real database directly in the app? So that I can use a solution like this one: synchronizing pouchDB with json data
(that solution is for working if I understand correctly in the exact opposite direction. The app is working with a database and synchronizing with json received from the web)
Or can I update the data in an external (preferably) NoSQL (pouchDB or couchDB) or relational database, and somehow "synchronize" it with the json files, when the app is connected to the web and have it update?
Is there a known method or framework to accomplish this with react-native?

Ansible to give MYSQL table output

I am trying to execute a Ansible one liner, which call a bash script from a remote server and then executes in local machine. The bash script actually fetch data from Database.
Is it possible for Ansible to give a Table formatted output?
I am just pasting the column headers alone.
Thanks
Aravind
author_name scheduled_start_time scheduled_end_time comment_data name
If you want to parse ansible output, there are only two ways, which both are hard and somewhat hacky. One is to use callback plugins, the other is to parse with sed/awk/perl/python/whatever you like. See Ansible output formatting options for reference.
I think there is a cleaner solution: you can execute your script on remote machine, save its output in a file on the remote machine and then save it locally by using fetch module. After that you can process resulting files locally using local action.

Is it possible to save data to a json file on local disk using $resource.save without using any server side implementation?

I am trying to build an Employement Management App in AngularJS where a basic operation is adding an employee.
I am displaying details using a service and getting Json data from the mock json file i am using.
Similarly, can i add form data to a textfile on the harddisk.
I have seen it done in a tutorial using $resource.save.
If at all it is possible without any server side code please share the example,it would be helpful.
Thanks.
You can make use of HTML5 local browser storage, as this does not require folder access. Mode datails here http://diveintohtml5.info/storage.html
AngularJS has modules for local storage which you can use to access such storages like this https://github.com/grevory/angular-local-storage