Updating JSON files in an ionic app - json

I am creating an Ionic app and there are few json files in the www/dataFiles folder. I need to package these files with the app. However, once installed on a device these json files need to be updated periodically with new data from a server. I was searching for days trying to figure out how to do it and I found out that files in the www folder cannot be rewritten. I have no clue how to move forward. Do I have to save files in a different location? I would really appreciate if you can point me in the right direction.
Thank you in advance.

You can not modify files on an already installed app.
Why not pull this data from a web service? Instead of storing the data on separate .json files, get that same json data from the server as a request and you will always have fresh data from the server.
What are you using as a server/database? With this information I can put you in the right path.

Related

Nuxt Content and JSON file automatically retrieved from API

I'm integration nuxtjs/content (https://content.nuxtjs.org/writing) for my contents. But I would like to have the json file generated from responses from my API.
How I can create a command to retrieve, maybe thought cron, the contents and save it in content/ folder?
You could indeed, depending of your hosting solution, have something running every midnight and rebuilding your app. Where you could run some Node.js script to create files in the given directories before it is handled by nuxt/content.
An example of code can be found here: https://stackoverflow.com/a/67689890/8816585

"Automate JSON Files upload to Blob Storage"

Automatic JSON Files upload to Blob Storage.
Description:
We have a SSIS job which will generate JSON files with data at a server path. We are manually copying the JSON files and dropping them in BLOB storage in order to trigger our logic app.
Now, Could anyone help to provide information on how we can automate the process of copying JSON files to BLOB automatically? ( Like do we have any approach or code to copy the JSON files at a specific time and copy those JSON files in BLOB )
The solution is to listen to the file system change at your server path, then to use Azure Storage SDK to upload these files which be triggered by the file changed event.
As reference, here are some resources about the API or SO threads of file changes listener in different languages, because I don't know what language you want to use.
C# FileSystemWatcher Class
Python How do I watch a file for changes?
Node.js Observe file changes with node.js
For other languages, I think you can easily get their solution by searching. And to upload files to Azure Storage, you just need to refer to Azure offical getstarted tutorials in dfferent languages to write your code.

Load and Save JSON file react client-side only

I am building a simple editor-type application in react-redux, and I want to mimic the operation of downloading and uploading json files for saving and loading data - entirely client side. The server side does not need the data. Local storage may be too small, and it would be nice to provide the user the data in a portable file they could upload on a new machine. Is this even possible, and if so how?
Using a blob file.
You can set the content of a new file which is temp and local, then trigger a click event to download the file.
duplicate answer here and here

Is it possible to retrieve a list of files in a JSON format from a URL that lists the contents of a folder

I have a NFS location that is not managed by me and it's contents can be accessed by browsing to it, i.d. the server is serving up the folder as a HTML page.
something like https://ftp.mozilla.org/pub/firefox/releases/52.0/
Is it possible to get the list of files in a JSON format response directly in the request response? Without changing anything on the NFS server and without having to write code to parse the HTML?
e.g., Maybe I can send the request to the URL with different headers.
To clarify:
When you access the address with a browser, curl or wget, you get a HTML page.
My motivation is that I don't want to mount the NFS location. I want to access the files by downloading them from the URL.
I don't know the type of server that is holding the shared folder.
Thanks.
In short, the answer is NO.
Not without tweaking the settings on the webserver that is serving the folder contents.
Here are some examples of how to tweak Apache to serve JSON formatted files listing for the folder.
Apache directory listing as JSON using PHP
Apache directory listing as json
Apache External Module mod_jsonindex - May not be the recommended way
http://1h.com/opensource/mod_jsonindex.html

Is there any way to fill in Sharepoint entries via parsing text file?

My workplace has a whole bunch of unannotated .zip files that need to be uploaded to the new file server (Windows). I've used perl to parse through through the excel files within the .zip files to create an annotation.txt file for each .zip file that contains information about the .zip file. I have 1000's of zip files and do not want to manually enter in information for each entry if there's a way to automate it. I am proficient in perl and mysql, and wondering if there is any way to utilize my skillsets to port this information into the Microsoft Sharepoint website.
Thank you in advance for any advice or suggestions.
There a many, many ways to meet your requirement.
You could write a event receiver to parse the files once uploaded and set metadata.
A better approach for your use case might be to write a .NET based console application and reference Microsoft.SharePoint.Client and then upload your files using the Client side object model (CSOM) and set the metadata during that process as outlined here: Upload a document to a SharePoint list from Client Side Object Model
There are also REST and ASMX webservices that you could call from a non .NET runtime process.
Plenty of options, pick the one that fits your needs and skills best.