Nuxt Content and JSON file automatically retrieved from API - json

I'm integration nuxtjs/content (https://content.nuxtjs.org/writing) for my contents. But I would like to have the json file generated from responses from my API.
How I can create a command to retrieve, maybe thought cron, the contents and save it in content/ folder?

You could indeed, depending of your hosting solution, have something running every midnight and rebuilding your app. Where you could run some Node.js script to create files in the given directories before it is handled by nuxt/content.
An example of code can be found here: https://stackoverflow.com/a/67689890/8816585

Related

Opening certain files in certain folders in a HTML page

I am trying to learn a bit about web technologies therefore I am trying to create a catalogue for my files.
The situation is the following:
I have a folder with N sub folders;
in each of there sub folders there is an image with always the same name (ie: image.jpg)
in each of there sub folders there is also a certain swg file with always the same name (ie: test.swg)
I would like to create an HTML file which read all the sub folders and create a preview using image.jpg, and when one clicks on the preview test.swg should be launched (not in the browser if possible)
The HTML files should contains all these preview like a catalogue.
How can I do this? should I have a local web server which runs in my machine? is it possible to do this with non web page technologies?
Thank you!
As far as i know Javascript & HTML doesn't have access to the filesystem as it's running on your browser and shouldn't be possible to go through the files iteratively because it would be some kind of breach in security.
If you ask me it's possible or not without a server, it should be possible but it is going to use other technology, for example:
Using a Command Line Interface in Linux or Windows based os you could write a shell script that iteratively will go through the files and folder path, and possibly create a JSON from it. From there the javascript could technically load that file like below.
<script type="text/javascript" src="data.json"></script>
<script type="text/javascript" src="javascript.js"></script>
But do note that you should periodically run the shell script periodically with something like scheduler or refresh it manually.
If you want to do it the normal way you could use many different server side language, for example NodeJs, or PHP as I think both of them require only little configuration.
You could post follow up question if you've decided on which language you want to use.
Below is some reference that you can use to start working on reading the directories
NodeJS
Node.js fs.readdir recursive directory search
Get all files recursively in directories NodejS
PHP
List all the files and folders in a Directory with PHP recursive function
How to recursively iterate through files in PHP?
After reading the directories & Files you just need to pass the data to the "rendering" part, and use some javascript to invoke the .swg when the image is clicked
But I'm not really sure about the .swg file can be invoked to the desktop app directly or not you could do some research on it
Open online file with desktop applications?

"Automate JSON Files upload to Blob Storage"

Automatic JSON Files upload to Blob Storage.
Description:
We have a SSIS job which will generate JSON files with data at a server path. We are manually copying the JSON files and dropping them in BLOB storage in order to trigger our logic app.
Now, Could anyone help to provide information on how we can automate the process of copying JSON files to BLOB automatically? ( Like do we have any approach or code to copy the JSON files at a specific time and copy those JSON files in BLOB )
The solution is to listen to the file system change at your server path, then to use Azure Storage SDK to upload these files which be triggered by the file changed event.
As reference, here are some resources about the API or SO threads of file changes listener in different languages, because I don't know what language you want to use.
C# FileSystemWatcher Class
Python How do I watch a file for changes?
Node.js Observe file changes with node.js
For other languages, I think you can easily get their solution by searching. And to upload files to Azure Storage, you just need to refer to Azure offical getstarted tutorials in dfferent languages to write your code.

Updating JSON files in an ionic app

I am creating an Ionic app and there are few json files in the www/dataFiles folder. I need to package these files with the app. However, once installed on a device these json files need to be updated periodically with new data from a server. I was searching for days trying to figure out how to do it and I found out that files in the www folder cannot be rewritten. I have no clue how to move forward. Do I have to save files in a different location? I would really appreciate if you can point me in the right direction.
Thank you in advance.
You can not modify files on an already installed app.
Why not pull this data from a web service? Instead of storing the data on separate .json files, get that same json data from the server as a request and you will always have fresh data from the server.
What are you using as a server/database? With this information I can put you in the right path.

Is there any way to fill in Sharepoint entries via parsing text file?

My workplace has a whole bunch of unannotated .zip files that need to be uploaded to the new file server (Windows). I've used perl to parse through through the excel files within the .zip files to create an annotation.txt file for each .zip file that contains information about the .zip file. I have 1000's of zip files and do not want to manually enter in information for each entry if there's a way to automate it. I am proficient in perl and mysql, and wondering if there is any way to utilize my skillsets to port this information into the Microsoft Sharepoint website.
Thank you in advance for any advice or suggestions.
There a many, many ways to meet your requirement.
You could write a event receiver to parse the files once uploaded and set metadata.
A better approach for your use case might be to write a .NET based console application and reference Microsoft.SharePoint.Client and then upload your files using the Client side object model (CSOM) and set the metadata during that process as outlined here: Upload a document to a SharePoint list from Client Side Object Model
There are also REST and ASMX webservices that you could call from a non .NET runtime process.
Plenty of options, pick the one that fits your needs and skills best.

How can I add file locations to a database after they are uploaded using a Perl CGI script?

I have a CGI program I have written using Perl. One of its functions is to upload pics to the server.
All of it is working well, including adding all kinds of info to a MySQL db. My question is: How can I get the uploaded pic files location and names added to the db?
I would rather that instead of changing the script to actually upload the pics to the db. I have heard horror stories of uploading binary files to databases.
Since I am new to all of this, I am at a loss. Have tried doing some research and web searches for 3 weeks now with no luck. Any suggestions or answers would be greatly appreciated. I would really hate to have to manually add all the locations/names to the db.
I am using: a Perl CGI script, MySQL db, Linux server and the files are being uploaded to the server. I AM NOT looking to add the actual files to the db. Just their location(s).
It sounds like you have your method complete where you take the upload, make it a string and toss it unto mysql similar to reading file in as a string. However since your given a filehandle versus a filename to read by CGI. You are wondering where that file actually is.
If your using CGI.pm, the upload, uploadInfo, the param for the upload, and upload private files will help you deal with the upload file sources. Where they are stashed after the remote client and the CGI are done isn't permanent usually and a minimum is volatile.
You've got a bunch of uploaded files that need to be added to the db? Should be trivial to dash off a one-off script to loop through all the files and insert the details into the DB. If they're all in one spot, then a simple opendir()/readdir() type loop would catch them all, otherwise you can make a list of file paths to loop over and loop over that.
If you've talking about recording new uploads in the server, then it would be something along these lines:
user uploads file to server
script extracts any wanted/needed info from the file (name, size, mime-type, checksums, etc...)
start database transaction
insert file info into database
retrieve ID of new record
move uploaded file to final resting place, using the ID as its filename
if everything goes file, commit the transaction
Using the ID as the filename solves the worries of filename collisions and new uploads overwriting previous ones. And if you store the uploads somewhere outside of the site's webroot, then the only access to the files will be via your scripts, providing you with complete control over downloads.