"Automate JSON Files upload to Blob Storage" - json

Automatic JSON Files upload to Blob Storage.
Description:
We have a SSIS job which will generate JSON files with data at a server path. We are manually copying the JSON files and dropping them in BLOB storage in order to trigger our logic app.
Now, Could anyone help to provide information on how we can automate the process of copying JSON files to BLOB automatically? ( Like do we have any approach or code to copy the JSON files at a specific time and copy those JSON files in BLOB )

The solution is to listen to the file system change at your server path, then to use Azure Storage SDK to upload these files which be triggered by the file changed event.
As reference, here are some resources about the API or SO threads of file changes listener in different languages, because I don't know what language you want to use.
C# FileSystemWatcher Class
Python How do I watch a file for changes?
Node.js Observe file changes with node.js
For other languages, I think you can easily get their solution by searching. And to upload files to Azure Storage, you just need to refer to Azure offical getstarted tutorials in dfferent languages to write your code.

Related

Databricks Autoloader Files Process issue

I've zip files in my container and I would get one or more files everyday and as they come in, I want to process the files. I have some questions.
Can I use Databricks autoloader feature to process zip files? Is zip file supported by Autoloader?
What settings need to be enabled to use Autoloader? I have my container and sas token.
Once the zip file is processed (unzip, read each of the file in the zip file), I should not read the zip again. How can I do this when I use Autoloader? Is there any specific setting?
Are there any samples available? I'm new to this area and trying to get more info.
Unfortunately, processing of Zip file using Azure DataBrick is not possible.
Auto Loader supports two modes for detecting new files: directory listing and file notification.
Auto Loader provides a Structured Streaming source called cloudFiles.
Given an input directory path on the cloud file storage, the
cloudFiles source automatically processes new files as they arrive,
with the option of also processing existing files in that directory.
Auto Loader can scale to loading data from storage accounts that
contain billions of files that need to be backfilled to pipelines
where millions of files are loaded in an hour.
For more information you can refer this Microsoft Document

Storing and loading data from react-native calendar to a JSON file

I'm currently thinking of a concept for a react-native app where people add events over a period of time like a diary/log. These events need to be exported and for security and privacy reasons I don't want to use a database. I've read you can use JSON files as a storage method, too.
How can I store data from the app to a JSON file and load the data from the JSON file back in the app? Don't need any code, helpful articles or webpages are appreciated
Assuming that you already have all the app data into a json, its rather simple.
Decide where to store the appdata.json, lets call it APP_DATA_PATH
Find a library to read/write files (I've only used expo-file-system)
On app boot, check if APP_DATA_PATH exists, if it does read the file and load into app, if not assume its a new user
Whenever app data changes, write the changes to APP_DATA_PATH

Load and Save JSON file react client-side only

I am building a simple editor-type application in react-redux, and I want to mimic the operation of downloading and uploading json files for saving and loading data - entirely client side. The server side does not need the data. Local storage may be too small, and it would be nice to provide the user the data in a portable file they could upload on a new machine. Is this even possible, and if so how?
Using a blob file.
You can set the content of a new file which is temp and local, then trigger a click event to download the file.
duplicate answer here and here

How to bulk import documents with custom metadata from csv to Alfresco repo?

I have an excel file (or csv), that holds a list of documents with their properties and absolute paths in local hard drive.
Now that we are going to use Alfresco (v5.0.d) as DMS, I have already created a custom aspect which reflect the csv fields and I'm looking for a better approach to import all document from the csv file into Alfresco repository.
You could simply write java application to parse your csv and upload files, file by file using the RESTful api and do not forget to replicate the folder tree in your alfresco repo (as it is not recommended to have more than 1000 folders/documents on the same level in the hierarchy since it would require some tweaking in a few non trivial usecases).
To create the folder, refer to this answer.
To actually upload the files, refer to my answer here.

Is there any way to fill in Sharepoint entries via parsing text file?

My workplace has a whole bunch of unannotated .zip files that need to be uploaded to the new file server (Windows). I've used perl to parse through through the excel files within the .zip files to create an annotation.txt file for each .zip file that contains information about the .zip file. I have 1000's of zip files and do not want to manually enter in information for each entry if there's a way to automate it. I am proficient in perl and mysql, and wondering if there is any way to utilize my skillsets to port this information into the Microsoft Sharepoint website.
Thank you in advance for any advice or suggestions.
There a many, many ways to meet your requirement.
You could write a event receiver to parse the files once uploaded and set metadata.
A better approach for your use case might be to write a .NET based console application and reference Microsoft.SharePoint.Client and then upload your files using the Client side object model (CSOM) and set the metadata during that process as outlined here: Upload a document to a SharePoint list from Client Side Object Model
There are also REST and ASMX webservices that you could call from a non .NET runtime process.
Plenty of options, pick the one that fits your needs and skills best.