pushing data to local .json file via $http service in angular? - json

Is it possible to push data to a local json file stored in my app folder using the $http service?
I've tried fiddling around with $http.post/.get and can't find a way to save/push new data into my local .json file.

Javascript is a client side program and therefore has no ability to modify files on your server (if you're just playing with directories on your local machine, for all intents and purposes, the local machine is your server.) Its only concern is client side logic. You need some type of server backend to do this.

Related

How can i create createReadStream of a video located in a different server from that where we host the application

I have a nestjs project in which I need to store the videos and save their paths in the mysql database my question is: how can I create a createReadStream of one of these videos knowing that the videos and the database will be in one server and the application in another server ??
fs.createReadStream() works just fine if you give it a full OS path. If you're having trouble with that, then we need to see what your "absolute path" is.
Usually, you would not store a full OS path in a database because that "hard wires" your implementation to a specific disk layout. Instead, you would usually store a relative path only in the database and then combine that with a base path that's part of the configuration of your app. That gives you more flexibility if you ever need to rearrange how things are stored on disk without having to rewrite every path in the database. Instead, you can just change the base path in your configuration to point to the new path.
For example, imagine you outgrow your current system disk and add a new faster and larger disk. When storing only relative paths in the database, you could shut-down your app, copy all the files over to the new disk, then adjust the base path in your app, restart your app and it would be up and running with the new location. If you had stored the absolute path in your database, you'd have to write a DB script tor rewrite every single path in the database to the new location.
can i host the database in one server and the application in anthor ? if yes how can i do that ??
The database can be wherever you want. But, the files themselves that you want to stream as a response to an http request will have to be accessible from the web server. If you want to use fs.createReadStream() as the source to stream them, then the files have to be accessible via an OS file path because fs.createReadStream() only works with a file path. If the files are stored elsewhere that doesn't have OS file path access to your web server (like say in a cloud service), you'd have to find some other way to read/stream them from your web server.
How can I create a createReadStream of one of these videos knowing that the videos and the database will be in one server and the application in another server
You can use fs.createReadStream() only if you have OS level file access to the other server (likely via some shared file system mechanism). For example, the files could be stored on some shared file server.
If you don't have OS level file access, then you will have to assess what type of access you do have and pick an appropriate mechanism for streaming the files from there. For example, if you have web access to the files (they are accessible via some URL), then you could either redirect the client to get the files directly from the other web server or you could stream them from that other server yourself using an http library that supports streaming such as got() or axios(). You could then pipe that http stream into your response - similar to what you would do with the stream from fs.createReadStream().

best practices for database connection file PHP and azure

I have a PHP application I am wanting to deploy to Azure via Github. One of the files is a connection to a MySQL DB, which for obvious reasons, I don't want to have tracked on Github. The issue I am running into is getting connected to the DB, and displaying my webpage properly, because the connect.php file isn't in Github. What is the best way to get that to Azure without going through Github?
In your connect.php file, get your values from an environment variable instead of setting it explicitly. Then, in your Azure portal, go to the web app's Application Settings blade & set your environment variables under the App settings section.

How to upload a CSV file in a microservice deployed in cloud foundry

I am new to cloud foundry. I am currently working on a requirement where I have to upload a CSV file (via JSP UI) into a service deployed in cloud foundry and persists its data in service.
The issue is from UI, I only get a local path of that CSV file and when I am trying to parse that CSV via this path the file is not recognized. I guess the reason is service is already deployed in CF, so it does not recognize this local machine path.
Can you please let me know how can I parse this CSV file in local machine and where to parse this CSV.
Thanks in Advance!
There is nothing specific to Cloud Foundry about how you would receive an uploaded file in a web application. Since you mentioned using Java, I would suggest checking out this post.
How to upload files to server using JSP/Servlet?
The only thing you need to keep in mind that's specific to Cloud Foundry is that the filesystem in Cloud Foundry is ephemeral. It behaves like a normal filesystem and you can write files to it, however, the lifetime of the filesystem is equal to the lifetime of your application instance. That means restarts, restages or anything else that would cause the application container to be recreated will destroy the file system.
In short, you can use the file system for caching or temporary files but anything that you want to retain should be stored elsewhere.
https://docs.cloudfoundry.org/devguide/deploy-apps/prepare-to-deploy.html#filesystem

Can I update AWS website with a local JSON file?

I have a static website currently hosted on AWS and I suppose its static (i.e. I can't update it without manually changing the HTML and then reuploading to AWS). I want to make it easier for myself to update certain sections (particularly the 'dates' section). So I was thinking of using a JSON object. Ideally the AWS website would be able to update from a JSON file on my local/personal computer but I'm not sure if that's possible? Do I need the JSON file to be uploaded to a web server/AWS every time I change it? I would like to just update my JSON file locally and not have to change/update anything in AWS. Is this possible or do I need some type of API?
From what I get from your question, I can think of the below two use-cases:
1) In case your static website is hosted on S3, you can use the AWS CLI (Command Line Interface): https://aws.amazon.com/cli/. This will allow you to upload the files directly from your local machine to the S3 bucket.
2) In case it is hosted on an EC2 Instance, you can setup a git repository for your website and push the changes made to the git server. The same git server can then be used to pull the latest changes on your EC2 instance.

Posting an (Image-) URL of a file from Fileserver and Associated Data as JSON via HTTP

I have developed a Node.js & Express Application that sends data as JSON.
Part of the application requires to send the Location of image files stored on different file servers (Windows Server 2012).
Currently the Node.js Application are delivering the file-location in a format like
[{"MulPfadS":"M:\\Originalbilder\\fos\\EID","MulDateiS":"EID00124","MulExtentS":"jpg","ObjId":178983}]
Now the customer wants the location of the file in a format like
http://10.9.0.11/path_to_image
There are no Webserver running on the file servers and the customer doesn't want to copy the image files from the file servers to the same server where the node.js application is running.
Is there a way to resolve this issue? And if so how?
You could convert the image as base64 encoded string and transfer across.