Heroku - retrieve remote changes of JSON file - json

My site that heroku hosts takes user inputs and update a json file on a remote server. I should have probably stored in a database. It's a better solution. But is there any way I can download the most up to date Json files on the Heroku remote server?

Heroku doesn't offer any mechanism to commit files directly on the server, or to copy files from the server. One of the main reasons is its ephemeral filesystem:
Each dyno gets its own ephemeral filesystem, with a fresh copy of the most recently deployed code. During the dyno’s lifetime its running processes can use the filesystem as a temporary scratchpad, but no files that are written are visible to processes in any other dyno and any files written will be discarded the moment the dyno is stopped or restarted. For example, this occurs any time a dyno is replaced due to application deployment and approximately once a day as part of normal dyno management.
If the file is accessible over the web you might be able to download it from your browser, but whatever file you created may not be there anymore. You're right that a database is a better choice.

Related

How do I download csv changes from my Heroku

I made a heroku app with streamlit and I used two csv files to save changes. The app is a schedule for group plans, the changes people introduce in the schedule are visible online but when i check my git repository it is not actualized.
How can I download the modificated csv files?
Thanks
Your git repository stores the application source code, which gets deployed to Heroku.
At runtime you application use the Heroku local storage when saving files (not the git repository). You need to download/fetch the CSV files from the application.
Given that Heroku file system is ephemeral (local files are removed when the application restarts) it is not a good idea to persist data on the local filesystem, but rather using an external storage.
You can check out some options in HerokuFiles GitHub repository. If you want the CSV files to be stored with the application source code you can use PyGithub to perform a commit.

PhpStorm speed up project creation when working with existing files on a remote server

When creating a new project from existing files on a remote server (via SFTP), it takes a long time to get all the files (three hours for a Magento 2 installation). If I were to compress and transfer the installation, it would take less than ten minutes to transfer.
Is there a way I can manually transfer the files (outside of PhpStorm) onto my dev machine, then have PhpStorm index the local copy of the files for its code intelligence, but tell it about the link to the remote server, for deployment and debugging?

How could I automatically upload files from my directory to server? [duplicate]

An ASP.NET application (running on Windows server/IIS 7) has to transfer big size files uploaded by current user to an external SFTP server. Due to the file size the idea is to do this asynchronously.
The idea is that the ASP.NET application stores the uploaded file on a local directory of the Windows server. The current user can continue his work. A Windows service or a Quartz job (other tools(*)/ideas?) is now responsible to transfer the file to the external SFTP server.
(*) Are there existing tools that listen on changes of a Windows directory and then move the files on a SFTP server (incl. handling communication errors/retries)?
If there is no existing solution, do you have had similar requirements? What do we have to consider? Because the connection to the SFTP server is not very stable we need an optimized error handling with auto retry functionality.
To watch for changes in a local directory in .NET, use
the FileSystemWatcher class.
If you are looking for an out of the box solution, use the keepuptodate command in WinSCP scripting.
A simple example of WinSCP script (e.g. watch.txt):
open sftp://username:password#host/
keepuptodate c:\local_folder_to_watch /remote_folder
exit
Run the script like:
winscp.com /script=watch.txt
Though this works only, if the uploaded files are preserved in the remote folder.
(I'm the author of WinSCP)

Open folder vs create new project from existing files, located under shared network drive in PhpStorm

It's not clear to my why I should use the option in PhpStorm to create a new project from existing files instead of just opening a folder and declaring the project directory.
I have a web server installed and I can access it's root by a shared network drive. Now I can just open the a folder in PhpStorm and declare it's root. It will generate a PhpStorm project at the given directory.
But there is also an option to open a new project from existing files (located under shared network drive). My best guess is that this option is the way to go. Is this true and if so, why? Or if it doesn't matter, why doesn't it?
There will be several people using the same shared drive to work in different projects in the webroot.
You can, of course, create a project on mounted network drive via File/Open, but note that this is not officially supported. All IDE functionality is based on the index of the project files which PHPStorm builds when the project is loaded and updates on the fly as you edit your code. To provide efficient coding assistance, PHPStorm needs to re-index code fast, which requires fast access to project files and caches storage. The latter can be ensured only for local files, that is, files that are stored on you hard disk and are accessible through the file system. Sure, mounts are typically in the fast network, but one day some hiccup happen and a user sends a stacktrace and all we see in it is blocking I/O call.
So, the suggested approach is downloading files to your local drive and use deployment configuiration to synchronize local files with remote. See https://confluence.jetbrains.com/display/PhpStorm/Sync+changes+and+automatic+upload+to+a+deployment+server+in+PhpStorm

Maximum filesize per file can FIP service handle

We got this alert and it appears that the controller server was unable to transfer the file to our production server due to the file too large to be transferred to our Production Server/s:
*exception: tooltwist.fip.FipException: File is too large to be downloaded: tomcat/bin/synnexESDClient.2013-04-30.log*
Upon checking on the controller's image, the file's size is 84MB:
-rw-rw-r--. 1 controller controller **84M** Apr 30 23:59 synnexESDClient.2013-04-30.log
What is the maximum filesize per file can the FIP service handle to transfer from controller to production server/s? Or is there a config file for FIP service that we can check?
I'm not certain of the maximum size of fip file transfers, but it's probably a power of 2 (32mb, 64mb, etc).
In any case, the purpose of FIP (File Installation Protocol) is to incrementally deploy an application to production servers. Including large log files in the software distribution process is likely to jam up your website updating process, as it is installed to a dozen or more servers (especially when some are on the other side of the country).
First thing, you might want to consider whether you really want to deploy a log file from the Controller to production servers (what is the Controller doing that creates that log file, and why do you want it on production servers?).
If you really need to copy that file to production servers, I suggest you do it independently to the software and web files installation process. To do this, include the log file in the exclusion list for fip and then copy it by hand.