I have an email with a link to a video file stored on a cloud hosting service (backblaze)
I'm trying to make it so that when someone can clicks the link the file would start to download. Right now I have this:
Download Here
That takes you to the video instead of downloading the file. I'd like to avoid that if possible and have the link prompt you to start downloading the file when you click on it.
Is this possible to do in an email when the file is stored on a server somewhere?
Thanks
I think you can't do this in plain html.
Since you can't use JavaScript in email, the best option would be to manage to include some PHP script in the server that do the job.
This example was taken from serverfault
PHP
?php
// We'll be outputting a MP4
header('Content-type: video/mp4');
// It will be called downloaded.mp4
header('Content-Disposition: attachment; filename="downloaded.mp4"');
// Add your file source here
readfile('original.mp4');
?>
Related
I need to upload a file repeatedly by browser (automatic) and refresh time ask me for confirm.
How can i to POST form with a specified file?
Sorry my english
It would be nice if you make available what you already tried, then we know where you are getting stuck.
Basically if you need to upload a file to a remote server you will need a dynamic language like PHP, Python, etc.
You can't send files to a remote server using plain HTML.
For security reasons HTML itself won't let you send files to a remote webserver via <form> automatically.
For that feature you would have to have your webserver handle the form via a special file like for example <form action="exampleFormHandler.php" method="post">.
Placed at your webserver, this form-handling file would have to provide the sending of that very file then.
I am wondering if I can have a webpage where I can tell it to grab my file and put it in a directory, such as: "http://example.ex/folder". Meaning the file I provided is put into the "folder" folder.
Overall process:
Button says: "Import file"
I select a file, and my file is "text.txt"
It takes my file "text.txt" and adds it to the local system/directory of the website.
You can do this using JQuery File Upload and then adding a backend service that captures the file and saves it.
For example, here is a repository that has a basic Python (Flask) server integrated with JQuery File Upload that will take an uploaded file and place it on the server:
https://github.com/ngoduykhanh/flask-file-uploader
I'd put the rest of the code here, but it is a lot - and requires HTML, JavaScript and a back-end language (like Python).
Here is the documentation on JQuery File Upload: https://github.com/blueimp/jQuery-File-Upload
As a word of caution, DO NOT TRUST ANYTHING UPLOADED TO YOUR SERVER. Meaning, do not put it out on the open internet without some sort of authentication or checks in place to make sure only files you intend are uploaded. Otherwise, people will find it and upload scripts turning your device into a Bitcoin miner, spam relay, or bot host.
Instead of doing it this way, why not use SFTP to upload it to your server to host? At least that way you can lock down access.
I have a webpage with a bunch of download links and a name before each download link, the format of the webpage looks like this:
File One's name
direct download link here
File Two's name
direct download link here
and so on.
Is it possible to write a program so it mass downloads all the links on the page with each one being named the names above their respective link? What can I use in order to write something that can this?
I would recommend writing a scraper in python using the scrapy framework.
Using this framework you will be able to extract all of the links from the page and pass them to a media pipeline in scrapy, this media pipeline will allow you to download and save the files with custom names (For example the name above the link).
It's a lot to take in if you haven't programmed, used python or used scrapy before but there are loads of tutorials out there to help you.
Good luck!!
You can initialise a variable for Name of your download file at first.
$filename = "XYZ";//File Name
And at the download function you can call that variable name so that it will be the name of your downloaded file.
//header info for browser
header("Content-Type: application/xls");
header("Content-Disposition: attachment; filename=$filename.xls");
header("Pragma: no-cache");
header("Expires: 0");
In this the downloaded file will be named as XYZ with extension .xls
(I am sorry if my question is not in the right place. (I've been thinking for awhile and came up to the conclusion that this one is the best place for my question)
Is it possible to create such an HTML web-page that would provide a user to download a certain file from it, but would not disclose the location of that file (i.e. the user would not know the URL of the file that he is downloading).
If yes, would you, please, give me some directions as to which HTML code I should use to create such a page.
The HTML page would provide a link to a server side script passing a filename or other unique moniker:
Download Now
The script would read the identifier, derive a full path from it, load the file and write it back with the appropriate headers/mime type causing the browser to prompt the user with the normal download dialog.
The only location data available to the user would be the link to the script - which would - unless you add some security - serve back the file just as if it were a standard url pointing to a file.
(PHP Example)
With pure html, no. But with a serverside script (php, c#, vb, perl, or other) yes. You would stream the file to user. In that case just the serverside script has access to the origin files
I have a web page that will be used to create KML Files with a perl script.I want the user to add some data to a form that will be used in my perl script. When the form is submitted it will run the script, create a kml file, then prompt the user to save the file. The only part I am not sure about is how to have the user save the file after the script has created the KML. Do I have the perl script prompt the download or use something on the HTML page prompt the download. I am not sure the best way to do this.
If you have a link or a form for telling the server to build the KML then just generate the KML normally and send it back to the browser with some extra HTTP headers. The headers you want are:
Content-disposition set to attachment;filename=whatever.kml where "whatever.kml" is what you want the file to be called.
Content-type set to application/vnd.google-earth.kml+xml.
The Content-disposition should tell the browser to download the KML instead of trying to handle it.
So the Perl script will be prompting the browser to prompt the download.
Assuming the contents of the kml file are in $kml then you'd want to do something like:
use CGI;
my $cgi = new CGI;
print $cgi->header('-Content-disposition' => 'attachment;filename=kml.xml',
'-Content-type' => 'application/vnd.google-earth.kml+xml');
print $kml;