Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I'm trying to figure out how to "store" simple text into a .txt file with html.
A Html form page, that you'd input "data/text" into, and it would post the text into the .txt file.
then later you can "get" the data you posted.
Im trying to do this without a server, all local. (No PHP).
Thanks
You can't. You can store data into a Cookie or the browser LocalStorage, but arbitrary filesystem writing isn't possible with just HTML.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have string contains HTML type of data and I am simply trying to convert them into PDF using ASP.Net. I've looked around in the support pages and Googled it a lot! There seems to be no simple snippet of code to this is common task with a nice output. To be more specific, if is it possible to convert whole html page (With css) into a PDF file like wise what any browser does for us.
Any help would be greatly appreciated! Please help me to resolve this issue.
Thanks
Are you familiar with pdf.js
Example code
https://mozilla.github.io/pdf.js/
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
Like here on Stack Overflow, when one asks a question, that question is now a tag of some page, is a new page created inside the the server with the name of the a tag so that search engines can find it, how does this work?
If no pages are created, where is this data of all the tags kept?
I see for this very page, Stack overflow haas this facebook tag.
<meta name="og:description" content="Like here on stack over flow, when one asks a question, that question is ">
The data is stored in a database.
The pages are generated by having the HTTP server invoke a piece of software which examines the URL, fetches the appropriate data from the database, and outputs some HTML.
There are numerous ways to do this including psgi, FastCGI, CGI, and wsgi.
that information used when you post a link to this page to facebook. just try it and you will see the result.
details: http://ogp.me/
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
This question does not appear to be about programming within the scope defined in the help center.
Improve this question
Is it possible to use HTML tags in objects description, like links? I am trying to post a link on the wall with my app but it skips HTML stuff.
With regard to your question - no. You can't use HTML tags at all when posting data to Facebook.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
So, I have a web-site. The links have a following structure: http://example.com/1, http://example.com/2, http://example.com/3, etc. Each of this pages has a simple table. So how can I download automatically every single page on my computer? Thanks.
P.S. I know that some of you may tell me to google it. But I don't know what I'm actually looking for (I mean what to type in search field).
usewget (http://www.gnu.org/software/wget/ ) to scrape the site
Check out the wget command line tool. It will let you download and save web pages.
Beyond that, your question is too broad for the Stack Overflow community to be of much help.
You could write a simple app and loop through all the urls and pull down the html. For a Java example, take a look at: http://docs.oracle.com/javase/tutorial/networking/urls/readingWriting.html
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
There are many documentaion on internet or user guides which have no pdf.
They have the table of contents and links.
Is there any way that i can convert them to pdf for printing for offline reading
EDIT: IF i can get some script code where i can give the http address of the wiki documentaion and that script generats the Single page of the html links it will be great.
thanks
Try this popular addon for Firefox: https://addons.mozilla.org/en-US/firefox/addon/pdf-download/