How can I post data (form) to html page and hijacking the data in the middle? - html

the site addres: http://www.ynet.co.il/YediothPortal/Ext/TalkBack/CdaTalkBack/1,2497,L-3650194-0-68-544-0--,00.html
fill the form with rubbish.
Hit 'Send'
the form post the data to another HTML without any parsing of the data i've just added
How do they do it?

A likely option is that they are using a content management system where "html" on the URL doesn't actually mean it's a static html file.

This may be out of left field, but I've certainly used the occasional JS function to grab everything in the header and either parse it or pass it to another script using AJAX.
I'll sometimes use this method in a 404.html page to grab the headers of the previous page, parse them out to see where someone was trying to go and redirect them.
That is, as annakata said, one of the numerous options available.

Edit based on clarified question:
Numerous frameworks can be configured to intercept an html request - for instance asp.net can be set to handle any given extension and an HTTPModule could do anything with that. It's really up to web server configuration what it decides to do with any request.
also: you don't really want to be saying "hijack"

Related

Does html structure make difference in get or post request?

I have a href , when i click on it, it goes with "POST"request.
(PS. i am using magento 1 framework)
<li>
Logout
</li>
when i remove this "login-popup-in-footer my-account-text" it goes with GET request which is ideal.
I am not sure if HTML Or htaccess file makes difference in GET OR POST request.
Feel free to share thoughts.
Thankyou
Neither, at least not directly.
The HTML you have will trigger a GET request. It can't do anything else.
It will be some client-side JS that is searching the document for elements which are members of one or more of those classes and adding an event listener that prevents the default behaviour of the link and makes a POST request.
Your server configuration can't influence it either. While it could issue a redirect response there is no way for one of those to response to a GET request in a way that causes the browser to make a POST request (although the reverse is not true).

HTML Form: Can submitted GET/POST parameters be suppressed using only HTML or CSS?

I am volunteering on a website-based project that is trying to make all pages fully operable JavaScript free before adding any JavaScript for enhancements, and I was asked to investigate whether or not a particular scenario could be handled purely through HTML/CSS.
What we have is a form that is populated to help us filter a list of tickets that are displayed on the screen after a page update through a GET action, which itself works fine, but the concern with the current implementation is that the URL cannot be made into a permanent link. The request, however, to keep the permanent link as minimal as possible, is to only send GET parameters for fields that are populated with something (so, suppressing GET parameters for fields that are blank) instead of having a different GET parameter for each form field on the page.
I have thought of several ways that could be done, most including JavaScript (example: create fields with ids but no names and a hidden field w/ name that uses JS to grab the data from the fields), but also one that would be a POST action with a redirect back to the GET with a human readable string that could be permanently used. The lead dev, however would prefer not to go through the POST/redirect method if at all possible.
That being said, I'm trying to make sure I cover all my bases and ask experts their thoughts on this before I strongly push for the POST/redirect solution: Is there a way using only HTML & CSS to directly suppress GET parameters of a form for fields that are blank without using a POST/redirect?
No, suppressing fields from being submitted in an HTML form with method of "GET" is not possible without using JavaScript, or instead submitting the form with a POST method and using a server side function to minimize the form.
What fields are submitted are defined by the HTML specification and HTML and CSS alone cannot modify this behavior and still have the browser be compliant with the standards.
No, you cannot programmatically suppress any default browser behavior without using some kind of client scripting language, like JavaScript.
As a side note, you say "JavaScript for enhancements", but JavaScript is not used for enhancements these days. And no one in the real world would except a decent front-end without the use of JavaScript. I would suggest you simply use JavaScript.
I do not think you can avoid Javascript here to pre process before submission to eliminate unchanged /empty form fields.

Why is a html form post to a restlet resource not working?

Restlet's (2.0M6 on Google App Engine) annotations are actually sensible to the order of a resource's methods.
When posting html form data, make sure that the #Post("html") method stays above the #Post("xml") method in the receiving resource.
At least Firefox puts both content types into the request's Accept header, so the first matching method will be processed.
The question is, if there is any other way to achieve control over method precedence?
For example I would like the client to accept text/html only.
As per your comment that you're asking whether there is some kind of client-side html form attribute or JavaScript to modify the accept header, the answer would be, AFAIK: no. Not for links clicked or forms submitted by the user. As you mentioned in your comment, you might be able to use JS to intercept link clicks and form posts, and use XHR instead, but that'd probably be tricky, if possible.
BTW, XmlHttpRequest doesn't really have anything to do with XML. It can handle any sort of content, for both requests and responses. It's very common to return a snippet of HTML to a XHR request and use DOM injection to dynamically update the page.

HTML form method with nice URL

I just want to know whether there is a way to answer this question with "Yes" without using JavaScript.
What I want to do is have a search form that automatically generates URLs like http://example.com/search/my+search+term or something similar when I enter my search term into a search text field.
EDIT: Due to some mis-understanding (and not being clear on my part), a clarification: I want the browser to generate that URL based on the value of the text field when the form is submitted.
No, it's not possible without using JavaScript.
The best you can do is using a GET action and have an url like http://example.com/search/?q=my+search+term, where q is the name of the input search box.
Using html only, no.
You could have something server side that might work. You could have the server respond with a 302 response code. If you are using Apache, you could probably use mod_rewrite to take the GET request and generate a new url.
For example, the browser might ask for http://example.com/search/?q=blah+foo+bar, the server could then take that and send the browser a 302 redirect for http://example.com/search/blah+foo+bar.
See more information at the Apache url rewriting guide, or by using your favorite search engine.
You could still use javascript to generate the correct url, but if someone has javascript disabled, this would work as a fallback.
The answer is No
No if you want it to be client side, if you can do it server side (by submitting the form) you can use something like PHP
Yes you could perform something like this server-side pretty easily as long as you don't mind submitting a form.
EDIT: Upon further clarification from the author in comments below: It is not possible in a pure client-side manner without JavaScript or some other client-side tool like Flash/Silverlight (which is admittedly overkill).

REST/Ajax deep linking compatibility - Anchor tags vs query string

So I'm working on a web app, and I want to filter search results.
A nice restful implementation might look like this:
1. mysite.com/clothes/men/hats+scarfs
But lets say we want to ajax up the filtering, like the cool kids, and we want to retain deep linking, we might use the anchor tag and parse that with Javascript to show the correct listings:
2. mysite.com/clothes#/men/hats+scarfs
However, if someone clicks the first link with JS enabled, and then changes filters, we might get:
3. mysite.com/clothes/men/hats+scarfs#/women/shoes
Urk.
Similarly, if someone does not have JS enabled, and clicks link 2 - JS will not parse the options and the correct listings will not be shown.
Are Ajax deep links and non-Ajax links incompatible? It would seem so, as servers cannot parse the # part of a url, since it is not sent to the server.
There's a monkeywrench being thrown into this issue by Google: A proposal for making Ajax crawlable. Google is including recommendations for url structure there that may give you ideas for your own application.
Here's the wrapup:
In summary, starting with a stateful
URL such as
http://example.com/dictionary.html#AJAX
, it could be available to both
crawlers and users as
http://example.com/dictionary.html#!AJAX
which could be crawled as
http://example.com/dictionary.html?_escaped_fragment_=AJAX
which in turn would be shown to users
and accessed as
http://example.com/dictionary.html#!AJAX
View Google's Presentation here (note: google docs presentation)
In general I think it's useful to simply turn off JavaScript and CSS entirely and browse your website and web application and see what ends up getting exposed. Once you get a sense of what's visible, you will understand what most search engines see and that in turn will show you what is and is not getting spidered.
If you go to mysite.com/clothes/men/hats+scarfs with JavaScript enabled then your JavaScript should automatically rewrite that to mysite.com/clothes#men/hats+scarfs - when you click on a filter, they should be controlled by JavaScript meaning you'll only change the hashtag rather than the entire URL (as you're going to have return false anyway).
The problem you have is for non-JS users going to your JS enabled deeplinks as the server can't determine that stuff. Unfortunately, the only thing you can do is take them to mysite.com/clothes and make them start their journey again (as far as I'm aware). You'll need to try and ensure that when people link to the site, they use the hardcoded deeplink rather than the hashed deeplink
I don't recommend ever using the query string as you are sending data back to the server without direct relevance to the prior specified destination. That is a corruptible security hole as malicious code can be manually added to the query string to cause a XSS or buffer overflow attack at your webserver.
I believe REST was intended to work with absolute URIs without a query string, because then your specifying only a location of a resource and it is that location that is descriptive and semantically relevant in addition to the possibility of the resource being so equally relevant. Even if there is no resource at the specified path you have still instantiated a potentially unique and descriptive location that can be processed accordingly.
Users entering the site via deep links
Nonsensical links (like /clothes/men/hats#women/shoes) can be avoided if you construct your Ajax initialisation code in such a way that users who enter the site on filtered pages (e.g. /clothes/women/shoes) are taken to the /clothes page before any Ajax filtering happens. For example, you might do something like this (using jQuery):
$("a.filter")
.each(function() {
var href = $(this).attr("href").replace("/clothes/", "/clothes#");
$(this).attr("href", href);
})
.click(function() {
update_filter($(this).attr("href").split("#")[1]);
});
Users without JavaScript
As you said in the question, there's no way for the server to know about the URL fragment so filtering would not be applied for users without JavaScript enabled if they were given a link to /clothes#filter.
However, even without filtering, these links could be made more meaningful for non-JS users by using the filter strings as IDs in your /clothes page. To prevent this messing with the Ajax experience the IDs would need to be changed (or the elements removed) with JavaScript before the Ajax links were initialised.
How practical this is depends on how many categories you have and what your /clothes page contains.