Expire the page after submission - html

I'm implementing an iAuth form for a credit application in a J2EE container (JSTL+JSP+Stripes). The vendor states in the implementation guide:
Expire the “Questions” page after answers submission
When performing iAuth transactions you will need to “expire” the page on which the consumer's questions will be displayed after they have submitted their answers. This is crucial in order to prevent a consumer from using the "back" button to modify their answers after they have already submitted them once and found that their authentication attempt was unsuccessful. Once the answers to a question set have been transmitted to vendor, that question session is closed. Any additional attempts at modifying the answers to the same question set will result in an "invalid transaction-continue" response.
I am unsure what this means.
Are "they" suggesting just setting "Cache-Control" and/or "Pragma" headers on the form page?

Well you can use HTTP related techniques to expire pages. But those methods are rather what I consider "soft" techniques.
To better secure your system, you may want to follow this kind of server-side implementation:
Page A refers to the page that goes to the Form Page and Page B is the controller which receives the information posted by Form Page.
User visits Page A
Page A determines that the Form Page should be viewable to User
Page A creates a session variable A and sets it to true
Page A shows a link, or redirect the User, to Form Page
Form Page determines whether User can view the page by checking session variable A
Form Page displays the form.
User enters the information and submits the form
Form Page post data to Page B
Page B receives the information, validate, and delete session variable A
Of course it can be even more complex with time checking (whether the User took too long from Page A to Form B, or took merely a second to submit Form Page to Page B).
When it comes to security in networking: Server side > Client Side

Related

Passing login information through a URL to a form with changing attribute names

I am working on a project which involves setting up a bunch of dashboards around the office. The plan is to use Screenly on Raspberry Pi 3s, as it seems to fit our needs for the most part at a very low cost. The problem is, some of the webpages that need to be displayed are locked behind a login. Screenly doesn't have a way to get past this, other than passing the login information and the page redirect through the URL itself. I am aware of the potential security issues this could bring, which is why the account we crated for this use can only view (and not edit) very specific pages.
I want to pass login information through a URL in order to login to a website and directly access a specific page on that website. I have had success passing login information in the form of:
https://website.com/dologin.action?username=CapnCrunch&password=Fr00tl00ps&login=Log+in&os_destination=%2Fpages%2Fviewpage.action%3FpageId%58008
This works nicely when the username and password attribute names are always the same, but not when they change on every refresh. Instead of the HTML attributes for the username box remaining the same every time the login page is accessed, they change slightly every time.
For example, these are the HTML attributes for the username upon loading the page for the first time:
<input name="ct100$phUser$txtUser8193" type"text" id="txtUser8193"
class="login_user border-box" placeholder="My Username">
But when I refresh the page, this same bit of HTML code changes to:
<input name="ct100$phUser$txtUser5516" type"text" id="txtUser5516"
class="login_user border-box" placeholder="My Username">
I would love to pass the URL arguments in the form of:
dologin.action?ct100$phUser$txtUserXxXx=CapnCrunch
Where XxXx is just whatever number the page decided to use at that time.
All the solutions I have found online include using external scripts of some kind. The problem is, Screenly only accepts URLs. Using a script would involve either editing Screenly's source code, or using a proxy webpage.
Is there any way to get around the changing attribute name without using external scripts?
Thanks in advance

Auto-populate form via URL, then submit?

I have working the auto population of this form: http://getpocket.com/save
I'm using it rather than the API so that it works when users are logged into Pocket on the same browser as my website.
However, it's not a good user experience to then have to click 'save', so how can I "automate" that?
I won't show my code, because it essentially is just to generate a link of the form:
http://getpocket.com/save/?title=thetitle&url=encodedurl
It populates the form fine, but how can I submit? I tried apending &save and &submitand then each of those =True, in vain. Is the issue that the save button doesn't have a name= field, which is what's used to hook into the title and URL fields?
EDIT: Just to be clear, I didn't have any malicious intentions, only to save articles to read later on click of a button.
If I find the time I'll have a look at the API.
Luckily this is impossible (on Pocket and most sites) due to cross site forgery request protection to prevent exactly what you are trying to do.
A token is set in the form and together with session information for the user on pocket (or any other site that uses csfr token protection) it will need to form some sort of secret hash. When the 'save' form is submitted the combination of these strings will be checked and normally new strings will be set. Because there is (practically) no chance that you will be able to predict the token form the form itself and have no real way of manipulating the session hash, you are out of luck. And we are all very happy for that :).
Otherwise you could make links on other sites that would delete your whole database when you happen to click on them, etc.
In short: You can't.
On any form without csrf protection you'd have to target not the url of the page with the form, but the 'action' of the form. You can see this action by inspecting the form with your browser's DOM inspector. But, as I said, csrf protection will prevent this from working most of the time.
http://en.wikipedia.org/wiki/Cross-site_request_forgery
https://www.owasp.org/index.php/Cross-Site_Request_Forgery_(CSRF)

How to fill a web form inputs using delphi XE3?

I need to know how to fill an web form using Delphi XE3? I have a web form with user name and password, so how to fill it programmatically?
The page is http://batelco.com/portal see only two inputs user name and pass so how to fill and pass them ?
Using Internet Direct (Indy) HTTP client class, you can submit form values to the server using HTTP POST.
The Indy HTTP client will also receive and store cookies which the server sends with its response, if an instance of the TIdCookieManager class has been assigned to the IdHTTP client component.
HTTP cookies are required by many secure web applications when the client makes further HTTP requests to other secured URL on the server. The Indy HTTP client then will send the cookies with the request (if a TIdCookieManager has been assigned to the IdHTTP client component).
So you could send a login POST request on the login URL, providing needed authentication information, and then send a GET request to the download statistics URL to retrieve its HTML.
Regarding your specific login form, which uses ASP, here is a question about programmatically sending POST requests: HowTo deal with cryptic hidden values for ASP Net (__VIEWSTATE)
This article shows how to get and set properties of named elements.
You should get and set value properties. What ID's would form elements have depends upon your page.
Check if Element with ID has a value
This article while asked "how to read" also describes both how to get values and how to set them. Afterall if you can do A := B (read value), then you probably can also do B := A (set value).
read content in webbrowser input field
Now, that the page URL is given in the question we can click on the right top corner login-form elements in WWW browser with right button, and choose "Inspect element" to see its sources. Or, if browser is not modern and does not have Inspect command on menu, we can use another commend, like View page source and find form in the sources of the whole page. For example one of those elements is
<input name="txtUsername" type="text" maxlength="15"
id="txtUsername" tabindex="1" class="inpu-field" onfocus="txtfocus();"
onblur="txtblur();" style="color: gray; background-image: none;">
Thus we know know the ID of 1st element of form, whose "value" attribute we need to get(read) or set(write).
Links above show how to do it, given the known ID.
BTW, you given your page wrong, the real page is https://www.e-services.com.bh/Eservices/login_batelco.aspx
What about your original page, it just does not work with MSIE6 that is TWebBrowser in default mode - for compatibility with all the written applications using Microsoft ActiveX component. See http://imgur.com/ad4wbOI
If can use Google Chrome instead of TWebBrowser.
Or you can reach the ActiveX interface as one of TWebBrowser properties, and acquire new interface and turn off MSIE6-compatibility http://msdn.microsoft.com/en-us/library/aa752510.aspx
However, "how to make this page render in twebbrowser" is another, new question, not the question you asked here.
Actually, the only reason why i do not vote for closing this question as duplicate, is because none of articles above have "set" or "write" or "fill" in their title, so finding them was a bit harder than trivial.
But if the page is not mutating on load and does not have some one-time protection like CAPTCHA or unique form hash-codes, then you can post all the values with single HTTP request without even loading the form.

how do i block outbound form submitting

I have a profile edit page on my website with preset age and country lists so people can choose their age and country.
My problem is a guy made an HTML form that can submit a custom age and country. Does somebody know how to block form submitting from websites that are not on my domain?
I changed my form a few times, but he can find the input names just as simply as I changed them.
The only fail-safe way to prevent a submission of a form with undesirable values is to perform validation on server side.
I think the referrer (Request.ServerVariables("http_referrer")) should tell you the page the request came from. As Oleg said you should additionally validate the returned form data in any case.

Handling a form and its action: one page or two?

When doing web programming there seem to be two different ways of handling a form and its action. Is there a reason to prefer one method over the other?
Method 1: One page that displays a form or handles a submitted form. This seems simple, in that there's one form for everything. But it also means the if/else cases can become rather large.
if [form has not been submitted] {
// display form for user input
} else {
// handle submitted form
}
Method 2: Handle user input on one page with a form, with the form submitting to a second page.
page 1 (input.html):
<form action="./submit.html">
// display form for user input
</form>
page 2 (submit.html): Handles the input from the form.
I've seen both of these methods used. The upside of the first method is that there's only one page and one set of variables to worry about, but the page can get large. In the second method, each file is simpler and shorter, but now you have twice as many files to worry about and maintain.
I typically submit back to the same page, as you will often need to redisplay the initial page if there are validation errors.
Once the record has been successfully saved, I typically use the Post/Redirect/Get method to avoid the common multiple submit issue.
I think it depends on what kind of form it is. If it has too much validation I would go with two pages.
I always use to pages though because I like my code to be clean, but I use AJAX whenever possible e.g. contact forms since they are short, and I just post the response back to a div in the form.
Caching.
If the server has a series of static HTML pages livened maybe only by AJAX (and even these requests cached, server side, per user), it reduces its load and traffic significantly. Then confining the dynamic content of targets of forms to a relatively small area is a boon, because a page that is a target of POST can't be retrieved from the cache, it has to be regenerated from scratch no matter how heavily loaded it is.
Of course this doesn't solve the question of n static pages + 1 CGI vs n static pages + m CGI.
But once you don't need to spit out sophisticated HTML, just a simple redirect, keeping things in one place may be profitable - error checking and handling, authentication, session management and so on.
OTOH if every your page is a CGI script that creates a new page on each visit, there is no reason why it can't accept form data and handle it at the same time.