How did this website do their splash page/age verification? [closed] - html

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am looking at this website - http://www.shopmss.com/ - and I was wondering how they did the splash page, age verification and store all on the same URL 'shopmss.com'. You click through 3 screens before you get back to the store.
My secondary question is, can you do this without setting a cookie? i.e. Javascript, that appends the browser bar URL? Or something with mod_rewrite?
EDIT: I thought this was a relevant question to ask because I was exploring the best practice to accomplish the task, I figured it would have something technical. My bad.

The site is setting a cookie called BX. That could be tracking a session, in which they can display different content based on the state of the session.

They are using a frameset. Check the source.

Related

Can a website tell a user's browser to store the entire page locally? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 days ago.
Improve this question
Can a (single-page) website tell a user's browser to store an entire page locally?
For context: I'm hosting a website on a server that charges according to bandwidth. The contents of the site don't change much, so I'm wondering if the user's browser can store the webpage rather than sending repeat requests for the web page!
I've looked into browser-native cacheing, but that appears to be for further requests triggered after the page's scripts load!
This is usually achieved thanks to PWA and Service workers: https://developer.mozilla.org/en-US/docs/Web/Progressive_web_apps/Offline_Service_workers
Actually it's the only way of doing that I know, it can be a bit tricky but it's quite interesting once you understand everything that you can do with it.

Private (invisible) html page [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
How would I upload an html file to my website and not have it be visible to the world? I don't want it showing up on Google or Bing or any weird web spider matrix bot thing being able to see it. I don't want it password protected. I just want it invisible and to be the only person who knows the url.
It would be something like.
My-Website.com/INVISIBLE.html
Your webpage My-Website.com/INVISIBLE.html stays unknown to the world unless you tell someone about it. To make it restricted to search engines, you could use a robots.txt file, details of which are documented at http://www.robotstxt.org/robotstxt.html however not all search engines respect the robots.txt file.
Adding robots.txt to your page should do it
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449

How to detect advertisement link? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I want to detect the links in page which are used for advertisement. Or is there any statistical data by which I can guess this link is for advertisement?
I know this isn't a concrete answer, but if I were doing the same, I'd take a look at AdBlock and other add-ons in browsers such as Firefox since they do much the same. There are quite a few open source add-ons out there where you can view the code that does this. And even most email programs detect junk mail (and ads) using Bayesian filters which I'm sure with a bit of tweaking would work well.

How do free webhosters enforce ads? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
How do those webhosting companies enforce ads on your page?
I'd love to enforce a specific piece of html code on a webserver.
So, how do they?
They might use append and prepend depending on the exact solution you are referring to.
You basically use it to call another file (html, php etc) which is appended or prepended on the page (At the top or botton).
I did it once years ago and it worked.
Maybe stick the adsense code in the appended/prepended file.
See: http://www.maheshchari.com/php-auto-append-prepend-file-using-htaccess/
James

Parsing web-site [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
So, I have a web-site. The links have a following structure: http://example.com/1, http://example.com/2, http://example.com/3, etc. Each of this pages has a simple table. So how can I download automatically every single page on my computer? Thanks.
P.S. I know that some of you may tell me to google it. But I don't know what I'm actually looking for (I mean what to type in search field).
usewget (http://www.gnu.org/software/wget/ ) to scrape the site
Check out the wget command line tool. It will let you download and save web pages.
Beyond that, your question is too broad for the Stack Overflow community to be of much help.
You could write a simple app and loop through all the urls and pull down the html. For a Java example, take a look at: http://docs.oracle.com/javase/tutorial/networking/urls/readingWriting.html