serve content based on vistor lan vs wan? - html

I'm guessing this isn't possible, because my Google-fu isn't that bad, yet I can't find anyone doing it, anyone wanting to do it, or any hints on how I might do it. Probably because it is a very bad idea, dumb idea, or so super simple that I should already know how without wasting your time.
I'm just trying to build a very simple landing page on my own webserver so I have links to the various services' web admin pages that I want to go to without having to remember the different port numbers. 99% of the time, I'm doing this on my own lan, but sometimes I access over the Internet.
Right now, my landing page might have a link to 192.168.1.2:3939 for example. I can access the landing page from outside by going to mydomain.com, but then my link still goes to 192.168 blah blah blah.
I'd like it to be the local lan link when I'm accessing from within the lan, but automagically replace the 192.168.1.2 with mydomain.com when I'm accessing from outside.
Possible? Stupid? Is there a better way I'm not thinking of?

Simply configure your LAN DNS to point your domain to 192.168.1.2. Then, you can simply use the hostname everywhere.

Related

Trying to re-design a website with unknown origins

So this is going to be a kinda complicated, frustrating question.
SO I recently started working for a company doing a lil marketing/social media. One of the things they wanted me to update was their website (refresh the look and layout and update the design) I've never built a website (i mean I have a Wix portfolio but..) and have had little HTML experience, but I've taken a bunch of c++, python and other coding classes and also really like figuring out new things. Their IT guy now works for another side of the company and is very weird about relinquishing the passwords for the various social media sites and whatnot. When I finally got the admin passwords for the website I realized that that only lets you change the words on the website or add new info. There is no layout/coding capabilities at all. He also gave me the FTP access and the username and password for that. He was very weird about me changing things (even though the CEO asked me to) and won't give a straight, comprehensible answer about the capabilities we have with this website! The original person who created the website is no longer here and they can't seem to find his contact info.
So my main question is, how do I use the FTP info? Do I simply just download an FTP client and login there???? Will it even be possible for me to access this websites infrastructure? I just kind of need a starting point on what I should be researching/trying to do.
Sorry this was so long and feel free to ask questions because I bet I was a little confusing.
PS. I don't even know what host it was built with like Wordpress or ya kno
Do you know where the website is hosted? Ask him for the hosting provider login. Once you have that, you will be able to see what sort of installation you have there, and to obtain the credentials you'll need to FTP in via Filezilla or a similar tool. If he doesn't give you the info, take it to your boss and have them apply pressure from above.
He might have added you as an author or a user that's not an admin. Make sure your an admin.
The FTP can be accessed from the hosing service, but you mentioned that he didn't give you the details. So download Filezilla to access FTP.
Hope this helps.

Is there any tips for minimising access to a public page without login?

I have a page that is just a non interactive display for a shop window.
Obviously, I don't link to it, and I'd also like to avoid people stumbling across it (by Google etc).
It will always be powered by Chrome.
I have thought of...
Checking User Agent for Chrome
Ensuring resolution is 1920 x 1080 (not that useful as it is a client side check)
Banning under robots.txt to keep Google out of it
Do you have any more suggestions?
Should I not really worry about it?
Not that I would EVER recommend what I'm about to suggest - how about filtering by IP address. Since you provider IP is rarely going to change you can use Javascript to kick out or deny requests from IP addresses other than yours. Maybe a clean redirect to http://www.google.com or something silly like that. Although I would still suggest locking it down with a login and password and just have it write a never expiring cookie. That's still not a great idea but a shy bit better than the road your trucking down right now.
You could always limit the connections by IP address (If you know it ahead of time/it's reliable):
Apache's access control
If it is just for a shop window, do you even need access to a web page?
You can host the file locally.
Personally, I wouldn't worry about it, if no-one is linking to it externally it is unlikely to ever be found by search engines.

SSL Encryption and an external image server

I have an ASP.NET web site technology that I use for scores of clients. Each client gets their own web site (a copy of the core site that can then be customized). The web site includes a fair amount of content - articles on health and wellness - that is loaded from a central content server. I can load the html for these articles from a central content server by copying from the content server and then inserting the text into the page as it is produced.
Easy so far.
However, these articles have image references that point back to the central server. The problem that I have is due to the fact that these sites are always accessed (every page) via an SSL link. When a page with an external image reference is loaded, the visitor receives a message that the page "contains both secure and insecure elements" (or something similar) because the images come from the (unsecured) server. There is really no way around this.
So, in your judgment, is it better to:
A) just put a cert on the content server so I can get the images over SSL? Are there problems there due to the page content having two certs? Any other thoughts?
B) change the links to the article presentation page so they don't use SSL? They don't need SSL but the left side of the page contains lots of links to pages that do need - all of which are now relative links. Making them all absolute links is grody because each client's site has its own URL so all links would need to be generated in code (blech).
C) Something else that I haven't thought of? This is where I am hoping that someone with experience in the area will offer something brilliant!
NOTE: I know that I can not get rid of the warning about insecure elements - it is there for a reason. I am just wondering if anyone else has experience in this area and has a reasonable compromise or some new insight.
Not sure how feasable this is but it may be possible to use a rewrite or proxy module to mirror the (img directory) structure on each clone to that of the central. With such a rule in place you could use relative img urls instead & internally rewrite all requests to these images over to the central server, silently
e.g.:
https://cloneA/banner.jpg -> http://central/static/banner.jpg
https://cloneB/topic7/img/header.jpg -> http://central/static/topic7/header.jpg
I'd go with B.
Sadly, I think you'll find this is a sad fact of life in SSL. Even if you were to put a cert on the other server, I think it may still get confused because of different sites [can't confirm nor deny though], and regardless, you don't want to waste the time of your media server by encrypting images.
I figured out a completely different way to import the images late last night after asking this question. In IIS, at least, you can set up "Virtual Directories" that can point essentially anywhere (I'm now evaluating whether to use a dedicated directory on each web server or a URL). If I use a dedicated directory on each server I will have three directories to keep up to date, but at least I don't have 70+.
Since each site will pull the images using resource locations found on the local site, then I don't have to worry about changing the SSL status of any page.

Best method of showing clients their website during development

We are trying to streamline the process of showing clients their websites whilst in development without the need to change absolute paths etc.
We mostly develop locally and change our hosts files to reflect the domain name, when we are ready to show the client we copy the files to www.client.com/dev but I'm looking for a better method, any suggestions that can make this process smoother and faster would be great.
If you always host the site on a separate domain and not in a subdirectory, you will never have to change absolute paths. So instead of hosting a site in development at www.client.com/dev try dev.client.com. Another option would be to use client.yourcompany.com.
Also try to protect the site in development with HTTP basic authentication. This is easy to set up in most web servers, without changing your web application. Also, if the content is even remotely sensitive in any way, use HTTPS as well.
Alternatively, let them simply come over to your office and present it to them (or go to them and present it). The upside is that you have full control over what they will and won't see, and it never has to go online.
Well, we have client.t.uw.ru site which is universally visible.
When it matures, it moves onto www.client.com and is pushed to search engines.
Thus, we have a * DNS entry on t.uw.ru domain which makes it easy.

How can I update a web page remotely? Using a web service or email, no direct server access

How can I update a web page remotely? Is there a web service or can I do it via email, I have no direct access to the server.
We simply need to add an alert facility in an emergency. For example simple text message across the top of the home page saying "We are shut today due to bad weather".
Thanks
I can't tell that I catch what you mean, but I will answer in general manner
1- if you are building the whole site from scratch: You can create your site by any CMS like dotNetNuke or joomla which will allow you to login and edit what you want
2- if you are building just this page from scratch : You can build your page with online-editing in mind, in this case I recommend to build two pages one for for viewing content and the other for online-editing you can use any HTML-Editor control like FCKEditor
3- if you are dealing with already built page : it will be easier to build administration page which you can upload the new version of the content page to it, and the administration page take care of replacing the content page
hope this can help you, if not, please feel free to clear your needs so we can help more
Contact the host server company you have your DNS/Host service/name resolution with and ask them to redirect the DNS calls to another server of your choice with the notice you wish to have people see when they try to access your page.
On a general basis, yes it's possible, that's what most blog engines and CMS are for. It's also fairly easy to develop an ad-hoc program if all you need is to be able to put an offline page.
If what you mean however is you need to do today witout any access to the server, contacting the person hosting your site or your DNS are indeed your best chances.
I'd suggest getting someone to put a twitter widget on the page, then you can sms/email or use a web browser to send your updates and they will automatically appear on the site.
Is it at all possible for you get someone to do that for you? Twittercard can be used to generate the code to drop in.
It looks like this thread is a bit dated, but for anyone still looking for a way to update your site using email, you might want to check out https://www.sitemailcms.com/. It's a service I've developed to do just that.