why do some sites have a 'img.' site? [duplicate] - subdomain

This question already has answers here:
Pros and Cons of a separate image server (e.g. images.mydomain.com)?
(5 answers)
Closed 6 years ago.
an example of what i am asking, the website
www.discogs.com
has all of its image paths (the ones iv looked at) leading to here:
www.img.discogs.com
and i have seen other sites similar - usually ones that store lots of images (which i am intending to do).
do they just simply purchase a new domain with the 'img.' within it, or is it an image hosting specialised site or..?
if it is simply purchasing an additional site to store the images, is there any information etc on how to go about setting this up i.e. is a login system required on the storage site, or security considerations etc... can image uploading be done through the 'main' site or would it need to be done through the 'img.' site.
i have tried to google, but im pretty poor at naming things correctly and so havn't found any answers as yet.
if anybody could shed some light on this i would be very much grateful! thanks in advance...
why do some sites have a 'img.' site?
wasn't sure what tag to put for this one, if incorrect please let me know

2 reasons.
1) caching policies are usually per subdomain which allows more aggressive caching on IMG. while using normal caching for the main website as it may be dynamic.
2) it may be a different server, a high storage one. The main website is running the main apps hogging all the performance and the image subdomain allows for a separate, high speed delivery.

Related

Find Wordpress fastest loading site

I happened to visit this website and found it really really fast, it was like layers over layers but doesn't need to load the site at all, even once. Call me old but I'm really impressed by this technology:
https://www.bookofthemonth.com
If anyone knows what technique they're using here in this website please share. And furthermore, does Wordpress have any theme like this? For this site as a blog would be hard because it eventually needs to load and accumulate too many resources, true or false? It would be awesome if you can make a blog with articles on a site like this!(or at least that's what I'm looking for) Thanks everyone in advance.
You can always do View Source... on an internet site you like.
If you do that to the site you mentioned, you'll find that it's a react.js site, not a site delivered by a standard cms such as WordPress.
Still, this site is a typical "single page" design, with
a slider at the top
row of information showing five interesting posts
a row of hyperlinks (the How It Works) row
a row of information showing five more interesting posts ("Meet our members")
You can certainly find several, or more, WordPress themes handling this.
It's also likely that the site for a recognized brand (like the one you mentioned) is delivered by a highly scalable web server infrastructure with caching and a content delivery network. So, can you make yours as fast as this? Not without spending some serious money.
How to proceed? Look for an appropriate "One Page" wordpress theme and follow the directions.
Once you get it looking correct, you can adopt Cloudflare or another content-delivery network platform.

Maintain consistent design & chrome across multiple websites

In the same vein as the microservice architecture approach we're currently looking at splitting our legacy marketplace application into multiple, smaller sites. We've already carved off the checkout portion and soon to follow will be the seller portal, user portal and registration pages. Each site will be completely separate and have its own domain, data access etc.
The problem is: how do we maintain consistent site chrome (i.e. header & footer) across multiple websites? For the checkout site that we've already split off we were prepared to drop the site chrome but that's going to be a much less acceptable solution for future projects.
The ideas I've had so far are (assuming we don't want to simply duplicate the header and footer in each site):
Put the necessary HTML in a nuget package and install it in each site that needs it. This should be fairly easy to do but has the disadvantage that any change to the chrome means every site needs to be updated and redeployed. Also, it limits us to .NET for all future sites (maybe not a real concern?).
Serve all our sites through some kind of proxy site that injects the site chrome into the HTML before serving to the client. This way the site chrome is actually its own application and can be deployed independently of anything else. Disadvantage: I haven't really got any idea of how to implement this; I wasn't able to find anyone else trying anything similar on Google. Also, it might be fragile even once it's up and running due to interplay between the chrome app and the content app.
Has anyone else solved this problem before? If so, what approach did you use?

Images in a mySQL database [duplicate]

This question already has answers here:
Storing Images in DB - Yea or Nay?
(56 answers)
Closed 8 years ago.
I'm making a website for my brother's webcomic, which was previously hosted on Tumblr. What is the most efficient/logical option for storing the pictures?
Downloading and putting the path in the Db
Storing them in the database, base64-encoded
linking directly to the pictures on Tumblr
wat do?
If the tumblr site is going to remain active I would lean towards using the Tumblr API to get at the photos. You could then just write some javascript/jquery functions to display the images however you want.
I've done something similar in the past with Google Picasa Albums and it worked out pretty well.
http://www.tumblr.com/docs/en/api/v2#photo-posts
Just a little additional info, in the past I've found using jquery plugins sometimes makes it a bit mor simple to get at the data I'm looking for.
Never used this one in particular but just a quick search and found this as an example of one that might be helpful.
https://github.com/Iaaan/jQuery-plugin-for-Tumblr-API

How to prevent users from downloading Presentation ( PPT ) and videos from my HTML page [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How to prevent downloading images and video files from my website?
I have an HTML page which resides locally on my machine. I have 2 items on every page 1 - PPT presentation 2 - Video tutorial . I want to prevent users from downloading the content for their personal use. How can this be achieved?
Thanks in Advance !
Maddy
Unfortunately the short answer is that it cannot not be done in a good way, if the content should be available at your website at the same time.
There are solutions where you obfuscate the path to the file when it is sent to the browser, and then use a JavaScript to "decrypt" the path at the client. But those solutions are in no way bullet proof, as the decryption technique would have to be sent to the client as well.
This one is not easy to do, especially if you show them on your page already. As you do so they'll get downloaded on the browser an there's no way, at least i'm aware of you could do this with ease!
All you can do is to make it harder... but it still always be possibile! Even if you could stop them from downloading the file, you can't stop them from hooking a VCR to their video card and re-recording it. Even if you use some protected-path technology to stop that, you can't stop them from pointing a camcorder at the screen.
Same applies to the ppt presentations... as they can view them... users could take screenshots or do whatever to create their own copy!

How to rebuild an entirely static website, without changing URLs?

I have a website that I was asked to "redesign".
The site itself was built and still maintained by FrontPage, so there are hundreds (hopefully not more) of HTML pages.
My main limitation is that I can't change any of the URLs because they have been there for over 10 years and have a lot of SEO value.
I want to rebuild the site in a smart way (CSS classes, dynamic pages, etc.) but also give the owner the ability to change content as he needs.
I was thinking of using WordPress, however I don't have experience with it and I'm not sure what it's limitations are.
My other issue, is that I need server side languages in order to enable this kind of site, but I don't know how to do that without changing the URLs.
And after I deal with all that, is there any way around manually handling every single page?
Any suggestions, or pushing in a certain direction are all welcome.
Feel free to provide new, meaningful, URLs but make 100% sure that you configure correct 301 redirects from all old URLs to the new ones.
You may change the permalinks from your WordPress admin section once you've written the .htaccess file. The old links of the static pages can then be redirect to new pages and links using 301 redirects which won't negatively affect the existing PageRank and SEO; see How to redirect a webpage