Possible to host a website with Google Cloud without a domain? - html

I want to host some html files on Google Cloud and wondered, if this is possible to do, without adding a custom domain...
With for example Cloudflare or AWS, that's possible...

GCS objects can be loaded just fine from a web browser, with or without a domain. They follow either of these naming schemes:
https://storage.googleapis.com/YOUR_BUCKET_NAME/YOUR_OBJECT_NAME
https://YOUR_BUCKET_NAME.storage.googleapis.com/YOUR_OBJECT_NAME
If you simply need to serve resources via a web browser, this is quite sufficient.
If you need a bucket to represent an entire website, it'd be a good idea to use a custom domain. This enables a handful of nice, website-like features, such as defining default pages when none is specified as well as providing a customization 404 page.

You have three options (well, only two of them are really viable, but the last one can be useful in certain situations).
In order of ease to use and viability:
1) Google App Engine:
The default Google App Engine app is served out of *.appspot.com site, so if you create a project call "cutekittens", your site address will be cutekittens.appspot.com.
Furthermore, you can choose to do something simple like a static webpage, or you can host an entire webapp on Google App Engine. It's easy to use and very powerful. Google App Engine supports its own storage (Datastore), bigdata (Big Query), and MySQL (Cloud SQL) solutions and all of that can be served out of the default appspot.com site which acts the the front end.
2) Static Website on Google Cloud Storage. Google Cloud Storage is less powerful but should suffice if you just need a static website served. It uses "storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]", in which your object is probably an index.html.
3) Use a Google Compute Engine VM on static IP. This option is probably the MOST powerful, as you can do anything you want on your own VM. However this is also the less friendly usage since you will need the actual IP address to access the site and its resources.

Related

Github pages .io as API , Rate limit

I am making a android app which will have around 1-2k users per day, all I need is a json file which is hosted on https://name.github.io/repo/filename.json.
Is their any limitations on doing so?
Is their any better way to host this json file to fetch the data?
I will be also updating this json from time to time.
GitHub doesn't consider using using GitHub Pages as a CDN for hosting static assets to be within its guidelines for GitHub Pages. The intended purpose is to host a personal blog or website or a website for your open source project.
The documentation linked above outlines acceptable uses and limits.
Instead, you could end up storing this JSON file in some sort of cloud bucket (e.g., S3), possibly with a CDN (e.g., Cloudflare) in front of it. That would probably keep costs minimal for your app.

Can cloudrail be used serverless?

Cloudrail seems perfect to simplify interaction of a web page with e.g. google drive or other cloud service.
And this may be due to my lack of understanding and lack of knowledge of web development and Node.js, but can cloudrail be used serverless?
Or in other words can cloudrail be used on a stand-alone html page on a local computer (i.e. served by the filesystem instead of a server) and be made to access google drive or even work with an off-line mode while on a stand-alone html page (which may later sync with google drive)?
The reason being I would like to design a simple mobile app that accesses the cloud (fine so far) that is mirrored as a simple portable web page that can sit anywhere (desktop/laptop/mobile/USB stick) and is not served by any server but simply loaded from the local filesystem.
If not cloudrail, what other technology might I need to achieve this?
I don't think you can use it as a standalone website the way you describe it. You might be able to realize your use case though by e.g. using CloudRail's Nodejs SDK with Electron. This allows you to create an application very similar to a website with Javascript while still giving your users just one file that runs pretty much everywhere.

Reason for installation through Chrome Web Store

Is there a technical reason, why a Google Drive application must be installed through the Chrome Web Store (which severely limits the number of potential users)?
The reason that installation is required is to give users the ability to access applications from within the Google Drive user interface. Without installation, users would have no starting point for most applications, as they would not be able to start at a specific file, and then choose an application.
That said, I realize it can be difficult to work with in early development. We (the Google Drive team) are evaluating if we should remove this requirement or not. I suspect we'll have a final answer/solution in the next few weeks.
Update: We have removed the installation requirement. Chrome Web Store installation is no longer required for an app to work with a user's Drive transparently, but it is still required to take advantage of Google Drive UI integrations.
To provide the create->xxx behaviour that makes a new application document from the drive interface, and to be able to open existing documents from links, there must be some kind of manifest registered with Google's systems and some kind of agreement from the user that an application can access your documents and work with specific file types. There's little way around this when you think about the effects of not doing this.
That said, there are two high level issues that make for compatibility problems.
As the poster says, the requirement to install in the chrome store
severely limits the number of potential users.
But why? Why do the majority of Chrome Web Store applications say that they only work on Chrome? Most of these are wrappers to web applications that work on a range of browsers, yet you click through a selection and most display "works on chrome", aka only installs on chrome.
Before we launched our application on chrome we found that someone had created "xxxxxxx launcher" in the store, that simply forwards to our web app page. We're still wondering why it only "works on chrome". I suspect that some default template for the web store has:
"container" : "CHROME",
in it, which is the configuration option to say chrome only. That said, I can't find one, so I'm very confused why this is. It would be healthier if people picked Chrome because it's the better browser (which it is in a number of regards), not because their choice is limited if they don't. People can always write to the application vendor and ask if this limitation is really necessary.
The second thought is that a standardised manifest format across cloud storage providers would mean a much higher take up in web app vendors. Although, it isn't hugely complex to integrate, for example, with Google Drive, the back-end and ironing out the the details took over a week in total. Multiply that lots of storage providers and you have you lose an engineer for 2 months + the maintenance afterwards. The more than is common across vendor integration, the more likely it is to happen.
And while I'm on it, a JavaScript widget for opening and saving (I know Google have opening) by each cloud storage provider would improve integration by web app vendors. We should be using one storage providers across multiple applications, not one web application across multiple storage providers, the file UI should be common to the storage provider.
In order to sync with the local file system, one would need to install a browser plug-in in order to bridge the Web with the local computer. By default, Web applications don't have file I/O permissions on the user's hard drive for security reasons. Browser extensions, on the other hand, do not suffer from this limitation as it's assumed that when you, the user, give an application permission to be installed on your computer, you give it permissions to access more resources on the local computer.
Considering the add-on architectures for different browsers are different, Google first decided to build this application for their platform first. You can also find Google Drive in the Android/Play marketplace, one of Google's other app marketplaces.
In the future, if Google Drive is successful, there may very well be add-ons created for Firefox and Internet Explorer, but this of course has yet to be done and depends on whether or not Google either releases the API's to the public or internally makes a decision to develop add-ons for other browsers as well.

How would HTML5 web databases be cleaned-up?

I've started looking into HTML web database storage for some Chrome extension I'm working on, and it made me wonder - Who should be cleaning abandoned web databases? As opposed to desktop apps, there's no uninstaller for a web site. And as opposed to regular cookies, web databases can be much larger than just 4KB.
I can imagine some browsers or addons might give advanced users a way to clean up locally stored data, but I can't imagine my parents doing that. What will prevent web sites from clogging their hard drive once this feature is commonly used? Is there any way honest and responsible web sites can have their local data removed once they are not used anymore?
On the two websites and 4 apps I use html5 local storage in, I offer an option somewhere (off the About page, or in account settings, or a link at the bottom of the page) which gives you the ability to remove the local database and key-value pairs, as well as the option to opt-out of the site using it.
It'll be persistent, just like cookies. The difference with cookies is that you can store much more data and no expire date can be given.
Firefox has an option to clean those information automatically (Offline storage)

Which is the easiest cloud for static web content

I've got a few HTML pages with the requisite images, css and other bits and pieces, all static content no CGI required. I currently host it on an Amazon EC2 image that I need to have up and running for a different application. Ideally I'd like to move the hosting of the static content off the EC2 image so that it's independent of any single EC2 instance. I'd like to host it on one of the free or at least pay as you go cloud options.
The options I've come across are:
Windows Azure, in this case I haven't been able to get .html pages working and even if it is possible would it mean I'd have to update the whole Windows Azure app everytime I needed to update an image? Or is there an easy way static web content could be served up from Azure blobs?
Amazon's S3, I think I'd have to put fully qualified URL's into each HTML page for each image, css etc. file but that wouldn't be too bad. This seems like a reasonable option.
Google's App Engine, only spent 10 minutes looking at it but it seems like it would work as well.
Wordpress, I could just incorporate the HTML into a wordpress blog site but I find the themes a little bit too restrictive, pages can only be so wide etc.
Is there an easier way?
Update:
After some further investigation the two best ways I found are the S3 approach as described by Sug and Windows Azure Blob storage (rather than a Windows Azure service).
The difference between S3 and Azure Blobs is how the CNAME can be managed:
For S3 you'll end up with a CNAME like mybucket.mydomain.com
For Azure you'll end up with a CNAME like *.mydomain.com where * represents whatever you like. To access blobs the path is then *.mydomain.com/container/.
So S3 dictates the CNAME host but gives full flexibility on the resource path. Azure gives full flexibility on the CNAME host but dictates the first part of the resource path.
For serving only static files, using services like AppEngine or Azure will be over kill.
The simplest solution will be to use AWS S3:
1) No coding required
2) Pricing
3) You can easily map a bucket to your own domain or subdomain.
4) Free client tools to manage your buckets as it was dead simple filesystem.
I personally use S3Fox but there are many others (BucketExplorer is another example)
“S3 dictates the CNAME host”
Amazon has a CDN service called CloudFront, that uses an S3 bucket for storage. You only pay for S3 data transfer (I think).
Your bucket contents are copied to Amazon’s CDN, meaning superfast access from around the world. However, because it’s a CDN, files are automatically cached for a long time (so there’s a delay when re-naming or deleting files).
Just using an S3 bucket, and setting up another domain to point to the bucket via a CNAME, might be the best idea.
For simple sites like this, I've had good experiences with Nearly Free Speech.Net.
GitHub.com pages. You just need to know Git basics, check out the gh-pages branch, and put the static content there. It will be available at http://your-name.github.io/your-project/
For example, this is my project's file.