I am starting a new career as a developer and I am trying to offer my services to shops in my area for free to use as my portfolio. My question is, is it okay to deploy/hand off a website to my client using google domains and GitHub pages? Since it is the cheapest way to deploy a website?
What is the downside of doing so, or should I just suggest using a hosting site such as Hostgator?
I have created a website that is ready to be handed off.
Here are my thoughts:
For a simple static website that doesn't have server-side scripting (such as PHP, etc.) GitHub pages are a fine route for deploying your website.
Another factor to take into account would be the GitHub pages limits.
No larger than 1 GB
Bandwith limit of 100 GB per month
Limit of 10 builds per hour (excluding custom actions workflows)
Other options
AWS
Google Cloud
Heroku
"I am trying to offer my services to shops in my area for free to use as my portfolio."
Based on this alone, I will tell you that GitHub pages are probably the most ideal way to go for a portfolio. If your website is more complex, however, I would go for one of the options I listed above. Since GitHub pages is free and very easy to use, though, I do recommend it. Say your website grows in complexity and you begin to feel you need more wiggle room/flexibility to control how it's hosted, you can always start using something else.
I would personally, at the very least, use GitHub pages as a starting point for a static site.
Related
Noob poster: I'm trying to set up a Joomla 4 site with four distinct functions 1) Blog 2) Social 3) Mobile and 4) Site development. I plan to add the Gantry 5 framework and Gantry 5 Helium template. Each site function has to share data bidirectionally with the others (e.g. signon and workflow). I have a hosting provider and have been conversing with the hosting techs. There seems to be no definitive answer on whether to set up subdomains or use subdirectories for each of the functions. To lessen the impact of higher volume potential and a contained workflow I was favouring the subdomain route. On this approach the last word was I needed to install Joomla on each subdomain, followed on each by the Gantry 5 framework and then the Gantry 5 template. There appear to be pros and cons on each approach and we haven't been able to determine if subdomains or subdirectories would ultimately give me what I need. The tech suggested I post here. NOTE: There is no additional cost from my provider for using subdomains.
Activity so far: Reading what I can find on subdomains and subdirectories. I'm a designer so I've been reading then conversing then more reading then more conversing with the hosting techs trying to figure out the best way of going. It's taken a few days and now I'm chasing my tail.
Thanks.
I am making a android app which will have around 1-2k users per day, all I need is a json file which is hosted on https://name.github.io/repo/filename.json.
Is their any limitations on doing so?
Is their any better way to host this json file to fetch the data?
I will be also updating this json from time to time.
GitHub doesn't consider using using GitHub Pages as a CDN for hosting static assets to be within its guidelines for GitHub Pages. The intended purpose is to host a personal blog or website or a website for your open source project.
The documentation linked above outlines acceptable uses and limits.
Instead, you could end up storing this JSON file in some sort of cloud bucket (e.g., S3), possibly with a CDN (e.g., Cloudflare) in front of it. That would probably keep costs minimal for your app.
At my company we are redesiging our e-commerce website. HTML and CSS is re-written from the ground up to make the website responsive / mobile friendly.
Since it concerns one of our biggest websites which is responsible for generating of over 80% of our revenue it is very important that nothing goes "wrong".
Our application is running on a LAMP stack.
What are the best practices for testing a major redesign?
Some issues i am thinking of:
When a/b testing a whole design (if possible) i guess you definitaly
dont want Google to come by and index youre new design (since its
still in test phase). How to handle this?
Should you redirect a percentage of the users to a new url (or
perhaps subdomain)? Or is it better to serve the new content from the
existing indexed urls based on session?
How to compare statistics from a Google Analytics point of view?
How to hint Google about a new design? Should i e.g.
create a new UA code?
Solution might be to set a cookie only for customers who enter the website via the homepage. Doing so, you're excluding adwords traffic and returning visitors, who might be expecting an other webdesign, serve them the original website and leave their experience untouched.
Start the test with home traffic only, set cookie and redirect a percentage to a subdomain. Measure conversion rate by a dimension in Google analytics, within same analytics account. Set a 'disallow subdomain' in your robots.txt to exclude the subdomain from crawling by SE's.
Marc, You’re mixing a few different concerns here:
Instrumentation. If you changes can be expressed via HTML/CSS/JavaScript only, i.e. optimizational in nature, you may be able to instrument using tols like VWO or Optimizely. If there are server side changes too, then a tool like Sitespect (any server stack) or Variant (Java only) might be in order. The advantange of using a commecial product is that they provide a number of important features out of the box, e.g. collecting experiment data, experience stability (returning user sees the same experience), etc. You may be able to instrument on your own, but unless you’re looking at a handful of pages, that typically is hard, particularly if you want to do it outside of the app, via the DevOps mechanisms.
SEO. If you get your instrumentation right, this shouldn’t be an issue. Public URIs should not differ for the control and variant of the same resource.
Traffic routing. Another reason to consider a commercial tool. They factor that out of your app and let you set percentages. Some tools, like Variant, will allow you to write custom targeters, e.g. “value” users always see control.
The question is pretty clear I think, but I will elaborate on why I'm asking it.
I created a little blog engine based on OneNote. Basically, the blog configuration asks for an access to OneNote. Then the user chooses a section under which the blog posts are stored.
There is a cron script that will use all these informations to automatically get new pages, fetch the medias and cache every, and finally display the posts.
I chose OneNote because I own three Windows 8 computers and a Windows Phone, so OneNote was an easy choice, as I didn't want to get an other application to manage my blog.
There is still a lot to do (as always with softwares...), but I want to make this more or less an open source project, so that other people can install it on their websites and link it directely to OneNote.
The only "big" obstacle for this now is that authentication in the OneNote API needs to register the application on the Live Connect, and specify a redirect domain. So every user wishing to use this blog engine on their server will have to register their own application... That will look complicated just for a blog, especially if you're not tech-savvy.
Is there a way to "skip" or work around this requirement, even if it requires the user to make the section public (as it is for a blog, this doesn't seem too much to ask) ?
Thank you in advance,
Cheers
Sounds like an awesome project! When you get it released be sure to let us know at #OneNoteDev.
Unfortunately, at this time there's no way to circumvent the requirement for Live Connect OAuth configuration. You could offer a hosted variant so only you need to worry about the LiveID configuration.
I need advice.
I inherited a website that's been around a long time. The website gets a lot of organic traffic from Google. The business and website owner is upgrading the site to make the content more manageable. At the moment, a wordpress CMS powers half the site. Physical html pages make up the remainder of the site. Here's a summary:
1) Guide section which consists of a php wordpress driven blog found at http://mysite.com/guide. Individual pages in the guide section have urls such as http://mysite.com/guide/4930-hello-world or http://mysite.com/guide/489-welcome-to-my-site. The business owner spent 2 months populating these pages and is reluctant to scrap it for another system.
2) E-commerce section which consists of a thousand static/physical product pages. The product pages are NOT dynamically driven and no url rewrite rules are involved. The pages have urls such as http://mysite.com/products/239123-sofa.html and http://mysite.com/products/23-office-desks.html
The owner wants to use a non-PHP ERP or CRM solutions to power the website's e-commerce section and streamline some of the business' accounting, inventory, marketing and work-flow operations.
I have never worked with ERPs or CRMs before. Some questions I have are:
1) Is it a good idea to have one website under one domain driven by two different technologies? Wordpress manages pages such as http://mysite.com/guide/4930-hello-world while a Microsoft application manages pages such as http://mysite.com/products/239123-sofa.html. As mentioned earlier, the business owner is reluctant to scrap wordpress because he put considerable effort into populating it.
2) What challenges will I experience implementing url-rewrite rules (because it's two technologies under one domain, but different sub-directories)? I need to make sure the website retains its Page Rank and SEO goodies.
3) What server configuration challanges will I experience?
I've never replaced a legacy system of this magnitude on my own before. I appreciate any advice or feedback you guys can offer. Also let me know if there's anything else I should research.
Thanks
You can think of a configuration where you have separate logical/physical back-end servers for each system. Then you can have a front-end proxy (for instance Apache with mod_proxy) serving all the requests and separating them between the different back-ends.
This will also work as an application level "firewall" protecting you from unwanted requests, since you will only forward URLs that you recognize.
With regards to #1:
Big picture, while it's tough to say with the level of detail you've specified I'd say you'll probably want to make the system homogenous: use one technology and permanently redirect the legacy pages. It'll be much more cost-effective to maintain. Port the legacy WordPress content over to a new, single system.
With regards to #2:
If you're using ASP.NET, you can write an implementation if IHttpHandler to do the URL redirection, issuing an HTTP 301 (permanently moved) so that Google knows where the content has been moved to. I'd imagine other technologies have similar capabilities.
With regards to #3:
If you're using a single technology, this issue should be alleviated.