Chrome Extension for Amazon Cloud - google-chrome

Is there any extension for chrome which is similar to Hybridfox or Elasticfox?

You can use AWS web console from Amazon to do the same things from Google Chrome.
As of now there is nothing similar to Elastifox on Google Chrome.

There is this Chrome Extension:
elastic-chrome
But it does not seem to be actively developed, sorry.

So there is Eucalyptus.
It is an abstraction layer between EC2 and your virtualized hardware. It is also an abstraction layer between physical hardware, KVM, VMWare, etc and your VM operating systems. Feasible. Unfortunately if you left VMWare for KVM, Xen, etc. for the performance gains then you are literally destroying these benefits so that your infrastructure team can assign quotas instead of provisioning things as the company needs, or actively managing environments and the costs associated with them. If you are on physical hardware then welcome to the wild world of virtualization. You should evaluate virtual Iron (Oracle VM) now since its free and you need to catch up.
Unless your infrastructure team/guy is outnumbered by your applications and development personnel by a ratio of at least 10 to 1 this is completely manageable and is supposed to be their job. Otherwise then assigning quotas would be a great idea, although it would be a decent performance drain (tell me I am wrong with references please.)
It would be worth your time to look at Chef (Puppet sucks.) Knife has EC2, KVM, VMWare, et al provisioning capabilities to spin up an entire node and everything you want on it from a simple CLI command.
As for browser extensions, there are quite a few Amazon cloud management tools available for Firefox. Unfortunately since Google and Amazon are now competing against one another there are very few quality tools for Chrome. I use Chrome as my primary browser, Firefox for tools that arent controversial, and IE/Safari when there is no other option.

Related

Importing a Windows Image to Google Cloud

Per the following support article: https://cloud.google.com/compute/docs/images/import-existing-image ... it is possible to import RAW Linux images for use in Google Cloud VM's. However, it makes no mention of Windows instances. Is this also an option using the same RAW compressed import format?
To the extent that the RAW Windows disk image is GCE compatible (e.g., has all of the required device drivers), it should work from a technical perspective. The same would be true for any x86 OS, such as FreeBSD, etc.
This ignores any potential licensing issues that you may need to stay in compliance with, however. If you use a GCE Windows Service Instance, the billing is included in the per-hourly rate charged for the VM. If you create your own image, it will be up to you to stay in compliance. Your company might have some bulk volume licensing agreement with Microsoft which covers VM's, but there are often very specific terms you have to follow to stay within compliance of that agreement.

Openshift use for Commercial web sites

Can we use OpenShift Express, which is free right now, for commercial web applications?
And if not, then which PAAS services are there which are free, and have no vendor lock-in.
You can use OpenShift Express for commercial web apps but be sure it will meet your requirements. Potential issues include:
currently no outgoing email support
currently applications do not scale to accommodate load
1GB disc space limit
shared hosting
limit 3 cartridges (DB, metrics, etc) per app
no official support from Red Hat. Documentation is good and community forum support is very active.
OpenShift would meet many commercial site requirements. I think it's a great option. For more info read the FAQ.
Openshift have opened SMTP Port now.
check : https://www.redhat.com/openshift/community/blogs/outbound-mail-ports-are-now-open-for-business-on-openshift
You can use Cloudify. It is build for orchestrating any application on any Cloud without changing the application code or architecture. Cloudify is opensource and free.
Cloudify offers many features such as pluggable monitoring, scale rules by any KPI, APIs for sharing runtime information between agents and even Chef integration
Due Diligence Im the product manager for Cloudify in GigaSpaces
I've been using it for some small services and clients.
There isn't any clause on there terms of use that states that you can't use it as commercial web apps. But attention to the following line:
"You may not post or transmit through this website advertising or commercial solicitations; promotional materials relating to website or online services which are competitive with Red Hat and/or this website."
Yes, OpenShift has a tier that is completely free to use, even for commercial applications. There are no plans to change this in the future. There are, however, some minor limitations to the FreeShift tier:
Scaling limited to 3 gears
Serves about 15 pages/second
3GB total storage space (1GB per gear)
No SSL certificate for your custom domain name
No support from Red Hat
An alternative is Heroku, which you should definitely check out if you haven't already. Having used both, I can tell you that it's a much more polished platform: The servers are about 4× faster, you can run as many apps as you want, and the Heroku Toolbelt is much more powerful than the OpenShift's Client Tools. Heroku is also completely free until you reach 10k rows in your database.
RedHat will provide support (and scaling) when they release their MegaShift tier.
(https://openshift.redhat.com/community/developers/pricing)
I don't think there is a date yet for this.
It won't be for free off course.

How does CrossBrowserTesting.com work?

I have been curious about better ways to cross browser test than those screenshot services or maintaining my own array of VMs to VNC into. Then today I found crossbrowsertesting.com (which seems to allow you to connect from your browser via VNC to one of their machines running virtually any browser). This is really similar to a solution I had been thinking about, but veered away for a few reasons. I have two questions about this service:
if you have used the service, what are its pros/cons?
how do they get around people doing all kinds of nasty things on their VMs, since they give you a full desktop to play around in.
Bonus: how do they get around the legal issues regarding people VNC'ing into Windows and using IE when the connecting clients clearly do not own the software?
Have not used it sorry, but the standard con for a remote service is that your test site has to be accesible to the web.
You secure the desktops with the tools you get with Windows Server, the ability to lock down a user has been around for a while, although it still needs work.
Bonus: You can licence Terminal Services for multiple users, we frequently use it on our "management" servers that allow all the technical staff to log onto one server in the environment and then connect from there to the production servers. We licence it for everyone to log into at once.
Apart from not being able to access local, i.e. on company's intranet or your hard drive, web sites crossbrowsertesting.com may also have a response time issues. VNC is not a very efficient protocol and working using VNC can be a pain.
I prefer tool that allow me to install all relevant browsers on my PC, such as BrowserSeal

Tool to test websites

I'm developing on super fast fibre optic connection.
I want a tool that allows me to test web sites at certain preset speeds for example - I want to feel the experience of my site loading at modem speeds, then perhaps 1mbps, 2mbps, etc.
Basically I want to be able to set the speed of the connection so that I get the real feel of the site loading remotely from other countries and connections.
Anyone know of such a tool?
WANEM is a nice open source solution that can simulate Network delay, Packet loss, Packet corruption, Disconnections, Packet re-ordering, Jitter, etc.
It also supports a mode of operation that only uses one network-interface, which makes it super quick to set up a test environment.
EDIT
Although WANEM is a Linux application, you only need to burn the bootable CD and start a machine with that CD, no need to sacrifice a machine to run WANEM. If even that's not an option you can also download it as a virtual appliance that runs in a VMWare Workstation ($$), VMWare Player (free) or VMWare Server (free).
However, in my opinion(based on real usage of such products) it's really easier to have the "network simulator" on a separate machine instead of loading it on either the server or the client under test. And as explained above, thanks to the bootable cd option that can be any machine you have lying around - we typically use decommissioned desktops and notebooks for this purpose.
there are a lot of tools outside like:
http://www.netlimiter.com/
http://www.antamediabandwidth.com/
...
basically the most of them work likes proxies

Internet facing Windows Server 2008 -- is it secure?

I really know nothing about securing or configuring a "live" internet facing web server and that's exactly what I have been assigned to do by management. Aside from the operating system being installed (and windows update), I haven't done a thing. I have read some guides from Microsoft and on the web, but none of them seem to be very comprehensive/ up to date. Google has failed me.
We will be deploying a MVC ASP.NET site.
What is your personal check when you are getting ready to deploy a application on a new windows server?
This is all we do:
Make sure Windows Firewall is enabled. It has an "off by default" policy, so the out of box rule setup is fairly safe. But it never hurts to turn additional rules off, if you know you're never going to need them. We disable almost everything except for HTTP on the public internet interface, but we like Ping (who doesn't love Ping?) so we enable it manually, like so:
netsh firewall set icmpsetting 8
Disable the Administrator account. Once you're set up and going, give your own named account admin rights. Disabling the default Administrator account helps reduce the chance (however slight) of someone hacking it. (The other common default account, Guest, is already disabled by default.)
Avoid running services under accounts with administrator rights. Most reputable software is pretty good about this nowadays, but it never hurts to check. For example, in our original server setup the Cruise Control service had admin rights. When we rebuilt on the new servers, we used a regular account. It's a bit more work (you have to grant just the rights necessary to do the work, instead of everything at once) but much more secure.
I had to lockdown one a few years ago...
As a sysadmin, get involved with the devs early in the project.. testing, deployment and operation and maintenance of web apps are part of the SDLC.
These guidelines apply in general to any DMZ host, whatever OS linux or windows.
there are a few books deicated to IIS7 admin and hardening but It boils down to
decide on your firewall architecture and configuration and review for appropriateness. remember to defend your server against internal scanning from infected hosts.
depending on the level of risk consider a transparent Application Layer gateway to clean the traffic and make the webserver easier to monitor.
1, you treat the system as a bastion host. locking down the OS, reducing the attack surface(services, ports installed apps ie NO interactive users or mixed workloads, configure firewalls RPC to respond only to specified management DMZ or internal hosts).
consider ssh, OOB and/or management LAN access and host IDS verifiers like AIDE tripwire or osiris.
if the webserver is sensitive, consider using argus to monitor and record traffic patterns in addition to IIS/FW logs.
baseline the system configuration and then regularly audit against the base line, minimizing or controlling changes to keep this accurate. automate it. powershell is your friend here.
the US NIST maintain a national checklist program repository. NIST, NSA and CIS have OS and webserver checklists worth investigating even though they are for earlier versions. look at the apache checklists as well for configuration suggestions. review the addison wesley and OReilly apache security books to get a grasp of the issues.
http://checklists.nist.gov/ncp.cfm?prod_category://checklists.nist.gov/ncp.cfm?prod_category
http://www.nsa.gov/ia/guidance/security_configuration_guides/web_server_and_browser_guides.shtml
www.cisecurity.org offer checklists and benchmarking tools for subscribers. aim for a 7 or 8 at a minimum.
Learn from other's mistakes (and share your own if you make them):
Inventory your public facing application products and monitor them in NIST's NVD(vulerability database..) (they aggregate CERT and OVAL as well)
subscribe and read microsoft.public.iinetserver.iis.security and microsoft security alerts. (NIST NVD already watches CERT)
Michael Howard is MS's code security guru, read his blog (and make sure your dev's read it too) it's at: http://blogs.msdn.com/michael_howard/default.aspx
http://blogs.iis.net/ is the IIS teams blog. as a side note if you're a windows guy, always read the team blog for MS product groups you work with.
David Litchfield has written several books on DB and web app hardening. he is a man to listen to. read his blog.
If your dev's need a gentle introduction to (or reminder about) web security and sysadmins too! I recommend "Innocent code" by Sverre Huseby.. havent enjoyed a security book like that since a cookoo's egg. It lays down useful rules and principles and explains things from the ground up. Its a great strong accessible read
have you baselined and audited again yet? ( you make a change you make a new baseline).
Remember, IIS is a meta service (FTP.SMTP and other services run under it). make your life easier and run a service at a time on one box. backup your IIS metabase.
If you install app servers like tomcat or jboss on the same box ensure that they are secured and locked down too..
secure web management consoles to these applications, IIS included.
IF you have to have DB on the box too. this post can be leveraged in a similar way
logging.an unwatched public facing server (be it http, imap smtp) is a professional failure. check your logs pump them into an RDMS and look for the quick the slow and the the pesky. Almost invariably your threats will be automated and boneheaded. stop them at the firewall level where you can.
with permission, scan and fingerprint your box using P0f and nikto. Test the app with selenium.
ensure webserver errors are handled discreetly and in a controlled manner by IIS AND any applications. , setup error documents for 3xx, 4xx and 5xx response codes.
now you've done all that, you've covered your butt and you can look at application/website vulnerabilities.
be gentle with the developers, most only worry about this after a breach and reputation/trust damage is done. the horse has bolted and is long gone. address this now. its cheaper. Talk to your dev's about threat trees.
Consider your response to Dos and DDoS attacks.
on the plus side consider GOOD traffic/slashdotting and capacity issues.
Liase with the Dev's and Marketing to handle capacity issues and server/bandwidth provisioning in response to campaigns/sales new services. Ask them what sort of campaign response theyre expec(or reminting.
Plan ahead with sufficient lead time to allow provisioning. make friends with your network guys to discuss bandwidth provisioing at short notice.
Unavailabilty due to misconfiguration poor performance or under provisioning is also an issue.. monitor the system for performance, disk, ram http and db requests. know the metrics of normal and expected performance.. (please God, is there an apachetop for IIS? ;) ) plan for appropriate capacity.
During all this you may ask yourself: "am I too paranoid?". Wrong question.. it's "am I paranoid enough?" Remember and accept that you will always be behind the security curve and that this list might seem exhaustive, it is but a beginning. all of the above is prudent and diligent and should in no way be considered excessive.
Webservers getting hacked are a bit like wildfires (or bushfires here) you can prepare and it'll take care of almost everything, except the blue moon event. plan for how you'll monitor and respond to defacement etc.
avoid being a security curmudgeon or a security dalek/chicken little. work quietly and and work with your stakeholders and project colleagues. security is a process, not an event and keeping them in the loop and gently educating people is the best way to get incremental payoffs in term of security improvements and acceptance of what you need to do. Avoid being condescending but remember, if you DO have to draw a line in the sand, pick your battles, you only get to do it a few times.
profit!
Your biggest problem will likely be application security. Don't believe the developer when he tells you the app pool identity needs to be a member of the local administrator's group. This is a subtle twist on the 'don't run services as admin' tip above.
Two other notable items:
1) Make sure you have a way to backup this system (and periodically, test said backups).
2) Make sure you have a way to patch this system and ideally, test those patches before rolling them into production. Try not to depend upon your own good memory. I'd rather have you set the box to use windowsupdate than to have it disabled, though.
Good luck. The firewall tip is invaluable; leave it enabled and only allow tcp/80 and tcp/3389 inbound.
use the roles accordingly, the less privileges you use for your services accounts the better,
try not to run all as an administrator,
If you are trying to secure a web application, you should keep current with information on OWASP. Here's a blurb;
The Open Web Application Security
Project (OWASP) is a 501c3
not-for-profit worldwide charitable
organization focused on improving the
security of application software. Our
mission is to make application
security visible, so that people and
organizations can make informed
decisions about true application
security risks. Everyone is free to
participate in OWASP and all of our
materials are available under a free
and open software license. You'll
find everything about OWASP here on
our wiki and current information on
our OWASP Blog. Please feel free to
make changes and improve our site.
There are hundreds of people around
the globe who review the changes to
the site to help ensure quality. If
you're new, you may want to check out
our getting started page. Questions or
comments should be sent to one of our
many mailing lists. If you like what
you see here and want to support our
efforts, please consider becoming a
member.
For your deployment (server configuration, roles, etc...), their have been a lot of good suggestions, especially from Bob and Jeff. For some time attackers have been using backdoor's and trojans that are entirely memory based. We've recently developed a new type of security product which validate's server memory (using similar techniques to how Tripwire(see Bob's answer) validates files).
It's called BlockWatch, primarily designed for use in cloud/hypervisor/VM type deployments but can also validate physical memory if you can extract them.
For instance, you can use BlockWatch to verify your kernel and process address space code sections are what you expect (the legitimate files you installed to your disk).
Block incoming ports 135, 137, 138, 139, 445 with a firewall. The builtin one will do. Windows server 2008 is the first one for which using RDP directly is as secure as ssh.