How will you debug a webpage which is not loading? - html

A web page is not loading/hanging. How will you debug it?
I have been asked this question couple of times during my telephonic interviews. But I don't know the perfect answer.
I had given answers such as checking if the web-app is deployed properly, the internet itself might be slow, the JSP might have some errors, checking logs for any such detail, etc. But interviewer kept asking "These are all good checks, but what if all of these are fine, what else might be wrong?"
Also, it is not a JavaScript specific question. I can debug the JS/jQuery code using debugger, or following the console.log(). But how will you debug a plain JSP page?
Can any web-application gurus at SO help?

Once you know that you can't simply get to your site in the expected way (what I call the Hail Mary Test), then you need to start from the inside out.
Because of the multiple failure points a website can have, I always create a command line environment that allows me to test the framework & DB operation independently of the web server, firewall settings, etc. This can take some fiddling depending on what you are using, but I've done this successfully with Django, WordPress, Drupal, etc.
Once I know the app itself is working, I connect with a command line client (e.g. links) to see if a client coming from localhost works as expected. This confirms that the server itself is working (at least partially). Then I test from another host on the same LAN. More than once I've seen localhost work and LAN access not work, and the problem is almost always server configuration or firewall configuration.
If all of that works, but you still can't get to your site from the internet, then it is a networking / firewall setting somewhere further up the food chain. Try to find a host that is one step farther up from where you last succeeded and test that. Lather, rinse, repeat.

Related

Certain webpages only half loading

Good day,
Since moving house and having my new internet (and new ISP) installed, some web-pages only half-load and I constantly get cloud-flare warnings requesting that I confirm that I am not a robot. I did not have this problem with the previous ISP. I have attached some examples of the issues I'm having. This does not appear to be a "Chrome-only" thing as I get the same results with IE.
1. The first image shows the web page loading, but not giving all the information.
2. The second image shows the warnings that constantly pop up.
3. The third image is just an example of buttons that won't load on clicking, and some of those buttons don't even let me click them at
all.
Issue:
Can't load into certain pages, and can't click certain buttons, and keep getting this cloud-flare thing.
Things I have tried:
- Restarting computer
- Reinstalling chrome
- DNS flushing using the command prompt
- Changing DNS to the google DNS
Any and all help is greatly appreciated. Thank you.
I'm imagining your computer is the same, but your router is new/different (for the new ISP)? Maybe your computer has existing proxy settings which were specific to your old ISP, and won't work for the new one.
Search 'proxy' in Windows search and disable any custom proxies.
I've got the same issue as well when i moved into my new house. When i researched some more and talked with my tech-savvy cousin, he said that it could be because the internet connection isn't strong (some of the data gets lost in the way) or the ISP isn't allowing you to do the things that you want to do. Sometimes this occurs when you are asking for a lot of data. It is a security feature. Call them and ask why this is happening. Maybe it's different.
Also, check if you have a VPN on. I could also disrupt the websites.
Hope this was useful!!
Try connecting a different device (like an Xbox, or PS, or your phone) and see what happens. If it can connect to the devices mentioned above correctly, then the problem is with your PC or laptop or apple device.
Apparently the issue had to do with the IP address. They gave me my own one and all was well in the world.

Can chrome be used, from the command-line, to retrieve a URL's content to a file?

I've been driving myself mad trying to get curl, wget, the python request module, and others, to simply get me logged in to a website and pull page text there. I can certainly request HTML from the site, but only as an anonymous user. I've spent a few hours with tricks like chrome's "copy cURL" feature, but the website in question is smart enough to defend against login playbacks.
All I want is a way, from the command-line, to do something like:
chrome.exe --output_to_file page.html https://www.endpoint.com/auth_access_only.html
Essentially, I'm looking for chrome to do for me what cURL does, but I want the command-line invocation to be executed as me. I can see how this might open a potential security issue, but I don't mind at all if I have to do something magical to authorize my script. I'm not looking to do anything evil - I just want to be able to write scripts that are as "me" as I am.
I guess that, if it's truly unavoidable, I could suck it up and dust off Internet Explorer. I'd really rather not do that. I'd feel so dirty.
This is possible, but it's not as simple as you're thinking.
You can use the Chrome Debugging Protocol to remote-control Chrome.
You will need to write some code to make this work - I have done similar tasks using the chrome-remote-interface library for Node.js.
Make sure you understand what a browser profile is and where your profile folder lives.
If Chrome is already running using your browser profile: make sure it was launched with --remote-debugging-port=9002 or similar.
If Chrome is not already running using your browser profile: launch it with --user-data-dir="C:\path\to\your\profile" --remote-debugging-port=9002 or similar.
The "running or not" part is a bit tricky - you cannot launch more than one Chrome instance with the same browser profile, but you need to use this user profile because your login data is stored there. It may actually be easiest to create a separate browser profile that is just used for this automated task, and log in to the site there too.
Then, at a high level, your Node.js code will need to connect to Chrome, load the page, wait for the response, and save it to a file. Have a look at the example code for the chrome-remote-interface library - you can definitely piece together what you need from there.
Another option which uses the same underlying technology is to use puppeteer which is another tool to automate Chrome. It is designed to start from a fresh profile every time. If you do this, you'll need to script more interactions:
Visit the site's login page
Type the login credentials into the form and click the login button
Visit the site's authenticated page and save it to a file.
The benefit of this approach is that the result should be more reliable, preventing issues like expired login sessions.

Detect Internet Connectivity

I am trying to figure out a way to do an internet connectivity check for an AIR for iOS app. Previously, I was using (against my better judgement) a URLMonitor that checked Google once every 30 seconds. I did not like putting that load onto Google and neither did they; this morning, our network got flagged as a possible DDoS attacker simply from testing the app. So I had to disable this type of check and move on.
I have thought about using the NetworkInfo ANE from Adobe, but that presents its own issues in determining internet connectivity. The only way I can think of doing it is to check for interfaces "en0" and "pdpxx" (which correspond to WiFi and Cellular interfaces, respectively) and check their IPs to ensure they are not in the 192.168.x.x, 10.10.x.x, or 127.0.x.x ranges. However, I am not entirely sure those are the only static router/localhost IPs out there and there is always the possibility that the network interface names will change in the future, which would render this monitor useless. There is also the issue of IPv6 possibly throwing a wrench into this method as well.
Is there another way to check if the user is connected to the internet? I've searched multiple times and it seems that these are the only two ways to check. If that is the case, what is the best way to check?
I'm surprised that you got flagged as a DDoS attacker, are you sure that's what happened?
In any case, if your not happy with putting the load onto someone else's server, then make your own server, just a basic setup that you use with the URLMonitor. You don't have to use google's url with the URLMonitor, you can pass it another URLRequest, which could point to your own server.
monitor = new URLMonitor(new URLRequest("http://www.you-own-server.com"));
This might also be useful if you decide that you want to pass more data between the app and the server. It's your server, so can do what you want with it.
I don't think there's any other way to check if the user is connected to the internet. To be honest, I don't see why there would be. Checking for the users interfaces/wan0 etc... probably would be possible, but you'd need another program, maybe a simple python or c++ program, that Air could use to check these things, but that sounds like the long way round.

Webrtc no video when users are on different network

So I have a website set up. With a clienta page and a clientb page.
This is basically a split version of this site:
https://webrtc-demos.appspot.com/html/pc2.html
I am using signalr (websockets) to exchange information between the clients.
When a user opens up both cleinta and b on the same computer it works fine.
When a user opens up clienta on one computer and on another computer opens up clientb BUT both computers are on the same network, it works fine.
When a user open up clienta on one computer and on another computer opens clientb BUT on different networks there is no video or audio.
When the ice messages are exchanged i pass back a number so i know the order its sent. On the opposite end they dont always arrive the same order, but audio #1 always gets there before audio#2. and the same with video.
In all cases im using chrome dev 24.
I realize sequence and timing are everything with webrtc. Im just not understanding how it can work on separate pcs on the same network but not different networks. I should point out that when i say same network ive test both at work with 2 pcs and at home with 2 pcs. so i dont think its a firewall thing.
Any ideas?
I did check out https://apprtc.appspot.com/ as its a slightly more relevant link. This lead me to adding a couple settimeouts though they didnt seem to help.
One last thing, i did mention about the ice messages. I should also note that both sides send and receive all the messages. an offer is created and an answer is created. Hence, it working on same network machines.
Update:
Im using jsep and all the latest syntax according to webrtc.org
Update 11/15/2012:
So is there a open source package for creating a media relay?
Specifically .net, but could be php. The current site is public facing, this is how I was able to test on multiple networks. So it seems like I just need another endpoint for the media relay.
Updated 11/16/2012:
In hopes that I'll get it working or get valuable input from other developers I'm putting my code out on github.
https://github.com/thorst/RTC
Updated 11/21/2012
The code now works for everything except different network connections. (as described in this post)
Updated 5/28/2013
This years google io was much better at explaining turn, stun, and ice.
http://www.youtube.com/watch?feature=player_embedded&v=p2HzZkd2A40
For reference here are the older helped me get started
http://www.youtube.com/watch?v=E8C8ouiXHHk
http://www.youtube.com/watch?v=dAhhniqwkp8
Chrome doesn't implement TURN yet. STUN will only help with some types of NATs but not with symmetric ones which are prevalent in home routers. You need a media relay with a public IP to connect two devices that are behind symmetric NATs and the standard for that is TURN https://www.rfc-editor.org/rfc/rfc5766 .

Increasing Google Chrome's max-connections-per-server limit to more than 6

As far as I know, at the current moment, late 2011 the max-connections-per-server limit remains 6. Please correct me if I am wrong. This is bad that we cannot fix this easily as in Firefox. As far as I know this value is hardcoded.
One of the solutions is to download the Chromium's sources and rebuild them. Is there a more easy solution?
Is there any tricky way to hack this without creating a dozen of mirror-domains?
Why I'm asking the question: My task is to create a html-javascript slideshow that will run inside a fullscreened browser, and a huge monitor is hanging on the wall. The javascript is really complicated, it preloads photos and makes a lot of ajax calls to my web services. If WIFI connection is slow, if 6 photos are loading, the AJAX calls fail, the application runs bad. I want a fast solution based, on http or browser or ubuntu tweak something else, because rebuilding the javascript app will take days.
Offtopic: do you know any other things that can be tweaked in my concrete situation?
IE is even worse with 2 connection per domain limit. But I wouldn't rely on fixing client browsers. Even if you have control over them, browsers like chrome will auto update and a future release might behave differently than you expect. I'd focus on solving the problem within your system design.
Your choices are to:
Load the images in sequence so that only 1 or 2 XHR calls are active at a time (use the success event from the previous image to check if there are more images to download and start the next request).
Use sub-domains like serverA.myphotoserver.com and serverB.myphotoserver.com. Each sub domain will have its own pool for connection limits. This means you could have 2 requests going to 5 different sub-domains if you wanted to. The downfall is that the photos will be cached according to these sub-domains. BTW, these don't need to be "mirror" domains, you can just make additional DNS pointers to the exact same website/server. This means you don't have the headache of administrating many servers, just one server with many DNS records.
I don't know that you can do it in Chrome outside of Windows -- some Googling shows that Chrome (and therefore possibly Chromium) might respond well to a certain registry hack.
However, if you're just looking for a simple solution without modifying your code base, have you considered Firefox? In the about:config you can search for "network.http.max" and there are a few values in there that are definitely worth looking at.
Also, for a device that will not be moving (i.e. it is mounted in a fixed location) you should consider not using Wi-Fi (even a Home-Plug would be a step up as far as latency / stability / dropped connections go).
BTW, HTTP 1/1 specification (RFC2616) suggests no more than 2 connections per server.
Clients that use persistent connections SHOULD limit the number of simultaneous connections that they maintain to a given server. A single-user client SHOULD NOT maintain more than 2 connections with any server or proxy. A proxy SHOULD use up to 2*N connections to another server or proxy, where N is the number of simultaneously active users. These guidelines are intended to improve HTTP response times and avoid congestion.
There doesn't appear to be an external way to hack the behaviour of the executables.
You could modify the Chrome(ium) executables as this information is obviously compiled in. That approach brings a lot of problems with support and automatic upgrades so you probably want to avoid doing that. You also need to understand how to make the changes to the binaries which is not something most people can pick up in a few days.
If you compile your own browser you are creating a support issue for yourself as you are stuck with a specific revision. If you want to get new features and bug fixes you will have to recompile. All of this involves tracking Chrome development for bugs and build breakages - not something that a web developer should have to do.
I'd follow #BenSwayne's advice for now, but it might be worth thinking about doing some of the work outside of the client (the web browser) and putting it in a background process running on the same or different machines. This process can handle many more connections and you are just responsible for getting the data back from it. Since it is local(ish) you'll get results back quickly even with minimal connections.