SQL injection attack from localhost on live server - mysql

This is quite complex to explain but I keep getting injection attacks from another website by just clicking on a link. Oddly though it seems Google Chrome is the one generates it.
To elaborate, I have this site: http://byassociationonly.com and I have this site: http://dev.byassociationonly.com/example (can't name site as its a client site).
Whenever I click on any of the links on http://byassociationonly.com, in Google Chrome, on my machine, none of them work and I get an injection attack (I am using a plugin to send me email notifications when something like this happens, Wordpress Firewall).
The notification I receive is this: http://cl.ly/image/2U111T0m2X35
I just don't understand this error at all, Ive never had a problem before.
I've even removed the code within that page its referencing, which is from single.php, yet the problem still exists. I thought there were conflicts with my MAMP servers running locally but even if they are switched off, the problem still exists but localhost:8888 isn't referenced at all within wp_config.
However if I do this within Firefox, I don't get any notifications at all and the links work fine.
Has anybody got any ideas how to identify where the problem lies and solutions to fix?
As requested here's the code on the single.php page, that the error is reffering to: http://pastebin.com/QKqtLXQi

Did you recently install any Chrome extension? I have run into a similar problem before and after hours of troubleshooting it turned out to be an extension blocking some stuff. The fact that it works fine in FF, it feels like an isolated issue.

Related

Having trouble clearing cache to refresh webpage

This may sound like a very basic question but I feel like I've tried everything.
This a follow-up to this post I made earlier, where I resolved the issue, only for it to come back again.
To summarize, I was making some change to the contact.css file on the contact page of my website when I noticed the changes were working offline but didn't appear online. I narrowed this issue down to a caching issue with the above post (others could see the changes but I couldn't).
In the above example I couldn't get my website to show up as background-color:blue - eventually it worked and I thought I'd fixed it... So I go to change the color back to normal and boom, it stops refreshing the changes again.
So I think it's some sort of caching issue but for the life of me I can't get my cache to clear properly so that I can refresh and see the changes.
Here are the things I have tried already:
Clearing cache (many times) on Chrome, Firefox, and Opera
Hard refresh on Chrome, Firefox, and Opera
Disabling cache through dev tools on Chrome and Firefox (this worked initially then stopped working when I re-updated the website)
Checked multiple times that the CSS file uploaded correctly and the file path was correct. This was confirmed because the correct changes were seen by other people.
Flushed my DNS
Changed from my ISPs DNS to google's 8.8.8.8 + 8.8.4.4
I'm using HostGator to host my website, I'm wondering at this point whether it's something to do with them? I really just have no idea at this point.
Here's what I see online:
Here's what I should be seeing and what I do see on the offline version of my website:
I noticed you said "I'd really like to get to the bottom of the underlying issue" so I figured I'd write an answer to provide a few options (and if anyone wants me to add others, please feel free to add a comment). Overall though, determining your root cause is likely much harder than just solving your overall problem, but let's start with possible causes that I can think of off my head:
Multiple CDN servers taking a while to update so some are returning the old data (your current session) and some are returning new (incognito)
Server session caching so when you reload the page within one http context session you get back the same content (I've seen this in product search queries for example)
The solution to this is relatively simple though, it's called cache busting. Basically, every time you update your source code just add a unique key in either the query string, file name or something to make the url unique. For example, for your css you can link https://path/to.css?v2.0.1 and just keep increasing the version number as you go. If you use webpack for your build outputs, they have a content hash variable that you can use as a token in the file names.
As for the CDNs possibly caching things out of date... the content hash solution will solve that problem as it's an entirely different file name so the CDN will go get it from the root if it doesn't have it in it's cache. I'm unsure of the url version query parameter will do the same, maybe someone else could shed some light on that.
Have you tried using Incognito in Chrome?

Dev Tools Crashing on Chrome Version 50.0.2661.102 m

So i have the following problem.
Any time i click on a request to view the headers/payload/response
i receive a not responding window.
If i wait ~2 minutes it works.
So what i receive here is a developer tools not responsive status when working on local machine.
I tried to re-install chrome. Nothing changed.
Current Version is: Version 50.0.2661.102 m listed as up to date.
Is there any possibility to get some logs or did anyone faced the same problem?
I think it can be relevant if i show what extensions i have installed.
But i tried to enable/disable them and nothing changed.
And i get the same comportment in incognito mode too.
Later edit: I somehow identified the problem. Idea is that chrome is trying to display the cookie (request headers) which was 18k characters long and it looks like this is slowing a lot developer tools and sometimes make him crash.
I just saw that in Mozilla cookies are limited to a couple of characters (display perspective) and after that they show ...
I can't find any known issues regarding this crash, so here are a couple of steps you could take:
Use the Chrome Cleanup Tool and see if that helps.
Fully clean and re-install Chrome (including deleting user folders). If it works after that, you can slowly add extensions and plugins back and see if any reintroduce the problem.

Debugging why the HTML Cache isn't working

I was wondering if anybody could direct me to any tools for debugging the cache.manifest file in offline HTML5 access. I recently downloaded a program called Manifesto which allows me to look up the cache manifest on loading a page. Everything seems to working fine however it keeps on saying that the status is "uncached". Obviously, it seems like although it is checking to make sure the cache files are there, it isn't actually caching them upon load. Whats going on and more importantly, how do I figure out how to solve it?
I got it to work. To be honest, it might have been working before but because I was having some different php scripts running in the background I had to be a little careful with pulling it up.

Chrome basic auth and digest auth issue

I'm having some issues with Chrome canceling some HTTP requests and I'm suspecting cached authentication data to be the cause. Let me first write down some important factors about the application I'm writing.
I was using Basic Authentication scheme for some time to guard several services and resources in my web app.
In the meantime I was using/testing the app heavily using Chrome with my main Google Account fully synced. Most frequently I was using my name - "lukasz" - as the username in Basic Auth.
Recently I have switched my application to use Digest Authentication.
Now, some of the HTTP requests I'm making are failing with status=failed with no apparent reason. It only happens when I'm using user "lukasz", if I enter some other unique username - there is no problem.
I looked everywhere in the backend and frontend and I couldn't locate the issue to be in our code. I can easily reproduce this with user "lukasz" each time. So I reverted my code to Basic Auth (while not touching the rest of app) and the problem was gone.
That led me to think that there is something wrong with cached passwords. So I cleared the cache in Chrome, but that didn't help. After several hours of analyzing the issue I decided to make sure that I'm running fresh instance of Chrome, so I reinstalled it (deleting the disk data along the way). TADAAA! The problem was gone and I couldn't reproduce this anymore.
Then I synchronized my Google Account with this newly installed Chrome and after a short while the requests to my app started failing again!! So I took a deeper look at this (cleaning profile data from disk and redoing all the steps) and indeed it looks like the problem starts as soon as my account is synced with cloud!
Yes, I know it sounds dodgy. It sounds ridiculous. It sounds stupid. But I am almost sure that those two problems are somehow related (failing requests and account sync).
My idea is this: Chrome somehow remembered that I was using "lukasz/my-pass" with Basic Auth for certain services. After I switched to Digest Auth the same combination of credentials (lukasz/my-pass) is now acting funny. Perhaps under the hood Chrome still thinks that this is Basic Auth and cancels requests when it learns otherwise?
UPDATE:
I've did some low level debugging with chrome://net-internals/ and it appears that the problem is while reading cache entry. This seems to prove my initial assumption.
I did some investigation and found this article. Apparently always adding "Last-Modified" header to my http response has solved the issue in Chrome (I'm still having some problems in FF, but that's off topic).
However, it still doesn't solve my issue entirely. Why the requests were failing in the first place?
You could try using incognito mode and see what happens. It may give you some hints without having to clear the cache or re-installing Chrome.
Also take a look at How to clear basic authentication details in chrome

Selenium IDE does not record events from a server, but from disk

I'm encountering an issue that Selenium IDE seems not to record a specific event on a real webserver.
However, if I save the page (including all resources) via firefox entirely to disk, open the saved file in the browser and try to record the same issue, Selenium IDE now works correctly and records the event as expected.
I'm not sure what is causing this behavior - maybe some race conditions inside Selenium IDE exists (latencies from a real webserver are higher than on a local file URL), or maybe it has something to do with URLs - but these are only quick guesses.
Does anybody have some suggestions/best practices how to track down such kind of Selenium IDE issues?
UPDATE:
I figured out my root issue, only with trial and error, but with succeess. I filed a bug at the selemium project.
The reason why it locally worked was a file not found after form submit which not happened at the serverside. It seems that the file not found error strangely prevented the bug from occuring.
However, the main part of this question isn't really answered yet, next time I still do not know how to quickly track down such issues. So for now, I'll keep it open.
I have similar issue. The Selenium IDE does not record anything from this website "http://suppliers.inwk.com". You may not have credentials to get login access, but if you can get the login page itself recorded in Selenum IDE, then I think we can come to the root cause, or atleast get a clue.