In a HTML page references to Javascript files are rewritten with one additional subdirectory at the end:
/+sfgRmluamFuX1R5cGU9amF2YV9zY3JpcHQmRmluamFuX0xhbmc9dGV4dC9qYXZhc2NyaXB0+.
Why is that and can it be a source to potential problems?
In our source code we have Javascript includes like this one:
On development machines and the test server everything works fine. However, when installed on a production server, the code is somehow changed and looks like this one:
This is for every script. Since we are experiencing problems with some of the Javascripts, I wonder whether this can be the cause of them.
I googled for quite a while and did not find any good explanation for this code addition, the only one I found was that this can be generated by a proxy server.
Edit: Proxy issue. See Ivan's solution to his own problem.
Is this really HTML? You wouldn't happen to be in ASP.NET would you, cause that looks alot like a cookieless session string.
You can very easily test to make sure you're scripts are loaded by checking firebug's net tab, or failing that just putting an alert('LOADED!') in them.
We use Java and JSF with JBoss.
Originally the code is like this one:
<script src="/js/tiny_mce/tiny_mce.js" type="text/javascript"><!--
//--></script>
And it gets rewritten to this one:
<script src="/js/tiny_mce/+sfgRmluamFuX1R5cGU9amF2YV9zY3JpcHQmRmluamFuX0xhbmc9dGV4dC9qYXZhc2NyaXB0+/tiny_mce.js" type="text/javascript"><!--
//--></script>
It is definitely a proxy problem. We accessed it from another place than ours and it worked and these funny additions did not appear.
Here's the solution for all the guys that can have the same problem. It was due to our security policy. We added the website into our trusted zone and it was fine.
Related
I'm not much of a server guy so I was wondering if anyone here would know of a basic apache/mysql configuration detail to look for to explain why a cloned wordpress site in one environment would be encoding all ampersands on save when the same site in several other environments doesn't. There's not a whole lot special about the code. It's pretty much raw wordpress and ACF for custom fields.
I'm sure it's not just ampersands too, but that's the character we're using to test with. And really, what is happening is a site we built on our local machines was hosted on media temple, then aws, then wpengine, and now liquid web. All previous hosts have had no issues with this, but now with liquid web we have an encoding problem and they insist this is unrelated to their server configurations, and refuse to help diagnose the issue.
I can run phpinfo() on my local and the liquid web servers, but I am not sure what I would be looking for. Anything that would be auto encoding post values? Not sure. Also not sure if this would be a mysql configuration thing.
Any thoughts?
Naturally, we found the culprit minutes after posting this. Here it is in case anyone else has this issue:
in the wp-config.php file we noticed this:
define('CUSTOM_TAGS', true);
Evidently, this runs the wp_kses function on post input. We didn't add this to the config file, but considering plugins can modify this file, it is plausible that we had tried out plugins for things and decided against them, and upon uninstall, the plugin didn't clean up after itself.
Who knows how this can end up in the wp-config.php file, but it did. So if you experience the & bug, have a look there.
I've got a small experimental project on the go that uses an embedded system to show web pages.
The major draw back is that the embedded system doesn't have any form of server on it (No lovely web server languages allowed).
My current setup for testing any potential winning solution is locally (Just in a document C:/users/me/test/index.html) and then also testing it in wamp.
I've looked into using JS or JQuery but evey resource i've found only ever works when I test it within wamp, which isn't a viable solution for me.
I have a couple of questions:
Is it even possible to read an XML document without any form of
server technology?
If so, could someone post some resources please? I've found a lot of similar topics to mine, but none really cover my predicament.
If this isn't possible, are there any other technologies I could use to give the same output?
Thanks
Mostly for learning and testing-purposes, I need an environment/software where I can apply XSL Transformations on websites (html).
It needs to support Sessions and Cookies because of a login required to actually reach the pages I want to transform via XSL.
The manual method aka calling the page in the browser and download it and copy into Eclipse for example, is too slow. I need an automated system.. if possible one which can call multiple pages via a script.
I know that this could be realized with a lot of coding in Java, but I hoped for a simpler solution...
Any ideas?
Thanks in advance!
No clue why people have downed this question -_-', but I've found a sufficing solution:
Using "wget" for downloading the files and Saxon HE (NET) for actually applying the transformations. Those programs can be easily called from windows CMD :)
Any changes to JSP in eclipse like removing css layouts, html tags in eclipse not reflecting when the google webapp is deployed on local test server.
What are the possible problems causing this?
I found doing a shift-refresh in the browser does wonders. What you described is typical of browser caching and doesn't have anything to do with the local server.
Other than that nothing else comes to mind that would explain your problems.
I have a web server running Windows Server R2 Standard, and am experiencing the issue described in this blog post: http://www.hanselman.com/blog/BugAndFixASPNETFailsToDetectIE10CausingDoPostBackIsUndefinedJavaScriptErrorOrMaintainFF5ScrollbarPosition.aspx
In short, .Net is failing to recognise IE10 and is treating it as a downlevel browser without javascript support - if I understand the issue correctly.
I tried the following popular solutions:
Installed both hotfixes for .Net 2 and 4.
Manually updated the browser definition files in the .Net framework config folder and ran aspnet_regbrowsers.exe.
Put the new browser definition files in the App_Browsers directory.
Finally I upgraded to .Net 4.5 which solved the issue for .Net 4 sites running on the server, however .Net 2 sites are still experiencing the issue.
Because this is a live web server it has a lot of windows updates that have not been installed. I thought maybe there was an update that would address the issue. So I've taken a look at the updates but none of the descriptions seem to address the issue, so I cannot justify installing them and potentially causing more problems.
Does anyone have any other solutions or possible reasons why this issue just won't go away?
I have the same problem, and I haven't been able to figure out why none of the fixes work. However, I did find a workaround that might work for you: Setting the Page.ClientTarget attribute to "uplevel" overrides .NET's browser capability detection. Have a look at http://msdn.microsoft.com/en-us/library/system.web.ui.page.clienttarget.aspx for more information.
Have come through many microsoft hot fixes, they were working in local environment, but in the live server there was no result.
Setting up Page.ClientTarget = "uplevel" (preferably in a header or footer page) has really solved the issue. I think this the best solution, as your .net application may not detect any other browser in future. But then we may have to wait and see, if this fix has any side effects.