I have an MS Access database that contains many records. The ASP classic pages in the website that loaded records into the database were written years ago in HTML 4.01 transitional using charset iso-8859-1.
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
There are some special characters (e.g. é) in some of the database fields. The pages that were coded at the same time as the database input pages display these characters correctly.
However, I have now added some mobile friendly pages to the site which are coded in HTML 5 and use the charset UTF-8.
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
Those pages, using the same data from the same database do not show the special characters correctly. They show a � instead.
I have tried re-coding the charset on the new pages to iso-8859-1 but that does not fix the problem. I have searched this forum and read pages like http://kunststube.net/frontback/ but cannot see where I am going wrong.
Could it be that the MS Access database holds the information in charset iso-8859-1 and I need to change it when I run the "select * from" command in ASP? If so how do I do that? Or am I way off track with that idea?
I know I could change all of the new pages and code them in HTML 4.01 transitional and that will work, but I was hoping to update the old ones in the fullness of time to HTML 5 rather than go backward.
OK I seem to have solved it by using
<%# language=vbscript codepage=65001 %>
Related
I'm setting up a page for terms with firebase hosting. I deployed the page but my text in page shows as question marks. my text is in Arabic language.
I tried many answers, but none of them work.
maybe the problem in firebase hosting?
I tried:
- <meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
- <META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=windows-1256">
- <META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-6">
My Code:
<!DOCTYPE html>
<html>
<meta charset="utf-8"/>
<body>
<h1 dir='rtl' lang='ar' >الشروط</h1>
<p dir="rtl" lang="ar">الشروط</p>
Code looks OK - just make sure your editor save it in UTF-8,
To verify this - download the file from the server as "yourfilename.html" and open it locally in your browser - see if you see it in Arabic.
If you don't, open it in notepad - you will probably see question marks which means you didn't save the file in UTF-8 format - you probably saved it as ASCII.
If on the other hand you will see it in Arabic - seek in the response structure you receive maybe something there is not right.
I am experiencing a very strange bug after updating my firefox to version 26 (on macbook pro with mountain lion). Although the headers have not changed, it is now failing to encode the utf-8 characters correctly. I tried several different header styles but still end up with the same problem:
My original header is somewhat old-school but it is working fine in every other browser.
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
Even stranger is the fact that if I run the page tab app directly via its server url (outside of facebook and its pagetab iframe) the encoding works fine!!
I tried the strict doctype and html5 doctype tags, but the problem still remains!
If anyone has any ideas about whats going on I would be appreciate hearing them.
At least latest version of Firefox(26.0) complaints in console if you have too much stuff before introducing charset meta tag.
The character encoding declaration of document was found too late for
it to take effect. The encoding declaration needs to be moved to be
within the first 1024 bytes of the file.
So it could help if you move charset declaration right after opening -tag.
I have used the following code in my head tag.
<!DOCTYPE HTML>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>Admin Panel</title>
</head>
I have characters of other language which is supported by UTF-8 encoding in my web page. But when i save my html file it showed me error The document's current encoding can not correctly save all of the characters within the document. You may want to change to UTF-8 or an encoding that supports the special characters in this document.
I have already using UTF-8. How to fix this?
You are not using UTF-8. You have just included some markup which tells the browser you are using UTF-8.
That error message sounds like it is coming from your editor. You need to configure your editor to save in UTF-8.
I have an ASP.NET page with the encoding defined in the header that loads with the proper encoding only once, then shows the data with no encoding (ASCII?).
The HTML header is written like this:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" >
<meta http-equiv="content-type" content="text/html" />
<meta http-equiv="charset" content="utf-8" />
The data that should appear as UTF-8 is fetched from a nvarchar cell in SQL and put in a cookie in ASP.Net. When the page reloads the first time, it appears ok:
Salé
Then for all other redirects / refreshes, it appears as:
Salé
The charset meta is in the source when checking the bad page's source in a browser.
Is my header wrong? If not, I'll dig deeper into my code.
[EDIT]
I tried to change the header programmatically with a runat="server" in the <head>, but it wasn't working.
Dim meta As System.Web.UI.HtmlControls.HtmlMeta = New System.Web.UI.HtmlControls.HtmlMeta()
meta.HttpEquiv = "content-type"
meta.Content = Response.ContentType + "; charset=" + Response.ContentEncoding.HeaderName
Page.Header.Controls.Add(meta)
[EDIT]
Found a similar issue: UTF-8 problems with characters from MySQL database (e.g. é as é)
Lets say I have the following file in called index in the directory D:\Experimental:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" >
<head>
<title>Minimal XHTML 1.1 Document</title>
</head>
<body>
<p>This is a minimal XHTML 1.1 document.</p>
</body>
</html>
If I open the link
file:///D:/experimental/index.html
I get to see the html, but it seems that the character encoding defaults to Western (ISO-8859-1), I can see this when I click view -> character encoding in firefox.
I want to display this in UTF-8 because Western (ISO-8859-1) doesn't display some characters correctly. Does anyone know how to fix this?
You should include:
<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
in your HEAD element.
Edit
I've just tried your example in Firefox on the Mac, and even without the meta tag, it correctly interprets the document as UTF-8. The Standard seems to indicate that it should use the XML processing instruction, but that you should also use the correct HTTP headers. Since you're not sending headers (because you're not using HTTP) you can specify them with the meta tag.
Maybe try adding
<meta http-equiv="content-type" content="text/html;charset=utf-8" />
in <head> section?
When loading files from disk, your browser does not have an HTTP Content-Type header to read the encoding from, so it guesses. To guess the document encoding it uses your operative systems current encoding, the actual bytes that are in the files and information inside the file itself.
As Jonathan wrote, you can add a
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
element that will help the browser using the correct content type. Anyway, note that that element will often be ignored by browsers if your document is sent from a misconfigured HTTP server that explicitly specifies another encoding the Content-Type header.