I'm writing a Chrome extension that works with a website that uses ISO-8859-1. Just to give some context, what my extension does is making posting in the site's forums quicker by adding a more convenient post form. The value of the textarea where the message is written is then sent through an Ajax call (using jQuery).
If the message contains characters like á
these characters appear as á in the posted message. Forcing the browser to display UTF-8 instead of ISO-8859-1 makes the á
appear correctly.
It is my understanding that Javascript uses UTF-8 for its strings, so it is my theory that if I transcode the string to ISO-8859-1 before sending it, it should solve my problem. However there seems to be no direct way to do this transcoding in Javascript, and I can't touch the server side code. Any advice?
I've tried setting the created form to use iso-8859-1 like this:
var form = document.createElement("form");
form.enctype = "application/x-www-form-urlencoded; charset=ISO-8859-1";
And also:
var form = document.createElement("form");
form.encoding = "ISO-8859-1";
But that doesn't seem to work.
EDIT:
The problem actually lied in how jQuery was urlencoding the message (or something along the way), I fixed this by telling jQuery not to process the data and doing it myself as is shown in the following snippet:
function cfaqs_post_message(msg) {
var url = cfaqs_build_post_url();
msg = escape(msg).replace(/\+/g, "%2B");
$.ajax({
type: "POST",
url: url,
processData: false,
data: "message=" + msg + "&post=Preview Message",
success: function(html) {
// ...
},
dataType: "html",
contentType: "application/x-www-form-urlencoded"
});
}