In my case I was using Angular JS to receive an encoded CSV file from the server in response to an HTTP POST. The problem was that CSVs returned from XMLHttpRequests are represented as Unicode (I want to say UTF-8, but according to this it is UTF-16) strings, not pre-encoded binary data. It looks like this is true in your example too, it is reading the CSV from a DOM element? In this case it ends up being represented as Unicode in memory, so it doesn't matter what value you set the encoding metadata to, the data is still Unicode.
My Angular code was doing something like this:
$http.post('/url', postData, {}).then(handleResponse);
Inside handleResponse
, the data was already represented in Unicode. According to Angular's $http service, not providing the responseType
property on the config object causes it to default to string
. Which according to Mozilla ends up represented as a DOMString in UTF-16, whereas we actually want it to be a Blob. Setting the responseType
to blob
on the config object successfully prevented the content from getting decoded. Without this, the response data was being inadvertently decoded before being placed into the Blob.
$http.post('/url', postData, {responseType: 'blob'}).then(handleResponse);
I then used saveAs() to get the browser to provide the file contents to the user.
function handleResponse(response) {
let headers = response.headers();
let blob = new Blob([response.data], {type: headers['content-type']});
saveAs(blob, headers['x-filename']);
}
I got the idea to set responseType
from https://mcmap.net/q/374816/-how-to-read-binary-data-in-angularjs-in-an-arraybuffer
"\ufeff"
at the beginning really made it work! – Shuma