Chrome says my content script isn't UTF-8
Asked Answered
C

4

9

Receiving the error Could not load file 'worker.js' for content script. It isn't UTF-8 encoded.

> file -I chrome/worker.js
chrome/worker.js: text/plain; charset=utf-8

With to-utf8-unix

> to-utf8-unix chrome/worker.js                                      
chrome/worker.js
----------------
Detected charset:
UTF-8
Confidence of charset detection:
100
Result:
Conversion not needed.
----------------

I also tried converting the file with Sublime Text back and forth without any luck.

manifest:

  "content_scripts": [{
      "matches": ["http://foo.com/*"],
      "js": ["worker.js"]
  }],

The file in question: https://www.dropbox.com/s/kcv23ooh06wlxg3/worker.js?dl=1

It is a compiled javascript file spit out from clojurescript with cljsbuild:

               {:id "chrome-worker"
                :source-paths ["src/chrome/worker"],
                :compiler {:output-to "chrome/worker.js",
                           :optimizations :simple,
                           :pretty-print false}}
               ]}

Other files (options page, background) are compiled the same way and don't generate this error. I tried getting rid of weird characters like Emojis but that didn't fix the problem.

Catabolite answered 23/4, 2018 at 10:56 Comment(4)
Your js file is very large and contains too much unformatted js code. Try to clean it up.Kermanshah
Might be a bug in this version of Chrome. Try Chrome Canary or an older portable Chrome.Koziel
@elegant-user it doesn't matter if code is minified. Pretty printed has the same problemCatabolite
@wOxxOm good thinking! sadly same problem in Canary as wellCatabolite
C
8

It turns out this is a problem within the google closure compiler that clojurescript uses to generate javascript - https://github.com/google/closure-compiler/issues/1704

A workaround is to set compilation to "US-ASCII"

:closure-output-charset "US-ASCII"

Thanks a to to pesterhazy from the clojurians slack for helping with this!

Catabolite answered 23/4, 2018 at 13:30 Comment(0)
A
8

In case you are using Webpack you can solve it by replacing the default minifier Uglify with Terser, which won´t produce those encoding issues.

in your webpack.conf.js add

const TerserPlugin = require('terser-webpack-plugin');

// add this into your config object
optimization: {
    minimize: true,
    minimizer: [
      new TerserPlugin({
        parallel: true,
        terserOptions: {
          ecma: 6,
          output: { 
             ascii_only: true 
          },
        },
      }),
    ],
  },
Airplane answered 23/10, 2019 at 18:19 Comment(2)
Using this throws an error while building as - Failed to compile. Cannot read properties of undefined (reading 'javascript')Wham
This worked for me. But I don't know why. Why does Ugliy produce encoding issues?Shotton
A
2

In case anyone has this issue with Parcel, just add a .terserrc file with this content.

{
 "ecma": 6,
 "output": {
   "ascii_only": true
  }
}

This is an adaptation of @marian-klühspies response https://mcmap.net/q/1139821/-chrome-says-my-content-script-isn-39-t-utf-8

Alysiaalyson answered 30/7, 2021 at 22:51 Comment(0)
J
1

Had this error get thrown after editing working source code in WordPad. When I saved the file in WordPad, the encoding was lost. To fix it, open the same file in NotePad, Save as, and specify "UTF-8" in the Encoding drop down menu next to the save button.

Jeaz answered 2/3, 2021 at 14:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.