Encoding problem when your package contains functions with non-english characters
Asked Answered
S

2

9

I am building my own package, and I keep running into encoding issues because the functions in my package has non-english (non-ASCII) characters.

Inherently, Korean characters are a part of many of the functions in my package. A sample function:

library(rvest)
sampleprob <- function(url) {
  # sample url: "http://dart.fss.or.kr/dsaf001/main.do?rcpNo=20200330003851"
  result <- grepl("연결재무제표 주석", html_text(read_html(url)))
  return(result)
}

However, when installing the package I run into encoding problems.

I created a sample package (https://github.com/hyk0127/KorEncod/) with just one function (what is shown above) and uploaded it onto my github page for a reproducible example. I run the following code to install:

library(devtools)
install_github("hyk0127/KorEncod")

Below is the error message that I see

Error : (converted from warning) unable to re-encode 'hello.R' line 7
ERROR: unable to collate and parse R files for package 'KorEncod'
* removing 'C:/Users/myname/Documents/R/win-library/3.6/KorEncod'
* restoring previous 'C:/Users/myname/Documents/R/win-library/3.6/KorEncod'
Error: Failed to install 'KorEncod' from GitHub:
  (converted from warning) installation of package ‘C:/Users/myname/AppData/Local/Temp/RtmpmS5ZOe/file48c02d205c44/KorEncod_0.1.0.tar.gz’ had non-zero exit status

The error message about line 7 refers to the Korean characters in the function.

It is possible to locally install the package with tar.gz file, but then the function does not run as intended, because the Korean characters are recognized in broken encoding.

This cannot be the first time that someone has tried building a package that has non-english (or non-ASCII) characters, and yet I couldn't find a solution to this. Any help will be deeply appreciated.


A few pieces of info that I think are related:

Currently the DESCRIPTION file specifies "Encoding: UTF-8".

I have used sys.setlocale to set the locale into Korean and back to no avail. I have specified @encoding UTF-8 to the function to no avail as well.

I am currently using Windows where the administrative language is set to English. I have tried using a different laptop with Windows & administrative language set to Korean, and the same problem appears.

Siphonophore answered 25/2, 2021 at 2:8 Comment(1)
I'd say that collate and ctype are important in devtools::session_info() (both derived from Sys.getlocale() IMHO). Unfortunately, Sys.setlocale(category = "LC_CTYPE" , locale=".65001") cannot be honored in Windows, unlike Sys.setlocale(category = "LC_COLLATE" , locale=".65001") which works as expected… Aged incompatibility Windows vs. UTF-8 in R.Armbrecht
N
4

The key trick is replacing the non-ASCII characters with their unicode codes - the \uxxxx encoding.

These can be generated via stringi::stri_escape_unicode() function.

Note that since it will be necessary to completely get rid of the Korean characters in your code in order to pass the R CMD check it will be necessary to perform a manual copy & re-encode via {stringi} on the command line & paste back operation on all your R scripts included in the package.

I am not aware of an available automated solution for this problem.

In the specific use case of the example provided the unicode would read like this:

sampleprob <- function(url) {
  # stringi::stri_escape_unicode("연결재무제표 주석") to get the \uxxxx codes
  result <- grepl("\uc5f0\uacb0\uc7ac\ubb34\uc81c\ud45c \uc8fc\uc11d", 
                  rvest::html_text(xml2::read_html(url)))
  return(result)
}
sampleprob("http://dart.fss.or.kr/dsaf001/main.do?rcpNo=20200330003851")
[1] TRUE

This will be a hassle, but it seems to be the only way to make your code platform neutral (which is a key CRAN requirement, and thus subject to R CMD check).

Nipper answered 22/3, 2021 at 7:49 Comment(5)
Thanks! One problem with this approach is that it doesn't run in the current form. I've fiddled around with the function and found that when I manually change all the double-backslashes into single-backslashes it works (shows TRUE with the given sample url), possibly because only then it reads the unicodes as inputs and not codes..?Siphonophore
A code like the following, which makes R read the unicodes as inputs, works. grepl(parse(text = paste0("'", stringi::stri_escape_unicode("연결재무제표 주석"), "'")), html_text(read_html("http://dart.fss.or.kr/dsaf001/main.do?rcpNo=20200330003851"))) . Of course, in building the package I inevitably would have to manually copy-paste the expression from parse(text = paste0("'", stringi::stri_escape_unicode("연결재무제표 주석"), "'")) into greplSiphonophore
Please feel free to update your answer (to make reading this easier for ppl who might look into this in the future)! I would love a more non-manual way to build the package than hand copy-pasting but it seems, surprisingly, like a long shot.Siphonophore
@Siphonophore makes sense; I have expanded the answer to explicitly mention that the code needs to be replaced manually; which will be a hassle, but there is no automated solution (that I know of).Nipper
And by the way: to double check the answer I have ventured from the comfort of my Linux installation (which runs in sweet Unicode) to a Windows machine (which runs in en.wikipedia.org/wiki/Windows-1250 to support my native Czech language) and it was having serious issues with the Korean characters even in a commented piece of code. Encoding is evil! :)Nipper
S
2

Adding for the future value (for those facing similar problems), you can also solve this problem by saving the non-ASCII characters in a data file, then loading the value & using it.

So save the character as a data file (using standard package folder names and roxygen2 package)

# In your package, save as a separate file within .\data-raw 
kor_chrs <- list(sampleprob = "연결재무제표 주석")
usethis::use_data(kor_chrs)

Then in your functions load the data and use them.

# This is your R file for the function within ./R folder
#' @importFrom rvest html_text
#' @importFrom xml2  read_html
#' @export
sampleprob <- function(url) {
  # sample url: "http://dart.fss.or.kr/dsaf001/main.do?rcpNo=20200330003851"
  result <- grepl(kor_chrs$sampleprob[1], html_text(read_html(url)))
  return(result)
}

This, yes, is still a workaround, but it runs in Windows machines without any troubles.

Siphonophore answered 24/3, 2021 at 17:42 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.