How to filter on partial match using sparklyr
Asked Answered
E

1

11

I'm new to sparklyr (but familiar with spark and pyspark), and I've got a really basic question. I'm trying to filter a column based on a partial match. In dplyr, i'd write my operation as so:

businesses %>%
  filter(grepl('test', biz_name)) %>%
  head

Running that code on a spark dataframe however gives me:

Error: org.apache.spark.sql.AnalysisException: Undefined function: 'GREPL'. This function is neither a registered temporary function nor a permanent function registered in the database 'project_eftpos_failure'.; line 5 pos 7

Eryneryngo answered 18/9, 2017 at 23:19 Comment(0)
S
16

The same as in standard Spark, you can use either rlike (Java regular expressions):

df <- copy_to(sc, iris) 

df %>% filter(rlike(Species, "osa"))

# or anchored
df %>% filter(rlike(Species, "^.*osa.*$"))

or like (simple SQL regular expressions):

df %>% filter(like(Species, "%osa%"))

Both methods can be also used with suffix notation as

df %>% filter(Species %rlike%  "^.*osa.*$")

and

df %>% filter(Species %like% "%osa%")

respectively.

For details see vignette("sql-translation").

Swami answered 18/9, 2017 at 23:27 Comment(3)
Perfect, so simple :O i'm surprised i didn't find that listed anywhere else or on here: spark.rstudio.com/articles/guides-dplyr.html#sql-translationEryneryngo
Thanks for the answer @rookieerror, but the link is dead now.Urinal
You can use the Hive Operators and User-Defined Functions (UDFs) cwiki.apache.org/confluence/display/Hive/LanguageManual+UDFGeoff

© 2022 - 2024 — McMap. All rights reserved.