scala Questions

7

Solved

I've got a local Kafka running using the following docker-compose.yml file: version: '2' services: zookeeper: image: "confluentinc/cp-zookeeper:5.0.1" environment: ZOOKEEPER_CLIENT_PO...
Vociferous asked 3/12, 2018 at 0:52

7

Solved

How to get current_date - 1 day in sparksql, same as cur_date()-1 in mysql.
Haase asked 13/12, 2016 at 6:28

1

A legacy app has the below working Gatling test private val getUserByTuid = scenario("GetActiveUserInfoWrapper") .feed(users.toArray.circular) .exec( http("GET /v1/foo/{id}") .get("/v1/foo/${i...
Hymanhymen asked 14/6, 2018 at 23:53

2

Solved

Taken from "Scala with cats" (page 18): Implicit Conversions When you create a type class instance constructor using an implicit def, be sure to mark the parameters to the method as impl...
Byler asked 5/12, 2020 at 15:33

11

I'm trying to compile a mixed java-scala code using intelliJ 14.1.4 but it keeps giving me this error: Error Compiling SBT component 'compiler-interface-2.9.2-52.0' I have tried to downgrade my sc...
Yeah asked 6/8, 2015 at 9:19

1

I have a Spark Streaming application that reads data from multiple Kafka topics. Each topic has a different type of data, and thus requires a different processing pipeline. My initial solution was...
Laclos asked 2/4, 2017 at 11:8

9

Solved

I keep getting Forward reference extends over definition of value a error while trying to compile my application (inside SBT). a is just val a = "", the error is triggered by accessing a partic...
Pomade asked 11/11, 2012 at 4:9

3

Checkpoint version: val savePath = "/some/path" spark.sparkContext.setCheckpointDir(savePath) df.checkpoint() Write to disk version: df.write.parquet(savePath) val df = spark.read.parque...
Dav asked 9/8, 2018 at 17:25

6

I am trying to build a Scala project on IntelliJ using Gradle. I am able to make but compile keeps failing with a stack overflow error. I looked through other posts with similar issues which sugg...
Kary asked 3/4, 2017 at 21:2

3

I'm trying to use Apache Iceberg for writing data to a specified location(S3/local). Following is the configuration used below. SBT: libraryDependencies += "org.apache.spark" %% "spa...
Mirianmirielle asked 1/9, 2022 at 10:47

6

I'm using sbt-native-packager 1.0.0-M5 to create my docker image. I need to add a file that's not a source file or in the resource folder. My docker commands are as follows: dockerCommands := Seq(...
Eosin asked 23/2, 2015 at 14:27

5

Solved

I have my timestamp in UTC and ISO8601, but using Structured Streaming, it gets automatically converted into the local time. Is there a way to stop this conversion? I would like to have it in UTC. ...

4

Solved

Given Table 1 with one column "x" of type String. I want to create Table 2 with a column "y" that is an integer representation of the date strings given in "x". Essential is to keep null values in...

5

Solved

Im using scala Map#get function, and for every accurate query it returns as Some[String] IS there an easy way to remove the Some? Example: def searchDefs{ print("What Word would you like define...
Vanburen asked 22/2, 2012 at 6:7

6

Solved

I have a method that should convert a list to an Option of an object, or None if the list is empty. def listToOption(myList: List[Foo]): Option[Bar] = { if(myList.nonEmpty) Some(Bar(myList)) els...
Marylou asked 30/1, 2015 at 20:42

4

Solved

Let's say that I have a trait, Parent, with one child, Child. scala> sealed trait Parent defined trait Parent scala> case object Boy extends Parent defined module Boy I write a function t...
Becki asked 4/2, 2015 at 17:13

3

Solved

I use EMR Notebook connected to EMR cluster. Kernel is Spark and language is Scala. I need some jars that are located in S3 bucket. How can I add jars? In case of 'spark-shell' it's easy: spar...
Northward asked 13/8, 2019 at 8:28

9

Solved

I read data from a csv file ,but don't have index. I want to add a column from 1 to row's number. What should I do,Thanks (scala)
Smash asked 14/4, 2017 at 7:9

3

Solved

I am using Spark 2 and Scala 2.11 in a Zeppelin 0.7 notebook. I have a dataframe that I can print like this: dfLemma.select("text", "lemma").show(20,false) and the output looks like: +---------...
Cryptozoite asked 6/7, 2017 at 10:38

3

Solved

Problem I am trying to run a remote Spark Job through IntelliJ with a Spark HDInsight cluster (HDI 4.0). In my Spark application I am trying to read an input stream from a folder of parquet files f...

2

Solved

What are the possible ways to make a HashSet thread safe? Saw some samples as given below. var test = new mutable.HashSet[Long] with mutable.SynchronizedSet[Long] SynchronizedSet is deprecated ...
Nigro asked 6/12, 2016 at 10:58

4

I was looking at the DataFrame API, i can see two different methods doing the same functionality for removing duplicates from a data set. I can understand dropDuplicates(colNames) will remove dupl...
Defamatory asked 27/2, 2016 at 7:22

8

Solved

I have three strings, for example "A", "B", "C". I have to produce the string which results from concatenating them, only the second string must be padded with whitespace to a given length. This w...
Lawgiver asked 7/6, 2013 at 14:12

1

Using Sbt in order to run scalatest tests, you can put all test of a suite in parallel using the trait ParallelTestExecution and this works out of the box. How to obtain the same result running ...
Coppage asked 1/2, 2019 at 16:23

2

Solved

Look at this code snippet: userService.insert(user) match { case Success(f) => Logger.debug("User created successfully") case Failure(e) => { // how do I determine the type of `e`? } } ...
Subtropics asked 5/1, 2014 at 23:25

© 2022 - 2024 — McMap. All rights reserved.