apache-spark-standalone Questions

2

Solved

I'm deploying a Spark Apache application using standalone cluster manager. My architecture uses 2 Windows machines: one set as a master, and another set as a slave (worker). Master: on which I run:...
Emendation asked 26/9, 2017 at 15:9

4

Solved

I am new to Apache Spark, and I just learned that Spark supports three types of cluster: Standalone - meaning Spark will manage its own cluster YARN - using Hadoop's YARN resource manager Me...

1

I've been trying to create a new spark session with Livy 0.7 server that runs on Ubuntu 18.04. On that same machine I have a running spark cluster with 2 workers and I'm able to create a normal spa...
Heterodoxy asked 3/6, 2020 at 14:37

3

I'm running spark master through the following command: ./sbin/start-master.sh After that I went to http://localhost:8080, and I saw the following page. I was expecting to see the tab with Job...
Cupid asked 31/5, 2019 at 15:18

2

Solved

I am trying to install Spark 1.6.1 on windows 10 and so far I have done the following... Downloaded spark 1.6.1, unpacked to some directory and then set SPARK_HOME Downloaded scala 2.11.8, unpack...
Stalinist asked 18/5, 2016 at 16:14

0

What features of YARN make it better than Spark Standalone mode for multi-tenant cluster running only Spark applications? Maybe besides authentication. There are a lot of answers at Google, pretty...

3

Solved

UPDATE: The problem is resolved. The Docker image is here: docker-spark-submit I run spark-submit with a fat jar inside a Docker container. My standalone Spark cluster runs on 3 virtual machines -...
Audiovisual asked 3/8, 2017 at 15:57

2

Solved

Until now, I have only used Spark on a Hadoop cluster with YARN as the resource manager. In that type of cluster, I know exactly how many executors to run and how the resource management works. How...

3

Solved

TL;DR: In a Spark Standalone cluster, what are the differences between client and cluster deploy modes? How do I set which mode my application is going to run on? We have a Spark Standalone clus...
Prophylactic asked 4/5, 2016 at 12:23

4

Solved

In Spark Standalone mode, there are master and worker nodes. Here are few questions: Does 2 worker instance mean one worker node with 2 worker processes? Does every worker instance hold an executo...
Methane asked 11/7, 2014 at 11:34

0

I am running a spark streaming application on a cluster composed by three nodes, each one with a worker and three executors (so a total of 9 executors). I am using the spark standalone mode (versio...

0

The running spark streaming job, which is supposed to run continuously, exited abruptly with the following error (found in the executor logs): 2017-07-28 00:19:38,807 [SIGTERM handler] ERROR org.a...

4

Solved

Using Spark(1.6.1) standalone master, I need to run multiple applications on same spark master. All application submitted after first one, keep on holding 'WAIT' state always. I also observed, the...

1

Solved

I have been developing in pyspark with spark standalone non-cluster mode. These days, I would like to explore more on the cluster mode of spark. I searched on the internet, and found I may need a c...
Finery asked 8/6, 2017 at 13:49

2

209/5000 Hello I want to add the option "--deploy-mode cluster" to my code scala: val sparkConf = new SparkConfig ().setMaster ("spark: //192.168.60.80:7077") Without using the shell (the comm...
Elvinelvina asked 11/5, 2017 at 12:28

1

Basically, Master node also perform as a one of the slave. Once slave on master completed it called the SparkContext to stop and hence this command propagate to all the slaves which stop the execut...
Effy asked 6/12, 2016 at 10:29

2

Solved

Having read this question, I would like to ask additional questions: The Cluster Manager is a long-running service, on which node it is running? Is it possible that the Master and the Driver node...

1

Solved

I have a Spark Standalone (not YARN/Mesos) cluster and a driver app running (in client mode), which talks to that cluster to execute its tasks. However, if I shutdown and restart the Spark master a...
Judaea asked 13/10, 2016 at 15:38

1

Solved

So I have a spark standalone server with 16 cores and 64GB of RAM. I have both the master and worker running on the server. I don't have dynamic allocation enabled. I am on Spark 2.0 What I dont u...
Alston asked 8/9, 2016 at 19:56

3

Does the driver need constant access to the master node? Or is it only required to get initial resource allocation? What happens if master is not available after Spark context has been created? Do...
Roderic asked 5/3, 2016 at 14:38
1

© 2022 - 2024 — McMap. All rights reserved.