How to run sbt-assembly tasks from within IntelliJ IDEA?
Asked Answered
S

3

16

Is it possible to run sbt-assembly from within IntelliJ IDEA?

Also I read in the doc that one could add task within the SBT Tool window. But what I see is that it only helps you view your project not task? I cannot add any tasks there. How does the Tool window work exactly?

I have the last version of IntelliJ IDEA.

Subadar answered 3/8, 2014 at 23:17 Comment(0)
B
8

This answer is now out-of-date. IntelliJ now allows you to create a run configuration for an SBT task. You create the Run Configuration by :

  • choosing "Edit configurations" from the "Run" menu (or the toolbar popup)
  • click the "+" button to add a configuration and select "SBT Task" as the type of configuration you want to make.
  • fill out the details such as the name of the task and the working directory if necessary

You can now run the task in the same way as any other run configuration; select it in the run configuration popup in the toolbar and click the run button (or if you're one of those keyboard-only people press shift-ctrl-r and select the task from the popup that appears)

official documentation here : https://www.jetbrains.com/help/idea/2016.2/run-debug-configuration-sbt-task.html

Began answered 9/9, 2016 at 9:39 Comment(0)
M
9

You can find the SBT plugin useful for your needs. With it, you can execute any tasks or command available in your build so sbt-assembly ones should work, too.

The plugin gives you SBT Console in which you start a sbt shell as if you were running it on the command line. The plugin gives you a more IDEA-like environment to work with the interactive console.

enter image description here

Manuelmanuela answered 4/8, 2014 at 18:25 Comment(2)
Console is cool, but what about UI support? For example with maven plugin I can just select task for build and press ctrl + shift + f10 to make a build. Is there something similar for sbt?Pedo
I would also like to see UI support for SBT tasks. There is a "SBT Tasks" pane that takes up a lot of real estate when open, and the only thing that discernibly does anything is the "Refresh all SBT projects" button. Love the plugin, but would like to be able to double-click a command in the list and have it execute.Darla
B
8

This answer is now out-of-date. IntelliJ now allows you to create a run configuration for an SBT task. You create the Run Configuration by :

  • choosing "Edit configurations" from the "Run" menu (or the toolbar popup)
  • click the "+" button to add a configuration and select "SBT Task" as the type of configuration you want to make.
  • fill out the details such as the name of the task and the working directory if necessary

You can now run the task in the same way as any other run configuration; select it in the run configuration popup in the toolbar and click the run button (or if you're one of those keyboard-only people press shift-ctrl-r and select the task from the popup that appears)

official documentation here : https://www.jetbrains.com/help/idea/2016.2/run-debug-configuration-sbt-task.html

Began answered 9/9, 2016 at 9:39 Comment(0)
D
0

TL;DR

  • Create a new intellij configuration > add a "JAR Application"
  • Fill in the "Configuration" section as needed but specifically the "path to jar" should be the file path to where the fat jar will be generated
  • At the bottom in the "Before Launch" > create a "run external tool" where you call the sbt assembly command line tool (see step 13 below for details)

Spark + fat jar setup

I found the "SBT task" to be useless in getting sbt-assembly to work within intellij as others suggested above. Rather I had to create a custom configuration calling sbt assembly on the command line.

My use case was specifically around Scala 2.12.18 + Apache Spark v3.2.1 running in stand alone cluster mode (not yarn nor mesos) on my local network + uber/fat jars within Intellij 2023.3.3 Ultimate for MacOS.

The setup is a bit convoluted but the result is you can simply hit the play/run button and it'll compile your code to a fat jar and deploy it to the spark cluster using spark-submit with 1 click.

As an alternative as others have suggested elsewhere, you can just keep running the sbt assembly and spark-submit on the command line manually after each code change but this gets tedious after awhile IMO.

  1. open IntelliJ > Preferences > (left pane) Plugins > find the scala and spark plugins and install, restart IntellJ
  2. File > New > Project... > in left pane "Generators" pick Spark then set name=mysparkapp, Type=Batch, Language=Scala,build system=SBT, JDK = 1.8, scala=2.12.18, sbt=1.9.6
  3. Let the generator run and once finished, you'll need to create all the .sbt etc config files to make sbt assembly work within your project. There is a good write up here. After your done with this, open the stubbed out "SparkPi" scala.
  4. In this file, you'll notice a little Spark icon (star) in the gutter (left side where you put break points), click it and choose "create spark submit configuration"
  5. At the top, under "Remote target" dropdown, click and choose "Add custom spark cluster". Run thru this wizard setting up an SSH connection to transfer files. We're not really using this piece because we're deploying fat jars to the cluster but it still seems to require you to set it up.
  6. In the application text box, paste the path to where the uber jar will be generated, for example /Users/me/sparkstuff/mysparkapp/target/scala-2.12/mysparkapp-assembly-1.0.jar
  7. In class textbox, put the class you want to run (that has the def main()) like com.example.mysparkapp.SparkPi
  8. Click the "Additional customization" > check the sections "Spark Configuration", "Dependencies", "Driver" and "Executor". This is basically just showing sections to fill in config settings that are passed to spark-submit
  9. Under Spark configuration > cluster manager=standalone, deploy mode=client (so you can see output when your app runs), spark home=path to spark install folder
  10. under "Spark Debug" section, I unchecked "start spark driver with debug agent"
  11. Fill in Dependencies, Driver and Executor sections as needed (or leave them blank)
  12. At the bottom in the "Result Submit Command" you can click the expand arrows and see the resulting command line call to invoke spark-submit.
  13. Now the sbt assembly bit: at the very bottom in the "Before Launch" list, delete everything.
  14. click the + and choose "run external tool" and + again
  15. name it "sbt assembly" and under Tool Settings, Program=, Arguments=assembly, Working directory=$ProjectFileDir$ and check all the boxes at the bottom ("make console active..." etc) > click OK > OK
  16. Then click the Apply button at the bottom of the "Run/Debug Configurations" window and OK button to close. Note it may re-add the "upload files through SFTP" bit but you can ignore it. Again referring to step 5 above, this SSH part must be filled out for some reason even when its not needed.
  17. Now you should be able to click the top right play button (green triangle) and it should compile your code into a fat jar (via sbt assembly) and launch it on the spark cluster. All the activity should show in the Run terminal windows.
Debonair answered 9/2 at 21:44 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.