Spark Setup with Scala and Run in IntelliJ

Spark Setup with Scala and Run in IntelliJ 100% working

Spark Setup with Scala and Run in IntelliJ, Scala spark is a backend support language that is usually used to control the data with spark jobs, simply the as we know spark works on apache Hadoop supports and RDD, which we will learn in upcoming articles.

I am handling the functions and coding spark using scala language known as scala spark similarly to coding in python known as Pyspark.


Now let us see how we can initialize the scala spark using Intellij Idea Community following the below steps.

Steps for Spark Setup with Scala and Run in IntelliJ

Table of Contents

Step 1: Download and Install, Java JDK 8, IntelliJ Idea Community edition and add the Hadoop path to run HDFS in lower Operating systems, which comes with a Spark assembly zip file you can download. For full detail, and installation please go through our spark installation guide.

Step 2: After installing open IntelliJ, follow the basic steps and setup.

Step 3: Now go to the plugins menu on the welcome page as shown below.

Inside intellij, navigate to plugin
Click on plugin

Step 4: Inside the Plugins menu there will be two tabs one will be Marketplace and Installed Plugins, marketplace search for Scala Plugin and install

Scala Plugin install
Click on install
Note: Also you can install the Maven support plugin as well for maven framework support

Step 5: Now let us create a project for Scala spark, goto Projects in the welcome menu

image e1653218441351 Spark Setup with Scala and Run in IntelliJ

Step 6: Select a new project in the top right corner next select the Scala Sbt project as shown below, and click next.

image 22 Spark Setup with Scala and Run in IntelliJ
Scala Sbt

Step 7: Select the scala version as 2.11.0 cause it is stable with a spark as of now, and Click finish.

Wait until all libraries get successfully indexed.

Spark Setup with Scala and Run in IntelliJ
image 24 Spark Setup with Scala and Run in IntelliJ

Step 8: Next step is checking whether Java Version is properly set to Java JDK 8 or not, for that go to File -> Project Structure or if you are using the Windows operating system just press Ctrl + Alt + Shfit + S inside that navigate to the Project tab where you will SDK options select Java 8 which is 1.8.0._281 as shown below, click apply and wait for indexing.

Step 9: If some error comes in extraction then press a tiny hammer icon in the top menu bar to re-build the project

image 25 Spark Setup with Scala and Run in IntelliJ

Step 10: We are done with all setup let us create a sample code for the Scala Spark session.

Example Code for Spark Setup with Scala and Run in IntelliJ

First thing in build.sbt file add this dependency as shown below

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "3.1.2" % "provided",
  "org.apache.spark" %% "spark-sql" % "3.1.2",
  "org.apache.spark" %% "spark-mllib" % "3.1.2",
  "org.apache.spark" %% "spark-streaming" % "3.1.2",

After this press, this tiny icon to build the libraries added in build.sbt file

image 26 Spark Setup with Scala and Run in IntelliJ

Inside project -> src -> main -> scala right-click and Create a file name as sparkStreaming as scala object as shown below

image 27 Spark Setup with Scala and Run in IntelliJ

Example Code:

import org.apache.spark.sql.types.{DataType, DataTypes, StructType}
import org.apache.spark.sql.{Row, SparkSession}

object sparkStreaming {

  def main(args: Array[String]): Unit = {

    val spark = SparkSession
      .appName("spark Streaming")

    val myStructureData = Seq {
      Row("Solutiongigs", "Example code")

    val myStructureSchema = new StructType()
      .add("Name", DataTypes.StringType)
      .add("Description", DataTypes.StringType)

    val df  = spark.createDataFrame(




output for sample code


In this was an article we discussed Spark Setup with Scala and Run in IntelliJ how to set up the spark in IntelliJ with a sample spark session example be tuned for more articles like this, any queries please comment down below, thank you, and Happy learning!!.


apache spark application on windows apache spark scala and intellij Click to copy keyword configuration of apache spark scala debug spark application intellij development environment with intellij how to check spark version in intellij import org apache spark-sql sparksession in intellij install apache spark on mac install spark on windows intellij scala and spark setup intellij spark plugin intellij spark scala sbt example net for apache spark application project using intellij idea setup scala sdk intellij spark-scala project github spark application using intellij spark installation on ubuntu spark installation on windows spark install windows spark scala maven example spark ui intellij windows apache spark

2 thoughts on “Spark Setup with Scala and Run in IntelliJ 100% working”

  1. Pingback: Aws streaming data using Scala Spark | Solutiongigs

  2. Pingback: How to set up and download python for windows working 100% | Solutiongigs

Leave a Comment

Your email address will not be published. Required fields are marked *