Add dependency to remote spark on intelliJ -


i running spark in intellij remotely facing difficulties while adding dependency spark conf.

val conf = new sparkconf()     .setmaster("spark://ip:7077")     .set("packages", "com.databricks:spark-avro_2.10:2.0.1:jar")     .setappname("localtrial") 

error:

16/02/23 12:27:10 warn tasksetmanager: lost task 0.0 in stage 0.0 (tid 0, 172.16.248.156): java.lang.classnotfoundexception: com.databricks.spark.avro.avrorelation$$anonfun$buildscan$1$$anonfun$3 @ java.net.urlclassloader.findclass(urlclassloader.java:381) ```

i have tried setjars property of conf class. appreciated.

you need add dependency build.sbt file intellij can compile against it. if adding argument spark-submit can set dependency prodived else need package inside jar file using sbt-assembly, or similar, plugin.

add following line build.sbt file , if auto-import enabled intellij download dependency. if auto-import isn't enabled close project , import again or use refresh button inside sbt tool window.

librarydependencies += "com.databricks" %% "spark-avro" % "2.0.1" % "provided" 

Comments

Popular posts from this blog

java - Suppress Jboss version details from HTTP error response -

gridview - Yii2 DataPorivider $totalSum for a column -

Sass watch command compiles .scss files before full sftp upload -