apache spark - Error when creating a StreamingContext -
i open spark shell
spark-shell --packages org.apache.spark:spark-streaming-kafka_2.10:1.6.0
then want create streaming context
import org.apache.spark._ import org.apache.spark.streaming._ val conf = new sparkconf().setmaster("local[2]").setappname("networkwordcount").set("spark.driver.allowmultiplecontexts", "true") val ssc = new streamingcontext(conf, seconds(1))
i run exception:
org.apache.spark.sparkexception: 1 sparkcontext may running in jvm (see spark-2243). ignore error, set spark.driver.allowmultiplecontexts = true. running sparkcontext created at:
when open spark-shell, there streaming context created. called sc, meaning not need create configure object. use existing sc object.
val ssc = new streamingcontext(sc,seconds(1))
instead of var use val
Comments
Post a Comment