I have mixed Java/Scala project. There are Quartz jobs that are implemented in Java and use some Scala classes. These classes should use the same SparkContext instance, so I implemented something that should be singleton and looks like this:
object SparkContextLoader {
    var hasSC = false
    var sc:Any = 0
    def getSC(workers):SparkContext={
    if (!hasSC) {
        val sparkConf = new SparkConf().setMaster("local[" + workers + "]").setAppName("SparkApp")
        sc = new SparkContext(sparkConf)
        hasSC = true
    }
    return sc.asInstanceOf[SparkContext]
}
Calling SparkContextLoader from two different jobs always creates a new SparkContext instance which is not allowed.
Why Scala object doesn't behave like singleton?
 
     
    