Is it possible to check the version of Databricks Runtime in Azure?
            Asked
            
        
        
            Active
            
        
            Viewed 2.0k times
        
    3 Answers
16
            
            
        In Scala:
dbutils.notebook.getContext.tags("sparkVersion")
In Python:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")
Is giving you the Databricks runtime and Scala version back, e. g.: 5.0.x-scala2.11 .
 
    
    
        jdhao
        
- 24,001
- 18
- 134
- 273
 
    
    
        Hauke Mallow
        
- 2,887
- 3
- 11
- 29
- 
                    Is it possible to get just the databricks runtime version. Instead of 5.0.x-scala2.11 just "5.0" – harsha87 Feb 19 '21 at 22:24
13
            Databricks Runtime is the set of core components that run on the clusters managed by Azure Databricks. It includes Apache Spark but also adds a number of components and updates that substantially improve the usability, performance, and security of big data analytics.
You can choose from among many supported runtime versions when you create a cluster.
If you want to know the version of Databricks runtime in Azure after creation:
Go to Azure Data bricks portal => Clusters => Interactive Clusters => here you can find the run time version.
For more details, refer "Azure Databricks Runtime versions".
Hope this helps.
 
    
    
        CHEEKATLAPRADEEP
        
- 12,191
- 1
- 19
- 42
2
            
            
        print (spark.version)
worked for me
 
    
    
        Eugene Lycenok
        
- 603
- 6
- 14
- 
                    sorry ... this is not runtime version ... but that helped me at the time .. didn't know the reputation decreases after you remove an answer :) – Eugene Lycenok Mar 26 '21 at 07:40
 
    
