Suppose I run a pyspark job using a dataproc workflow template and an ephemeral cluster... How can I get the name of the cluster created inside my pyspark job
            Asked
            
        
        
            Active
            
        
            Viewed 118 times
        
    1 Answers
2
            
            
        One way would be to fork out and run this command:
/usr/share/google/get_metadata_value attributes/dataproc-cluster-name
The only output will be the cluster name, without any new line characters or anything else to cleanup. See Running shell command and capturing the output
 
    
    
        tix
        
- 2,138
- 11
- 18
