Context
Spark reader has the function format, which is used to specify a data source type, for example, JSON, CSV or third party com.databricks.spark.redshift
Help
how can I check whether a third-party format exists or not, let me give a case
- In local spark, connect to redshift two open source libs available 1. 
com.databricks.spark.redshift2.io.github.spark_redshift_community.spark.redshift, how I can determine which libs the user pastes in the classpath 
What I tried
- Class.forName("com.databricks.spark.redshift"), not worked
 - I tried to check spark code for how they are throwing error, here is line, but unfortunately Utils is not available publically
 - Instead of targeting format option, I tried to target JAR file 
System.getProperty("java.class.path") spark.read.format("..").load()in try/catch
I looking for a proper & reliable solution