import org.apache.spark.sql.internal.StaticSQLConf
scala> val metastoreName = spark.conf.get(StaticSQLConf.CATALOG_IMPLEMENTATION.key)
metastoreName: String = hive
scala> spark.conf.set(StaticSQLConf.CATALOG_IMPLEMENTATION.key, "hive")
org.apache.spark.sql.AnalysisException: Cannot modify the value of a static config: spark.sql.catalogImplementation;
at org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:144)
at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:41)
... 50 elided
StaticSQLConf — Cross-Session, Immutable and Static SQL Configuration
StaticSQLConf
holds cross-session, immutable and static SQL configuration properties.
Configuration Property | ||
---|---|---|
(internal) Configures Builder.enableHiveSupport is used to enable Hive support for a SparkSession. Used when:
|
||
(internal) Only used for internal debugging when Default: Not all functions are supported when enabled. |
||
Name of the SQL extension configuration class that is used to configure Default: (empty) |
||
(internal) Name of the Spark-owned internal database of global temporary views Default: Used exclusively to create a GlobalTempViewManager when
|
||
List of class names that implement QueryExecutionListener that will be automatically registered to new Default: (empty) The classes should have either a no-arg constructor, or a constructor that expects a |
||
The properties in StaticSQLConf
can only be queried and can never be changed once the first SparkSession
is created.