StaticSQLConf — Cross-Session, Immutable and Static SQL Configuration

Table 1. StaticSQLConf’s Configuration Properties
Configuration Property


(internal) Configures in-memory (default) or hive-related BaseSessionStateBuilder and ExternalCatalog

Used when:


(internal) Only used for internal debugging when HiveExternalCatalog is requested to restoreTableMetadata.

Default: false

Not all functions are supported when enabled.


Name of the SQL extension configuration class that is used to configure SparkSession extensions (when Builder is requested to get or create a SparkSession). The class should implement Function1[SparkSessionExtensions, Unit], and must have a no-args constructor.

Default: (empty)


(internal) The maximum size of the cache that maps qualified table names to table relation plans. Must not be negative.

Default: 1000


(internal) Name of the Spark-owned internal database of global temporary views

Default: global_temp

Used exclusively to create a GlobalTempViewManager when SharedState is first requested for the GlobalTempViewManager.

The name of the internal database cannot conflict with the names of any database that is already available in ExternalCatalog.


When enabled (true), Hive Thrift server is running in a single session mode. All the JDBC/ODBC connections share the temporary views, function registries, SQL configuration and the current database.

Default: false


List of class names that implement QueryExecutionListener that will be automatically registered to new SparkSessions.

Default: (empty)

The classes should have either a no-arg constructor, or a constructor that expects a SparkConf argument.


(internal) The maximum length allowed in a single cell when storing additional schema information in Hive’s metastore

Default: 4000


Number of executions to retain in the Spark UI.

Default: 1000


Directory of a Spark warehouse

Default: spark-warehouse

The properties in StaticSQLConf can only be queried and can never be changed once the first SparkSession is created.

import org.apache.spark.sql.internal.StaticSQLConf
scala> val metastoreName = spark.conf.get(StaticSQLConf.CATALOG_IMPLEMENTATION.key)
metastoreName: String = hive

scala> spark.conf.set(StaticSQLConf.CATALOG_IMPLEMENTATION.key, "hive")
org.apache.spark.sql.AnalysisException: Cannot modify the value of a static config: spark.sql.catalogImplementation;
  at org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:144)
  at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:41)
  ... 50 elided

results matching ""

    No results matching ""