StaticSQLConf — Cross-Session, Immutable and Static SQL Configuration

Table 1. StaticSQLConf’s Configuration Properties
Name Default Value Scala Value Description

spark.sql.catalogImplementation

in-memory

CATALOG_IMPLEMENTATION

Selects the active catalog implementation from the available ExternalCatalogs:

Note
Use Builder.enableHiveSupport to enable Hive support (that sets spark.sql.catalogImplementation configuration property to hive when the Hive classes are available).

Used when:

  1. SparkSession is requested for the SessionState

  2. SharedState is requested for the ExternalCatalog

  3. SetCommand command is executed (with hive keys)

  4. SparkSession is created with Hive support

spark.sql.debug

false

DEBUG_MODE

(internal) Only used for internal debugging when HiveExternalCatalog is requested to restoreTableMetadata.

Not all functions are supported when enabled.

spark.sql.extensions

(empty)

SPARK_SESSION_EXTENSIONS

Name of the SQL extension configuration class that is used to configure SparkSession extensions (when Builder is requested to get or create a SparkSession). The class should implement Function1[SparkSessionExtensions, Unit], and must have a no-args constructor.

spark.sql.filesourceTableRelationCacheSize

1000

FILESOURCE_TABLE_RELATION_CACHE_SIZE

(internal) The maximum size of the cache that maps qualified table names to table relation plans. Must not be negative.

spark.sql.globalTempDatabase

global_temp

GLOBAL_TEMP_DATABASE

(internal) Name of the Spark-owned internal database of global temporary views

Used exclusively to create a GlobalTempViewManager when SharedState is first requested for the GlobalTempViewManager.

Note
The name of the internal database cannot conflict with the names of any database that is already available in ExternalCatalog.

spark.sql.hive.thriftServer.singleSession

false

HIVE_THRIFT_SERVER_SINGLESESSION

When set to true, Hive Thrift server is running in a single session mode. All the JDBC/ODBC connections share the temporary views, function registries, SQL configuration and the current database.

spark.sql.queryExecutionListeners

(empty)

QUERY_EXECUTION_LISTENERS

List of class names that implement QueryExecutionListener that will be automatically registered to new SparkSessions.

The classes should have either a no-arg constructor, or a constructor that expects a SparkConf argument.

spark.sql.sources.schemaStringLengthThreshold

4000

SCHEMA_STRING_LENGTH_THRESHOLD

(internal) The maximum length allowed in a single cell when storing additional schema information in Hive’s metastore

spark.sql.ui.retainedExecutions

1000

UI_RETAINED_EXECUTIONS

Number of executions to retain in the Spark UI.

spark.sql.warehouse.dir

spark-warehouse

WAREHOUSE_PATH

The directory of a Hive warehouse (using Derby) with managed databases and tables (aka Spark warehouse)

Tip
Read the official Hive Metastore Administration document to learn more.

The properties in StaticSQLConf can only be queried and can never be changed once the first SparkSession is created.

import org.apache.spark.sql.internal.StaticSQLConf
scala> val metastoreName = spark.conf.get(StaticSQLConf.CATALOG_IMPLEMENTATION.key)
metastoreName: String = hive

scala> spark.conf.set(StaticSQLConf.CATALOG_IMPLEMENTATION.key, "hive")
org.apache.spark.sql.AnalysisException: Cannot modify the value of a static config: spark.sql.catalogImplementation;
  at org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:144)
  at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:41)
  ... 50 elided

results matching ""

    No results matching ""