import org.apache.spark.sql.functions.from_json
val jsonCol = from_json($"json", new StructType())
import org.apache.spark.sql.catalyst.expressions.JsonToStructs
val jsonExpr = jsonCol.expr.asInstanceOf[JsonToStructs]
scala> println(jsonExpr.numberedTreeString)
00 jsontostructs('json, None)
01 +- 'json
JsonToStructs Unary Expression
JsonToStructs
is a unary expression with timezone support and CodegenFallback.
JsonToStructs
is a ExpectsInputTypes expression.
Note
|
|
Property | Description | ||
---|---|---|---|
Function that converts |
|||
Enabled (i.e. |
|||
|
|||
StructType that…FIXME |
Creating JsonToStructs Instance
JsonToStructs
takes the following when created:
-
Child expression
JsonToStructs
initializes the internal registries and counters.
Parsing Table Schema for String Literals — validateSchemaLiteral
Method
validateSchemaLiteral(exp: Expression): StructType
validateSchemaLiteral
requests CatalystSqlParser to parseTableSchema for Literal of StringType.
For any other non-StringType
types, validateSchemaLiteral
reports a AnalysisException
:
Expected a string literal instead of [expression]