public class CreateTableUsingAsSelect
extends org.apache.spark.sql.catalyst.plans.logical.UnaryNode
implements scala.Product, scala.Serializable
UnaryNode instead of a Command because we want the analyzer
can analyze the logical plan that will be used to populate the table.
So, PreWriteCheck can detect cases that are not allowed.| Constructor and Description |
|---|
CreateTableUsingAsSelect(String tableName,
String provider,
boolean temporary,
SaveMode mode,
scala.collection.immutable.Map<String,String> options,
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child) |
| Modifier and Type | Method and Description |
|---|---|
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan |
child() |
SaveMode |
mode() |
scala.collection.immutable.Map<String,String> |
options() |
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> |
output() |
String |
provider() |
String |
tableName() |
boolean |
temporary() |
childrenResolved, cleanArgs, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, org$apache$spark$Logging$$log__$eq, org$apache$spark$Logging$$log_, org$apache$spark$sql$catalyst$plans$logical$LogicalPlan$$resolveAsColumn, org$apache$spark$sql$catalyst$plans$logical$LogicalPlan$$resolveAsTableColumn, resolve, resolve, resolve$default$3, resolveChildren, resolveChildren$default$3, resolved, resolveGetField, sameResult, statePrefix, statisticsexpressions, inputSet, missingInput, org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionDown$1, org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionUp$1, outputSet, printSchema, references, schema, schemaString, simpleString, transformAllExpressions, transformExpressions, transformExpressionsDown, transformExpressionsUpapply, argString, asCode, collect, fastEquals, flatMap, foreach, foreachUp, generateTreeString, getNodeNumbered, makeCopy, map, mapChildren, nodeName, numberedTreeString, origin, otherCopyArgs, stringArgs, toString, transform, transformChildrenDown, transformChildrenUp, transformDown, transformUp, treeString, withNewChildrenproductArity, productElement, productIterator, productPrefixinitializeIfNecessary, initializeLogging, log_public CreateTableUsingAsSelect(String tableName,
String provider,
boolean temporary,
SaveMode mode,
scala.collection.immutable.Map<String,String> options,
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child)
public String tableName()
public String provider()
public boolean temporary()
public SaveMode mode()
public scala.collection.immutable.Map<String,String> options()
public org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child()
child in interface org.apache.spark.sql.catalyst.trees.UnaryNode<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>public scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output()
output in class org.apache.spark.sql.catalyst.plans.QueryPlan<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>