|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Packages that use exprNodeDesc | |
---|---|
org.apache.hadoop.hive.ql.exec | |
org.apache.hadoop.hive.ql.optimizer.ppr | |
org.apache.hadoop.hive.ql.parse | |
org.apache.hadoop.hive.ql.plan | |
org.apache.hadoop.hive.ql.ppd |
Uses of exprNodeDesc in org.apache.hadoop.hive.ql.exec |
---|
Fields in org.apache.hadoop.hive.ql.exec with type parameters of type exprNodeDesc | |
---|---|
protected Map<String,exprNodeDesc> |
Operator.colExprMap
A map of output column name to input expression map. |
Methods in org.apache.hadoop.hive.ql.exec that return types with arguments of type exprNodeDesc | |
---|---|
Map<String,exprNodeDesc> |
Operator.getColumnExprMap()
Returns a map of output column name to input expression map Note that currently it returns only key columns for ReduceSink and GroupBy operators |
Methods in org.apache.hadoop.hive.ql.exec with parameters of type exprNodeDesc | |
---|---|
static ExprNodeEvaluator |
ExprNodeEvaluatorFactory.get(exprNodeDesc desc)
|
static boolean |
FunctionRegistry.isOpAnd(exprNodeDesc desc)
Returns whether the exprNodeDesc is a node of "and". |
static boolean |
FunctionRegistry.isOpAndOrNot(exprNodeDesc desc)
Returns whether the exprNodeDesc is a node of "and", "or", "not". |
static boolean |
FunctionRegistry.isOpPositive(exprNodeDesc desc)
Returns whether the exprNodeDesc is a node of "positive". |
Method parameters in org.apache.hadoop.hive.ql.exec with type arguments of type exprNodeDesc | |
---|---|
protected int |
CommonJoinOperator.populateJoinKeyValue(Map<Byte,List<ExprNodeEvaluator>> outMap,
Map<Byte,List<exprNodeDesc>> inputMap)
|
void |
Operator.setColumnExprMap(Map<String,exprNodeDesc> colExprMap)
|
Uses of exprNodeDesc in org.apache.hadoop.hive.ql.optimizer.ppr |
---|
Methods in org.apache.hadoop.hive.ql.optimizer.ppr that return exprNodeDesc | |
---|---|
static exprNodeDesc |
ExprProcFactory.genPruner(String tabAlias,
exprNodeDesc pred,
boolean hasNonPartCols)
Generates the partition pruner for the expression tree |
Methods in org.apache.hadoop.hive.ql.optimizer.ppr that return types with arguments of type exprNodeDesc | |
---|---|
HashMap<TableScanOperator,exprNodeDesc> |
OpWalkerCtx.getOpToPartPruner()
|
Methods in org.apache.hadoop.hive.ql.optimizer.ppr with parameters of type exprNodeDesc | |
---|---|
static exprNodeDesc |
ExprProcFactory.genPruner(String tabAlias,
exprNodeDesc pred,
boolean hasNonPartCols)
Generates the partition pruner for the expression tree |
static boolean |
PartitionPruner.hasColumnExpr(exprNodeDesc desc)
Whether the expression contains a column node or not. |
static boolean |
PartitionPruner.onlyContainsPartnCols(Table tab,
exprNodeDesc expr)
Find out whether the condition only contains partitioned columns. |
static PrunedPartitionList |
PartitionPruner.prune(Table tab,
exprNodeDesc prunerExpr,
HiveConf conf,
String alias,
Map<String,PrunedPartitionList> prunedPartitionsMap)
Get the partition list for the table that satisfies the partition pruner condition. |
Constructor parameters in org.apache.hadoop.hive.ql.optimizer.ppr with type arguments of type exprNodeDesc | |
---|---|
OpWalkerCtx(HashMap<TableScanOperator,exprNodeDesc> opToPartPruner)
Constructor |
Uses of exprNodeDesc in org.apache.hadoop.hive.ql.parse |
---|
Methods in org.apache.hadoop.hive.ql.parse that return exprNodeDesc | |
---|---|
exprNodeDesc |
SemanticAnalyzer.genExprNodeDesc(ASTNode expr,
RowResolver input)
Generates and expression node descriptor for the expression passed in the arguments. |
static exprNodeDesc |
TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String name,
exprNodeDesc... children)
Get the exprNodeDesc |
static exprNodeDesc |
TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String udfName,
List<exprNodeDesc> children)
This function create an ExprNodeDesc for a UDF function given the children (arguments). |
static exprNodeDesc |
TypeCheckProcFactory.processGByExpr(Node nd,
Object procCtx)
Function to do groupby subexpression elimination. |
Methods in org.apache.hadoop.hive.ql.parse that return types with arguments of type exprNodeDesc | |
---|---|
HashMap<TableScanOperator,exprNodeDesc> |
ParseContext.getOpToPartPruner()
|
Methods in org.apache.hadoop.hive.ql.parse with parameters of type exprNodeDesc | |
---|---|
static exprNodeDesc |
TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String name,
exprNodeDesc... children)
Get the exprNodeDesc |
Method parameters in org.apache.hadoop.hive.ql.parse with type arguments of type exprNodeDesc | |
---|---|
static exprNodeDesc |
TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String udfName,
List<exprNodeDesc> children)
This function create an ExprNodeDesc for a UDF function given the children (arguments). |
static boolean |
TypeCheckProcFactory.DefaultExprProcessor.isRedundantConversionFunction(ASTNode expr,
boolean isFunction,
ArrayList<exprNodeDesc> children)
|
void |
ParseContext.setOpToPartPruner(HashMap<TableScanOperator,exprNodeDesc> opToPartPruner)
|
Constructor parameters in org.apache.hadoop.hive.ql.parse with type arguments of type exprNodeDesc | |
---|---|
ParseContext(HiveConf conf,
QB qb,
ASTNode ast,
HashMap<TableScanOperator,exprNodeDesc> opToPartPruner,
HashMap<String,Operator<? extends Serializable>> topOps,
HashMap<String,Operator<? extends Serializable>> topSelOps,
LinkedHashMap<Operator<? extends Serializable>,OpParseContext> opParseCtx,
Map<JoinOperator,QBJoinTree> joinContext,
HashMap<TableScanOperator,Table> topToTable,
List<loadTableDesc> loadTableWork,
List<loadFileDesc> loadFileWork,
Context ctx,
HashMap<String,String> idToTableNameMap,
int destTableId,
UnionProcContext uCtx,
List<MapJoinOperator> listMapJoinOpsNoReducer,
Map<GroupByOperator,Set<String>> groupOpToInputTables,
Map<String,PrunedPartitionList> prunedPartitions,
HashMap<TableScanOperator,filterDesc.sampleDesc> opToSamplePruner)
|
Uses of exprNodeDesc in org.apache.hadoop.hive.ql.plan |
---|
Subclasses of exprNodeDesc in org.apache.hadoop.hive.ql.plan | |
---|---|
class |
exprNodeColumnDesc
|
class |
exprNodeConstantDesc
A constant expression. |
class |
exprNodeFieldDesc
|
class |
exprNodeGenericFuncDesc
Describes a GenericFunc node. |
class |
exprNodeNullDesc
|
Methods in org.apache.hadoop.hive.ql.plan that return exprNodeDesc | |
---|---|
exprNodeDesc |
exprNodeNullDesc.clone()
|
exprNodeDesc |
exprNodeGenericFuncDesc.clone()
|
exprNodeDesc |
exprNodeFieldDesc.clone()
|
exprNodeDesc |
exprNodeConstantDesc.clone()
|
exprNodeDesc |
exprNodeColumnDesc.clone()
|
abstract exprNodeDesc |
exprNodeDesc.clone()
|
exprNodeDesc |
extractDesc.getCol()
|
exprNodeDesc |
exprNodeFieldDesc.getDesc()
|
exprNodeDesc |
filterDesc.getPredicate()
|
Methods in org.apache.hadoop.hive.ql.plan that return types with arguments of type exprNodeDesc | |
---|---|
List<exprNodeDesc> |
exprNodeGenericFuncDesc.getChildExprs()
|
List<exprNodeDesc> |
exprNodeGenericFuncDesc.getChildren()
|
List<exprNodeDesc> |
exprNodeFieldDesc.getChildren()
|
List<exprNodeDesc> |
exprNodeDesc.getChildren()
|
ArrayList<exprNodeDesc> |
selectDesc.getColList()
|
Map<Byte,List<exprNodeDesc>> |
joinDesc.getExprs()
|
ArrayList<exprNodeDesc> |
reduceSinkDesc.getKeyCols()
|
Map<Byte,List<exprNodeDesc>> |
mapJoinDesc.getKeys()
|
ArrayList<exprNodeDesc> |
groupByDesc.getKeys()
|
ArrayList<exprNodeDesc> |
aggregationDesc.getParameters()
|
ArrayList<exprNodeDesc> |
reduceSinkDesc.getPartitionCols()
|
ArrayList<exprNodeDesc> |
reduceSinkDesc.getValueCols()
|
Methods in org.apache.hadoop.hive.ql.plan with parameters of type exprNodeDesc | |
---|---|
void |
extractDesc.setCol(exprNodeDesc col)
|
void |
exprNodeFieldDesc.setDesc(exprNodeDesc desc)
|
void |
filterDesc.setPredicate(exprNodeDesc predicate)
|
Method parameters in org.apache.hadoop.hive.ql.plan with type arguments of type exprNodeDesc | |
---|---|
static List<FieldSchema> |
PlanUtils.getFieldSchemasFromColumnList(List<exprNodeDesc> cols,
List<String> outputColumnNames,
int start,
String fieldPrefix)
Convert the ColumnList to FieldSchema list. |
static List<FieldSchema> |
PlanUtils.getFieldSchemasFromColumnList(List<exprNodeDesc> cols,
String fieldPrefix)
Convert the ColumnList to FieldSchema list. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
List<String> outputColumnNames,
boolean includeKeyCols,
int tag,
ArrayList<exprNodeDesc> partitionCols,
String order,
int numReducers)
Create the reduce sink descriptor. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
List<String> outputColumnNames,
boolean includeKeyCols,
int tag,
ArrayList<exprNodeDesc> partitionCols,
String order,
int numReducers)
Create the reduce sink descriptor. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
List<String> outputColumnNames,
boolean includeKeyCols,
int tag,
ArrayList<exprNodeDesc> partitionCols,
String order,
int numReducers)
Create the reduce sink descriptor. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
List<String> outputColumnNames,
boolean includeKey,
int tag,
int numPartitionFields,
int numReducers)
Create the reduce sink descriptor. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
List<String> outputColumnNames,
boolean includeKey,
int tag,
int numPartitionFields,
int numReducers)
Create the reduce sink descriptor. |
static exprNodeGenericFuncDesc |
exprNodeGenericFuncDesc.newInstance(GenericUDF genericUDF,
List<exprNodeDesc> children)
Create a exprNodeGenericFuncDesc based on the genericUDFClass and the children parameters. |
void |
exprNodeGenericFuncDesc.setChildExprs(List<exprNodeDesc> children)
|
void |
selectDesc.setColList(ArrayList<exprNodeDesc> colList)
|
void |
joinDesc.setExprs(Map<Byte,List<exprNodeDesc>> exprs)
|
void |
reduceSinkDesc.setKeyCols(ArrayList<exprNodeDesc> keyCols)
|
void |
groupByDesc.setKeys(ArrayList<exprNodeDesc> keys)
|
void |
mapJoinDesc.setKeys(Map<Byte,List<exprNodeDesc>> keys)
|
void |
aggregationDesc.setParameters(ArrayList<exprNodeDesc> parameters)
|
void |
reduceSinkDesc.setPartitionCols(ArrayList<exprNodeDesc> partitionCols)
|
void |
reduceSinkDesc.setValueCols(ArrayList<exprNodeDesc> valueCols)
|
Constructors in org.apache.hadoop.hive.ql.plan with parameters of type exprNodeDesc | |
---|---|
exprNodeFieldDesc(TypeInfo typeInfo,
exprNodeDesc desc,
String fieldName,
Boolean isList)
|
|
extractDesc(exprNodeDesc col)
|
|
filterDesc(exprNodeDesc predicate,
boolean isSamplingPred)
|
|
filterDesc(exprNodeDesc predicate,
boolean isSamplingPred,
filterDesc.sampleDesc sampleDescr)
|
Constructor parameters in org.apache.hadoop.hive.ql.plan with type arguments of type exprNodeDesc | |
---|---|
aggregationDesc(String genericUDAFName,
GenericUDAFEvaluator genericUDAFEvaluator,
ArrayList<exprNodeDesc> parameters,
boolean distinct,
GenericUDAFEvaluator.Mode mode)
|
|
exprNodeGenericFuncDesc(TypeInfo typeInfo,
GenericUDF genericUDF,
List<exprNodeDesc> children)
|
|
groupByDesc(groupByDesc.Mode mode,
ArrayList<String> outputColumnNames,
ArrayList<exprNodeDesc> keys,
ArrayList<aggregationDesc> aggregators,
boolean groupKeyNotReductionKey)
|
|
groupByDesc(groupByDesc.Mode mode,
ArrayList<String> outputColumnNames,
ArrayList<exprNodeDesc> keys,
ArrayList<aggregationDesc> aggregators,
boolean groupKeyNotReductionKey,
boolean bucketGroup)
|
|
joinDesc(Map<Byte,List<exprNodeDesc>> exprs,
ArrayList<String> outputColumnNames)
|
|
joinDesc(Map<Byte,List<exprNodeDesc>> exprs,
ArrayList<String> outputColumnNames,
boolean noOuterJoin,
joinCond[] conds)
|
|
joinDesc(Map<Byte,List<exprNodeDesc>> exprs,
ArrayList<String> outputColumnNames,
joinCond[] conds)
|
|
mapJoinDesc(Map<Byte,List<exprNodeDesc>> keys,
tableDesc keyTblDesc,
Map<Byte,List<exprNodeDesc>> values,
List<tableDesc> valueTblDescs,
ArrayList<String> outputColumnNames,
int posBigTable,
joinCond[] conds)
|
|
mapJoinDesc(Map<Byte,List<exprNodeDesc>> keys,
tableDesc keyTblDesc,
Map<Byte,List<exprNodeDesc>> values,
List<tableDesc> valueTblDescs,
ArrayList<String> outputColumnNames,
int posBigTable,
joinCond[] conds)
|
|
reduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
ArrayList<String> outputKeyColumnNames,
ArrayList<String> outputValueolumnNames,
int tag,
ArrayList<exprNodeDesc> partitionCols,
int numReducers,
tableDesc keySerializeInfo,
tableDesc valueSerializeInfo)
|
|
reduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
ArrayList<String> outputKeyColumnNames,
ArrayList<String> outputValueolumnNames,
int tag,
ArrayList<exprNodeDesc> partitionCols,
int numReducers,
tableDesc keySerializeInfo,
tableDesc valueSerializeInfo)
|
|
reduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
ArrayList<String> outputKeyColumnNames,
ArrayList<String> outputValueolumnNames,
int tag,
ArrayList<exprNodeDesc> partitionCols,
int numReducers,
tableDesc keySerializeInfo,
tableDesc valueSerializeInfo)
|
|
selectDesc(ArrayList<exprNodeDesc> colList,
ArrayList<String> outputColumnNames)
|
|
selectDesc(ArrayList<exprNodeDesc> colList,
ArrayList<String> outputColumnNames,
boolean selectStar)
|
|
selectDesc(ArrayList<exprNodeDesc> colList,
boolean selectStar,
boolean selStarNoCompute)
|
Uses of exprNodeDesc in org.apache.hadoop.hive.ql.ppd |
---|
Methods in org.apache.hadoop.hive.ql.ppd that return exprNodeDesc | |
---|---|
exprNodeDesc |
ExprWalkerInfo.getConvertedNode(Node nd)
|
Methods in org.apache.hadoop.hive.ql.ppd that return types with arguments of type exprNodeDesc | |
---|---|
Map<String,List<exprNodeDesc>> |
ExprWalkerInfo.getFinalCandidates()
Returns the list of pushdown expressions for each alias that appear in the current operator's RowResolver. |
Methods in org.apache.hadoop.hive.ql.ppd with parameters of type exprNodeDesc | |
---|---|
void |
ExprWalkerInfo.addAlias(exprNodeDesc expr,
String alias)
Adds the specified alias to the specified expr |
void |
ExprWalkerInfo.addConvertedNode(exprNodeDesc oldNode,
exprNodeDesc newNode)
adds a replacement node for this expression |
void |
ExprWalkerInfo.addFinalCandidate(exprNodeDesc expr)
Adds the specified expr as the top-most pushdown expr (ie all its children can be pushed) |
static ExprWalkerInfo |
ExprWalkerProcFactory.extractPushdownPreds(OpWalkerInfo opContext,
Operator<? extends Serializable> op,
exprNodeDesc pred)
|
String |
ExprWalkerInfo.getAlias(exprNodeDesc expr)
Returns the alias of the specified expr |
boolean |
ExprWalkerInfo.isCandidate(exprNodeDesc expr)
Returns true if the specified expression is pushdown candidate else false |
void |
ExprWalkerInfo.setIsCandidate(exprNodeDesc expr,
boolean b)
Marks the specified expr to the specified value |
Method parameters in org.apache.hadoop.hive.ql.ppd with type arguments of type exprNodeDesc | |
---|---|
static ExprWalkerInfo |
ExprWalkerProcFactory.extractPushdownPreds(OpWalkerInfo opContext,
Operator<? extends Serializable> op,
List<exprNodeDesc> preds)
Extracts pushdown predicates from the given list of predicate expression |
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |