Uses of Class
org.apache.hadoop.hive.ql.plan.ExprNodeDesc

Packages that use ExprNodeDesc
org.apache.hadoop.hive.ql.exec Hive QL execution tasks, operators, functions and other handlers. 
org.apache.hadoop.hive.ql.index   
org.apache.hadoop.hive.ql.index.bitmap   
org.apache.hadoop.hive.ql.index.compact   
org.apache.hadoop.hive.ql.metadata   
org.apache.hadoop.hive.ql.optimizer   
org.apache.hadoop.hive.ql.optimizer.lineage   
org.apache.hadoop.hive.ql.optimizer.listbucketingpruner   
org.apache.hadoop.hive.ql.optimizer.pcr   
org.apache.hadoop.hive.ql.optimizer.ppr   
org.apache.hadoop.hive.ql.parse   
org.apache.hadoop.hive.ql.plan   
org.apache.hadoop.hive.ql.ppd   
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.exec
 

Fields in org.apache.hadoop.hive.ql.exec with type parameters of type ExprNodeDesc
protected  Map<String,ExprNodeDesc> Operator.colExprMap
          A map of output column name to input expression map.
 

Methods in org.apache.hadoop.hive.ql.exec that return ExprNodeDesc
static ExprNodeDesc Utilities.deserializeExpression(String s, org.apache.hadoop.conf.Configuration conf)
           
 

Methods in org.apache.hadoop.hive.ql.exec that return types with arguments of type ExprNodeDesc
 Map<String,ExprNodeDesc> Operator.getColumnExprMap()
          Returns a map of output column name to input expression map Note that currently it returns only key columns for ReduceSink and GroupBy operators.
 

Methods in org.apache.hadoop.hive.ql.exec with parameters of type ExprNodeDesc
static String Utilities.checkJDOPushDown(Table tab, ExprNodeDesc expr)
          Check if the partition pruning expression can be pushed down to JDO filtering.
static ExprNodeEvaluator ExprNodeEvaluatorFactory.get(ExprNodeDesc desc)
           
static boolean FunctionRegistry.isOpAnd(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "and".
static boolean FunctionRegistry.isOpAndOrNot(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "and", "or", "not".
static boolean FunctionRegistry.isOpNot(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "not".
static boolean FunctionRegistry.isOpOr(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "or".
static boolean FunctionRegistry.isOpPositive(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "positive".
static String Utilities.serializeExpression(ExprNodeDesc expr)
           
 

Method parameters in org.apache.hadoop.hive.ql.exec with type arguments of type ExprNodeDesc
static int JoinUtil.populateJoinKeyValue(Map<Byte,List<ExprNodeEvaluator>> outMap, Map<Byte,List<ExprNodeDesc>> inputMap, Byte[] order, int posBigTableAlias)
           
 void Operator.setColumnExprMap(Map<String,ExprNodeDesc> colExprMap)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.index
 

Methods in org.apache.hadoop.hive.ql.index that return ExprNodeDesc
 ExprNodeDesc IndexPredicateAnalyzer.analyzePredicate(ExprNodeDesc predicate, List<IndexSearchCondition> searchConditions)
          Analyzes a predicate.
 ExprNodeDesc IndexSearchCondition.getComparisonExpr()
           
 ExprNodeDesc HiveIndexQueryContext.getResidualPredicate()
           
 ExprNodeDesc IndexPredicateAnalyzer.translateSearchConditions(List<IndexSearchCondition> searchConditions)
          Translates search conditions back to ExprNodeDesc form (as a left-deep conjunction).
 

Methods in org.apache.hadoop.hive.ql.index with parameters of type ExprNodeDesc
 ExprNodeDesc IndexPredicateAnalyzer.analyzePredicate(ExprNodeDesc predicate, List<IndexSearchCondition> searchConditions)
          Analyzes a predicate.
 void AbstractIndexHandler.generateIndexQuery(Index index, ExprNodeDesc predicate, ParseContext pctx, HiveIndexQueryContext queryContext)
           
 void HiveIndexHandler.generateIndexQuery(List<Index> indexes, ExprNodeDesc predicate, ParseContext pctx, HiveIndexQueryContext queryContext)
          Generate the list of tasks required to run an index optimized sub-query for the given predicate, using the given indexes.
 void IndexSearchCondition.setComparisonExpr(ExprNodeDesc comparisonExpr)
           
 void HiveIndexQueryContext.setResidualPredicate(ExprNodeDesc residualPredicate)
           
 

Constructors in org.apache.hadoop.hive.ql.index with parameters of type ExprNodeDesc
IndexSearchCondition(ExprNodeColumnDesc columnDesc, String comparisonOp, ExprNodeConstantDesc constantDesc, ExprNodeDesc comparisonExpr)
          Constructs a search condition, which takes the form
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.index.bitmap
 

Methods in org.apache.hadoop.hive.ql.index.bitmap with parameters of type ExprNodeDesc
 void BitmapIndexHandler.generateIndexQuery(List<Index> indexes, ExprNodeDesc predicate, ParseContext pctx, HiveIndexQueryContext queryContext)
           
 

Constructors in org.apache.hadoop.hive.ql.index.bitmap with parameters of type ExprNodeDesc
BitmapInnerQuery(String tableName, ExprNodeDesc predicate, String alias)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.index.compact
 

Methods in org.apache.hadoop.hive.ql.index.compact with parameters of type ExprNodeDesc
 void CompactIndexHandler.generateIndexQuery(List<Index> indexes, ExprNodeDesc predicate, ParseContext pctx, HiveIndexQueryContext queryContext)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.metadata
 

Fields in org.apache.hadoop.hive.ql.metadata declared as ExprNodeDesc
 ExprNodeDesc HiveStoragePredicateHandler.DecomposedPredicate.pushedPredicate
          Portion of predicate to be evaluated by storage handler.
 ExprNodeDesc HiveStoragePredicateHandler.DecomposedPredicate.residualPredicate
          Portion of predicate to be post-evaluated by Hive for any rows which are returned by storage handler.
 

Methods in org.apache.hadoop.hive.ql.metadata with parameters of type ExprNodeDesc
 HiveStoragePredicateHandler.DecomposedPredicate HiveStoragePredicateHandler.decomposePredicate(org.apache.hadoop.mapred.JobConf jobConf, Deserializer deserializer, ExprNodeDesc predicate)
          Gives the storage handler a chance to decompose a predicate.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer
 

Methods in org.apache.hadoop.hive.ql.optimizer that return ExprNodeDesc
protected abstract  ExprNodeDesc PrunerExpressionOperatorFactory.ColumnExprProcessor.processColumnDesc(NodeProcessorCtx procCtx, ExprNodeColumnDesc cd)
          Process column desc.
 

Methods in org.apache.hadoop.hive.ql.optimizer with parameters of type ExprNodeDesc
protected  void PrunerOperatorFactory.FilterPruner.addPruningPred(Map<TableScanOperator,ExprNodeDesc> opToPrunner, TableScanOperator top, ExprNodeDesc new_pruner_pred)
          Add pruning predicate.
protected  void PrunerOperatorFactory.FilterPruner.addPruningPred(Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPrunner, TableScanOperator top, ExprNodeDesc new_pruner_pred, Partition part)
          Add pruning predicate.
static Map<Node,Object> PrunerUtils.walkExprTree(ExprNodeDesc pred, NodeProcessorCtx ctx, NodeProcessor colProc, NodeProcessor fieldProc, NodeProcessor genFuncProc, NodeProcessor defProc)
          Walk expression tree for pruner generation.
 

Method parameters in org.apache.hadoop.hive.ql.optimizer with type arguments of type ExprNodeDesc
protected  void PrunerOperatorFactory.FilterPruner.addPruningPred(Map<TableScanOperator,ExprNodeDesc> opToPrunner, TableScanOperator top, ExprNodeDesc new_pruner_pred)
          Add pruning predicate.
protected  void PrunerOperatorFactory.FilterPruner.addPruningPred(Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPrunner, TableScanOperator top, ExprNodeDesc new_pruner_pred, Partition part)
          Add pruning predicate.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.lineage
 

Methods in org.apache.hadoop.hive.ql.optimizer.lineage with parameters of type ExprNodeDesc
static LineageInfo.Dependency ExprProcFactory.getExprDependency(LineageCtx lctx, Operator<? extends OperatorDesc> inpOp, ExprNodeDesc expr)
          Gets the expression dependencies for the expression.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner
 

Methods in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner that return ExprNodeDesc
static ExprNodeDesc LBExprProcFactory.genPruner(String tabAlias, ExprNodeDesc pred, Partition part)
          Generates the list bucketing pruner for the expression tree.
protected  ExprNodeDesc LBExprProcFactory.LBPRColumnExprProcessor.processColumnDesc(NodeProcessorCtx procCtx, ExprNodeColumnDesc cd)
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner that return types with arguments of type ExprNodeDesc
 Map<TableScanOperator,Map<String,ExprNodeDesc>> LBOpWalkerCtx.getOpToPartToLBPruner()
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner with parameters of type ExprNodeDesc
static ExprNodeDesc LBExprProcFactory.genPruner(String tabAlias, ExprNodeDesc pred, Partition part)
          Generates the list bucketing pruner for the expression tree.
static org.apache.hadoop.fs.Path[] ListBucketingPruner.prune(ParseContext ctx, Partition part, ExprNodeDesc pruner)
          Prunes to the directories which match the skewed keys in where clause.
 

Constructor parameters in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner with type arguments of type ExprNodeDesc
LBOpWalkerCtx(Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPartToLBPruner, Partition part)
          Constructor.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.pcr
 

Fields in org.apache.hadoop.hive.ql.optimizer.pcr declared as ExprNodeDesc
 ExprNodeDesc PcrExprProcFactory.NodeInfoWrapper.outExpr
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.pcr with parameters of type ExprNodeDesc
static PcrExprProcFactory.NodeInfoWrapper PcrExprProcFactory.walkExprTree(String tabAlias, ArrayList<Partition> parts, ExprNodeDesc pred)
          Remove partition conditions when necessary from the the expression tree.
 

Constructors in org.apache.hadoop.hive.ql.optimizer.pcr with parameters of type ExprNodeDesc
PcrExprProcFactory.NodeInfoWrapper(PcrExprProcFactory.WalkState state, Boolean[] resultVector, ExprNodeDesc outExpr)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.ppr
 

Methods in org.apache.hadoop.hive.ql.optimizer.ppr that return ExprNodeDesc
static ExprNodeDesc ExprProcFactory.genPruner(String tabAlias, ExprNodeDesc pred, boolean hasNonPartCols)
          Generates the partition pruner for the expression tree.
protected  ExprNodeDesc ExprProcFactory.PPRColumnExprProcessor.processColumnDesc(NodeProcessorCtx procCtx, ExprNodeColumnDesc cd)
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.ppr that return types with arguments of type ExprNodeDesc
 HashMap<TableScanOperator,ExprNodeDesc> OpWalkerCtx.getOpToPartPruner()
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.ppr with parameters of type ExprNodeDesc
static Object PartExprEvalUtils.evalExprWithPart(ExprNodeDesc expr, LinkedHashMap<String,String> partSpec, StructObjectInspector rowObjectInspector)
          Evaluate expression with partition columns
static ExprNodeDesc ExprProcFactory.genPruner(String tabAlias, ExprNodeDesc pred, boolean hasNonPartCols)
          Generates the partition pruner for the expression tree.
static boolean PartitionPruner.hasColumnExpr(ExprNodeDesc desc)
          Whether the expression contains a column node or not.
static boolean PartitionPruner.onlyContainsPartnCols(Table tab, ExprNodeDesc expr)
          Find out whether the condition only contains partitioned columns.
static Map<PrimitiveObjectInspector,ExprNodeEvaluator> PartExprEvalUtils.prepareExpr(ExprNodeDesc expr, List<String> partNames, StructObjectInspector rowObjectInspector)
           
static PrunedPartitionList PartitionPruner.prune(Table tab, ExprNodeDesc prunerExpr, HiveConf conf, String alias, Map<String,PrunedPartitionList> prunedPartitionsMap)
          Get the partition list for the table that satisfies the partition pruner condition.
 

Constructor parameters in org.apache.hadoop.hive.ql.optimizer.ppr with type arguments of type ExprNodeDesc
OpWalkerCtx(HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner)
          Constructor.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.parse
 

Methods in org.apache.hadoop.hive.ql.parse that return ExprNodeDesc
 ExprNodeDesc SemanticAnalyzer.genExprNodeDesc(ASTNode expr, RowResolver input)
          Generates an expression node descriptor for the expression passed in the arguments.
 ExprNodeDesc SemanticAnalyzer.genExprNodeDesc(ASTNode expr, RowResolver input, TypeCheckCtx tcCtx)
          Generates an expression node descriptor for the expression passed in the arguments.
static ExprNodeDesc TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String udfName, ExprNodeDesc... children)
          This function create an ExprNodeDesc for a UDF function given the children (arguments).
static ExprNodeDesc TypeCheckProcFactory.processGByExpr(Node nd, Object procCtx)
          Function to do groupby subexpression elimination.
 

Methods in org.apache.hadoop.hive.ql.parse that return types with arguments of type ExprNodeDesc
 HashMap<TableScanOperator,ExprNodeDesc> ParseContext.getOpToPartPruner()
           
 Map<TableScanOperator,Map<String,ExprNodeDesc>> ParseContext.getOpToPartToSkewedPruner()
           
 

Methods in org.apache.hadoop.hive.ql.parse with parameters of type ExprNodeDesc
static ExprNodeDesc TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String udfName, ExprNodeDesc... children)
          This function create an ExprNodeDesc for a UDF function given the children (arguments).
 

Method parameters in org.apache.hadoop.hive.ql.parse with type arguments of type ExprNodeDesc
static boolean TypeCheckProcFactory.DefaultExprProcessor.isRedundantConversionFunction(ASTNode expr, boolean isFunction, ArrayList<ExprNodeDesc> children)
           
 void ParseContext.setOpPartToSkewedPruner(HashMap<TableScanOperator,Map<String,ExprNodeDesc>> opToPartToSkewedPruner)
           
 void ParseContext.setOpToPartPruner(HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner)
           
 

Constructor parameters in org.apache.hadoop.hive.ql.parse with type arguments of type ExprNodeDesc
ParseContext(HiveConf conf, QB qb, ASTNode ast, HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner, HashMap<TableScanOperator,PrunedPartitionList> opToPartList, HashMap<String,Operator<? extends OperatorDesc>> topOps, HashMap<String,Operator<? extends OperatorDesc>> topSelOps, LinkedHashMap<Operator<? extends OperatorDesc>,OpParseContext> opParseCtx, Map<JoinOperator,QBJoinTree> joinContext, HashMap<TableScanOperator,Table> topToTable, List<LoadTableDesc> loadTableWork, List<LoadFileDesc> loadFileWork, Context ctx, HashMap<String,String> idToTableNameMap, int destTableId, UnionProcContext uCtx, List<AbstractMapJoinOperator<? extends MapJoinDesc>> listMapJoinOpsNoReducer, Map<GroupByOperator,Set<String>> groupOpToInputTables, Map<String,PrunedPartitionList> prunedPartitions, HashMap<TableScanOperator,FilterDesc.sampleDesc> opToSamplePruner, GlobalLimitCtx globalLimitCtx, HashMap<String,SplitSample> nameToSplitSample, HashSet<ReadEntity> semanticInputs, List<Task<? extends Serializable>> rootTasks, Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPartToSkewedPruner)
           
ParseContext(HiveConf conf, QB qb, ASTNode ast, HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner, HashMap<TableScanOperator,PrunedPartitionList> opToPartList, HashMap<String,Operator<? extends OperatorDesc>> topOps, HashMap<String,Operator<? extends OperatorDesc>> topSelOps, LinkedHashMap<Operator<? extends OperatorDesc>,OpParseContext> opParseCtx, Map<JoinOperator,QBJoinTree> joinContext, HashMap<TableScanOperator,Table> topToTable, List<LoadTableDesc> loadTableWork, List<LoadFileDesc> loadFileWork, Context ctx, HashMap<String,String> idToTableNameMap, int destTableId, UnionProcContext uCtx, List<AbstractMapJoinOperator<? extends MapJoinDesc>> listMapJoinOpsNoReducer, Map<GroupByOperator,Set<String>> groupOpToInputTables, Map<String,PrunedPartitionList> prunedPartitions, HashMap<TableScanOperator,FilterDesc.sampleDesc> opToSamplePruner, GlobalLimitCtx globalLimitCtx, HashMap<String,SplitSample> nameToSplitSample, HashSet<ReadEntity> semanticInputs, List<Task<? extends Serializable>> rootTasks, Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPartToSkewedPruner)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.plan
 

Subclasses of ExprNodeDesc in org.apache.hadoop.hive.ql.plan
 class ExprNodeColumnDesc
          ExprNodeColumnDesc.
 class ExprNodeConstantDesc
          A constant expression.
 class ExprNodeFieldDesc
          ExprNodeFieldDesc.
 class ExprNodeGenericFuncDesc
          Describes a GenericFunc node.
 class ExprNodeNullDesc
          ExprNodeNullDesc.
 

Methods in org.apache.hadoop.hive.ql.plan that return ExprNodeDesc
 ExprNodeDesc ExprNodeConstantDesc.clone()
           
 ExprNodeDesc ExprNodeGenericFuncDesc.clone()
           
 ExprNodeDesc ExprNodeFieldDesc.clone()
           
abstract  ExprNodeDesc ExprNodeDesc.clone()
           
 ExprNodeDesc ExprNodeNullDesc.clone()
           
 ExprNodeDesc ExprNodeColumnDesc.clone()
           
 ExprNodeDesc ExtractDesc.getCol()
           
 ExprNodeDesc ExprNodeFieldDesc.getDesc()
           
 ExprNodeDesc ExprNodeDesc.ExprNodeDescEqualityWrapper.getExprNodeDesc()
           
 ExprNodeDesc TableScanDesc.getFilterExpr()
           
 ExprNodeDesc FilterDesc.getPredicate()
           
static ExprNodeDesc ExprNodeDescUtils.mergePredicates(ExprNodeDesc prev, ExprNodeDesc next)
          bind two predicates by AND op
static ExprNodeDesc ExprNodeDescUtils.replace(ExprNodeDesc origin, List<ExprNodeDesc> sources, List<ExprNodeDesc> targets)
           
 

Methods in org.apache.hadoop.hive.ql.plan that return types with arguments of type ExprNodeDesc
 List<ExprNodeDesc> ExprNodeGenericFuncDesc.getChildExprs()
           
 List<ExprNodeDesc> ExprNodeGenericFuncDesc.getChildren()
           
 List<ExprNodeDesc> ExprNodeFieldDesc.getChildren()
           
 List<ExprNodeDesc> ExprNodeDesc.getChildren()
           
 List<ExprNodeDesc> SelectDesc.getColList()
           
 Map<Byte,List<ExprNodeDesc>> JoinDesc.getExprs()
           
 Map<Byte,List<ExprNodeDesc>> HashTableSinkDesc.getExprs()
           
 Map<Byte,List<ExprNodeDesc>> JoinDesc.getFilters()
           
 Map<Byte,List<ExprNodeDesc>> HashTableSinkDesc.getFilters()
           
 ArrayList<ExprNodeDesc> ReduceSinkDesc.getKeyCols()
           
 ArrayList<ExprNodeDesc> GroupByDesc.getKeys()
           
 Map<Byte,List<ExprNodeDesc>> HashTableSinkDesc.getKeys()
           
 Map<Byte,List<ExprNodeDesc>> MapJoinDesc.getKeys()
           
 ArrayList<ExprNodeDesc> AggregationDesc.getParameters()
           
 ArrayList<ExprNodeDesc> FileSinkDesc.getPartitionCols()
           
 ArrayList<ExprNodeDesc> ReduceSinkDesc.getPartitionCols()
           
 ArrayList<ExprNodeDesc> ReduceSinkDesc.getValueCols()
           
 

Methods in org.apache.hadoop.hive.ql.plan with parameters of type ExprNodeDesc
static boolean ExprNodeDescUtils.containsPredicate(ExprNodeDesc source, ExprNodeDesc predicate)
          return true if predicate is already included in source
static int ExprNodeDescUtils.indexOf(ExprNodeDesc origin, List<ExprNodeDesc> sources)
           
static ExprNodeDesc ExprNodeDescUtils.mergePredicates(ExprNodeDesc prev, ExprNodeDesc next)
          bind two predicates by AND op
static ExprNodeDesc ExprNodeDescUtils.replace(ExprNodeDesc origin, List<ExprNodeDesc> sources, List<ExprNodeDesc> targets)
           
 void ExtractDesc.setCol(ExprNodeDesc col)
           
 void ExprNodeFieldDesc.setDesc(ExprNodeDesc desc)
           
 void ExprNodeDesc.ExprNodeDescEqualityWrapper.setExprNodeDesc(ExprNodeDesc exprNodeDesc)
           
 void TableScanDesc.setFilterExpr(ExprNodeDesc filterExpr)
           
 void FilterDesc.setPredicate(ExprNodeDesc predicate)
           
 

Method parameters in org.apache.hadoop.hive.ql.plan with type arguments of type ExprNodeDesc
static List<FieldSchema> PlanUtils.getFieldSchemasFromColumnList(List<ExprNodeDesc> cols, List<String> outputColumnNames, int start, String fieldPrefix)
          Convert the ColumnList to FieldSchema list.
static List<FieldSchema> PlanUtils.getFieldSchemasFromColumnList(List<ExprNodeDesc> cols, String fieldPrefix)
          Convert the ColumnList to FieldSchema list.
static List<FieldSchema> PlanUtils.getFieldSchemasFromColumnListWithLength(List<ExprNodeDesc> cols, List<List<Integer>> distinctColIndices, List<String> outputColumnNames, int length, String fieldPrefix)
          Convert the ColumnList to FieldSchema list.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKey, int tag, int numPartitionFields, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKey, int tag, int numPartitionFields, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKey, int tag, int numPartitionFields, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKey, int tag, int numPartitionFields, int numReducers)
          Create the reduce sink descriptor.
static int ExprNodeDescUtils.indexOf(ExprNodeDesc origin, List<ExprNodeDesc> sources)
           
static ExprNodeGenericFuncDesc ExprNodeGenericFuncDesc.newInstance(GenericUDF genericUDF, List<ExprNodeDesc> children)
          Create a exprNodeGenericFuncDesc based on the genericUDFClass and the children parameters.
static ExprNodeDesc ExprNodeDescUtils.replace(ExprNodeDesc origin, List<ExprNodeDesc> sources, List<ExprNodeDesc> targets)
           
static ExprNodeDesc ExprNodeDescUtils.replace(ExprNodeDesc origin, List<ExprNodeDesc> sources, List<ExprNodeDesc> targets)
           
 void ExprNodeGenericFuncDesc.setChildExprs(List<ExprNodeDesc> children)
           
 void SelectDesc.setColList(List<ExprNodeDesc> colList)
           
 void JoinDesc.setExprs(Map<Byte,List<ExprNodeDesc>> exprs)
           
 void HashTableSinkDesc.setExprs(Map<Byte,List<ExprNodeDesc>> exprs)
           
 void JoinDesc.setFilters(Map<Byte,List<ExprNodeDesc>> filters)
           
 void HashTableSinkDesc.setFilters(Map<Byte,List<ExprNodeDesc>> filters)
           
 void ReduceSinkDesc.setKeyCols(ArrayList<ExprNodeDesc> keyCols)
           
 void GroupByDesc.setKeys(ArrayList<ExprNodeDesc> keys)
           
 void HashTableSinkDesc.setKeys(Map<Byte,List<ExprNodeDesc>> keys)
           
 void MapJoinDesc.setKeys(Map<Byte,List<ExprNodeDesc>> keys)
           
 void AggregationDesc.setParameters(ArrayList<ExprNodeDesc> parameters)
           
 void FileSinkDesc.setPartitionCols(ArrayList<ExprNodeDesc> partitionCols)
           
 void ReduceSinkDesc.setPartitionCols(ArrayList<ExprNodeDesc> partitionCols)
           
 void ReduceSinkDesc.setValueCols(ArrayList<ExprNodeDesc> valueCols)
           
 

Constructors in org.apache.hadoop.hive.ql.plan with parameters of type ExprNodeDesc
ExprNodeDesc.ExprNodeDescEqualityWrapper(ExprNodeDesc exprNodeDesc)
           
ExprNodeFieldDesc(TypeInfo typeInfo, ExprNodeDesc desc, String fieldName, Boolean isList)
           
ExtractDesc(ExprNodeDesc col)
           
FilterDesc(ExprNodeDesc predicate, boolean isSamplingPred)
           
FilterDesc(ExprNodeDesc predicate, boolean isSamplingPred, FilterDesc.sampleDesc sampleDescr)
           
 

Constructor parameters in org.apache.hadoop.hive.ql.plan with type arguments of type ExprNodeDesc
AggregationDesc(String genericUDAFName, GenericUDAFEvaluator genericUDAFEvaluator, ArrayList<ExprNodeDesc> parameters, boolean distinct, GenericUDAFEvaluator.Mode mode)
           
ExprNodeGenericFuncDesc(ObjectInspector oi, GenericUDF genericUDF, List<ExprNodeDesc> children)
           
ExprNodeGenericFuncDesc(TypeInfo typeInfo, GenericUDF genericUDF, List<ExprNodeDesc> children)
           
FileSinkDesc(String dirName, TableDesc tableInfo, boolean compressed, int destTableId, boolean multiFileSpray, int numFiles, int totalFiles, ArrayList<ExprNodeDesc> partitionCols, DynamicPartitionCtx dpCtx)
           
GroupByDesc(GroupByDesc.Mode mode, ArrayList<String> outputColumnNames, ArrayList<ExprNodeDesc> keys, ArrayList<AggregationDesc> aggregators, boolean groupKeyNotReductionKey, boolean bucketGroup, float groupByMemoryUsage, float memoryThreshold, List<Integer> listGroupingSets, boolean groupingSetsPresent, int groupingSetsPosition)
           
GroupByDesc(GroupByDesc.Mode mode, ArrayList<String> outputColumnNames, ArrayList<ExprNodeDesc> keys, ArrayList<AggregationDesc> aggregators, boolean groupKeyNotReductionKey, float groupByMemoryUsage, float memoryThreshold, List<Integer> listGroupingSets, boolean groupingSetsPresent, int groupingSetsPosition)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames, boolean noOuterJoin, JoinCondDesc[] conds)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames, boolean noOuterJoin, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames, boolean noOuterJoin, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames, JoinCondDesc[] conds)
           
MapJoinDesc(Map<Byte,List<ExprNodeDesc>> keys, TableDesc keyTblDesc, Map<Byte,List<ExprNodeDesc>> values, List<TableDesc> valueTblDescs, List<TableDesc> valueFilteredTblDescs, List<String> outputColumnNames, int posBigTable, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters, boolean noOuterJoin, String dumpFilePrefix)
           
MapJoinDesc(Map<Byte,List<ExprNodeDesc>> keys, TableDesc keyTblDesc, Map<Byte,List<ExprNodeDesc>> values, List<TableDesc> valueTblDescs, List<TableDesc> valueFilteredTblDescs, List<String> outputColumnNames, int posBigTable, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters, boolean noOuterJoin, String dumpFilePrefix)
           
MapJoinDesc(Map<Byte,List<ExprNodeDesc>> keys, TableDesc keyTblDesc, Map<Byte,List<ExprNodeDesc>> values, List<TableDesc> valueTblDescs, List<TableDesc> valueFilteredTblDescs, List<String> outputColumnNames, int posBigTable, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters, boolean noOuterJoin, String dumpFilePrefix)
           
ReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numDistributionKeys, ArrayList<ExprNodeDesc> valueCols, ArrayList<String> outputKeyColumnNames, List<List<Integer>> distinctColumnIndices, ArrayList<String> outputValueColumnNames, int tag, ArrayList<ExprNodeDesc> partitionCols, int numReducers, TableDesc keySerializeInfo, TableDesc valueSerializeInfo)
           
ReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numDistributionKeys, ArrayList<ExprNodeDesc> valueCols, ArrayList<String> outputKeyColumnNames, List<List<Integer>> distinctColumnIndices, ArrayList<String> outputValueColumnNames, int tag, ArrayList<ExprNodeDesc> partitionCols, int numReducers, TableDesc keySerializeInfo, TableDesc valueSerializeInfo)
           
ReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numDistributionKeys, ArrayList<ExprNodeDesc> valueCols, ArrayList<String> outputKeyColumnNames, List<List<Integer>> distinctColumnIndices, ArrayList<String> outputValueColumnNames, int tag, ArrayList<ExprNodeDesc> partitionCols, int numReducers, TableDesc keySerializeInfo, TableDesc valueSerializeInfo)
           
SelectDesc(List<ExprNodeDesc> colList, boolean selectStar, boolean selStarNoCompute)
           
SelectDesc(List<ExprNodeDesc> colList, List<String> outputColumnNames)
           
SelectDesc(List<ExprNodeDesc> colList, List<String> outputColumnNames, boolean selectStar)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.ppd
 

Methods in org.apache.hadoop.hive.ql.ppd that return ExprNodeDesc
 ExprNodeDesc ExprWalkerInfo.getConvertedNode(Node nd)
           
 

Methods in org.apache.hadoop.hive.ql.ppd that return types with arguments of type ExprNodeDesc
 Map<String,List<ExprNodeDesc>> ExprWalkerInfo.getFinalCandidates()
          Returns the list of pushdown expressions for each alias that appear in the current operator's RowResolver.
 Map<ExprNodeDesc,ExprNodeDesc> ExprWalkerInfo.getNewToOldExprMap()
           
 Map<ExprNodeDesc,ExprNodeDesc> ExprWalkerInfo.getNewToOldExprMap()
           
 Map<String,List<ExprNodeDesc>> ExprWalkerInfo.getNonFinalCandidates()
          Returns list of non-final candidate predicate for each map.
 

Methods in org.apache.hadoop.hive.ql.ppd with parameters of type ExprNodeDesc
 void ExprWalkerInfo.addAlias(ExprNodeDesc expr, String alias)
          Adds the specified alias to the specified expr.
 void ExprWalkerInfo.addConvertedNode(ExprNodeDesc oldNode, ExprNodeDesc newNode)
          adds a replacement node for this expression.
 void ExprWalkerInfo.addFinalCandidate(ExprNodeDesc expr)
          Adds the specified expr as the top-most pushdown expr (ie all its children can be pushed).
 void ExprWalkerInfo.addNonFinalCandidate(ExprNodeDesc expr)
          Adds the specified expr as a non-final candidate
static ExprWalkerInfo ExprWalkerProcFactory.extractPushdownPreds(OpWalkerInfo opContext, Operator<? extends OperatorDesc> op, ExprNodeDesc pred)
           
 String ExprWalkerInfo.getAlias(ExprNodeDesc expr)
          Returns the alias of the specified expr.
 boolean ExprWalkerInfo.isCandidate(ExprNodeDesc expr)
          Returns true if the specified expression is pushdown candidate else false.
 void ExprWalkerInfo.setIsCandidate(ExprNodeDesc expr, boolean b)
          Marks the specified expr to the specified value.
 

Method parameters in org.apache.hadoop.hive.ql.ppd with type arguments of type ExprNodeDesc
 void ExprWalkerInfo.addPushDowns(String alias, List<ExprNodeDesc> pushDowns)
          Adds the passed list of pushDowns for the alias.
static ExprWalkerInfo ExprWalkerProcFactory.extractPushdownPreds(OpWalkerInfo opContext, Operator<? extends OperatorDesc> op, List<ExprNodeDesc> preds)
          Extracts pushdown predicates from the given list of predicate expression.
 



Copyright © 2011 The Apache Software Foundation