|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Packages that use Table | |
---|---|
org.apache.hadoop.hive.ql.exec | Hive QL execution tasks, operators, functions and other handlers. |
org.apache.hadoop.hive.ql.hooks | |
org.apache.hadoop.hive.ql.index | |
org.apache.hadoop.hive.ql.index.compact | |
org.apache.hadoop.hive.ql.lockmgr | Hive Lock Manager interfaces and some custom implmentations |
org.apache.hadoop.hive.ql.metadata | |
org.apache.hadoop.hive.ql.optimizer.ppr | |
org.apache.hadoop.hive.ql.parse | |
org.apache.hadoop.hive.ql.plan | |
org.apache.hadoop.hive.ql.security.authorization |
Uses of Table in org.apache.hadoop.hive.ql.exec |
---|
Methods in org.apache.hadoop.hive.ql.exec with parameters of type Table | |
---|---|
static void |
Utilities.addMapWork(MapredWork mr,
Table tbl,
String alias,
Operator<?> work)
|
static TableDesc |
Utilities.getTableDesc(Table tbl)
|
static void |
Utilities.validatePartSpec(Table tbl,
Map<String,String> partSpec)
|
Uses of Table in org.apache.hadoop.hive.ql.hooks |
---|
Methods in org.apache.hadoop.hive.ql.hooks that return Table | |
---|---|
Table |
WriteEntity.getT()
|
Table |
ReadEntity.getT()
|
Table |
WriteEntity.getTable()
Get the table associated with the entity. |
Table |
ReadEntity.getTable()
Get table entity. |
Methods in org.apache.hadoop.hive.ql.hooks with parameters of type Table | |
---|---|
void |
WriteEntity.setT(Table t)
|
void |
ReadEntity.setT(Table t)
|
Constructors in org.apache.hadoop.hive.ql.hooks with parameters of type Table | |
---|---|
ReadEntity(Table t)
Constructor. |
|
WriteEntity(Table t)
Constructor for a table. |
|
WriteEntity(Table t,
boolean complete)
|
Uses of Table in org.apache.hadoop.hive.ql.index |
---|
Methods in org.apache.hadoop.hive.ql.index with parameters of type Table | |
---|---|
List<Task<?>> |
HiveIndexHandler.generateIndexBuildTaskList(Table baseTbl,
Index index,
List<Partition> indexTblPartitions,
List<Partition> baseTblPartitions,
Table indexTbl,
Set<ReadEntity> inputs,
Set<WriteEntity> outputs)
Requests that the handler generate a plan for building the index; the plan should read the base table and write out the index representation. |
Uses of Table in org.apache.hadoop.hive.ql.index.compact |
---|
Methods in org.apache.hadoop.hive.ql.index.compact with parameters of type Table | |
---|---|
List<Task<?>> |
CompactIndexHandler.generateIndexBuildTaskList(Table baseTbl,
Index index,
List<Partition> indexTblPartitions,
List<Partition> baseTblPartitions,
Table indexTbl,
Set<ReadEntity> inputs,
Set<WriteEntity> outputs)
|
Uses of Table in org.apache.hadoop.hive.ql.lockmgr |
---|
Constructors in org.apache.hadoop.hive.ql.lockmgr with parameters of type Table | |
---|---|
HiveLockObject(Table tbl,
HiveLockObject.HiveLockObjectData lockData)
|
Uses of Table in org.apache.hadoop.hive.ql.metadata |
---|
Methods in org.apache.hadoop.hive.ql.metadata that return Table | |
---|---|
Table |
Table.copy()
|
Table |
Partition.getTable()
|
Table |
Hive.getTable(String tableName)
Returns metadata for the table named tableName |
Table |
Hive.getTable(String tableName,
boolean throwException)
Returns metadata for the table named tableName |
Table |
Hive.getTable(String dbName,
String tableName)
Returns metadata of the table |
Table |
Hive.getTable(String dbName,
String tableName,
boolean throwException)
Returns metadata of the table |
Table |
Hive.newTable(String tableName)
|
Methods in org.apache.hadoop.hive.ql.metadata with parameters of type Table | |
---|---|
void |
Hive.alterTable(String tblName,
Table newTbl)
Updates the existing table metadata with the new metadata. |
Partition |
Hive.createPartition(Table tbl,
Map<String,String> partSpec)
Creates a partition. |
Partition |
Hive.createPartition(Table tbl,
Map<String,String> partSpec,
org.apache.hadoop.fs.Path location)
Creates a partition |
void |
Hive.createTable(Table tbl)
Creates the table with the give objects |
void |
Hive.createTable(Table tbl,
boolean ifNotExists)
Creates the table with the give objects |
static String |
MetaDataFormatUtils.getAllColumnsInformation(Table table)
|
Partition |
Hive.getPartition(Table tbl,
Map<String,String> partSpec,
boolean forceCreate)
|
Partition |
Hive.getPartition(Table tbl,
Map<String,String> partSpec,
boolean forceCreate,
String partPath)
Returns partition metadata |
List<Partition> |
Hive.getPartitions(Table tbl)
get all the partitions that the table has |
List<Partition> |
Hive.getPartitions(Table tbl,
Map<String,String> partialPartSpec)
get all the partitions of the table that matches the given partial specification. |
List<Partition> |
Hive.getPartitionsByNames(Table tbl,
Map<String,String> partialPartSpec)
get all the partitions of the table that matches the given partial specification. |
static String |
MetaDataFormatUtils.getTableInformation(Table table)
|
void |
Partition.setTable(Table table)
Should be only used by serialization. |
Constructors in org.apache.hadoop.hive.ql.metadata with parameters of type Table | |
---|---|
DummyPartition(Table tbl,
String name)
|
|
Partition(Table tbl)
create an empty partition. |
|
Partition(Table tbl,
Map<String,String> partSpec,
org.apache.hadoop.fs.Path location)
Create partition object with the given info. |
|
Partition(Table tbl,
Partition tp)
|
Uses of Table in org.apache.hadoop.hive.ql.optimizer.ppr |
---|
Methods in org.apache.hadoop.hive.ql.optimizer.ppr with parameters of type Table | |
---|---|
static boolean |
PartitionPruner.onlyContainsPartnCols(Table tab,
ExprNodeDesc expr)
Find out whether the condition only contains partitioned columns. |
static PrunedPartitionList |
PartitionPruner.prune(Table tab,
ExprNodeDesc prunerExpr,
HiveConf conf,
String alias,
Map<String,PrunedPartitionList> prunedPartitionsMap)
Get the partition list for the table that satisfies the partition pruner condition. |
Uses of Table in org.apache.hadoop.hive.ql.parse |
---|
Fields in org.apache.hadoop.hive.ql.parse declared as Table | |
---|---|
Table |
BaseSemanticAnalyzer.tableSpec.tableHandle
|
Methods in org.apache.hadoop.hive.ql.parse that return Table | |
---|---|
Table |
QBMetaData.getDestTableForAlias(String alias)
|
Table |
QBMetaData.getSrcForAlias(String alias)
|
Table |
QBMetaData.getTableForAlias(String alias)
|
Methods in org.apache.hadoop.hive.ql.parse that return types with arguments of type Table | |
---|---|
HashMap<String,Table> |
QBMetaData.getAliasToTable()
|
HashMap<TableScanOperator,Table> |
ParseContext.getTopToTable()
|
Methods in org.apache.hadoop.hive.ql.parse with parameters of type Table | |
---|---|
void |
QBMetaData.setDestForAlias(String alias,
Table tab)
|
void |
QBMetaData.setSrcForAlias(String alias,
Table tab)
|
Method parameters in org.apache.hadoop.hive.ql.parse with type arguments of type Table | |
---|---|
void |
ParseContext.setTopToTable(HashMap<TableScanOperator,Table> topToTable)
|
Constructor parameters in org.apache.hadoop.hive.ql.parse with type arguments of type Table | |
---|---|
ParseContext(HiveConf conf,
QB qb,
ASTNode ast,
HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner,
HashMap<TableScanOperator,PrunedPartitionList> opToPartList,
HashMap<String,Operator<? extends Serializable>> topOps,
HashMap<String,Operator<? extends Serializable>> topSelOps,
LinkedHashMap<Operator<? extends Serializable>,OpParseContext> opParseCtx,
Map<JoinOperator,QBJoinTree> joinContext,
HashMap<TableScanOperator,Table> topToTable,
List<LoadTableDesc> loadTableWork,
List<LoadFileDesc> loadFileWork,
Context ctx,
HashMap<String,String> idToTableNameMap,
int destTableId,
UnionProcContext uCtx,
List<AbstractMapJoinOperator<? extends MapJoinDesc>> listMapJoinOpsNoReducer,
Map<GroupByOperator,Set<String>> groupOpToInputTables,
Map<String,PrunedPartitionList> prunedPartitions,
HashMap<TableScanOperator,FilterDesc.sampleDesc> opToSamplePruner)
|
Uses of Table in org.apache.hadoop.hive.ql.plan |
---|
Constructors in org.apache.hadoop.hive.ql.plan with parameters of type Table | |
---|---|
DynamicPartitionCtx(Table tbl,
Map<String,String> partSpec,
String defaultPartName,
int maxParts)
|
Uses of Table in org.apache.hadoop.hive.ql.security.authorization |
---|
Methods in org.apache.hadoop.hive.ql.security.authorization with parameters of type Table | |
---|---|
void |
HiveAuthorizationProvider.authorize(Table table,
Partition part,
List<String> columns,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
Authorization privileges against a list of columns. |
void |
DefaultHiveAuthorizationProvider.authorize(Table table,
Partition part,
List<String> columns,
Privilege[] inputRequiredPriv,
Privilege[] outputRequiredPriv)
|
void |
HiveAuthorizationProvider.authorize(Table table,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
Authorization privileges against a hive table object. |
void |
DefaultHiveAuthorizationProvider.authorize(Table table,
Privilege[] inputRequiredPriv,
Privilege[] outputRequiredPriv)
|
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |