|
||||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |
java.lang.Objectorg.apache.hadoop.mapreduce.InputFormat<ImmutableBytesWritable,Result>
org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
org.apache.hadoop.hbase.mapreduce.TableInputFormat
@InterfaceAudience.Public @InterfaceStability.Stable public class TableInputFormat
Convert HBase tabular data into a format that is consumable by Map/Reduce.
Field Summary | |
---|---|
static String |
INPUT_TABLE
Job parameter that specifies the input table. |
static String |
SCAN
Base-64 encoded scanner. |
static String |
SCAN_BATCHSIZE
Set the maximum number of values to return for each call to next(). |
static String |
SCAN_CACHEBLOCKS
Set to false to disable server-side caching of blocks for this scan. |
static String |
SCAN_CACHEDROWS
The number of rows for caching that will be passed to scanners. |
static String |
SCAN_COLUMN_FAMILY
Column Family to Scan |
static String |
SCAN_COLUMNS
Space delimited list of columns and column families to scan. |
static String |
SCAN_MAXVERSIONS
The maximum number of version to return. |
static String |
SCAN_ROW_START
Scan start row |
static String |
SCAN_ROW_STOP
Scan stop row |
static String |
SCAN_TIMERANGE_END
The ending timestamp used to filter columns with a specific range of versions. |
static String |
SCAN_TIMERANGE_START
The starting timestamp used to filter columns with a specific range of versions. |
static String |
SCAN_TIMESTAMP
The timestamp used to filter columns with a specific timestamp. |
Fields inherited from class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase |
---|
INPUT_AUTOBALANCE_MAXSKEWRATIO, MAPREDUCE_INPUT_AUTOBALANCE, TABLE_ROW_TEXTKEY |
Constructor Summary | |
---|---|
TableInputFormat()
|
Method Summary | |
---|---|
static void |
addColumns(Scan scan,
byte[][] columns)
Adds an array of columns specified using old format, family:qualifier. |
static void |
configureSplitTable(org.apache.hadoop.mapreduce.Job job,
TableName tableName)
Sets split table in map-reduce job. |
org.apache.hadoop.conf.Configuration |
getConf()
Returns the current configuration. |
protected Pair<byte[][],byte[][]> |
getStartEndKeys()
|
void |
setConf(org.apache.hadoop.conf.Configuration configuration)
Sets the configuration. |
Methods inherited from class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase |
---|
calculateRebalancedSplits, createRecordReader, getHTable, getScan, getSplitKey, getSplits, includeRegionInSplit, reverseDNS, setHTable, setScan, setTableRecordReader |
Methods inherited from class java.lang.Object |
---|
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
Field Detail |
---|
public static final String INPUT_TABLE
public static final String SCAN
TableMapReduceUtil.convertScanToString(Scan)
for more details.
public static final String SCAN_ROW_START
public static final String SCAN_ROW_STOP
public static final String SCAN_COLUMN_FAMILY
public static final String SCAN_COLUMNS
public static final String SCAN_TIMESTAMP
public static final String SCAN_TIMERANGE_START
public static final String SCAN_TIMERANGE_END
public static final String SCAN_MAXVERSIONS
public static final String SCAN_CACHEBLOCKS
public static final String SCAN_CACHEDROWS
public static final String SCAN_BATCHSIZE
Constructor Detail |
---|
public TableInputFormat()
Method Detail |
---|
public org.apache.hadoop.conf.Configuration getConf()
getConf
in interface org.apache.hadoop.conf.Configurable
Configurable.getConf()
public void setConf(org.apache.hadoop.conf.Configuration configuration)
setConf
in interface org.apache.hadoop.conf.Configurable
configuration
- The configuration to set.Configurable.setConf(
org.apache.hadoop.conf.Configuration)
public static void addColumns(Scan scan, byte[][] columns)
Overrides previous calls to Scan.addColumn(byte[], byte[])
for any families in the
input.
scan
- The Scan to update.columns
- array of columns, formatted as family:qualifier
Scan.addColumn(byte[], byte[])
protected Pair<byte[][],byte[][]> getStartEndKeys() throws IOException
getStartEndKeys
in class TableInputFormatBase
IOException
public static void configureSplitTable(org.apache.hadoop.mapreduce.Job job, TableName tableName)
|
||||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |