|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Packages that use Scan | |
---|---|
org.apache.hadoop.hbase.avro | Provides an HBase Avro service. |
org.apache.hadoop.hbase.catalog | |
org.apache.hadoop.hbase.client | Provides HBase Client |
org.apache.hadoop.hbase.ipc | Tools to help define network clients and servers. |
org.apache.hadoop.hbase.mapreduce | Provides HBase MapReduce Input/OutputFormats, a table indexing MapReduce job, and utility |
org.apache.hadoop.hbase.regionserver | |
org.apache.hadoop.hbase.rest.client | |
org.apache.hadoop.hbase.rest.model |
Uses of Scan in org.apache.hadoop.hbase.avro |
---|
Methods in org.apache.hadoop.hbase.avro that return Scan | |
---|---|
static Scan |
AvroUtil.ascanToScan(AScan ascan)
|
Uses of Scan in org.apache.hadoop.hbase.catalog |
---|
Methods in org.apache.hadoop.hbase.catalog that return Scan | |
---|---|
static Scan |
MetaReader.getScanForTableName(byte[] tableName)
This method creates a Scan object that will only scan catalog rows that belong to the specified table. |
Uses of Scan in org.apache.hadoop.hbase.client |
---|
Methods in org.apache.hadoop.hbase.client that return Scan | |
---|---|
Scan |
Scan.addColumn(byte[] familyAndQualifier)
Deprecated. use addColumn(byte[], byte[]) instead |
Scan |
Scan.addColumn(byte[] family,
byte[] qualifier)
Get the column from the specified family with the specified qualifier. |
Scan |
Scan.addColumns(byte[][] columns)
Deprecated. issue multiple addColumn(byte[], byte[]) instead |
Scan |
Scan.addColumns(String columns)
Deprecated. use addColumn(byte[], byte[]) instead |
Scan |
Scan.addFamily(byte[] family)
Get all columns from the specified family. |
protected Scan |
HTable.ClientScanner.getScan()
|
protected Scan |
ScannerCallable.getScan()
|
Scan |
Scan.setFamilyMap(Map<byte[],NavigableSet<byte[]>> familyMap)
Setting the familyMap |
Scan |
Scan.setFilter(Filter filter)
Apply the specified server-side filter when performing the Scan. |
Scan |
Scan.setMaxVersions()
Get all available versions. |
Scan |
Scan.setMaxVersions(int maxVersions)
Get up to the specified number of versions of each column. |
Scan |
Scan.setStartRow(byte[] startRow)
Set the start row of the scan. |
Scan |
Scan.setStopRow(byte[] stopRow)
Set the stop row. |
Scan |
Scan.setTimeRange(long minStamp,
long maxStamp)
Get versions of columns only within the specified timestamp range, [minStamp, maxStamp). |
Scan |
Scan.setTimeStamp(long timestamp)
Get versions of columns with the specified timestamp. |
Methods in org.apache.hadoop.hbase.client with parameters of type Scan | |
---|---|
ResultScanner |
HTable.getScanner(Scan scan)
|
ResultScanner |
HTableInterface.getScanner(Scan scan)
Returns a scanner on the current table as specified by the Scan
object. |
Constructors in org.apache.hadoop.hbase.client with parameters of type Scan | |
---|---|
HTable.ClientScanner(Scan scan)
|
|
Scan(Scan scan)
Creates a new instance of this class while copying all values. |
|
ScannerCallable(HConnection connection,
byte[] tableName,
Scan scan)
|
Uses of Scan in org.apache.hadoop.hbase.ipc |
---|
Methods in org.apache.hadoop.hbase.ipc with parameters of type Scan | |
---|---|
long |
HRegionInterface.openScanner(byte[] regionName,
Scan scan)
Opens a remote scanner with a RowFilter. |
Uses of Scan in org.apache.hadoop.hbase.mapreduce |
---|
Methods in org.apache.hadoop.hbase.mapreduce that return Scan | |
---|---|
Scan |
TableInputFormatBase.getScan()
Gets the scan defining the actual details like columns etc. |
Methods in org.apache.hadoop.hbase.mapreduce with parameters of type Scan | |
---|---|
static void |
IdentityTableMapper.initJob(String table,
Scan scan,
Class<? extends TableMapper> mapper,
org.apache.hadoop.mapreduce.Job job)
Use this before submitting a TableMap job. |
static void |
GroupingTableMapper.initJob(String table,
Scan scan,
String groupColumns,
Class<? extends TableMapper> mapper,
org.apache.hadoop.mapreduce.Job job)
Use this before submitting a TableMap job. |
static void |
TableMapReduceUtil.initTableMapperJob(String table,
Scan scan,
Class<? extends TableMapper> mapper,
Class<? extends org.apache.hadoop.io.WritableComparable> outputKeyClass,
Class<? extends org.apache.hadoop.io.Writable> outputValueClass,
org.apache.hadoop.mapreduce.Job job)
Use this before submitting a TableMap job. |
static void |
TableMapReduceUtil.initTableMapperJob(String table,
Scan scan,
Class<? extends TableMapper> mapper,
Class<? extends org.apache.hadoop.io.WritableComparable> outputKeyClass,
Class<? extends org.apache.hadoop.io.Writable> outputValueClass,
org.apache.hadoop.mapreduce.Job job,
boolean addDependencyJars)
Use this before submitting a TableMap job. |
void |
TableRecordReader.setScan(Scan scan)
Sets the scan defining the actual details like columns etc. |
void |
TableRecordReaderImpl.setScan(Scan scan)
Sets the scan defining the actual details like columns etc. |
void |
TableInputFormatBase.setScan(Scan scan)
Sets the scan defining the actual details like columns etc. |
Uses of Scan in org.apache.hadoop.hbase.regionserver |
---|
Methods in org.apache.hadoop.hbase.regionserver with parameters of type Scan | |
---|---|
InternalScanner |
HRegion.getScanner(Scan scan)
Return an iterator that scans over the HRegion, returning the indicated columns and rows specified by the Scan . |
protected InternalScanner |
HRegion.getScanner(Scan scan,
List<KeyValueScanner> additionalScanners)
|
KeyValueScanner |
Store.getScanner(Scan scan,
NavigableSet<byte[]> targetCols)
Return a scanner for both the memstore and the HStore files |
protected InternalScanner |
HRegion.instantiateInternalScanner(Scan scan,
List<KeyValueScanner> additionalScanners)
|
long |
HRegionServer.openScanner(byte[] regionName,
Scan scan)
|
boolean |
MemStore.shouldSeek(Scan scan)
Check if this memstore may contain the required keys |
boolean |
StoreFile.Reader.shouldSeek(Scan scan,
SortedSet<byte[]> columns)
|
Constructors in org.apache.hadoop.hbase.regionserver with parameters of type Scan | |
---|---|
ScanQueryMatcher(Scan scan,
byte[] family,
NavigableSet<byte[]> columns,
long ttl,
KeyValue.KeyComparator rowComparator,
int maxVersions)
|
|
ScanQueryMatcher(Scan scan,
byte[] family,
NavigableSet<byte[]> columns,
long ttl,
KeyValue.KeyComparator rowComparator,
int maxVersions,
boolean retainDeletesInOutput)
Constructs a ScanQueryMatcher for a Scan. |
Uses of Scan in org.apache.hadoop.hbase.rest.client |
---|
Methods in org.apache.hadoop.hbase.rest.client with parameters of type Scan | |
---|---|
ResultScanner |
RemoteHTable.getScanner(Scan scan)
|
Uses of Scan in org.apache.hadoop.hbase.rest.model |
---|
Methods in org.apache.hadoop.hbase.rest.model with parameters of type Scan | |
---|---|
static ScannerModel |
ScannerModel.fromScan(Scan scan)
|
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |