|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
meta
region.
meta
region.
Base64.Base64InputStream
will read data from another
InputStream, given in the constructor, and
encode/decode to/from Base64 notation on the fly.Base64.Base64InputStream
in DECODE mode.
Base64.Base64InputStream
in either ENCODE or DECODE mode.
Base64.Base64OutputStream
will write data to another
OutputStream, given in the constructor, and
encode/decode to/from Base64 notation on the fly.Base64.Base64OutputStream
in ENCODE mode.
Base64.Base64OutputStream
in either ENCODE or DECODE mode.
MapFile
.getClosest when
searching for the key that comes BEFORE this one but NOT this one.FSInputStream
that reads the stream in blocks
of a fixed, configurable size.hbase.master
from 'local' to 'localhost:PORT' in
passed Configuration instance.
versions
of keys matching the origin key's
row/column/timestamp and those of an older vintage.
stopRow.
- getScanner(Text[], Text, Text, long) -
Method in class org.apache.hadoop.hbase.client.HTable
- Deprecated. Use String or byte [] overload instead
- getScanner(String[], String, String, long) -
Method in class org.apache.hadoop.hbase.client.HTable
- Get a scanner on the current table starting at the specified row and
ending just before
stopRow.
- getScanner(byte[][], byte[], byte[], long) -
Method in class org.apache.hadoop.hbase.client.HTable
- Get a scanner on the current table starting at the specified row and
ending just before
stopRow.
- getScanner(Text[], Text, long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.client.HTable
- Deprecated. Use String or byte [] overload instead
- getScanner(String[], String, long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.client.HTable
- Get a scanner on the current table starting at the specified row.
- getScanner(byte[][], byte[], long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.client.HTable
- Get a scanner on the current table starting at the specified row.
- getScanner(byte[][], byte[], long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.regionserver.HRegion
- Return an iterator that scans over the HRegion, returning the indicated
columns for only the rows that match the data filter.
- getScanner(long, byte[][], byte[], RowFilterInterface) -
Method in class org.apache.hadoop.hbase.regionserver.HStore
- Return a scanner for both the memcache and the HStore files
- getScanner(int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
- Returns the scanner associated with the specified ID.
- getScanner(byte[][], byte[], long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.util.migration.v5.HRegion
- Return an iterator that scans over the HRegion, returning the indicated
columns for only the rows that match the data filter.
- getScanner(long, byte[][], byte[], RowFilterInterface) -
Method in class org.apache.hadoop.hbase.util.migration.v5.HStore
- Return a scanner for both the memcache and the HStore files
- getScannerCallable(byte[]) -
Method in class org.apache.hadoop.hbase.client.HTable.ClientScanner
-
- getServer(Object, String, int, Configuration) -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServer(Object, String, int, int, boolean, Configuration) -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServer() -
Method in class org.apache.hadoop.hbase.master.MetaRegion
-
- getServerAddress() -
Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getServerAddress() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getServerInfo() -
Method in class org.apache.hadoop.hbase.regionserver.HRegionServer
-
- getServerName() -
Method in class org.apache.hadoop.hbase.client.ServerCallable
-
- getServersToLoad() -
Method in class org.apache.hadoop.hbase.master.HMaster
-
- getServersToServerInfo() -
Method in class org.apache.hadoop.hbase.master.HMaster
-
- getServerVersion() -
Method in exception org.apache.hadoop.hbase.ipc.HbaseRPC.VersionMismatch
- Get the server's agreed to version.
- getSimilarityName() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getSize() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getSize() -
Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- getSize() -
Method in class org.apache.hadoop.hbase.regionserver.BeforeThisStoreKey
-
- getSize() -
Method in class org.apache.hadoop.hbase.regionserver.HStore
-
- getSize() -
Method in class org.apache.hadoop.hbase.util.migration.v5.HStore
-
- getSocketFactory() -
Method in class org.apache.hadoop.ipc.HBaseClient
-
- getSortedOnlineRegionInfos() -
Method in class org.apache.hadoop.hbase.regionserver.HRegionServer
-
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
- Calculates the splits that will serve as input for the map tasks.
- getStartCode() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.master.MetaRegion
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.regionserver.HRegion
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.util.migration.v5.HRegion
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.util.migration.v5.HRegionInfo
-
- getStartKeys() -
Method in class org.apache.hadoop.hbase.client.HTable
- Gets the starting row key for every region in the currently open table
- getStartRow() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getStopRowKey() -
Method in class org.apache.hadoop.hbase.filter.StopRowFilter
- An accessor for the stopRowKey
- getStore(byte[]) -
Method in class org.apache.hadoop.hbase.regionserver.HRegion
- Return HStore instance.
- getStore(byte[]) -
Method in class org.apache.hadoop.hbase.util.migration.v5.HRegion
-
- getTable(String) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- getTable(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
- Creates and returns an HTable instance from a given table name.
- getTableDesc() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getTableDesc() -
Method in class org.apache.hadoop.hbase.regionserver.HRegion
-
- getTableDesc() -
Method in class org.apache.hadoop.hbase.util.migration.v5.HRegion
-
- getTableDesc() -
Method in class org.apache.hadoop.hbase.util.migration.v5.HRegionInfo
-
- getTableDescriptor() -
Method in class org.apache.hadoop.hbase.client.HTable
-
- getTableDir(Path, byte[]) -
Static method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getTableDir(Path, byte[]) -
Static method in class org.apache.hadoop.hbase.util.migration.v5.HTableDescriptor
-
- getTableName() -
Method in class org.apache.hadoop.hbase.client.HTable
-
- getTableName() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getTableName(String[]) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- getTableNameFromRegionName(byte[]) -
Static method in class org.apache.hadoop.hbase.util.migration.v5.HRegionInfo
- Extracts table name prefix from a metaregion row name.
- getTableNames() -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getTableNames() -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- List all the userspace tables.
- getTableNames() -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getTableRegions(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getTableRegions(byte[]) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- List the regions associated with a table.
- getTableRegions(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.client.HTable.ClientScanner
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.client.ScannerCallable
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.io.BatchUpdate
- Return the timestamp this BatchUpdate will be committed with.
- getTimestamp() -
Method in class org.apache.hadoop.hbase.io.Cell
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.RegionHistorian.RegionHistoryInformation
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.regionserver.BeforeThisStoreKey
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.regionserver.HLogEdit
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.util.migration.v5.HLogEdit
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.util.migration.v5.RegionHistorian.RegionHistoryInformation
-
- getTimestampAsString() -
Method in class org.apache.hadoop.hbase.RegionHistorian.RegionHistoryInformation
-
- getTimestampAsString() -
Method in class org.apache.hadoop.hbase.util.migration.v5.RegionHistorian.RegionHistoryInformation
-
- getTimeToLive() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getTimeToLive() -
Method in class org.apache.hadoop.hbase.util.migration.v5.HColumnDescriptor
-
- getType() -
Method in class org.apache.hadoop.hbase.HMsg
-
- getUrl() -
Static method in class org.apache.hadoop.hbase.util.VersionInfo
- Get the subversion URL for the root hbase directory.
- getUser() -
Static method in class org.apache.hadoop.hbase.util.VersionInfo
- The user that compiled hbase.
- getVal() -
Method in class org.apache.hadoop.hbase.regionserver.HLogEdit
-
- getVal() -
Method in class org.apache.hadoop.hbase.util.migration.v5.HLogEdit
-
- getValue(byte[]) -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValue(String) -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValue(byte[]) -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getValue(String) -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getValue() -
Method in class org.apache.hadoop.hbase.io.BatchOperation
-
- getValue() -
Method in class org.apache.hadoop.hbase.io.Cell
-
- getValue() -
Method in class org.apache.hadoop.hbase.io.RowResult.Entry
-
- getValues() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValues() -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getVectorSize() -
Method in class org.apache.hadoop.hbase.BloomFilterDescriptor
- Deprecated.
- getVer(byte[], byte[], byte[], int) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getVer(byte[], byte[], byte[], int) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get the specified number of versions for the specified table,
row, and column.
- getVer(byte[], byte[], byte[], int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getVersion() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getVersion(FileSystem, Path) -
Static method in class org.apache.hadoop.hbase.util.FSUtils
- Verifies current version of file system
- getVersion() -
Static method in class org.apache.hadoop.hbase.util.VersionInfo
- Get the hbase version.
- getVerTs(byte[], byte[], byte[], long, int) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getVerTs(byte[], byte[], byte[], long, int) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get the specified number of versions for the specified table,
row, and column.
- getVerTs(byte[], byte[], byte[], long, int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getWebAppDir(String) -
Static method in class org.apache.hadoop.hbase.util.InfoServer
-
- getWritable(byte[], Writable) -
Static method in class org.apache.hadoop.hbase.util.Writables
- Set bytes into the passed Writable by calling its
Writable.readFields(java.io.DataInput)
.
- getWriter(FileSystem, SequenceFile.CompressionType, boolean, int) -
Method in class org.apache.hadoop.hbase.regionserver.HStoreFile
- Get a store file writer.
- getWriter(FileSystem, SequenceFile.CompressionType, boolean, int) -
Method in class org.apache.hadoop.hbase.util.migration.v5.HStoreFile
- Get a store file writer.
- getXMLOutputter(PrintWriter) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- GROUP_COLUMNS -
Static variable in class org.apache.hadoop.hbase.mapred.GroupingTableMap
- JobConf parameter to specify the columns used to produce the key passed to
collect from the map phase
- GroupingTableMap - Class in org.apache.hadoop.hbase.mapred
- Extract grouping columns from input record
- GroupingTableMap() -
Constructor for class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
- GZIP -
Static variable in class org.apache.hadoop.hbase.util.Base64
- Specify that data should be gzip-compressed.
HbaseRPC
ObjectWritable
.dir
location.
dir
location.
ibw
.
Keying.createKey(String)
transform.
stdout
or split the specified log files.
stdout
or split the specified log files.
.META.
catalog table
.META.
catalog table
Scanner.next()
.Base64.Base64InputStream.read()
repeatedly until the end of stream is reached or
len bytes are read.
Writable
, String
, primitive type, or an array of
the preceding.
Writable
, String
, primitive type, or an array of
the preceding.
ColumnValueFilter
Constructor that takes a row key regular expression to filter on.
meta
region.
meta
region.
hsf
directory.
hsf
directory.
m
invoking the passed
listener
per row found.
m
invoking the passed
listener
per row found.
ColumnValueFilter
Specify a value that must be matched for the given column.
ColumnValueFilter
Set column filters for a number of columns.
HTable
.
RowFilterInterface
to be used.
TableInputFormatBase.TableRecordReader
.
startTime
- sleeper -
Variable in class org.apache.hadoop.hbase.master.HMaster
-
- Sleeper - Class in org.apache.hadoop.hbase.util
- Sleeper for current thread.
- Sleeper(int, AtomicBoolean) -
Constructor for class org.apache.hadoop.hbase.util.Sleeper
-
- SoftSortedMap<K,V> - Class in org.apache.hadoop.hbase.util
- A SortedMap implementation that uses SoftReferences internally to make it
play well with the GC when in a low-memory situation.
- SoftSortedMap() -
Constructor for class org.apache.hadoop.hbase.util.SoftSortedMap
- Constructor
- SoftSortedMap(Comparator<K>) -
Constructor for class org.apache.hadoop.hbase.util.SoftSortedMap
- Constructor
- splitLog(Path, Path, FileSystem, Configuration) -
Static method in class org.apache.hadoop.hbase.regionserver.HLog
- Split up a bunch of log files, that are no longer being written to, into
new files, one per region.
- splitLog(Path, Path, FileSystem, Configuration) -
Static method in class org.apache.hadoop.hbase.util.migration.v5.HLog
- Split up a bunch of log files, that are no longer being written to, into
new files, one per region.
- start() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- Start the server.
- START_ROW -
Static variable in class org.apache.hadoop.hbase.rest.GenericHandler
-
- startKey -
Variable in class org.apache.hadoop.hbase.thrift.generated.TRegionInfo.Isset
-
- startKey -
Variable in class org.apache.hadoop.hbase.thrift.generated.TRegionInfo
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args
-
- startup() -
Method in class org.apache.hadoop.hbase.LocalHBaseCluster
- Start the cluster.
- stop -
Variable in class org.apache.hadoop.hbase.Chore
-
- stop() -
Method in class org.apache.hadoop.hbase.regionserver.HRegionServer
- Sets a flag that will cause all the HRegionServer threads to shut down
in an orderly fashion.
- stop() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- stop the server
- stopProxy(VersionedProtocol) -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Stop this proxy and release its invoker's resource
- stopRequested -
Variable in class org.apache.hadoop.hbase.regionserver.HRegionServer
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args.Isset
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args.Isset
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args
-
- StopRowFilter - Class in org.apache.hadoop.hbase.filter
- Implementation of RowFilterInterface that filters out rows greater than or
equal to a specified rowKey.
- StopRowFilter() -
Constructor for class org.apache.hadoop.hbase.filter.StopRowFilter
- Default constructor, filters nothing.
- StopRowFilter(byte[]) -
Constructor for class org.apache.hadoop.hbase.filter.StopRowFilter
- Constructor that takes a stopRowKey on which to filter
- subMap(byte[], byte[]) -
Method in class org.apache.hadoop.hbase.io.HbaseMapWritable
-
- subMap(byte[], byte[]) -
Method in class org.apache.hadoop.hbase.io.RowResult
-
- subMap(K, K) -
Method in class org.apache.hadoop.hbase.util.SoftSortedMap
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.get_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.get_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getColumnDescriptors_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getColumnDescriptors_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRow_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRow_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableNames_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableNames_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableRegions_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableRegions_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVer_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVer_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVerTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVerTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.isTableEnabled_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.isTableEnabled_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerGet_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerGet_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_result
-
- suspendEncoding() -
Method in class org.apache.hadoop.hbase.util.Base64.Base64OutputStream
- Suspends encoding of the stream.
tableName
exists.
TableInputFormat
s.Base64.Base64OutputStream.write(int)
repeatedly until len bytes are
written.
Writable
, String
, primitive type, or an array of
the preceding.
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |