|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
HServerLoad.addRegionInfo(RegionLoad)
meta
region.
Base64.Base64InputStream
will read data from another
InputStream, given in the constructor, and
encode/decode to/from Base64 notation on the fly.Base64.Base64InputStream
in DECODE mode.
Base64.Base64InputStream
in either ENCODE or DECODE mode.
Base64.Base64OutputStream
will write data to another
OutputStream, given in the constructor, and
encode/decode to/from Base64 notation on the fly.Base64.Base64OutputStream
in ENCODE mode.
Base64.Base64OutputStream
in either ENCODE or DECODE mode.
FSInputStream
that reads the stream in blocks
of a fixed, configurable size.param
, to the IPC server running at
address
, returning the value.
info
in
.META.
SequenceFile.Reader
returned.
DataOutput
implementation that writes to an in-memory
buffer.hbase.master
from 'local' to 'localhost:PORT' in
passed Configuration instance.
SequenceFileOutputFormat.getOutputCompressionType(org.apache.hadoop.mapred.JobConf)
to get SequenceFile.CompressionType
for job-outputs.
versions
of keys matching the origin key's
row/column/timestamp and those of an older vintage.
stopRow.
- getScanner(String[], String, String, long) -
Method in class org.apache.hadoop.hbase.client.HTable
- Get a scanner on the current table starting at the specified row and
ending just before
stopRow.
- getScanner(byte[][], byte[], byte[], long) -
Method in class org.apache.hadoop.hbase.client.HTable
- Get a scanner on the current table starting at the specified row and
ending just before
stopRow.
- getScanner(String[], String, long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.client.HTable
- Get a scanner on the current table starting at the specified row.
- getScanner(byte[][], byte[], long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.client.HTable
- Get a scanner on the current table starting at the specified row.
- getScanner(TransactionState, byte[][]) -
Method in class org.apache.hadoop.hbase.client.transactional.TransactionalTable
- Get a scanner on the current table starting at first row.
- getScanner(TransactionState, byte[][], byte[]) -
Method in class org.apache.hadoop.hbase.client.transactional.TransactionalTable
- Get a scanner on the current table starting at the specified row.
- getScanner(TransactionState, byte[][], byte[], long) -
Method in class org.apache.hadoop.hbase.client.transactional.TransactionalTable
- Get a scanner on the current table starting at the specified row.
- getScanner(TransactionState, byte[][], byte[], RowFilterInterface) -
Method in class org.apache.hadoop.hbase.client.transactional.TransactionalTable
- Get a scanner on the current table starting at the specified row.
- getScanner(TransactionState, byte[][], byte[], long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.client.transactional.TransactionalTable
- Get a scanner on the current table starting at the specified row.
- getScanner(byte[][], byte[], long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.regionserver.HRegion
- Return an iterator that scans over the HRegion, returning the indicated
columns for only the rows that match the data filter.
- getScanner(long, byte[][], byte[], RowFilterInterface) -
Method in class org.apache.hadoop.hbase.regionserver.HStore
- Return a scanner for both the memcache and the HStore files
- getScanner(long, byte[][], byte[], long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.regionserver.transactional.TransactionalRegion
- Return an iterator that scans over the HRegion, returning the indicated
columns for only the rows that match the data filter.
- getScanner(int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
- Returns the scanner associated with the specified ID.
- getScannerCaching() -
Method in class org.apache.hadoop.hbase.client.HTable
- Get the number of rows for caching that will be passed to scanners
- getScannerCallable(byte[], int) -
Method in class org.apache.hadoop.hbase.client.HTable.ClientScanner
-
- getScannerCallable(byte[], int) -
Method in class org.apache.hadoop.hbase.client.transactional.TransactionalTable.TransactionalClientScanner
-
- getSecond() -
Method in class org.apache.hadoop.hbase.util.Pair
- Return the second element stored in the pair.
- getSequenceNumber() -
Method in class org.apache.hadoop.hbase.regionserver.HLog
-
- getServer(Object, String, int, Configuration) -
Static method in class org.apache.hadoop.hbase.ipc.HBaseRPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServer(Object, String, int, int, boolean, Configuration) -
Static method in class org.apache.hadoop.hbase.ipc.HBaseRPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServer() -
Method in class org.apache.hadoop.hbase.master.MetaRegion
-
- getServerAddress() -
Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getServerAddress() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getServerInfo() -
Method in class org.apache.hadoop.hbase.regionserver.HRegionServer
-
- getServerName() -
Method in class org.apache.hadoop.hbase.client.ServerCallable
-
- getServersToLoad() -
Method in class org.apache.hadoop.hbase.master.HMaster
-
- getServersToServerInfo() -
Method in class org.apache.hadoop.hbase.master.HMaster
-
- getServerVersion() -
Method in exception org.apache.hadoop.hbase.ipc.HBaseRPC.VersionMismatch
-
- getSimilarityName() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getSize() -
Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- getSize() -
Method in interface org.apache.hadoop.hbase.io.SequenceFile.ValueBytes
- Size of stored data.
- getSize() -
Method in class org.apache.hadoop.hbase.regionserver.HStore
-
- getSortedOnlineRegionInfos() -
Method in class org.apache.hadoop.hbase.regionserver.HRegionServer
-
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
- Calculates the splits that will serve as input for the map tasks.
- getStartCode() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getStartEndKeys() -
Method in class org.apache.hadoop.hbase.client.HTable
- Gets the starting and ending row keys for every region in the currently open table
- getStartKey() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.master.MetaRegion
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.regionserver.HRegion
-
- getStartKeys() -
Method in class org.apache.hadoop.hbase.client.HTable
- Gets the starting row key for every region in the currently open table
- getStartRow() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getStatusForTransaction(long) -
Method in class org.apache.hadoop.hbase.client.transactional.LocalTransactionLogger
-
- getStatusForTransaction(long) -
Method in interface org.apache.hadoop.hbase.client.transactional.TransactionLogger
-
- getStopRowKey() -
Method in class org.apache.hadoop.hbase.filter.StopRowFilter
- An accessor for the stopRowKey
- getStore(byte[]) -
Method in class org.apache.hadoop.hbase.regionserver.HRegion
- Return HStore instance.
- getStorefileIndexSizeInMB() -
Method in class org.apache.hadoop.hbase.HServerLoad
-
- getStorefileIndexSizeMB() -
Method in class org.apache.hadoop.hbase.HServerLoad.RegionLoad
-
- getStorefiles() -
Method in class org.apache.hadoop.hbase.HServerLoad
-
- getStorefiles() -
Method in class org.apache.hadoop.hbase.HServerLoad.RegionLoad
-
- getStores() -
Method in class org.apache.hadoop.hbase.HServerLoad.RegionLoad
-
- getTable(String) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- getTable(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
- Creates and returns an HTable instance from a given table name.
- getTableDesc() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getTableDesc() -
Method in class org.apache.hadoop.hbase.regionserver.HRegion
-
- getTableDescriptor(String) -
Method in class org.apache.hadoop.hbase.client.HBaseAdmin
-
- getTableDescriptor(byte[]) -
Method in class org.apache.hadoop.hbase.client.HBaseAdmin
-
- getTableDescriptor() -
Method in class org.apache.hadoop.hbase.client.HTable
-
- getTableDir(Path, byte[]) -
Static method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getTableName() -
Method in class org.apache.hadoop.hbase.client.HTable
-
- getTableName() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getTablename() -
Method in class org.apache.hadoop.hbase.regionserver.HLogKey
-
- getTableName(String[]) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- getTableNames() -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getTableNames() -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- List all the userspace tables.
- getTableNames() -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getTableRegions(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getTableRegions(byte[]) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- List the regions associated with a table.
- getTableRegions(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getThreadLocalTransactionState() -
Method in class org.apache.hadoop.hbase.client.transactional.JtaXAResource
- Get the threadLocalTransaction state.
- getTimestamp() -
Method in class org.apache.hadoop.hbase.client.HTable.ClientScanner
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.client.ScannerCallable
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.HStoreKey.BeforeThisStoreKey
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.io.BatchUpdate
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.io.Cell
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.RegionHistorian.RegionHistoryInformation
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.regionserver.HLogEdit
-
- getTimestampAsString() -
Method in class org.apache.hadoop.hbase.RegionHistorian.RegionHistoryInformation
-
- getTimeToLive() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getTransactionalLeases() -
Method in class org.apache.hadoop.hbase.regionserver.transactional.TransactionalRegionServer
-
- getTransactionalRegion(byte[]) -
Method in class org.apache.hadoop.hbase.regionserver.transactional.TransactionalRegionServer
-
- getTransactionId() -
Method in class org.apache.hadoop.hbase.client.transactional.TransactionState
- Get the transactionId.
- getTransactionId() -
Method in class org.apache.hadoop.hbase.regionserver.HLogEdit
- Get the transactionId, or null if this is not a transactional edit.
- getTransactionTimeout() -
Method in class org.apache.hadoop.hbase.client.transactional.JtaXAResource
-
- getType() -
Method in class org.apache.hadoop.hbase.HMsg
-
- getUrl() -
Static method in class org.apache.hadoop.hbase.util.VersionInfo
- Get the subversion URL for the root hbase directory.
- getUser() -
Static method in class org.apache.hadoop.hbase.util.VersionInfo
- The user that compiled hbase.
- getVal() -
Method in class org.apache.hadoop.hbase.regionserver.HLogEdit
-
- getValue(byte[]) -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValue(String) -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValue(byte[]) -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getValue(String) -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getValue() -
Method in class org.apache.hadoop.hbase.io.BatchOperation
-
- getValue() -
Method in class org.apache.hadoop.hbase.io.Cell
-
- getValue() -
Method in class org.apache.hadoop.hbase.io.RowResult.Entry
-
- getValue() -
Method in interface org.apache.hadoop.hbase.io.SequenceFile.Sorter.RawKeyValueIterator
- Gets the current raw value
- getValueClass() -
Method in class org.apache.hadoop.hbase.io.MapFile.Reader
- Returns the class of values in this file.
- getValueClass() -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Reader
- Returns the class of values in this file.
- getValueClass() -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Writer
- Returns the class of values in this file.
- getValueClassName() -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Reader
- Returns the name of the value class.
- getValues() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValues() -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getVer(byte[], byte[], byte[], int) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getVer(byte[], byte[], byte[], int) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get the specified number of versions for the specified table,
row, and column.
- getVer(byte[], byte[], byte[], int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getVersion() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getVersion(FileSystem, Path) -
Static method in class org.apache.hadoop.hbase.util.FSUtils
- Verifies current version of file system
- getVersion() -
Static method in class org.apache.hadoop.hbase.util.VersionInfo
- Get the hbase version.
- getVerTs(byte[], byte[], byte[], long, int) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getVerTs(byte[], byte[], byte[], long, int) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get the specified number of versions for the specified table,
row, and column.
- getVerTs(byte[], byte[], byte[], long, int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getWebAppDir(String) -
Static method in class org.apache.hadoop.hbase.util.InfoServer
- Get the path for this web app
- getWebAppsPath() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- Get the pathname to the
path
files.
- getWritable(byte[], Writable) -
Static method in class org.apache.hadoop.hbase.util.Writables
- Set bytes into the passed Writable by calling its
Writable.readFields(java.io.DataInput)
.
- getWriteBuffer() -
Method in class org.apache.hadoop.hbase.client.HTable
- Get the write buffer
- getWriteBufferSize() -
Method in class org.apache.hadoop.hbase.client.HTable
- Get the maximum size in bytes of the write buffer for this HTable
- getWriter(FileSystem, SequenceFile.CompressionType, boolean, int) -
Method in class org.apache.hadoop.hbase.regionserver.HStoreFile
- Get a store file writer.
- getXAResource() -
Method in class org.apache.hadoop.hbase.client.transactional.TransactionManager
-
- getXMLOutputter(PrintWriter) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- GROUP_COLUMNS -
Static variable in class org.apache.hadoop.hbase.mapred.GroupingTableMap
- JobConf parameter to specify the columns used to produce the key passed to
collect from the map phase
- GroupingTableMap - Class in org.apache.hadoop.hbase.mapred
- Extract grouping columns from input record
- GroupingTableMap() -
Constructor for class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
- GZIP -
Static variable in class org.apache.hadoop.hbase.util.Base64
- Specify that data should be gzip-compressed.
MapFile.Reader
that serves up
either the top or bottom half of a MapFile where 'bottom' is the first half
of the file containing the keys that sort lowest and 'top' is the second half
of the file with keys that sort greater than those of the bottom half.length
, and
the provided seed value
Writable
class.
ObjectWritable
.dir
location.
MapFile
.getClosest when
searching for the key that comes BEFORE this one but NOT this one.ibw
.
JenkinsHash
.
Keying.createKey(String)
transform.
stdout
or split the specified log files.
SegmentDescriptor
.META.
catalog table
MurmurHash
.
key
and
val
.
key
, skipping its
value.
key
and
val
.
SequenceFile.Reader.nextRaw(DataOutputBuffer,SequenceFile.ValueBytes)
.
FSDataInputStream
returned.
Scanner.next()
.Base64.Base64InputStream.read()
repeatedly until the end of stream is reached or
len bytes are read.
Writable
, String
, primitive type, or an array of
the preceding.
Writable
, String
, primitive type, or an array of
the preceding.
ColumnValueFilter
Constructor that takes a row key regular expression to filter on.
meta
region.
hsf
directory.
m
invoking the passed
listener
per row found.
SequenceFile
s are flat files consisting of binary key/value
pairs.SequenceFile
.RawComparator
.
ColumnValueFilter
Specify a value that must be matched for the given column.
ColumnValueFilter
Set column filters for a number of columns.
SequenceFile.CompressionType
while creating the SequenceFile
or
SequenceFileOutputFormat.setOutputCompressionType(org.apache.hadoop.mapred.JobConf, org.apache.hadoop.io.SequenceFile.CompressionType)
to specify the SequenceFile.CompressionType
for job-outputs.
or
HTable
.
RowFilterInterface
to be used.
TableInputFormatBase.TableRecordReader
.
startTime
- Sleeper - Class in org.apache.hadoop.hbase.util
- Sleeper for current thread.
- Sleeper(int, AtomicBoolean) -
Constructor for class org.apache.hadoop.hbase.util.Sleeper
-
- SoftValueMap<K,V> - Class in org.apache.hadoop.hbase.util
- A Map that uses Soft Reference values internally.
- SoftValueMap() -
Constructor for class org.apache.hadoop.hbase.util.SoftValueMap
-
- SoftValueMap(ReferenceQueue<?>) -
Constructor for class org.apache.hadoop.hbase.util.SoftValueMap
-
- SoftValueSortedMap<K,V> - Class in org.apache.hadoop.hbase.util
- A SortedMap implementation that uses Soft Reference values
internally to make it play well with the GC when in a low-memory
situation.
- SoftValueSortedMap() -
Constructor for class org.apache.hadoop.hbase.util.SoftValueSortedMap
- Constructor
- SoftValueSortedMap(Comparator<K>) -
Constructor for class org.apache.hadoop.hbase.util.SoftValueSortedMap
- Constructor
- sort(Path[], Path, boolean) -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Sorter
- Perform a file sort from a set of input files into an output file.
- sort(Path, Path) -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Sorter
- The backwards compatible interface to sort.
- sortAndIterate(Path[], Path, boolean) -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Sorter
- Perform a file sort from a set of input files and return an iterator.
- split(String) -
Method in class org.apache.hadoop.hbase.client.HBaseAdmin
- Split a table or an individual region
- split(byte[]) -
Method in class org.apache.hadoop.hbase.client.HBaseAdmin
- Split a table or an individual region
- SPLIT_PREFIX -
Static variable in class org.apache.hadoop.hbase.RegionHistorian
-
- splitLog(Path, Path, FileSystem, Configuration) -
Static method in class org.apache.hadoop.hbase.regionserver.HLog
- Split up a bunch of regionserver commit log files that are no longer
being written to, into new files, one per region for region to replay on
startup.
- start(Xid, int) -
Method in class org.apache.hadoop.hbase.client.transactional.JtaXAResource
-
- start() -
Method in class org.apache.hadoop.hbase.ipc.HBaseServer
- Starts the service.
- START_ROW -
Static variable in class org.apache.hadoop.hbase.rest.GenericHandler
-
- startKey -
Variable in class org.apache.hadoop.hbase.thrift.generated.TRegionInfo.Isset
-
- startKey -
Variable in class org.apache.hadoop.hbase.thrift.generated.TRegionInfo
-
- startMonitoring() -
Method in class org.apache.hadoop.hbase.metrics.file.TimeStampingFileContext
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args
-
- startup() -
Method in class org.apache.hadoop.hbase.LocalHBaseCluster
- Start the cluster.
- stop -
Variable in class org.apache.hadoop.hbase.Chore
-
- stop() -
Method in class org.apache.hadoop.hbase.ipc.HBaseClient
- Stop all threads related to this client.
- stop() -
Method in class org.apache.hadoop.hbase.ipc.HBaseServer
- Stops the service.
- stop() -
Method in class org.apache.hadoop.hbase.regionserver.HRegionServer
- Sets a flag that will cause all the HRegionServer threads to shut down
in an orderly fashion.
- stopMonitoring() -
Method in class org.apache.hadoop.hbase.metrics.file.TimeStampingFileContext
-
- stopProxy(VersionedProtocol) -
Static method in class org.apache.hadoop.hbase.ipc.HBaseRPC
- Stop this proxy and release its invoker's resource
- stopRequested -
Variable in class org.apache.hadoop.hbase.regionserver.HRegionServer
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args.Isset
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args.Isset
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args
-
- StopRowFilter - Class in org.apache.hadoop.hbase.filter
- Implementation of RowFilterInterface that filters out rows greater than or
equal to a specified rowKey.
- StopRowFilter() -
Constructor for class org.apache.hadoop.hbase.filter.StopRowFilter
- Default constructor, filters nothing.
- StopRowFilter(byte[]) -
Constructor for class org.apache.hadoop.hbase.filter.StopRowFilter
- Constructor that takes a stopRowKey on which to filter
- storefileIndexSizeMB -
Variable in class org.apache.hadoop.hbase.regionserver.metrics.RegionServerMetrics
- Sum of all the storefile index sizes in this regionserver in MB
- storefiles -
Variable in class org.apache.hadoop.hbase.regionserver.metrics.RegionServerMetrics
- Count of storefiles open on the regionserver.
- stores -
Variable in class org.apache.hadoop.hbase.regionserver.HRegion
-
- stores -
Variable in class org.apache.hadoop.hbase.regionserver.metrics.RegionServerMetrics
- Count of stores open on the regionserver.
- Strings - Class in org.apache.hadoop.hbase.util
- Utillity for Strings.
- Strings() -
Constructor for class org.apache.hadoop.hbase.util.Strings
-
- subMap(byte[], byte[]) -
Method in class org.apache.hadoop.hbase.io.HbaseMapWritable
-
- subMap(byte[], byte[]) -
Method in class org.apache.hadoop.hbase.io.RowResult
-
- subMap(K, K) -
Method in class org.apache.hadoop.hbase.util.SoftValueSortedMap
-
- SubstringComparator - Class in org.apache.hadoop.hbase.filter
- This comparator is for use with ColumnValueFilter, for filtering based on
the value of a given column.
- SubstringComparator() -
Constructor for class org.apache.hadoop.hbase.filter.SubstringComparator
- Nullary constructor for Writable
- SubstringComparator(String) -
Constructor for class org.apache.hadoop.hbase.filter.SubstringComparator
- Constructor
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.atomicIncrement_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.atomicIncrement_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.get_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.get_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getColumnDescriptors_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getColumnDescriptors_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRow_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRow_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowWithColumns_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowWithColumns_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowWithColumnsTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowWithColumnsTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableNames_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableNames_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableRegions_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableRegions_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVer_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVer_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVerTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVerTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.isTableEnabled_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.isTableEnabled_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerGet_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerGet_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_result
-
- suspendEncoding() -
Method in class org.apache.hadoop.hbase.util.Base64.Base64OutputStream
- Suspends encoding of the stream.
- sync(long) -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Reader
- Seek to the next sync mark past a given position.
- sync() -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Writer
- create a sync point
- sync() -
Method in class org.apache.hadoop.hbase.regionserver.HLog
-
- SYNC_INTERVAL -
Static variable in class org.apache.hadoop.hbase.io.SequenceFile
- The number of bytes between sync points.
- syncSeen() -
Method in class org.apache.hadoop.hbase.io.SequenceFile.Reader
- Returns true iff the previous call to next passed a sync mark.
tableName
exists.
TableInputFormat
s.TableMap
and TableReduce
FileContext.emitRecord(String, String, OutputRecord)
.write(int)
repeatedly until len bytes are
written.
Writable
, String
, primitive type, or an array of
the preceding.
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |