|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
HTable.abort(long)
AccessControlException
with the specified detail message.
Path
to the list of inputs for the map-reduce job.
Base64.Base64InputStream
will read data from another
InputStream, given in the constructor, and
encode/decode to/from Base64 notation on the fly.Base64.Base64InputStream
in DECODE mode.
Base64.Base64InputStream
in either ENCODE or DECODE mode.
Base64.Base64OutputStream
will write data to another
OutputStream, given in the constructor, and
encode/decode to/from Base64 notation on the fly.Base64.Base64OutputStream
in ENCODE mode.
Base64.Base64OutputStream
in either ENCODE or DECODE mode.
Writer
doing its own printlns,
etc.FileSystemStore
.BufferedFSInputStream
with the specified buffer size,
and saves its argument, the input stream
in
, for later use.
param
, to the IPC server running at
address
, returning the value.
position
.
Writable
class.
RecordWriter
to future operations.
JobClient
.
InputSplit
to future operations.
RecordWriter
to future operations.
regionName
is closed and no longer available.
IOException
IOException
regionName
is closing.
MultiFilterRecordReader.emit(org.apache.hadoop.mapred.join.TupleWritable)
every Tuple from the
collector (the outer join of child RRs).
HTable.commit(long)
HTable.commit(long, long)
CompressionOutputStream
to compress data.Configuration
.JobConf
.
JobConf
.
Configuration
.Group
of counters, comprising of counters from a particular
counter Enum
class.Compressor
for use by this CompressionCodec
.
Decompressor
for use by this CompressionCodec
.
FsPermission
object.
PermissionStatus
object.
UnixUserGroupInformation
object.
CompressionInputStream
that will read from the given
InputStream
with the given Decompressor
.
CompressionOutputStream
that will write to the given
OutputStream
.
CompressionOutputStream
that will write to the given
OutputStream
with the given Compressor
.
recordName
.
recordName
.
Thread.setDaemon(boolean)
with true.DataInput
implementation that reads from an in-memory
buffer.DataOutput
implementation that writes to an in-memory
buffer.CompressionInputStream
to compress data.WritableComparable
implementation.
Record
implementation.
Configuration.iterator()
instead.
o
is a FloatWritable with the same value.
o
is a IntWritable with the same value.
o
is a LongWritable with the same value.
o
is an MD5Hash whose digest contains the
same values.
o
is a Text with the same contents.
o
is a UTF8 with the same contents.
o
is a VIntWritable with the same value.
o
is a VLongWritable with the same value.
InputFormat
.INode
s and Block
s.FilterFileSystem
contains
some other file system, which it uses as
its basic file system, possibly transforming
the data along the way or providing additional
functionality.what
in the backing
buffer, starting as position start
.
FSInputStream
in a DataInputStream
and buffers input through a BufferedInputStream
.OutputStream
in a DataOutputStream
,
buffers output through a BufferedOutputStream
and creates a checksum
file.FsAction
.
GenericOptionsParser
is a utility to parse command line
arguments generic to the Hadoop framework.GenericOptionsParser to parse only the generic Hadoop
arguments.
- GenericOptionsParser(Configuration, Options, String[]) -
Constructor for class org.apache.hadoop.util.GenericOptionsParser
- Create a
GenericOptionsParser
to parse given options as well
as generic Hadoop options.
- GenericWritable - Class in org.apache.hadoop.io
- A wrapper for Writable instances.
- GenericWritable() -
Constructor for class org.apache.hadoop.io.GenericWritable
-
- get(String, Object) -
Method in class org.apache.hadoop.conf.Configuration
- Deprecated. A side map of Configuration to Object should be used instead.
- get(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property, null
if
no such property exists.
- get(String, String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property.
- get(int) -
Method in class org.apache.hadoop.dfs.LocatedBlocks
- Get located block.
- get(Configuration) -
Static method in class org.apache.hadoop.fs.FileSystem
- Returns the configured filesystem implementation.
- get(URI, Configuration) -
Static method in class org.apache.hadoop.fs.FileSystem
- Returns the FileSystem for this URI's scheme and authority.
- get() -
Method in class org.apache.hadoop.hbase.hql.TableFormatterFactory
-
- get(Text, Text) -
Method in class org.apache.hadoop.hbase.HRegion
- Fetch a single data item.
- get(Text, Text, int) -
Method in class org.apache.hadoop.hbase.HRegion
- Fetch multiple versions of a single data item
- get(Text, Text, long, int) -
Method in class org.apache.hadoop.hbase.HRegion
- Fetch multiple versions of a single data item, with timestamp.
- get(Text, Text, Text) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Retrieve a single value from the specified region for the specified row
and column keys
- get(Text, Text, Text, int) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get the specified number of versions of the specified row and column
- get(Text, Text, Text, long, int) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get the specified number of versions of the specified row and column with
the specified timestamp.
- get(Text, Text, Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Retrieve a single value from the specified region for the specified row
and column keys
- get(Text, Text, Text, int) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get the specified number of versions of the specified row and column
- get(Text, Text, Text, long, int) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get the specified number of versions of the specified row and column with
the specified timestamp.
- get(Text, Text) -
Method in class org.apache.hadoop.hbase.HTable
- Get a single value for the specified row and column
- get(Text, Text, int) -
Method in class org.apache.hadoop.hbase.HTable
- Get the specified number of versions of the specified row and column
- get(Text, Text, long, int) -
Method in class org.apache.hadoop.hbase.HTable
- Get the specified number of versions of the specified row and column with
the specified timestamp.
- get(Object) -
Method in class org.apache.hadoop.hbase.io.HbaseMapWritable
-
- get() -
Method in class org.apache.hadoop.hbase.io.HbaseObjectWritable
- Return the instance, or null if none.
- get() -
Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
- Get the data from the BytesWritable.
- get(byte[], byte[], byte[]) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- get(byte[], byte[], byte[]) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get a single value for the specified table, row, and column at the
latest timestamp.
- get(byte[], byte[], byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- get(long, Writable) -
Method in class org.apache.hadoop.io.ArrayFile.Reader
- Return the
n
th value in the file.
- get() -
Method in class org.apache.hadoop.io.ArrayWritable
-
- get() -
Method in class org.apache.hadoop.io.BooleanWritable
- Returns the value of the BooleanWritable
- get() -
Method in class org.apache.hadoop.io.BytesWritable
- Get the data from the BytesWritable.
- get() -
Method in class org.apache.hadoop.io.FloatWritable
- Return the value of this FloatWritable.
- get() -
Method in class org.apache.hadoop.io.GenericWritable
- Return the wrapped instance.
- get() -
Method in class org.apache.hadoop.io.IntWritable
- Return the value of this IntWritable.
- get() -
Method in class org.apache.hadoop.io.LongWritable
- Return the value of this LongWritable.
- get(WritableComparable, Writable) -
Method in class org.apache.hadoop.io.MapFile.Reader
- Return the value for the named key, or null if none exists.
- get(Object) -
Method in class org.apache.hadoop.io.MapWritable
-
- get() -
Static method in class org.apache.hadoop.io.NullWritable
- Returns the single instance of this class.
- get() -
Method in class org.apache.hadoop.io.ObjectWritable
- Return the instance, or null if none.
- get(Text) -
Method in class org.apache.hadoop.io.SequenceFile.Metadata
-
- get(WritableComparable) -
Method in class org.apache.hadoop.io.SetFile.Reader
- Read the matching key from a set into
key
.
- get(Object) -
Method in class org.apache.hadoop.io.SortedMapWritable
-
- get() -
Method in class org.apache.hadoop.io.TwoDArrayWritable
-
- get() -
Method in class org.apache.hadoop.io.VIntWritable
- Return the value of this VIntWritable.
- get() -
Method in class org.apache.hadoop.io.VLongWritable
- Return the value of this LongWritable.
- get(Class) -
Static method in class org.apache.hadoop.io.WritableComparator
- Get a comparator for a
WritableComparable
implementation.
- get() -
Static method in class org.apache.hadoop.ipc.Server
- Returns the server instance called under or null.
- get(int) -
Method in class org.apache.hadoop.mapred.join.CompositeInputSplit
- Get ith child InputSplit.
- get(int) -
Method in class org.apache.hadoop.mapred.join.TupleWritable
- Get ith Writable from Tuple.
- get() -
Method in class org.apache.hadoop.metrics.util.MetricsIntValue
- Get value
- get(DataInput) -
Static method in class org.apache.hadoop.record.BinaryRecordInput
- Get a thread-local record input for the supplied DataInput.
- get(DataOutput) -
Static method in class org.apache.hadoop.record.BinaryRecordOutput
- Get a thread-local record output for the supplied DataOutput.
- get() -
Method in class org.apache.hadoop.record.Buffer
- Get the data from the Buffer.
- get() -
Method in class org.apache.hadoop.util.Progress
- Returns the overall progress of the root.
- getAbsolutePath(String) -
Method in class org.apache.hadoop.streaming.PathFinder
- Returns the full path name of this file if it is listed in the
path
- getAddress(Configuration) -
Static method in class org.apache.hadoop.mapred.JobTracker
-
- getAllJobs() -
Method in class org.apache.hadoop.mapred.JobClient
- Get the jobs that are submitted.
- getAllJobs() -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Get all the jobs submitted.
- getAllJobs() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getAllTasks() -
Method in class org.apache.hadoop.mapred.JobHistory.JobInfo
- Returns all map and reduce tasks
.
- getAlphabet(int) -
Static method in class org.apache.hadoop.hbase.util.Base64
- Returns one of the _SOMETHING_ALPHABET byte arrays depending on the options
specified.
- getAnalyzerName() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getApproxChkSumLength(long) -
Static method in class org.apache.hadoop.fs.ChecksumFileSystem
-
- getArchiveClassPaths(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the archive entries in classpath as an array of Path
- getArchiveTimestamps(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the timestamps of the archives
- getAssignedTracker(String) -
Method in class org.apache.hadoop.mapred.JobTracker
- Get tracker name for a given task id.
- getAttribute(String) -
Method in class org.apache.hadoop.hbase.util.InfoServer
- Get the value in the webapp context.
- getAttribute(String) -
Method in class org.apache.hadoop.mapred.StatusHttpServer
- Get the value in the webapp context.
- getAttribute(String) -
Method in class org.apache.hadoop.metrics.ContextFactory
- Returns the value of the named attribute, or null if there is no
attribute of that name.
- getAttribute(String) -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Convenience method for subclasses to access factory attributes.
- getAttributeNames() -
Method in class org.apache.hadoop.metrics.ContextFactory
- Returns the names of all the factory's attributes.
- getAttributeTable(String) -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Returns an attribute-value map derived from the factory attributes
by finding all factory attributes that begin with
contextName.tableName.
- getAvailable() -
Method in class org.apache.hadoop.fs.DF
-
- getBasePathInJarOut(String) -
Method in class org.apache.hadoop.streaming.JarBuilder
-
- getBasicCommand() -
Method in class org.apache.hadoop.hbase.hql.BasicCommand
-
- getBeginColumn() -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
-
- getBeginColumn() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getBeginLine() -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
-
- getBeginLine() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getBindAddress() -
Method in class org.apache.hadoop.hbase.HServerAddress
-
- getBlockInputStream(Block) -
Method in interface org.apache.hadoop.dfs.FSDatasetInterface
- Returns an input stream to read the contents of the specified block
- getBlockInputStream(Block, long) -
Method in interface org.apache.hadoop.dfs.FSDatasetInterface
- Returns an input stream at specified offset of the specified block
- getBlockLocations(String, long, long) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getBlockReport() -
Method in interface org.apache.hadoop.dfs.FSDatasetInterface
- Returns the block report - the full list of blocks stored
- getBlockReportAverageTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getBlockReportAverageTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Average time for Block Report Processing in last interval
- getBlockReportMaxTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getBlockReportMaxTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- The Maximum Block Report Processing Time since reset was called
- getBlockReportMinTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getBlockReportMinTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- The Minimum Block Report Processing Time since reset was called
- getBlockReportNum() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getBlockReportNum() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Number of block Reports processed in the last interval
- getBlockReportsAverageTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlockReportsAverageTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Average time for Block Reports Operation in last interval
- getBlockReportsMaxTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlockReportsMaxTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Maximum Block Reports Operation Time since reset was called
- getBlockReportsMinTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlockReportsMinTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Minimum Block Reports Operation Time since reset was called
- getBlockReportsNum() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlockReportsNum() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of Block Reports sent in last interval
- getBlocks(DatanodeInfo, long) -
Method in class org.apache.hadoop.dfs.NameNode
- return a list of blocks & their locations on
datanode
whose
total size is size
- getBlocks() -
Method in class org.apache.hadoop.fs.s3.INode
-
- getBlockSize() -
Method in class org.apache.hadoop.fs.FileStatus
- Get the block size of the file.
- getBlockSize(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated. Use getFileStatus() instead
- getBlocksRead() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlocksRead() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of blocks read in the last interval
- getBlocksRemoved() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlocksRemoved() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of blocks removed in the last interval
- getBlocksReplicated() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlocksReplicated() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of blocks replicated in the last interval
- getBlocksTotal() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.FSNamesystemMBean
- Number of allocated blocks in the system
- getBlocksVerified() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlocksVerified() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of blocks verified in the last interval
- getBlocksWritten() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlocksWritten() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of blocks written in the last interval
- getBlockVerificationFailures() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBlockVerificationFailures() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of block verification failures in the last interval
- getBloomFilter() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getBoolean(String, boolean) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a boolean
.
- getBoost(String) -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getBoundAntProperty(String, String) -
Static method in class org.apache.hadoop.streaming.StreamUtil
-
- getBytes(HStoreKey) -
Static method in class org.apache.hadoop.hbase.HStoreKey
- Returns row and column bytes out of an HStoreKey.
- getBytes() -
Method in class org.apache.hadoop.hbase.io.TextSequence
-
- getBytes(Writable) -
Static method in class org.apache.hadoop.hbase.util.Writables
-
- getBytes() -
Method in class org.apache.hadoop.io.Text
- Retuns the raw bytes.
- getBytes() -
Method in class org.apache.hadoop.io.UTF8
- Deprecated. The raw bytes.
- getBytes(String) -
Static method in class org.apache.hadoop.io.UTF8
- Deprecated. Convert a string to a UTF-8 encoded byte array.
- getBytes() -
Method in class org.onelab.filter.Key
-
- getBytesPerChecksum() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getBytesPerSum() -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- Return the bytes Per Checksum
- getBytesRead() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getBytesRead() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of bytes read in the last interval
- getBytesRead() -
Method in interface org.apache.hadoop.io.compress.Compressor
- Return number of uncompressed bytes input so far.
- getBytesRead() -
Method in class org.apache.hadoop.io.compress.lzo.LzoCompressor
- Return number of bytes given to this compressor since last reset.
- getBytesRead() -
Method in class org.apache.hadoop.io.compress.zlib.ZlibCompressor
- Returns the total number of uncompressed bytes input so far.
- getBytesRead() -
Method in class org.apache.hadoop.io.compress.zlib.ZlibDecompressor
- Returns the total number of uncompressed bytes input so far.
- getBytesWritten() -
Method in interface org.apache.hadoop.io.compress.Compressor
- Return number of compressed bytes output so far.
- getBytesWritten() -
Method in class org.apache.hadoop.io.compress.lzo.LzoCompressor
- Return number of bytes consumed by callers of compress since last reset.
- getBytesWritten() -
Method in class org.apache.hadoop.io.compress.zlib.ZlibCompressor
- Returns the total number of compressed bytes output so far.
- getBytesWritten() -
Method in class org.apache.hadoop.io.compress.zlib.ZlibDecompressor
- Returns the total number of compressed bytes output so far.
- getCacheArchives(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get cache archives set in the Configuration
- getCacheFiles(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get cache files set in the Configuration
- getCacheFlushListener() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getCallQueueLen() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- The number of rpc calls in the queue.
- getCallQueueLen() -
Method in class org.apache.hadoop.ipc.Server
- The number of rpc calls in the queue.
- getCapacity() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.FSDatasetMBean
- Returns total capacity (in bytes) of storage (used and unused)
- getCapacity() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- The raw capacity.
- getCapacity() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem.DiskStatus
-
- getCapacity() -
Method in class org.apache.hadoop.fs.DF
-
- getCapacity() -
Method in class org.apache.hadoop.io.BytesWritable
- Get the capacity, which is the maximum size that could handled without
resizing the backing storage.
- getCapacity() -
Method in class org.apache.hadoop.record.Buffer
- Get the capacity, which is the maximum count that could handled without
resizing the backing storage.
- getCapacityRemaining() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.FSNamesystemMBean
- Free (unused) storage capacity
- getCapacityTotal() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.FSNamesystemMBean
- Total storage capacity
- getCapacityUsed() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.FSNamesystemMBean
- Used storage capacity
- getCategory(List<List<Pentomino.ColumnName>>) -
Method in class org.apache.hadoop.examples.dancing.Pentomino
- Find whether the solution has the x in the upper left quadrant, the
x-midline, the y-midline or in the center.
- getChannelPosition(Block, FSDatasetInterface.BlockWriteStreams) -
Method in interface org.apache.hadoop.dfs.FSDatasetInterface
- Returns the current offset in the data stream.
- getChecksumFile(Path) -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- Return the name of the checksum file associated with a file.
- getChecksumFileLength(Path, long) -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- Return the length of the checksum file given the size of the
actual file.
- getChecksumHeaderSize() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getChecksumLength(long, int) -
Static method in class org.apache.hadoop.fs.ChecksumFileSystem
- Calculated the length of the checksum file in bytes.
- getChecksumSize() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getChecksumType() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getChunkPosition(long) -
Method in class org.apache.hadoop.fs.FSInputChecker
- Return position of beginning of chunk containing pos.
- getClass(String, Class<?>) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a Class
.
- getClass(String, Class<? extends U>, Class<U>) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a Class
implementing the interface specified by xface
.
- getClass(byte) -
Method in class org.apache.hadoop.hbase.io.HbaseMapWritable
-
- getClass(byte) -
Method in class org.apache.hadoop.io.AbstractMapWritable
-
- getClass(String, Configuration) -
Static method in class org.apache.hadoop.io.WritableName
- Return the class for a name.
- getClassByName(String) -
Method in class org.apache.hadoop.conf.Configuration
- Load a class by name.
- getClassByName(String) -
Static method in class org.apache.hadoop.contrib.utils.join.DataJoinJob
-
- getClassLoader() -
Method in class org.apache.hadoop.conf.Configuration
- Get the
ClassLoader
for this job.
- getClassName() -
Method in exception org.apache.hadoop.ipc.RemoteException
-
- getClientVersion() -
Method in exception org.apache.hadoop.hbase.ipc.HbaseRPC.VersionMismatch
- Get the client's prefered version
- getClientVersion() -
Method in exception org.apache.hadoop.ipc.RPC.VersionMismatch
- Get the client's prefered version
- getClosest(WritableComparable, Writable) -
Method in class org.apache.hadoop.io.MapFile.Reader
- Finds the record that is the closest match to the specified key.
- getClosest(WritableComparable, Writable, boolean) -
Method in class org.apache.hadoop.io.MapFile.Reader
- Finds the record that is the closest match to the specified key.
- getClosestRowBefore(Text, long) -
Method in class org.apache.hadoop.hbase.HRegion
- Return all the data for the row that matches row exactly,
or the one that immediately preceeds it, at or immediately before
ts.
- getClosestRowBefore(Text, Text) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Return all the data for the row that matches row exactly,
or the one that immediately preceeds it.
- getClosestRowBefore(Text, Text, long) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Return all the data for the row that matches row exactly,
or the one that immediately preceeds it, at or immediately before
ts.
- getClosestRowBefore(Text, Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Return all the data for the row that matches row exactly,
or the one that immediately preceeds it.
- getClosestRowBefore(Text, Text, long) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Return all the data for the row that matches row exactly,
or the one that immediately preceeds it, at or immediately before
ts.
- getClusterNick() -
Method in class org.apache.hadoop.streaming.StreamJob
-
- getClusterStatus() -
Method in class org.apache.hadoop.mapred.JobClient
- Get status information about the Map-Reduce cluster.
- getClusterStatus() -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Get the current status of the cluster
- getClusterStatus() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getCodec(Path) -
Method in class org.apache.hadoop.io.compress.CompressionCodecFactory
- Find the relevant compression codec for the given file based on its
filename suffix.
- getCodecClasses(Configuration) -
Static method in class org.apache.hadoop.io.compress.CompressionCodecFactory
- Get the list of codecs listed in the configuration
- getColumn() -
Method in class org.apache.hadoop.hbase.HLogEdit
-
- getColumn() -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParser
-
- getColumn() -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
- Deprecated.
- getColumn(int) -
Method in class org.apache.hadoop.hbase.hql.InsertCommand
-
- getColumn() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getColumn() -
Method in class org.apache.hadoop.hbase.io.BatchOperation
-
- getColumn() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
- Deprecated.
- getColumnDescriptor(String, Map<String, Object>, HColumnDescriptor) -
Method in class org.apache.hadoop.hbase.hql.AlterCommand
- Given a column name, column spec, and original descriptor, returns an
instance of HColumnDescriptor representing the column spec, with empty
values drawn from the original as defaults
- getColumnDescriptor(String, Map<String, Object>) -
Method in class org.apache.hadoop.hbase.hql.SchemaModificationCommand
- Given a column name and column spec, returns an instance of
HColumnDescriptor representing the column spec.
- getColumnDescriptors(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getColumnDescriptors(byte[]) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- List all the column families assoicated with a table.
- getColumnDescriptors(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getColumnList(HBaseAdmin, HTable) -
Method in class org.apache.hadoop.hbase.hql.DeleteCommand
-
- getColumnName(int) -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Get the name of a given column as a string
- getColumns() -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParser
-
- getColumns(HBaseAdmin, boolean) -
Method in class org.apache.hadoop.hbase.hql.SelectCommand
- Make sense of the supplied list of columns.
- getCombinerClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user-defined combiner class used to combine map-outputs
before being sent to the reducers.
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.DoubleValueSum
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMax
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMin
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueSum
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMax
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMin
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.UniqValueCount
-
- getCombinerOutput() -
Method in interface org.apache.hadoop.mapred.lib.aggregate.ValueAggregator
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.ValueHistogram
-
- getCommand() -
Method in class org.apache.hadoop.hbase.hql.BasicCommand
- basic commands are their own factories.
- getCommand() -
Method in interface org.apache.hadoop.hbase.hql.CommandFactory
-
- getCommandLine() -
Method in class org.apache.hadoop.util.GenericOptionsParser
- Returns the commons-cli
CommandLine
object
to process the parsed arguments.
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.AlterCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.BasicCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.ClearCommand
-
- getCommandType() -
Method in interface org.apache.hadoop.hbase.hql.Command
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.CreateCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.DeleteCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.DisableCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.DropCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.EnableCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.ExitCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.FsCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.InsertCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.JarCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.hql.TruncateCommand
-
- getComparator() -
Method in class org.apache.hadoop.mapred.join.CompositeRecordReader
- Return comparator defining the ordering for RecordReaders in this
composite.
- getCompression() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getCompressionCodec() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the compression codec of data in this file.
- getCompressionCodec() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- Returns the compression codec of data in this file.
- getCompressionType() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getCompressionType(Configuration) -
Static method in class org.apache.hadoop.io.SequenceFile
- Get the compression type for the reduce outputs
- getCompressMapOutput() -
Method in class org.apache.hadoop.mapred.JobConf
- Are the outputs of the maps be compressed?
- getCompressorType() -
Method in interface org.apache.hadoop.io.compress.CompressionCodec
- Get the type of
Compressor
needed by this CompressionCodec
.
- getCompressorType() -
Method in class org.apache.hadoop.io.compress.DefaultCodec
-
- getCompressorType() -
Method in class org.apache.hadoop.io.compress.GzipCodec
-
- getCompressorType() -
Method in class org.apache.hadoop.io.compress.LzoCodec
-
- getCompressOutput(JobConf) -
Static method in class org.apache.hadoop.mapred.OutputFormatBase
- Is the job output compressed?
- getConf() -
Method in interface org.apache.hadoop.conf.Configurable
- Return the configuration used by this object.
- getConf() -
Method in class org.apache.hadoop.conf.Configured
-
- getConf() -
Method in class org.apache.hadoop.dfs.Balancer
- return this balancer's configuration
- getConf() -
Method in class org.apache.hadoop.fs.FilterFileSystem
-
- getConf() -
Method in class org.apache.hadoop.hbase.io.HbaseMapWritable
-
- getConf() -
Method in class org.apache.hadoop.hbase.io.HbaseObjectWritable
-
- getConf() -
Method in class org.apache.hadoop.io.AbstractMapWritable
-
- getConf() -
Method in class org.apache.hadoop.io.compress.DefaultCodec
-
- getConf() -
Method in class org.apache.hadoop.io.compress.LzoCodec
-
- getConf() -
Method in class org.apache.hadoop.io.GenericWritable
-
- getConf() -
Method in class org.apache.hadoop.io.ObjectWritable
-
- getConf() -
Method in class org.apache.hadoop.mapred.join.CompositeRecordReader
- Return the configuration used by this object.
- getConf() -
Method in class org.apache.hadoop.mapred.SequenceFileInputFilter.FilterBase
-
- getConf() -
Method in class org.apache.hadoop.net.SocksSocketFactory
-
- getConf() -
Method in class org.apache.hadoop.tools.Logalyzer.LogComparator
-
- getConf() -
Method in class org.apache.hadoop.util.CopyFiles
-
- getConf() -
Method in class org.apache.hadoop.util.ToolBase
- Deprecated.
- getConfiguration() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getConfResourceAsInputStream(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get an input stream attached to the configuration resource with the
given
name
.
- getConfResourceAsReader(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get a
Reader
attached to the configuration resource with the
given name
.
- getConnection(HBaseConfiguration) -
Static method in class org.apache.hadoop.hbase.HConnectionManager
- Get the connection object for the instance specified by the configuration
If no current connection exists, create a new connection for that instance
- getConnection() -
Method in class org.apache.hadoop.hbase.HTable
-
- getContentLength(Path) -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
-
- getContentLength(Path) -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getContentLength(String) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getContentLength(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Return the number of bytes of the given path
If f is a file, return the size of the file;
If f is a directory, return the size of the directory tree
- getContentLength(Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getContentType(String) -
Static method in enum org.apache.hadoop.hbase.rest.GenericHandler.ContentType
- Utility method used looking at Accept header content.
- getContext(String) -
Method in class org.apache.hadoop.metrics.ContextFactory
- Returns the named MetricsContext instance, constructing it if necessary
using the factory's current configuration attributes.
- getContext(String) -
Static method in class org.apache.hadoop.metrics.MetricsUtil
- Utility method to return the named context.
- getContext() -
Method in class org.apache.hadoop.streaming.PipeMapRed
-
- getContextFactory() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Returns the factory by which this context was created.
- getContextName() -
Method in interface org.apache.hadoop.metrics.MetricsContext
- Returns the context name.
- getContextName() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Returns the context name.
- getCopyBlockOpAverageTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getCopyBlockOpAverageTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Average time for CopyBlock Operation in last interval
- getCopyBlockOpMaxTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getCopyBlockOpMaxTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Maximum CopyBlock Operation Time since reset was called
- getCopyBlockOpMinTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getCopyBlockOpMinTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Minimum CopyBlock Operation Time since reset was called
- getCopyBlockOpNum() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getCopyBlockOpNum() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of CopyBlock Operation in last interval
- getCorruptFiles() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the number of currupted files.
- getCount() -
Method in class org.apache.hadoop.record.Buffer
- Get the current count of the buffer.
- getCounter() -
Method in class org.apache.hadoop.mapred.Counters.Counter
- What is the current value of this counter?
- getCounter(Enum) -
Method in class org.apache.hadoop.mapred.Counters
- Returns current value of the specified counter, or 0 if the counter
does not exist.
- getCounter(String) -
Method in class org.apache.hadoop.mapred.Counters.Group
- Deprecated.
- getCounter(int, String) -
Method in class org.apache.hadoop.mapred.Counters.Group
- Get the counter for the given id and create it if it doesn't exist.
- getCounterNames() -
Method in class org.apache.hadoop.mapred.Counters.Group
- Deprecated. iterate through the group instead
- getCounters() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Gets the counters for this job.
- getCounters() -
Method in class org.apache.hadoop.mapred.TaskReport
- A table of counters.
- getCurrentSplit(JobConf) -
Static method in class org.apache.hadoop.streaming.StreamUtil
-
- getCurrentTrashDir() -
Method in class org.apache.hadoop.fs.FsShell
- Returns the Trash object associated with this shell.
- getCurrentValue(Writable) -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Get the 'value' corresponding to the last read 'key'.
- getCurrentValue(V) -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
-
- getData() -
Method in class org.apache.hadoop.contrib.utils.join.TaggedMapOutput
-
- getData() -
Method in class org.apache.hadoop.io.DataOutputBuffer
- Returns the current contents of the buffer.
- getDataNode() -
Static method in class org.apache.hadoop.dfs.DataNode
- Return the DataNode object
- getDatanodeReport() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- A formatted string for reporting the status of the DataNode.
- getDatanodeReport(FSConstants.DatanodeReportType) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getDataNodeStats() -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
- Return statistics for each datanode.
- getDataNodeStats() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Return statistics for each datanode.
- getDate() -
Static method in class org.apache.hadoop.util.VersionInfo
- The date that Hadoop was compiled.
- getDeclaredClass() -
Method in class org.apache.hadoop.hbase.io.HbaseObjectWritable
- Return the class this is meant to be.
- getDeclaredClass() -
Method in class org.apache.hadoop.io.ObjectWritable
- Return the class this is meant to be.
- getDecodabet(int) -
Static method in class org.apache.hadoop.hbase.util.Base64
- Returns one of the _SOMETHING_DECODABET byte arrays depending on the
options specified.
- getDecompressorType() -
Method in interface org.apache.hadoop.io.compress.CompressionCodec
- Get the type of
Decompressor
needed by this CompressionCodec
.
- getDecompressorType() -
Method in class org.apache.hadoop.io.compress.DefaultCodec
-
- getDecompressorType() -
Method in class org.apache.hadoop.io.compress.GzipCodec
-
- getDecompressorType() -
Method in class org.apache.hadoop.io.compress.LzoCodec
-
- getDefault() -
Static method in class org.apache.hadoop.fs.permission.FsPermission
- Get the default permission.
- getDefaultBlockSize() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getDefaultBlockSize() -
Method in class org.apache.hadoop.fs.FileSystem
- Return the number of bytes that large input files should be optimally
be split into to minimize i/o time.
- getDefaultBlockSize() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Return the number of bytes that large input files should be optimally
be split into to minimize i/o time.
- getDefaultBlockSize() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getDefaultExtension() -
Method in interface org.apache.hadoop.io.compress.CompressionCodec
- Get the default filename extension for this kind of compression.
- getDefaultExtension() -
Method in class org.apache.hadoop.io.compress.DefaultCodec
-
- getDefaultExtension() -
Method in class org.apache.hadoop.io.compress.GzipCodec
-
- getDefaultExtension() -
Method in class org.apache.hadoop.io.compress.LzoCodec
- Get the default filename extension for this kind of compression.
- getDefaultHost(String, String) -
Static method in class org.apache.hadoop.net.DNS
- Returns the default (first) host name associated by the provided
nameserver with the address bound to the specified network interface
- getDefaultHost(String) -
Static method in class org.apache.hadoop.net.DNS
- Returns the default (first) host name associated by the default
nameserver with the address bound to the specified network interface
- getDefaultIP(String) -
Static method in class org.apache.hadoop.net.DNS
- Returns the first available IP address associated with the provided
network interface
- getDefaultReplication() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getDefaultReplication() -
Method in class org.apache.hadoop.fs.FileSystem
- Get the default replication.
- getDefaultReplication() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Get the default replication.
- getDefaultReplication() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getDefaultSocketFactory(Configuration) -
Static method in class org.apache.hadoop.net.NetUtils
- Get the default socket factory as specified by the configuration
parameter hadoop.rpc.socket.factory.default
- getDelegate() -
Method in class org.apache.hadoop.mapred.join.CompositeRecordReader
- Obtain an iterator over the child RRs apropos of the value type
ultimately emitted from this join.
- getDelegate() -
Method in class org.apache.hadoop.mapred.join.JoinRecordReader
- Return an iterator wrapping the JoinCollector.
- getDelegate() -
Method in class org.apache.hadoop.mapred.join.MultiFilterRecordReader
- Return an iterator returning a single value from the tuple.
- getDependants() -
Method in class org.apache.hadoop.hbase.generated.master.hql_jsp
-
- getDependants() -
Method in class org.apache.hadoop.hbase.generated.master.master_jsp
-
- getDependants() -
Method in class org.apache.hadoop.hbase.generated.regionserver.regionserver_jsp
-
- getDependingJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getDescription() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Description of the upgrade object for displaying.
- getDfsUsed() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.FSDatasetMBean
- Returns the total space (in bytes) used by dfs datanode
- getDfsUsed() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- The used space by the data node.
- getDfsUsed() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem.DiskStatus
-
- getDiagnostics() -
Method in class org.apache.hadoop.mapred.TaskReport
- A list of error messages.
- getDigest() -
Method in class org.apache.hadoop.io.MD5Hash
- Returns the digest bytes.
- getDirPath() -
Method in class org.apache.hadoop.fs.DF
-
- getDirPath() -
Method in class org.apache.hadoop.fs.DU
-
- getDiskStatus() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Return the disk usage of the filesystem, including total capacity,
used space, and remaining space
- getDisplayName() -
Method in class org.apache.hadoop.mapred.Counters.Counter
- Get the name of the counter.
- getDisplayName() -
Method in class org.apache.hadoop.mapred.Counters.Group
- Returns localized name of the group.
- getDisplayName(String) -
Method in class org.apache.hadoop.mapred.Counters.Group
- Deprecated. get the counter directly
- getDistance(Node, Node) -
Method in class org.apache.hadoop.net.NetworkTopology
- Return the distance between two nodes
It is assumed that the distance from one node to its parent is 1
The distance between two nodes is calculated by summing up their distances
to their closest common ancestor.
- getDoubleValue(Object) -
Method in class org.apache.hadoop.contrib.utils.join.JobBase
-
- getDU(File) -
Static method in class org.apache.hadoop.fs.FileUtil
- Takes an input dir and returns the du on that local directory.
- getEditLogSize() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the size of the current edit log.
- getElementTypeID() -
Method in class org.apache.hadoop.record.meta.VectorTypeID
-
- getEmptier() -
Method in class org.apache.hadoop.fs.Trash
- Return a
Runnable
that periodically empties the trash of all
users, intended to be run by the superuser.
- getEncodedName() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getEndColumn() -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
-
- getEndColumn() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getEndKey() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getEndKey() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getEndLine() -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
-
- getEndLine() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getEndRow() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getEntry(MapFile.Reader[], Partitioner<K, V>, K, V) -
Static method in class org.apache.hadoop.mapred.MapFileOutputFormat
- Get an entry from output generated by this class.
- getError() -
Static method in class org.apache.hadoop.metrics.jvm.EventCounter
-
- getEventId() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- Returns event Id.
- getExcessiveReplicas() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the number of over-replicated blocks.
- getExcludedHosts() -
Method in class org.apache.hadoop.util.HostsFileReader
-
- getExecString() -
Method in class org.apache.hadoop.fs.DF
-
- getExecString() -
Method in class org.apache.hadoop.fs.DU
-
- getExecString() -
Method in class org.apache.hadoop.util.Shell
- return an array containing the command name & its parameters
- getExecString() -
Method in class org.apache.hadoop.util.Shell.ShellCommandExecutor
-
- getExecutable(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Get the URI of the application's executable.
- getExitCode() -
Method in exception org.apache.hadoop.util.Shell.ExitCodeException
-
- getExitCode() -
Method in class org.apache.hadoop.util.Shell
- get the exit code
- getExitCode() -
Method in class org.apache.hadoop.util.ShellUtil
- Deprecated.
- getFactor() -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- Get the number of streams to merge at once.
- getFactory(Class) -
Static method in class org.apache.hadoop.io.WritableFactories
- Define a factory for a class.
- getFactory() -
Static method in class org.apache.hadoop.metrics.ContextFactory
- Returns the singleton ContextFactory instance, constructing it if
necessary.
- getFailedJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getFamilies() -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getFamilyName() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getFatal() -
Static method in class org.apache.hadoop.metrics.jvm.EventCounter
-
- getFetchFailedMaps() -
Method in class org.apache.hadoop.mapred.ReduceTaskStatus
-
- getFieldID() -
Method in class org.apache.hadoop.record.meta.FieldTypeInfo
- get the field's id (name)
- getFieldTypeInfos() -
Method in class org.apache.hadoop.record.meta.RecordTypeInfo
- Return a collection of field type infos
- getFieldTypeInfos() -
Method in class org.apache.hadoop.record.meta.StructTypeID
-
- getFile(String, String) -
Method in class org.apache.hadoop.conf.Configuration
- Get a local file name under a directory named in dirsProp with
the given path.
- getFile() -
Method in class org.apache.hadoop.mapred.FileSplit
- Deprecated. Call
FileSplit.getPath()
instead.
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.fs.FileSystem
- Return a 2D array of size 1x1 or greater, containing hostnames
where portions of the given file can be found.
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Return a 2D array of size 1x1 or greater, containing hostnames
where portions of the given file can be found.
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
- Return null if the file doesn't exist; otherwise, get the
locations of the various chunks of the file file from KFS.
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.mapred.PhasedFileSystem
- Deprecated.
- getFileClassPaths(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the file entries in classpath as an array of Path
- getFileInfo(String) -
Method in class org.apache.hadoop.dfs.NameNode
- Get the file info for a specific file.
- getFileLength() -
Method in class org.apache.hadoop.dfs.LocatedBlocks
-
- getFileName() -
Method in class org.apache.hadoop.metrics.file.FileContext
- Returns the configured file name, or null.
- getFiles(PathFilter) -
Method in class org.apache.hadoop.fs.InMemoryFileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
- Returns the stat information about the file.
- getFileStatus(Path) -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Returns the stat information about the file.
- getFileStatus(Path) -
Method in class org.apache.hadoop.dfs.HftpFileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Return a file status object that represents the path.
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Get file status.
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
- FileStatus for S3 file systems.
- getFilesTotal() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.FSNamesystemMBean
- Total number of files and directories
- getFilesystem() -
Method in class org.apache.hadoop.fs.DF
-
- getFileSystem(Configuration) -
Method in class org.apache.hadoop.fs.Path
- Return the FileSystem that owns this Path.
- getFileSystem() -
Method in class org.apache.hadoop.mapred.TaskTracker
- Return the DFS filesystem
- getFilesystemName() -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- A MapReduce system always operates on a single filesystem.
- getFilesystemName() -
Method in class org.apache.hadoop.mapred.JobTracker
- Grab the local fs name
- getFileTimestamps(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the timestamps of the files
- getFileType() -
Method in class org.apache.hadoop.fs.s3.INode
-
- getFinishTime() -
Method in class org.apache.hadoop.mapred.TaskReport
- Get finish time of task.
- getFlippable() -
Method in class org.apache.hadoop.examples.dancing.Pentomino.Piece
-
- getFloat(String, float) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a float
.
- getFormattedTimeWithDiff(DateFormat, long, long) -
Static method in class org.apache.hadoop.util.StringUtils
- Formats time in ms and appends difference (finishTime - startTime)
as returned by formatTimeDiff().
- getFs() -
Method in class org.apache.hadoop.mapred.JobClient
- Get a filesystem handle.
- getFSDataset() -
Method in class org.apache.hadoop.dfs.DataNode
- This method is used for testing.
- getFsEditName() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the name of the edits file
- getFSImageLoadTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getFSImageLoadTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Time spent loading the FS Image at startup
- getFsImageName() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the name of the fsImage file
- getFsImageNameCheckpoint() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the name of the fsImage file uploaded by periodic
checkpointing
- getFSSize() -
Method in class org.apache.hadoop.fs.InMemoryFileSystem
-
- getFSState() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.FSNamesystemMBean
- The state of the file system: Safemode or Operational
- getFull(Text) -
Method in class org.apache.hadoop.hbase.HRegion
- Fetch all the columns for the indicated row.
- getFull(Text, long) -
Method in class org.apache.hadoop.hbase.HRegion
- Fetch all the columns for the indicated row at a specified timestamp.
- getGET_PERMISSION_COMMAND() -
Static method in class org.apache.hadoop.util.Shell
- Return a Unix command to get permission information.
- getGroup() -
Method in class org.apache.hadoop.fs.FileStatus
- Get the group associated with the file.
- getGroup(String) -
Method in class org.apache.hadoop.mapred.Counters
- Returns the named counter group, or an empty group if there is none
with the specified name.
- getGroupAction() -
Method in class org.apache.hadoop.fs.permission.FsPermission
- Return group
FsAction
.
- getGroupName() -
Method in class org.apache.hadoop.fs.permission.PermissionStatus
- Return group name
- getGroupNames() -
Method in class org.apache.hadoop.mapred.Counters
- Returns the names of all counter classes.
- getGroupNames() -
Method in class org.apache.hadoop.security.UnixUserGroupInformation
- Return an array of group names
- getGroupNames() -
Method in interface org.apache.hadoop.security.UserGroupInformation
- Get the name of the groups that the user belong to
- getGROUPS_COMMAND() -
Static method in class org.apache.hadoop.util.Shell
- a Unix command to get the current user's groups list
- getHadoopClientHome() -
Method in class org.apache.hadoop.streaming.StreamJob
-
- getHeader() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getHeartbeatsAverageTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getHeartbeatsAverageTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Average time for Heartbeat Operation in last interval
- getHeartbeatsMaxTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getHeartbeatsMaxTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Maximum Heartbeat Operation Time since reset was called
- getHeartbeatsMinTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getHeartbeatsMinTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Minimum Heartbeat Operation Time since reset was called
- getHeartbeatsNum() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getHeartbeatsNum() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of Heartbeat Operation in last interval
- getHomeDirectory() -
Method in class org.apache.hadoop.fs.FileSystem
- Return the current user's home directory in this filesystem.
- getHomeDirectory() -
Method in class org.apache.hadoop.fs.FilterFileSystem
-
- getHomeDirectory() -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- getHost() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getHost() -
Method in class org.apache.hadoop.streaming.Environment
-
- getHostName() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
-
- getHostname() -
Static method in class org.apache.hadoop.util.StringUtils
- Return hostname without throwing exception.
- getHosts(String, String) -
Static method in class org.apache.hadoop.net.DNS
- Returns all the host names associated by the provided nameserver with the
address bound to the specified network interface
- getHosts(String) -
Static method in class org.apache.hadoop.net.DNS
- Returns all the host names associated by the default nameserver with the
address bound to the specified network interface
- getHosts() -
Method in class org.apache.hadoop.util.HostsFileReader
-
- getHRegionConnection(HServerAddress) -
Method in interface org.apache.hadoop.hbase.HConnection
- Establishes a connection to the region server at the specified address.
- getHRegionInfo(Map<Text, byte[]>) -
Method in class org.apache.hadoop.hbase.HMaster
-
- getHRegionInfo(byte[]) -
Static method in class org.apache.hadoop.hbase.util.Writables
-
- getHRegionInfoOrNull(byte[]) -
Static method in class org.apache.hadoop.hbase.util.Writables
-
- getId() -
Method in class org.apache.hadoop.fs.s3.Block
-
- getId(Class<?>) -
Method in class org.apache.hadoop.hbase.io.HbaseMapWritable
-
- getId(Class) -
Method in class org.apache.hadoop.io.AbstractMapWritable
-
- GetImage() -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
-
- GetImage() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- GetImageServlet - Class in org.apache.hadoop.dfs
- This class is used in Namesystem's jetty to retrieve a file.
- GetImageServlet() -
Constructor for class org.apache.hadoop.dfs.GetImageServlet
-
- getIndexInterval() -
Method in class org.apache.hadoop.io.MapFile.Writer
- The number of entries that are added before an index entry is added.
- getInetSocketAddress() -
Method in class org.apache.hadoop.hbase.HServerAddress
-
- getInfo() -
Static method in class org.apache.hadoop.metrics.jvm.EventCounter
-
- getInfoPort() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getInfoPort() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getInfoPort() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getInodeLimitText() -
Method in class org.apache.hadoop.dfs.JspHelper
-
- getInputFormat() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
InputFormat
implementation for the map-reduce job,
defaults to TextInputFormat
if not specified explicity.
- getInputKeyClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Deprecated. Call
RecordReader.createKey()
.
- getInputPaths() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the list of input
Path
s for the map-reduce job.
- getInputSplit() -
Method in interface org.apache.hadoop.mapred.Reporter
- Get the
InputSplit
object for a map.
- getInputValueClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Deprecated. Call
RecordReader.createValue()
.
- getInt(String, int) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as an int
.
- getInterfaceName() -
Method in exception org.apache.hadoop.hbase.ipc.HbaseRPC.VersionMismatch
- Get the interface name
- getInterfaceName() -
Method in exception org.apache.hadoop.ipc.RPC.VersionMismatch
- Get the interface name
- getInternalFilter() -
Method in class org.apache.hadoop.hbase.filter.WhileMatchRowFilter
- Returns the internal filter being wrapped
- getIPs(String) -
Static method in class org.apache.hadoop.net.DNS
- Returns all the IPs associated with the provided interface, if any, in
textual form.
- getIsJavaMapper(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Check whether the job is using a Java Mapper.
- getIsJavaRecordReader(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Check whether the job is using a Java RecordReader
- getIsJavaRecordWriter(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Will the reduce use a Java RecordWriter?
- getIsJavaReducer(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Check whether the job is using a Java Reducer.
- getIsMap() -
Method in class org.apache.hadoop.mapred.MapTaskStatus
-
- getIsMap() -
Method in class org.apache.hadoop.mapred.ReduceTaskStatus
-
- getJar() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user jar for the map-reduce job.
- getJob(String) -
Method in class org.apache.hadoop.mapred.JobClient
- Get an
RunningJob
object to track an ongoing job.
- getJob(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobClient() -
Method in class org.apache.hadoop.mapred.TaskTracker
- The connection to the JobTracker, used by the TaskRunner
for locating remote files.
- getJobConf() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getJobCounters(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab the current job counters
- getJobCounters(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobEndNotificationURI() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the uri to be invoked in-order to send a notification after the job
has completed (success/failure).
- getJobFile() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the configuration file for the job.
- getJobFile() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get the path of the submitted job configuration.
- getJobID() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getJobId() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the job id.
- getJobId() -
Method in class org.apache.hadoop.mapred.JobStatus
-
- getJobID() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get the job identifier.
- getJobName() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user-specified job name.
- getJobName() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getJobName() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the user-specified job name.
- getJobName() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get the name of the job.
- getJobPriority() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
JobPriority
for this job.
- getJobProfile(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab a handle to a job that is already known to the JobTracker.
- getJobProfile(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobStatus(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab a handle to a job that is already known to the JobTracker.
- getJobStatus(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobTrackerHostPort() -
Method in class org.apache.hadoop.streaming.StreamJob
-
- getJobTrackerMachine() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobTrackerState() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the current state of the
JobTracker
,
as JobTracker.State
- getJournalSyncAverageTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getJournalSyncAverageTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Average time for Journal Sync in last interval
- getJournalSyncMaxTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getJournalSyncMaxTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- The Maximum Journal Sync Time since reset was called
- getJournalSyncMinTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getJournalSyncMinTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- The Minimum Journal Sync Time since reset was called
- getJournalSyncNum() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getJournalSyncNum() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Number of Journal Syncs in the last interval
- getJournalTransactionAverageTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getJournalTransactionAverageTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Average time for Journal transactions in last interval
- getJournalTransactionMaxTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getJournalTransactionMaxTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- The Maximum Journal Transaction Time since reset was called
- getJournalTransactionMinTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getJournalTransactionMinTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- The Minimum Journal Transaction Time since reset was called
- getJournalTransactionNum() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getJournalTransactionNum() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Number of Journal Transactions in the last interval
- getKeepCommandFile(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Does the user want to keep the command file for debugging? If this is
true, pipes will write a copy of the command data to a file in the
task directory named "downlink.data", which may be used to run the C++
program under the debugger.
- getKeepFailedTaskFiles() -
Method in class org.apache.hadoop.mapred.JobConf
- Should the temporary files for failed tasks be kept?
- getKeepTaskFilesPattern() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the regular expression that is matched against the task names
to see if we need to keep the files.
- getKey() -
Method in interface org.apache.hadoop.io.SequenceFile.Sorter.RawKeyValueIterator
- Gets the current raw key
- getKey() -
Method in class org.apache.hadoop.io.SequenceFile.Sorter.SegmentDescriptor
- Returns the stored rawKey
- getKeyClass() -
Method in class org.apache.hadoop.io.MapFile.Reader
- Returns the class of keys in this file.
- getKeyClass() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the class of keys in this file.
- getKeyClass() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- Returns the class of keys in this file.
- getKeyClass() -
Method in class org.apache.hadoop.io.WritableComparator
- Returns the WritableComparable implementation class.
- getKeyClass() -
Method in class org.apache.hadoop.mapred.KeyValueLineRecordReader
-
- getKeyClass() -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
- The class of key that must be passed to
SequenceFileRecordReader.next(WritableComparable,Writable)
..
- getKeyClassName() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the name of the key class.
- getKeyTypeID() -
Method in class org.apache.hadoop.record.meta.MapTypeID
- get the TypeID of the map's key element
- getLastFlushTime() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getLastUpdate() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- The time when this information was accurate.
- getLen() -
Method in class org.apache.hadoop.fs.FileStatus
-
- getLength(Block) -
Method in interface org.apache.hadoop.dfs.FSDatasetInterface
- Returns the specified block's on-disk length (excluding metadata)
- getLength() -
Method in class org.apache.hadoop.dfs.FSDatasetInterface.MetaDataInputStream
-
- getLength(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated. Use getFileStatus() instead
- getLength(Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
- Deprecated.
- getLength() -
Method in class org.apache.hadoop.fs.s3.Block
-
- getLength() -
Method in class org.apache.hadoop.hbase.io.TextSequence
-
- getLength() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
- Get the total number of bytes in the data of the
InputSplit
.
- getLength() -
Method in class org.apache.hadoop.io.DataInputBuffer
- Returns the length of the input.
- getLength() -
Method in class org.apache.hadoop.io.DataOutputBuffer
- Returns the length of the valid data currently in the buffer.
- getLength() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- Returns the current length of the output file.
- getLength() -
Method in class org.apache.hadoop.io.Text
- Returns the number of bytes in the byte array
- getLength() -
Method in class org.apache.hadoop.io.UTF8
- Deprecated. The number of bytes in the encoded string.
- getLength() -
Method in class org.apache.hadoop.mapred.FileSplit
- The number of bytes in the file to process.
- getLength() -
Method in interface org.apache.hadoop.mapred.InputSplit
- Get the total number of bytes in the data of the
InputSplit
.
- getLength() -
Method in class org.apache.hadoop.mapred.join.CompositeInputSplit
- Return the aggregate length of all child InputSplits currently added.
- getLength(int) -
Method in class org.apache.hadoop.mapred.join.CompositeInputSplit
- Get the length of ith child InputSplit.
- getLength() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
-
- getLength(int) -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns the length of the ith Path
- getLengths() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns an array containing the lengths of the files in
the split
- getLevel() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- Return this node's level in the tree.
- getLevel() -
Method in interface org.apache.hadoop.net.Node
- Return this node's level in the tree.
- getLevel() -
Method in class org.apache.hadoop.net.NodeBase
- Return this node's level in the tree.
- getLine() -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
- Deprecated.
- getLine() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
- Deprecated.
- getListenerAddress() -
Method in class org.apache.hadoop.ipc.Server
- Return the socket (ip+port) on which the RPC server is listening to.
- getListing(String) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getLiteralValues() -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParser
-
- getLoad() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getLoad() -
Method in class org.apache.hadoop.hbase.HServerLoad
-
- getLoadNativeLibraries(JobConf) -
Method in class org.apache.hadoop.util.NativeCodeLoader
- Return if native hadoop libraries, if present, can be used for this job.
- getLocal(Configuration) -
Static method in class org.apache.hadoop.fs.FileSystem
- Get the local file syste
- getLocalCache(URI, Configuration, Path, FileStatus, boolean, long, Path) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the locally cached file or archive; it could either be
previously cached (and valid) or copy it from the
FileSystem
now.
- getLocalCache(URI, Configuration, Path, boolean, long, Path) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the locally cached file or archive; it could either be
previously cached (and valid) or copy it from the
FileSystem
now.
- getLocalCacheArchives(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Return the path array of the localized caches
- getLocalCacheFiles(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Return the path array of the localized files
- getLocalDirs() -
Method in class org.apache.hadoop.mapred.JobConf
-
- getLocalJobFilePath(String) -
Static method in class org.apache.hadoop.mapred.JobHistory.JobInfo
- Get the path of the locally stored job file
- getLocalJobFilePath(String) -
Static method in class org.apache.hadoop.mapred.JobTracker
- Get the localized job file path on the job trackers local file system
- getLocalPath(String, String) -
Method in class org.apache.hadoop.conf.Configuration
- Get a local file under a directory named by dirsProp with
the given path.
- getLocalPath(String) -
Method in class org.apache.hadoop.mapred.JobConf
- Constructs a local file name.
- getLocalPathForWrite(String, Configuration) -
Method in class org.apache.hadoop.fs.LocalDirAllocator
- Get a path from the local FS.
- getLocalPathForWrite(String, long, Configuration) -
Method in class org.apache.hadoop.fs.LocalDirAllocator
- Get a path from the local FS.
- getLocalPathToRead(String, Configuration) -
Method in class org.apache.hadoop.fs.LocalDirAllocator
- Get a path from the local FS for reading.
- getLocatedBlocks() -
Method in class org.apache.hadoop.dfs.LocatedBlocks
- Get located blocks.
- getLocation(int) -
Method in class org.apache.hadoop.mapred.join.CompositeInputSplit
- getLocations from ith InputSplit.
- getLocations() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
- Get the list of hostnames where the input split is located.
- getLocations() -
Method in class org.apache.hadoop.mapred.FileSplit
-
- getLocations() -
Method in interface org.apache.hadoop.mapred.InputSplit
- Get the list of hostnames where the input split is located.
- getLocations() -
Method in class org.apache.hadoop.mapred.join.CompositeInputSplit
- Collect a set of hosts from all child InputSplits.
- getLocations() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
-
- getLockid() -
Method in class org.apache.hadoop.hbase.io.BatchUpdate
-
- getLog() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getLong(String, long) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a long
.
- getLongValue(Object) -
Method in class org.apache.hadoop.contrib.utils.join.JobBase
-
- getMapCompletionEvents(String, int, int) -
Method in class org.apache.hadoop.mapred.TaskTracker
-
- getMapDebugScript() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the map task's debug script.
- getMapOutputCompressionType() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
SequenceFile.CompressionType
for the map outputs.
- getMapOutputCompressorClass(Class<? extends CompressionCodec>) -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
CompressionCodec
for compressing the map outputs.
- getMapOutputKeyClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the key class for the map output data.
- getMapOutputValueClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the value class for the map output data.
- getMapperClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
Mapper
class for the job.
- getMapredJobID() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getMapRunnerClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
MapRunnable
class for the job.
- getMapSpeculativeExecution() -
Method in class org.apache.hadoop.mapred.JobConf
- Should speculative execution be used for this job for map tasks?
Defaults to
true
.
- getMapTaskReports(String) -
Method in class org.apache.hadoop.mapred.JobClient
- Get the information of the current state of the map tasks of a job.
- getMapTaskReports(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab a bunch of info on the map tasks that make up the job
- getMapTaskReports(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getMapTasks() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the number of currently running map tasks in the cluster.
- getMaster() -
Method in class org.apache.hadoop.hbase.HBaseAdmin
-
- getMaster() -
Method in interface org.apache.hadoop.hbase.HConnection
-
- getMaster() -
Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getMasterAddress() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getMaxBufferedDeleteTerms() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMaxBufferedDocs() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMaxFieldLength() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMaxMapAttempts() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the configured number of maximum attempts that will be made to run a
map task, as specified by the
mapred.map.max.attempts
property.
- getMaxMapTaskFailuresPercent() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the maximum percentage of map tasks that can fail without
the job being aborted.
- getMaxMapTasks() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the maximum capacity for running map tasks in the cluster.
- getMaxMergeDocs() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMaxReduceAttempts() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the configured number of maximum attempts that will be made to run a
reduce task, as specified by the
mapred.reduce.max.attempts
property.
- getMaxReduceTaskFailuresPercent() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the maximum percentage of reduce tasks that can fail without
the job being aborted.
- getMaxReduceTasks() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the maximum capacity for running reduce tasks in the cluster.
- getMaxTaskFailuresPerTracker() -
Method in class org.apache.hadoop.mapred.JobConf
- Expert: Get the maximum no.
- getMaxTasks() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Deprecated. Use
ClusterStatus.getMaxMapTasks()
and/or
ClusterStatus.getMaxReduceTasks()
- getMaxTime() -
Method in class org.apache.hadoop.metrics.util.MetricsTimeVaryingRate
- The max time for a single operation since the last reset
MetricsTimeVaryingRate.resetMinMax()
- getMaxValueLength() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getMaxVersions() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getMemory() -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- Get the total amount of buffer memory, in bytes.
- getMergeFactor() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMessage() -
Method in exception org.apache.hadoop.hbase.hql.generated.ParseException
- This method has the standard behavior when this object has been
created using the standard constructors.
- getMessage() -
Method in error org.apache.hadoop.hbase.hql.generated.TokenMgrError
- You can also modify the body of this method to customize your error messages.
- getMessage() -
Method in exception org.apache.hadoop.mapred.InvalidInputException
- Get a summary message of the problems found.
- getMessage() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getMessage() -
Method in exception org.apache.hadoop.record.compiler.generated.ParseException
- This method has the standard behavior when this object has been
created using the standard constructors.
- getMessage() -
Method in error org.apache.hadoop.record.compiler.generated.TokenMgrError
- You can also modify the body of this method to customize your error messages.
- getMetadata() -
Method in class org.apache.hadoop.hbase.HTable
-
- getMetadata() -
Method in class org.apache.hadoop.io.SequenceFile.Metadata
-
- getMetadata() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the metadata object of the file
- getMetaDataInputStream(Block) -
Method in interface org.apache.hadoop.dfs.FSDatasetInterface
- Returns metaData of block b as an input stream (and its length)
- getMetaDataLength(Block) -
Method in interface org.apache.hadoop.dfs.FSDatasetInterface
- Returns the length of the metadata file of the specified block
- getMetric(String) -
Method in class org.apache.hadoop.metrics.spi.OutputRecord
- Returns the metric object which can be a Float, Integer, Short or Byte.
- getMetricNames() -
Method in class org.apache.hadoop.metrics.spi.OutputRecord
- Returns the set of metric names.
- getMinTime() -
Method in class org.apache.hadoop.metrics.util.MetricsTimeVaryingRate
- The min time for a single operation since the last reset
MetricsTimeVaryingRate.resetMinMax()
- getMissingIds() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return a list of missing block names (as list of Strings).
- getMissingReplicas() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the number of under-replicated blocks.
- getMissingSize() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return total size of missing data, in bytes.
- getModificationTime() -
Method in class org.apache.hadoop.fs.FileStatus
- Get the modification time of the file.
- getMount() -
Method in class org.apache.hadoop.fs.DF
-
- getMsg() -
Method in class org.apache.hadoop.hbase.HMsg
- Accessor
- getMsg() -
Method in class org.apache.hadoop.hbase.hql.ReturnMsg
-
- getName() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getName() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Deprecated.
- getName() -
Method in class org.apache.hadoop.examples.dancing.Pentomino.Piece
-
- getName() -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated. call #getUri() instead.
- getName() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Deprecated. call #getUri() instead.
- getName() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
- Deprecated.
- getName() -
Method in class org.apache.hadoop.fs.Path
- Returns the final component of this path.
- getName() -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
- Deprecated.
- getName() -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
-
- getName() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getName() -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getName(Class) -
Static method in class org.apache.hadoop.io.WritableName
- Return the name for a class.
- getName() -
Method in class org.apache.hadoop.mapred.Counters.Group
- Returns raw name of the group.
- getName() -
Method in class org.apache.hadoop.mapred.PhasedFileSystem
- Deprecated.
- getName() -
Method in interface org.apache.hadoop.net.Node
- Return this node's name
- getName() -
Method in class org.apache.hadoop.net.NodeBase
- Return this node's name
- getName() -
Method in class org.apache.hadoop.record.meta.RecordTypeInfo
- return the name of the record
- getNamed(String, Configuration) -
Static method in class org.apache.hadoop.fs.FileSystem
- Deprecated. call #get(URI,Configuration) instead.
- getNamenode() -
Method in class org.apache.hadoop.dfs.DataNode
- Return the namenode's identifier
- getNameNodeAddr() -
Method in class org.apache.hadoop.dfs.DataNode
-
- getNameNodeAddress() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the address on which the NameNodes is listening to.
- getNameNodeMetrics() -
Static method in class org.apache.hadoop.dfs.NameNode
-
- getNbHash() -
Method in class org.apache.hadoop.hbase.BloomFilterDescriptor
-
- getNestedStructTypeInfo(String) -
Method in class org.apache.hadoop.record.meta.RecordTypeInfo
- Return the type info of a nested record.
- getNetworkLocation() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- rack name
- getNetworkLocation() -
Method in interface org.apache.hadoop.net.Node
- Return the string representation of this node's network location
- getNetworkLocation() -
Method in class org.apache.hadoop.net.NodeBase
- Return this node's network location
- getNewJobId() -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Allocate a name for the job.
- getNewJobId() -
Method in class org.apache.hadoop.mapred.JobTracker
- Allocates a new JobId string.
- getNextToken() -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParser
-
- getNextToken() -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParserTokenManager
-
- getNextToken() -
Method in class org.apache.hadoop.record.compiler.generated.Rcc
-
- getNextToken() -
Method in class org.apache.hadoop.record.compiler.generated.RccTokenManager
-
- getNode() -
Method in class org.apache.hadoop.mapred.join.Parser.NodeToken
-
- getNode() -
Method in class org.apache.hadoop.mapred.join.Parser.Token
-
- getNullContext(String) -
Static method in class org.apache.hadoop.metrics.ContextFactory
- Returns a "null" context - one which does nothing.
- getNum() -
Method in class org.apache.hadoop.mapred.join.Parser.NumToken
-
- getNum() -
Method in class org.apache.hadoop.mapred.join.Parser.Token
-
- getNumber() -
Method in class org.apache.hadoop.metrics.spi.MetricValue
-
- getNumberColumns() -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Get the number of columns.
- getNumberOfRegions() -
Method in class org.apache.hadoop.hbase.HServerLoad
-
- getNumberOfRequests() -
Method in class org.apache.hadoop.hbase.HServerLoad
-
- getNumBytesInSum() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getNumFiles(PathFilter) -
Method in class org.apache.hadoop.fs.InMemoryFileSystem
-
- getNumFilesCreated() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getNumFilesCreated() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Number of files created in the last interval
- getNumFilesListed() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getNumFilesListed() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Number of files listed in the last interval
- getNumFilesOpened() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getNumFilesOpened() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Number of files opened for reading in the last interval
- getNumFilesRenamed() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getNumFilesRenamed() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- Number of files renamed in the last interval
- getNumMapTasks() -
Method in class org.apache.hadoop.mapred.JobConf
- Get configured the number of reduce tasks for this job.
- getNumOfLeaves() -
Method in class org.apache.hadoop.net.NetworkTopology
- Return the total number of nodes
- getNumOfRacks() -
Method in class org.apache.hadoop.net.NetworkTopology
- Return the total number of racks
- getNumOpenConnections() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- The number of open RPC conections
- getNumOpenConnections() -
Method in class org.apache.hadoop.ipc.Server
- The number of open RPC conections
- getNumPaths() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns the number of Paths in the split
- getNumReduceTasks() -
Method in class org.apache.hadoop.mapred.JobConf
- Get configured the number of reduce tasks for this job.
- getObject(String) -
Method in class org.apache.hadoop.conf.Configuration
- Deprecated. A side map of Configuration to Object should be used instead.
- getOnlineMetaRegions() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getOnlineRegions() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getOtherAction() -
Method in class org.apache.hadoop.fs.permission.FsPermission
- Return other
FsAction
.
- getOut() -
Method in class org.apache.hadoop.hbase.hql.BasicCommand
-
- getOut() -
Method in class org.apache.hadoop.hbase.hql.formatter.AsciiTableFormatter
-
- getOut() -
Method in class org.apache.hadoop.hbase.hql.formatter.HtmlTableFormatter
-
- getOut() -
Method in interface org.apache.hadoop.hbase.hql.TableFormatter
-
- getOutput() -
Method in class org.apache.hadoop.util.Shell.ShellCommandExecutor
- Get the output of the shell command.
- getOutputCompressionType(JobConf) -
Static method in class org.apache.hadoop.mapred.SequenceFileOutputFormat
- Get the
SequenceFile.CompressionType
for the output SequenceFile
.
- getOutputCompressorClass(JobConf, Class<? extends CompressionCodec>) -
Static method in class org.apache.hadoop.mapred.OutputFormatBase
- Get the
CompressionCodec
for compressing the job outputs.
- getOutputFormat() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
OutputFormat
implementation for the map-reduce job,
defaults to TextOutputFormat
if not specified explicity.
- getOutputKeyClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the key class for the job output data.
- getOutputKeyComparator() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
WritableComparable
comparator used to compare keys.
- getOutputPath() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
Path
to the output directory for the map-reduce job.
- getOutputValueClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the value class for job outputs.
- getOutputValueGroupingComparator() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user defined
WritableComparable
comparator for
grouping keys of inputs to the reduce.
- getOwner() -
Method in class org.apache.hadoop.fs.FileStatus
- Get the owner of the file.
- getParameter(ServletRequest, String) -
Static method in class org.apache.hadoop.util.ServletUtil
- Get a parameter from a ServletRequest.
- getParent() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- Return this node's parent
- getParent() -
Method in class org.apache.hadoop.fs.Path
- Returns the parent of a path or null if at root.
- getParent() -
Method in interface org.apache.hadoop.net.Node
- Return this node's parent
- getParent() -
Method in class org.apache.hadoop.net.NodeBase
- Return this node's parent
- getPartition(IntWritable, IntWritable, int) -
Method in class org.apache.hadoop.examples.SleepJob
-
- getPartition(K2, V2, int) -
Method in class org.apache.hadoop.mapred.lib.HashPartitioner
- Use
Object.hashCode()
to partition.
- getPartition(K2, V2, int) -
Method in class org.apache.hadoop.mapred.lib.KeyFieldBasedPartitioner
- Use
Object.hashCode()
to partition.
- getPartition(K2, V2, int) -
Method in interface org.apache.hadoop.mapred.Partitioner
- Get the paritition number for a given key (hence record) given the total
number of partitions i.e.
- getPartitionerClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
Partitioner
used to partition Mapper
-outputs
to be sent to the Reducer
s.
- getPath() -
Method in class org.apache.hadoop.fs.FileStatus
-
- getPath() -
Method in class org.apache.hadoop.mapred.FileSplit
- The file containing this split's data.
- getPath(int) -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns the ith Path
- getPath(Node) -
Static method in class org.apache.hadoop.net.NodeBase
- Return this node's path
- getPaths() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns all the Paths in the split
- getPercentUsed() -
Method in class org.apache.hadoop.fs.DF
-
- getPercentUsed() -
Method in class org.apache.hadoop.fs.InMemoryFileSystem
-
- getPeriod() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Returns the timer period.
- getPermission() -
Method in class org.apache.hadoop.fs.FileStatus
- Get FsPermission associated with the file.
- getPermission() -
Method in class org.apache.hadoop.fs.permission.PermissionStatus
- Return permission
- getPlatformName() -
Static method in class org.apache.hadoop.util.PlatformName
- Get the complete platform as per the java-vm.
- getPort() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getPort() -
Method in class org.apache.hadoop.hbase.HServerAddress
-
- getPort() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- Get the port that the server is on
- getPort() -
Method in class org.apache.hadoop.mapred.StatusHttpServer
- Get the port that the server is on
- getPos() -
Method in class org.apache.hadoop.fs.BufferedFSInputStream
-
- getPos() -
Method in exception org.apache.hadoop.fs.ChecksumException
-
- getPos() -
Method in class org.apache.hadoop.fs.FSDataInputStream
-
- getPos() -
Method in class org.apache.hadoop.fs.FSDataOutputStream
-
- getPos() -
Method in class org.apache.hadoop.fs.FSInputChecker
-
- getPos() -
Method in class org.apache.hadoop.fs.FSInputStream
- Return the current offset from the start of the file
- getPos() -
Method in interface org.apache.hadoop.fs.Seekable
- Return the current offset from the start of the file
- getPos() -
Method in class org.apache.hadoop.mapred.join.CompositeRecordReader
- Unsupported (returns zero in all cases).
- getPos() -
Method in class org.apache.hadoop.mapred.KeyValueLineRecordReader
-
- getPos() -
Method in class org.apache.hadoop.mapred.LineRecordReader
-
- getPos() -
Method in interface org.apache.hadoop.mapred.RecordReader
- Returns the current position in the input.
- getPos() -
Method in class org.apache.hadoop.mapred.SequenceFileAsTextRecordReader
-
- getPos() -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
-
- getPos() -
Method in class org.apache.hadoop.streaming.StreamBaseRecordReader
- Returns the current position in the input.
- getPosition() -
Method in class org.apache.hadoop.io.DataInputBuffer
- Returns the current position in the input.
- getPosition() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Return the current byte position in the input file.
- getPreferredBlockSize(String) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getPreviousIntervalAverageTime() -
Method in class org.apache.hadoop.metrics.util.MetricsTimeVaryingRate
- The average rate of an operation in the previous interval
- getPreviousIntervalNumOps() -
Method in class org.apache.hadoop.metrics.util.MetricsTimeVaryingRate
- The number of operations in the previous interval
- getPreviousIntervalValue() -
Method in class org.apache.hadoop.metrics.util.MetricsTimeVaryingInt
- The Value at the Previous interval
- getProblems() -
Method in exception org.apache.hadoop.mapred.InvalidInputException
- Get the complete list of the problems reported.
- getProcess() -
Method in class org.apache.hadoop.util.Shell
- get the current sub-process executing the given command
- getProcess() -
Method in class org.apache.hadoop.util.ShellUtil
- Deprecated.
- getProfileEnabled() -
Method in class org.apache.hadoop.mapred.JobConf
- Get whether the task profiling is enabled.
- getProfileTaskRange(boolean) -
Method in class org.apache.hadoop.mapred.JobConf
- Get the range of maps or reduces to profile.
- getProgress() -
Method in interface org.apache.hadoop.io.SequenceFile.Sorter.RawKeyValueIterator
- Gets the Progress object; this has a float (0.0 - 1.0)
indicating the bytes processed by the iterator so far
- getProgress() -
Method in class org.apache.hadoop.mapred.join.CompositeRecordReader
- Report progress as the minimum of all child RR progress.
- getProgress() -
Method in class org.apache.hadoop.mapred.KeyValueLineRecordReader
-
- getProgress() -
Method in class org.apache.hadoop.mapred.LineRecordReader
- Get the progress within the split
- getProgress() -
Method in interface org.apache.hadoop.mapred.RecordReader
- How much of the input has the
RecordReader
consumed i.e.
- getProgress() -
Method in class org.apache.hadoop.mapred.SequenceFileAsTextRecordReader
-
- getProgress() -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
- Return the progress within the input split
- getProgress() -
Method in class org.apache.hadoop.mapred.TaskReport
- The amount completed, between zero and one.
- getProgress() -
Method in class org.apache.hadoop.streaming.StreamBaseRecordReader
-
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.hbase.HMaster
- Return protocol version corresponding to protocol interface.
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Return protocol version corresponding to protocol interface.
- getProtocolVersion(String, long) -
Method in interface org.apache.hadoop.ipc.VersionedProtocol
- Return protocol version corresponding to protocol interface.
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.mapred.TaskTracker
-
- getProxy(Class<?>, long, InetSocketAddress, Configuration, SocketFactory) -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Construct a client-side proxy object that implements the named protocol,
talking to a server at the named address.
- getProxy(Class<?>, long, InetSocketAddress, UserGroupInformation, Configuration, SocketFactory) -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Construct a client-side proxy object that implements the named protocol,
talking to a server at the named address.
- getProxy(Class<?>, long, InetSocketAddress, Configuration) -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Construct a client-side proxy object with the default SocketFactory
- getProxy(Class<?>, long, InetSocketAddress, Configuration, SocketFactory) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a client-side proxy object that implements the named protocol,
talking to a server at the named address.
- getProxy(Class<?>, long, InetSocketAddress, UserGroupInformation, Configuration, SocketFactory) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a client-side proxy object that implements the named protocol,
talking to a server at the named address.
- getProxy(Class<?>, long, InetSocketAddress, Configuration) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a client-side proxy object with the default SocketFactory
- getQueryStr() -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParser
-
- getRange(String, String) -
Method in class org.apache.hadoop.conf.Configuration
- Parse the given attribute as a set of integer ranges
- getRaw(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property, without doing
variable expansion.
- getRawCapacity() -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
- Return the total raw capacity of the filesystem, disregarding
replication .
- getRawCapacity() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Return the total raw capacity of the filesystem, disregarding
replication .
- getRawFileSystem() -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- get the raw file system
- getRawUsed() -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
- Return the total raw used space in the filesystem, disregarding
replication .
- getRawUsed() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Return the total raw used space in the filesystem, disregarding
replication .
- getReadBlockOpAverageTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadBlockOpAverageTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Average time for ReadBlock Operation in last interval
- getReadBlockOpMaxTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadBlockOpMaxTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Maximum ReadBlock Operation Time since reset was called
- getReadBlockOpMinTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadBlockOpMinTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Minimum ReadBlock Operation Time since reset was called
- getReadBlockOpNum() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadBlockOpNum() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of ReadBlock Operation in last interval
- getReader(FileSystem, Filter) -
Method in class org.apache.hadoop.hbase.HStoreFile
- Get reader for the store file map file.
- getReaders(FileSystem, Path, Configuration) -
Static method in class org.apache.hadoop.mapred.MapFileOutputFormat
- Open the output generated by this format.
- getReaders(Configuration, Path) -
Static method in class org.apache.hadoop.mapred.SequenceFileOutputFormat
- Open the output generated by this format.
- getReadMetadataOpAverageTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadMetadataOpAverageTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Average time for ReadMetadata Operation in last interval
- getReadMetadataOpMaxTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadMetadataOpMaxTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Maximum ReadMetadata Operation Time since reset was called
- getReadMetadataOpMinTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadMetadataOpMinTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Minimum ReadMetadata Operation Time since reset was called
- getReadMetadataOpNum() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadMetadataOpNum() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of ReadMetadata Operation in last interval
- getReadsFromLocalClient() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadsFromLocalClient() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of reads from local clients in the last interval
- getReadsFromRemoteClient() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReadsFromRemoteClient() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of reads from remote clients in the last interval
- getReadyJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getRecordName() -
Method in interface org.apache.hadoop.metrics.MetricsRecord
- Returns the record name.
- getRecordName() -
Method in class org.apache.hadoop.metrics.spi.MetricsRecordImpl
- Returns the record name.
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.hbase.mapred.TableInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.FileInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in interface org.apache.hadoop.mapred.InputFormat
- Get the
RecordReader
for the given InputSplit
.
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in interface org.apache.hadoop.mapred.join.ComposableInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.join.CompositeInputFormat
- Construct a CompositeRecordReader for the children of this InputFormat
as defined in the init expression.
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.KeyValueTextInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.MultiFileInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.SequenceFileAsTextInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.SequenceFileInputFilter
- Create a record reader for the given split
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.SequenceFileInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.TextInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.streaming.StreamInputFormat
-
- getRecordReaderQueue() -
Method in class org.apache.hadoop.mapred.join.CompositeRecordReader
- Return sorted list of RecordReaders for this composite.
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.hbase.mapred.IndexOutputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.hbase.mapred.TableOutputFormat
- Get the
RecordWriter
for the given job.
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.lib.NullOutputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.MapFileOutputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in interface org.apache.hadoop.mapred.OutputFormat
- Get the
RecordWriter
for the given job.
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.OutputFormatBase
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.SequenceFileOutputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.TextOutputFormat
-
- getReduceDebugScript() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the reduce task's debug Script
- getReducerClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
Reducer
class for the job.
- getReduceSpeculativeExecution() -
Method in class org.apache.hadoop.mapred.JobConf
- Should speculative execution be used for this job for reduce tasks?
Defaults to
true
.
- getReduceTaskReports(String) -
Method in class org.apache.hadoop.mapred.JobClient
- Get the information of the current state of the reduce tasks of a job.
- getReduceTaskReports(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab a bunch of info on the reduce tasks that make up the job
- getReduceTaskReports(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getReduceTasks() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the number of currently running reduce tasks in the cluster.
- getRegion(Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Protected utility method for safely obtaining an HRegion handle.
- getRegion(Text, boolean) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Protected utility method for safely obtaining an HRegion handle.
- getRegionDir(Path, HRegionInfo) -
Static method in class org.apache.hadoop.hbase.HRegion
- Computes the Path of the HRegion
- getRegionId() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getRegionInfo() -
Method in class org.apache.hadoop.hbase.HMsg
- Accessor
- getRegionInfo() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getRegionInfo(Text) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get metainfo about an HRegion
- getRegionInfo() -
Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getRegionInfo(Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get metainfo about an HRegion
- getRegionName() -
Method in class org.apache.hadoop.hbase.HMaster.MetaRegion
-
- getRegionName() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getRegionName() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getRegionServer() -
Method in class org.apache.hadoop.hbase.LocalHBaseCluster.RegionServerThread
-
- getRegionServers() -
Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getRegionsToCheck() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getRemaining() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.FSDatasetMBean
- Returns the amount of free storage space (in bytes)
- getRemaining() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- The raw free space.
- getRemaining() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem.DiskStatus
-
- getRemainingArgs() -
Method in class org.apache.hadoop.util.GenericOptionsParser
- Returns an array of Strings containing only application-specific arguments.
- getRemoteAddress() -
Static method in class org.apache.hadoop.ipc.Server
- Returns remote address as a string when invoked inside an RPC.
- getRemoteIp() -
Static method in class org.apache.hadoop.ipc.Server
- Returns the remote side ip address when invoked inside an RPC
Returns null incase of an error.
- getReplaceBlockOpAverageTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReplaceBlockOpAverageTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Average time for ReplaceBlock Operation in last interval
- getReplaceBlockOpMaxTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReplaceBlockOpMaxTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Maximum ReplaceBlock Operation Time since reset was called
- getReplaceBlockOpMinTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReplaceBlockOpMinTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Minimum ReplaceBlock Operation Time since reset was called
- getReplaceBlockOpNum() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getReplaceBlockOpNum() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of ReplaceBlock Operation in last interval
- getReplication() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the intended replication factor, against which the over/under-
replicated blocks are counted.
- getReplication() -
Method in class org.apache.hadoop.fs.FileStatus
- Get the replication factor of a file.
- getReplication(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated. Use getFileStatus() instead
- getReplication(Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
- Deprecated.
- getReplicationFactor() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the actual replication factor.
- getReport() -
Method in class org.apache.hadoop.contrib.utils.join.JobBase
- log the counters
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.DoubleValueSum
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMax
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMin
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueSum
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMax
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMin
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.UniqValueCount
-
- getReport() -
Method in interface org.apache.hadoop.mapred.lib.aggregate.ValueAggregator
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.ValueHistogram
-
- getReportDetails() -
Method in class org.apache.hadoop.mapred.lib.aggregate.ValueHistogram
-
- getReportItems() -
Method in class org.apache.hadoop.mapred.lib.aggregate.ValueHistogram
-
- getRequestCount() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getResource(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the
URL
for the named resource.
- getRevision() -
Static method in class org.apache.hadoop.util.VersionInfo
- Get the subversion revision number for the root directory
- getRootDir() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getRootRegionLocation() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getRotations() -
Method in class org.apache.hadoop.examples.dancing.Pentomino.Piece
-
- getRow() -
Method in class org.apache.hadoop.hbase.hql.InsertCommand
-
- getRow(Text, Text) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get all the data for the specified row
- getRow(Text, Text, long) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get all the data for the specified row at a given timestamp
- getRow(Text, Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get all the data for the specified row
- getRow(Text, Text, long) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get all the data for the specified row at a given timestamp
- getRow() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getRow(Text) -
Method in class org.apache.hadoop.hbase.HTable
- Get all the data for the specified row at the latest timestamp
- getRow(Text, long) -
Method in class org.apache.hadoop.hbase.HTable
- Get all the data for the specified row at a specified timestamp
- getRow() -
Method in class org.apache.hadoop.hbase.io.BatchUpdate
-
- getRow(byte[], byte[]) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getRow(byte[], byte[]) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get all the data for the specified table and row at the latest
timestamp.
- getRow(byte[], byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getRowKeyAtOrBefore(Text, long) -
Method in class org.apache.hadoop.hbase.HStore
- Find the key that matches row exactly, or the one that immediately
preceeds it.
- getRowkeyName() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getRowTs(byte[], byte[], long) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getRowTs(byte[], byte[], long) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get all the data for the specified table and row at the specified
timestamp.
- getRowTs(byte[], byte[], long) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getRpcOpsAvgProcessingTime() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- Average time for RPC Operations in last interval
- getRpcOpsAvgProcessingTimeMax() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- The Maximum RPC Operation Processing Time since reset was called
- getRpcOpsAvgProcessingTimeMin() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- The Minimum RPC Operation Processing Time since reset was called
- getRpcOpsAvgQueueTime() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- The Average RPC Operation Queued Time in the last interval
- getRpcOpsAvgQueueTimeMax() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- The Maximum RPC Operation Queued Time since reset was called
- getRpcOpsAvgQueueTimeMin() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- The Minimum RPC Operation Queued Time since reset was called
- getRpcOpsDiscardedOpsNum() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- Number of Discarded RPC operations due to timeout in the last interval
- getRpcOpsDiscardedOpsQtime() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- Average Queued time for Discarded RPC Operations in last interval
- getRpcOpsNumber() -
Method in interface org.apache.hadoop.ipc.metrics.RpcMgtMBean
- Number of RPC Operations in the last interval
- getRunnable() -
Method in class org.apache.hadoop.util.Daemon
-
- getRunningJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getRunningJobs() -
Method in class org.apache.hadoop.mapred.JobTracker
- Version that is called from a timer thread, and therefore needs to be
careful to synchronize.
- getRunState() -
Method in class org.apache.hadoop.mapred.JobStatus
-
- getSafeModeText() -
Method in class org.apache.hadoop.dfs.JspHelper
-
- getSafemodeTime() -
Method in class org.apache.hadoop.dfs.namenode.metrics.NameNodeStatistics
-
- getSafemodeTime() -
Method in interface org.apache.hadoop.dfs.namenode.metrics.NameNodeStatisticsMBean
- The time spent in the Safemode at startup
- getScanner(Text[], Text, long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.HRegion
- Return an iterator that scans over the HRegion, returning the indicated
columns for only the rows that match the data filter.
- getScanner(int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
- Returns the scanner associated with the specified ID.
- getSerializedLength() -
Method in class org.apache.hadoop.fs.s3.INode
-
- getServer() -
Method in class org.apache.hadoop.hbase.HMaster.MetaRegion
-
- getServer(Object, String, int, Configuration) -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServer(Object, String, int, int, boolean, Configuration) -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServer(Object, String, int, Configuration) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServer(Object, String, int, int, boolean, Configuration) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServerAddress() -
Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getServerAddress() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getServerAddress(Configuration, String, String, String) -
Static method in class org.apache.hadoop.net.NetUtils
- Deprecated.
- getServerInfo() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getServersToLoad() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getServersToServerInfo() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getServerVersion() -
Method in exception org.apache.hadoop.hbase.ipc.HbaseRPC.VersionMismatch
- Get the server's agreed to version.
- getServerVersion() -
Method in exception org.apache.hadoop.ipc.RPC.VersionMismatch
- Get the server's agreed to version.
- getSessionId() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user-specified session identifier.
- getShape(boolean, int) -
Method in class org.apache.hadoop.examples.dancing.Pentomino.Piece
-
- getShuffleFinishTime() -
Method in class org.apache.hadoop.mapred.MapTaskStatus
-
- getShuffleFinishTime() -
Method in class org.apache.hadoop.mapred.ReduceTaskStatus
-
- getSimilarityName() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getSize() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getSize() -
Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- getSize() -
Method in class org.apache.hadoop.io.BytesWritable
- Get the current size of the buffer.
- getSocketFactory(Configuration, Class<?>) -
Static method in class org.apache.hadoop.net.NetUtils
- Get the socket factory for the given class according to its
configuration parameter
hadoop.rpc.socket.factory.class.<ClassName>.
- getSocketFactoryFromProperty(Configuration, String) -
Static method in class org.apache.hadoop.net.NetUtils
- Get the socket factory corresponding to the given proxy URI.
- getSortFinishTime() -
Method in class org.apache.hadoop.mapred.MapTaskStatus
-
- getSortFinishTime() -
Method in class org.apache.hadoop.mapred.ReduceTaskStatus
-
- getSpace(int) -
Static method in class org.apache.hadoop.streaming.StreamUtil
-
- getSpeculativeExecution() -
Method in class org.apache.hadoop.mapred.JobConf
- Deprecated. Use {
JobConf.getMapSpeculativeExecution()
or
JobConf.getReduceSpeculativeExecution()
instead.
Should speculative execution be used for this job?
Defaults to true
.
- getSplits(int) -
Method in class org.apache.hadoop.examples.dancing.Pentomino
- Generate a list of prefixes to a given depth
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.hbase.mapred.TableInputFormat
- A split will be created for each HRegion of the input table
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.mapred.FileInputFormat
- Splits files returned by
FileInputFormat.listPaths(JobConf)
when
they're too big.
- getSplits(JobConf, int) -
Method in interface org.apache.hadoop.mapred.InputFormat
- Logically split the set of input files for the job.
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.mapred.join.CompositeInputFormat
- Build a CompositeInputSplit from the child InputFormats by assigning the
ith split from each child to the ith composite split.
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.mapred.MultiFileInputFormat
-
- getStart() -
Method in class org.apache.hadoop.mapred.FileSplit
- The position of the first byte in the file to process.
- getStartCode() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.HMaster.MetaRegion
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getStartKeys() -
Method in class org.apache.hadoop.hbase.HTable
- Gets the starting row key for every region in the currently open table
- getStartRow() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getStartTime() -
Method in class org.apache.hadoop.mapred.JobStatus
-
- getStartTime() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getStartTime() -
Method in class org.apache.hadoop.mapred.TaskReport
- Get start time of task.
- getState() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getState() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getState() -
Method in class org.apache.hadoop.mapred.TaskReport
- The most recent state, reported by a
Reporter
.
- getStats() -
Method in class org.apache.hadoop.dfs.NameNode
-
- getStatus() -
Method in exception org.apache.hadoop.hbase.hql.ExitException
- The status code returned by System.exit()
- getStatusText(boolean) -
Method in class org.apache.hadoop.dfs.UpgradeStatusReport
- Get upgradeStatus data as a text for reporting.
- getStopRowKey() -
Method in class org.apache.hadoop.hbase.filter.StopRowFilter
- An accessor for the stopRowKey
- getStorageID() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getStorageInfo() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.FSDatasetMBean
- Returns the storage id of the underlying storage
- getStr() -
Method in class org.apache.hadoop.mapred.join.Parser.StrToken
-
- getStr() -
Method in class org.apache.hadoop.mapred.join.Parser.Token
-
- getStringLiteral() -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParser
-
- getStrings(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the comma delimited values of the
name
property as
an array of String
s.
- getStrings(String) -
Static method in class org.apache.hadoop.util.StringUtils
- returns an arraylist of strings
- getSuccessfulJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- GetSuffix(int) -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
-
- GetSuffix(int) -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getSum() -
Method in class org.apache.hadoop.mapred.lib.aggregate.DoubleValueSum
-
- getSum() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueSum
-
- getSymlink(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- This method checks to see if symlinks are to be create for the
localized cache files in the current working directory
- getSystemDir() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the system directory where job-specific files are to be placed.
- getTable(String) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- getTable(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
- Creates and returns an HTable instance from a given table name.
- getTableDesc() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getTableDir(Path, Text) -
Static method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getTableName() -
Method in class org.apache.hadoop.hbase.HTable
-
- getTableName() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getTableName(String[]) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- getTableNameFromRegionName(Text) -
Static method in class org.apache.hadoop.hbase.HRegionInfo
- Extracts table name prefix from a region name.
- getTableNames() -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getTableNames() -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- List all the userspace tables.
- getTableNames() -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getTableRegions(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getTableRegions(byte[]) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- List the regions associated with a table.
- getTableRegions(byte[]) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getTabSize(int) -
Method in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
-
- getTabSize(int) -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getTag() -
Method in class org.apache.hadoop.contrib.utils.join.TaggedMapOutput
-
- getTag(String) -
Method in class org.apache.hadoop.metrics.spi.OutputRecord
- Returns a tag object which is can be a String, Integer, Short or Byte.
- getTagNames() -
Method in class org.apache.hadoop.metrics.spi.OutputRecord
- Returns the set of tag names
- getTask(String) -
Method in class org.apache.hadoop.mapred.TaskTracker
- Called upon startup by the child process, to fetch Task data.
- getTaskAttempts() -
Method in class org.apache.hadoop.mapred.JobHistory.Task
- Returns all task attempts for this task.
- getTaskCompletionEvents(String, int, int) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Get task completion events for the jobid, starting from fromEventId.
- getTaskCompletionEvents(String, int, int) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getTaskCompletionEvents(int) -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get events indicating completion (success/failure) of component tasks.
- getTaskDiagnostics(String, String, String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Get the diagnostics for a given task in a given job
- getTaskDiagnostics(String, String, String) -
Method in class org.apache.hadoop.mapred.JobTracker
- Get the diagnostics for a given task
- getTaskId() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- Returns task id.
- getTaskId() -
Method in class org.apache.hadoop.mapred.TaskLogAppender
- Getter/Setter methods for log4j.
- getTaskId() -
Method in class org.apache.hadoop.mapred.TaskReport
- The id of the task.
- getTaskInfo(JobConf) -
Static method in class org.apache.hadoop.streaming.StreamUtil
-
- getTaskLogFile(String, TaskLog.LogName) -
Static method in class org.apache.hadoop.mapred.TaskLog
-
- getTaskLogLength(JobConf) -
Static method in class org.apache.hadoop.mapred.TaskLog
- Get the desired maximum length of task's logs.
- getTaskOutputFilter(JobConf) -
Static method in class org.apache.hadoop.mapred.JobClient
- Get the task output filter out of the JobConf.
- getTaskOutputFilter() -
Method in class org.apache.hadoop.mapred.JobClient
- Deprecated.
- getTaskRunTime() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- Returns time (in millisec) the task took to complete.
- getTaskStatus() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- Returns enum Status.SUCESS or Status.FAILURE.
- getTaskTracker(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getTaskTrackerHttp() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- http location of the tasktracker where this task ran.
- getTaskTrackerMetrics() -
Method in class org.apache.hadoop.mapred.TaskTracker
-
- getTaskTrackerReportAddress() -
Method in class org.apache.hadoop.mapred.TaskTracker
- Return the port at which the tasktracker bound to
- getTaskTrackers() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the number of task trackers in the cluster.
- getTimestamp(Configuration, URI) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Returns mtime of a given cache file on hdfs.
- getTimestamp() -
Method in class org.apache.hadoop.hbase.HLogEdit
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getToken(int) -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParser
-
- getToken(int) -
Method in class org.apache.hadoop.record.compiler.generated.Rcc
-
- getTotalBlocks() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the total number of blocks in the scanned area.
- getTotalDirs() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return total number of directories encountered during this scan.
- getTotalFiles() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return total number of files encountered during this scan.
- getTotalLogFileSize() -
Method in class org.apache.hadoop.mapred.TaskLogAppender
-
- getTotalSize() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return total size of scanned data, in bytes.
- getTotalSubmissions() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getTrackerIdentifier() -
Method in class org.apache.hadoop.mapred.JobTracker
- Get the unique identifier (ie.
- getTrackerPort() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getTrackingURL() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get the URL where some job progress information will be displayed.
- getType() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Get the type of the software component, which this object is upgrading.
- getType() -
Method in class org.apache.hadoop.hbase.BloomFilterDescriptor
-
- getType() -
Method in class org.apache.hadoop.hbase.hql.ReturnMsg
-
- getType() -
Method in class org.apache.hadoop.mapred.join.Parser.Token
-
- getTypeID() -
Method in class org.apache.hadoop.record.meta.FieldTypeInfo
- get the field's TypeID object
- getTypes() -
Method in class org.apache.hadoop.io.GenericWritable
- Return all classes that may be wrapped.
- getTypeVal() -
Method in class org.apache.hadoop.record.meta.TypeID
- Get the type value.
- getUMask(Configuration) -
Static method in class org.apache.hadoop.fs.permission.FsPermission
- Get the user file creation mask (umask)
- getUniqueItems() -
Method in class org.apache.hadoop.mapred.lib.aggregate.UniqValueCount
-
- getUpgradeStatus() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Upgrade status determines a percentage of the work done out of the total
amount required by the upgrade.
- getUpgradeStatus() -
Method in class org.apache.hadoop.dfs.UpgradeStatusReport
- Get upgrade upgradeStatus as a percentage of the total upgrade done.
- getUpgradeStatusReport(boolean) -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Get status report for the upgrade.
- getUpgradeStatusText() -
Method in class org.apache.hadoop.dfs.JspHelper
-
- getUri() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getUri() -
Method in class org.apache.hadoop.dfs.HftpFileSystem
-
- getUri() -
Method in class org.apache.hadoop.fs.FileSystem
- Returns a URI whose scheme and authority identify this FileSystem.
- getUri() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Returns a URI whose scheme and authority identify this FileSystem.
- getUri() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getUri() -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- getUri() -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
-
- getURIs(String, String) -
Method in class org.apache.hadoop.streaming.StreamJob
- get the uris of all the files/caches
- getURL() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the link to the web-ui for details of the job.
- getUrl() -
Static method in class org.apache.hadoop.util.VersionInfo
- Get the subversion URL for the root Hadoop directory.
- getUsed() -
Method in class org.apache.hadoop.fs.DF
-
- getUsed() -
Method in class org.apache.hadoop.fs.DU
-
- getUsed() -
Method in class org.apache.hadoop.fs.FileSystem
- Return the total size of all files in the filesystem.
- getUser() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the reported username for this job.
- getUser() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the user id.
- getUser() -
Static method in class org.apache.hadoop.util.VersionInfo
- The user that compiled Hadoop.
- getUserAction() -
Method in class org.apache.hadoop.fs.permission.FsPermission
- Return user
FsAction
.
- getUserInfo() -
Static method in class org.apache.hadoop.ipc.Server
- Returns
UserGroupInformation
associated with current RPC.
- getUserName() -
Method in class org.apache.hadoop.fs.permission.PermissionStatus
- Return user name
- getUsername() -
Method in class org.apache.hadoop.mapred.JobStatus
-
- getUserName() -
Method in class org.apache.hadoop.security.UnixUserGroupInformation
- Return the user's name
- getUserName() -
Method in interface org.apache.hadoop.security.UserGroupInformation
- Get username
- getVal() -
Method in class org.apache.hadoop.hbase.HLogEdit
-
- getVal() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMax
-
- getVal() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMin
-
- getVal() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMax
-
- getVal() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMin
-
- getValue() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getValue(int) -
Method in class org.apache.hadoop.hbase.hql.InsertCommand
-
- getValue() -
Method in class org.apache.hadoop.hbase.io.BatchOperation
-
- getValue() -
Method in interface org.apache.hadoop.io.SequenceFile.Sorter.RawKeyValueIterator
- Gets the current raw value
- getValueClass() -
Method in class org.apache.hadoop.io.ArrayWritable
-
- getValueClass() -
Method in class org.apache.hadoop.io.MapFile.Reader
- Returns the class of values in this file.
- getValueClass() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the class of values in this file.
- getValueClass() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- Returns the class of values in this file.
- getValueClass() -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
- The class of value that must be passed to
SequenceFileRecordReader.next(WritableComparable,Writable)
..
- getValueClassName() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the name of the value class.
- getValueTypeID() -
Method in class org.apache.hadoop.record.meta.MapTypeID
- get the TypeID of the map's value element
- getVectorSize() -
Method in class org.apache.hadoop.hbase.BloomFilterDescriptor
-
- getVer(byte[], byte[], byte[], int) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getVer(byte[], byte[], byte[], int) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get the specified number of versions for the specified table,
row, and column.
- getVer(byte[], byte[], byte[], int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getVersion() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Get the layout version of the upgrade object.
- getVersion() -
Method in class org.apache.hadoop.dfs.UpgradeStatusReport
- Get the layout version of the currently running upgrade.
- getVersion() -
Method in interface org.apache.hadoop.fs.s3.FileSystemStore
-
- getVersion() -
Method in class org.apache.hadoop.io.VersionedWritable
- Return the version number of the current implementation.
- getVersion() -
Static method in class org.apache.hadoop.util.VersionInfo
- Get the Hadoop version.
- getVerTs(byte[], byte[], byte[], long, int) -
Method in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- getVerTs(byte[], byte[], byte[], long, int) -
Method in interface org.apache.hadoop.hbase.thrift.generated.Hbase.Iface
- Get the specified number of versions for the specified table,
row, and column.
- getVerTs(byte[], byte[], byte[], long, int) -
Method in class org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler
-
- getVIntSize(long) -
Static method in class org.apache.hadoop.io.WritableUtils
- Get the encoded length if an integer is stored in a variable-length format
- getVIntSize(long) -
Static method in class org.apache.hadoop.record.Utils
- Get the encoded length if an integer is stored in a variable-length format
- getWaitingJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getWarn() -
Static method in class org.apache.hadoop.metrics.jvm.EventCounter
-
- getWebAppDir(String) -
Static method in class org.apache.hadoop.hbase.util.InfoServer
-
- getWeight() -
Method in class org.onelab.filter.Key
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.dfs.HftpFileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.FileSystem
- Get the current working directory for the given file system
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Get the current working directory for the given file system
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the current working directory for the default file system.
- getWrappedStream() -
Method in class org.apache.hadoop.fs.FSDataOutputStream
-
- getWritable(byte[], Writable) -
Static method in class org.apache.hadoop.hbase.util.Writables
- Set bytes into the passed Writable by calling its
Writable.readFields(java.io.DataInput)
.
- getWriteBlockOpAverageTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getWriteBlockOpAverageTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Average time for WriteBlock Operation in last interval
- getWriteBlockOpMaxTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getWriteBlockOpMaxTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Maximum WriteBlock Operation Time since reset was called
- getWriteBlockOpMinTime() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getWriteBlockOpMinTime() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- The Minimum WriteBlock Operation Time since reset was called
- getWriteBlockOpNum() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getWriteBlockOpNum() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of WriteBlock Operation in last interval
- getWriter(FileSystem, SequenceFile.CompressionType, Filter) -
Method in class org.apache.hadoop.hbase.HStoreFile
- Get a store file writer.
- getWritesFromLocalClient() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getWritesFromLocalClient() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of writes from local clients in the last interval
- getWritesFromRemoteClient() -
Method in class org.apache.hadoop.dfs.datanode.metrics.DataNodeStatistics
-
- getWritesFromRemoteClient() -
Method in interface org.apache.hadoop.dfs.datanode.metrics.DataNodeStatisticsMBean
- Number of writes from remote clients in the last interval
- getXceiverCount() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- number of active connections
- getXMLOutputter(PrintWriter) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- getZlibCompressor(Configuration) -
Static method in class org.apache.hadoop.io.compress.zlib.ZlibFactory
- Return the appropriate implementation of the zlib compressor.
- getZlibCompressorType(Configuration) -
Static method in class org.apache.hadoop.io.compress.zlib.ZlibFactory
- Return the appropriate type of the zlib compressor.
- getZlibDecompressor(Configuration) -
Static method in class org.apache.hadoop.io.compress.zlib.ZlibFactory
- Return the appropriate implementation of the zlib decompressor.
- getZlibDecompressorType(Configuration) -
Static method in class org.apache.hadoop.io.compress.zlib.ZlibFactory
- Return the appropriate type of the zlib decompressor.
- globPaths(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated.
- globPaths(Path, PathFilter) -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated.
- globStatus(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Return all the files that match filePattern and are not checksum
files.
- globStatus(Path, PathFilter) -
Method in class org.apache.hadoop.fs.FileSystem
- Return an array of FileStatus objects whose path names match pathPattern
and is accepted by the user-supplied path filter.
- go() -
Method in class org.apache.hadoop.streaming.StreamJob
- This is the method that actually
intializes the job conf and submits the job
to the jobtracker
- goodClassOrNull(String, String) -
Static method in class org.apache.hadoop.streaming.StreamUtil
- It may seem strange to silently switch behaviour when a String
is not a classname; the reason is simplified Usage:
- Grep - Class in org.apache.hadoop.examples
-
- GROUP_COLUMNS -
Static variable in class org.apache.hadoop.hbase.mapred.GroupingTableMap
- JobConf parameter to specify the columns used to produce the key passed to
collect from the map phase
- GroupingTableMap - Class in org.apache.hadoop.hbase.mapred
- Extract grouping columns from input record
- GroupingTableMap() -
Constructor for class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
- GT_TKN -
Static variable in interface org.apache.hadoop.record.compiler.generated.RccConstants
-
- GZIP -
Static variable in class org.apache.hadoop.hbase.util.Base64
- Specify that data should be gzip-compressed.
- GzipCodec - Class in org.apache.hadoop.io.compress
- This class creates gzip compressors/decompressors.
- GzipCodec() -
Constructor for class org.apache.hadoop.io.compress.GzipCodec
-
- GzipCodec.GzipInputStream - Class in org.apache.hadoop.io.compress
-
- GzipCodec.GzipInputStream(InputStream) -
Constructor for class org.apache.hadoop.io.compress.GzipCodec.GzipInputStream
-
- GzipCodec.GzipInputStream(DecompressorStream) -
Constructor for class org.apache.hadoop.io.compress.GzipCodec.GzipInputStream
- Allow subclasses to directly set the inflater stream.
- GzipCodec.GzipOutputStream - Class in org.apache.hadoop.io.compress
- A bridge that wraps around a DeflaterOutputStream to make it
a CompressionOutputStream.
- GzipCodec.GzipOutputStream(OutputStream) -
Constructor for class org.apache.hadoop.io.compress.GzipCodec.GzipOutputStream
-
- GzipCodec.GzipOutputStream(CompressorStream) -
Constructor for class org.apache.hadoop.io.compress.GzipCodec.GzipOutputStream
- Allow children types to put a different type in here.
Object.hashCode()
.ObjectWritable
.TaskTracker
and
the JobTracker
.
dir
location.
ibw
.
Enum
type, by the specified amount.
JobTracker
.
InputFormat
describes the input-specification for a
Map-Reduce job.FileInputFormat
InputSplit
represents the data to be processed by an
individual Mapper
.Mapper
that swaps keys and values.Iterator
to go through the list of String
key-value pairs in the configuration.
JobClient
is the primary interface for the user-job to interact
with the JobTracker
.JobConf
, and connect to the
default JobTracker
.
JobProfile
.
JobProfile
the userid, jobid,
job config-file, job-details url and job name.
ArrayFile.Reader.seek(long)
, ArrayFile.Reader.next(Writable)
, or ArrayFile.Reader.get(long,Writable)
.
Keying.createKey(String)
transform.
InputFormat
for plain text files.Mapper
that extracts text matching a regular expression.Reducer
that sums long values.CompressionCodec
for a streaming
lzo compression/decompression pair.Compressor
based on the lzo algorithm.LzoCompressor.CompressionStrategy
.
Decompressor
based on the lzo algorithm.stdout
or split the specified log files.
OutputFormat
that writes MapFile
s.Mapper
and Reducer
implementations.Mapper
s.MapRunnable
implementation.MBeanUtil.registerMBean(String, String, Object)
SegmentDescriptor
FileSystem.mkdirs(Path, FsPermission)
with default permission.
InputFormat
that returns MultiFileSplit
's
in MultiFileInputFormat.getSplits(JobConf, int)
method.true
if a preset dictionary is needed for decompression.
WritableComparable
instance.
key
and
val
.
key
, skipping its
value.
key
and
val
.
SequenceFile.Reader.nextRaw(DataOutputBuffer,SequenceFile.ValueBytes)
.
key
.
stopRow.
- obtainScanner(Text[], Text, Text, long) -
Method in class org.apache.hadoop.hbase.HTable
- Get a scanner on the current table starting at the specified row and
ending just before
stopRow.
- obtainScanner(Text[], Text, long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.HTable
- Get a scanner on the current table starting at the specified row.
- offerService() -
Method in class org.apache.hadoop.dfs.DataNode
- Main loop for the DataNode.
- offerService() -
Method in class org.apache.hadoop.mapred.JobTracker
- Run forever
- okCols -
Variable in class org.apache.hadoop.hbase.HAbstractScanner
-
- ONE -
Static variable in interface org.apache.hadoop.mapred.lib.aggregate.ValueAggregatorDescriptor
-
- oneRotation -
Static variable in class org.apache.hadoop.examples.dancing.Pentomino
- Is the piece fixed under rotation?
- OneSidedPentomino - Class in org.apache.hadoop.examples.dancing
- Of the "normal" 12 pentominos, 6 of them have distinct shapes when flipped.
- OneSidedPentomino() -
Constructor for class org.apache.hadoop.examples.dancing.OneSidedPentomino
-
- OneSidedPentomino(int, int) -
Constructor for class org.apache.hadoop.examples.dancing.OneSidedPentomino
-
- onlineRegions -
Variable in class org.apache.hadoop.hbase.HRegionServer
-
- OP_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_BLOCKRECEIVED -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_BLOCKREPORT -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_ABANDONBLOCK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_ABANDONBLOCK_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_ADDBLOCK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_ADDBLOCK_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_COMPLETEFILE -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_COMPLETEFILE_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_DATANODE_HINTS -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_DATANODE_HINTS_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_DATANODEREPORT -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_DATANODEREPORT_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_DELETE -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_DELETE_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_EXISTS -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_EXISTS_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_ISDIR -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_ISDIR_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_LISTING -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_LISTING_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_MKDIRS -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_MKDIRS_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_OBTAINLOCK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_OBTAINLOCK_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_OPEN -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_OPEN_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_RAWSTATS -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_RAWSTATS_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_RELEASELOCK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_RELEASELOCK_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_RENAMETO -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_RENAMETO_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_RENEW_LEASE -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_RENEW_LEASE_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_STARTFILE -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_STARTFILE_ACK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_CLIENT_TRYAGAIN -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_COPY_BLOCK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_ERROR -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_FAILURE -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_HEARTBEAT -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_INVALIDATE_BLOCKS -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_READ_BLOCK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_READ_METADATA -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_REPLACE_BLOCK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_STATUS_CHECKSUM_OK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_STATUS_ERROR -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_STATUS_ERROR_CHECKSUM -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_STATUS_ERROR_EXISTS -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_STATUS_ERROR_INVALID -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_STATUS_SUCCESS -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_TRANSFERBLOCKS -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_TRANSFERDATA -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- OP_WRITE_BLOCK -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- open(Path, int) -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- open(Path, int) -
Method in class org.apache.hadoop.dfs.HftpFileSystem
-
- open(String, long, long) -
Method in class org.apache.hadoop.dfs.NameNode
-
- open(Path, int) -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- Opens an FSDataInputStream at the indicated Path.
- open(Path, int) -
Method in class org.apache.hadoop.fs.FileSystem
- Opens an FSDataInputStream at the indicated Path.
- open(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Opens an FSDataInputStream at the indicated Path.
- open(Path, int) -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Opens an FSDataInputStream at the indicated Path.
- open(Path, int) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- open(Path, int) -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- open(Path, int) -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
-
- openScanner(Text, Text[], Text, long, RowFilterInterface) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Opens a remote scanner with a RowFilter.
- openScanner(Text, Text[], Text, long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Opens a remote scanner with a RowFilter.
- OPERATION_FAILED -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- oprot_ -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.Client
-
- or(FsAction) -
Method in enum org.apache.hadoop.fs.permission.FsAction
- OR operation.
- OR -
Static variable in interface org.apache.hadoop.hbase.hql.generated.HQLParserConstants
-
- or(Filter) -
Method in class org.onelab.filter.BloomFilter
- Peforms a logical OR between this filter and a specified filter.
- or(Filter) -
Method in class org.onelab.filter.CountingBloomFilter
- Peforms a logical OR between this filter and a specified filter.
- or(Filter) -
Method in class org.onelab.filter.DynamicBloomFilter
- Peforms a logical OR between this filter and a specified filter.
- or(Filter) -
Method in class org.onelab.filter.Filter
- Peforms a logical OR between this filter and a specified filter.
- ORDERED -
Static variable in class org.apache.hadoop.hbase.util.Base64
- Encode using the special "ordered" dialect of Base64 described here:
http://www.faqs.org/qa/rfcc-1940.html.
- org.apache.hadoop - package org.apache.hadoop
-
- org.apache.hadoop.conf - package org.apache.hadoop.conf
- Configuration of system parameters.
- org.apache.hadoop.contrib.utils.join - package org.apache.hadoop.contrib.utils.join
-
- org.apache.hadoop.dfs - package org.apache.hadoop.dfs
- A distributed implementation of
FileSystem
. - org.apache.hadoop.dfs.datanode.metrics - package org.apache.hadoop.dfs.datanode.metrics
-
- org.apache.hadoop.dfs.namenode.metrics - package org.apache.hadoop.dfs.namenode.metrics
-
- org.apache.hadoop.examples - package org.apache.hadoop.examples
- Hadoop example code.
- org.apache.hadoop.examples.dancing - package org.apache.hadoop.examples.dancing
- This package is a distributed implementation of Knuth's dancing links
algorithm that can run under Hadoop.
- org.apache.hadoop.filecache - package org.apache.hadoop.filecache
-
- org.apache.hadoop.fs - package org.apache.hadoop.fs
- An abstract file system API.
- org.apache.hadoop.fs.kfs - package org.apache.hadoop.fs.kfs
- A client for the Kosmos filesystem (KFS)
- org.apache.hadoop.fs.permission - package org.apache.hadoop.fs.permission
-
- org.apache.hadoop.fs.s3 - package org.apache.hadoop.fs.s3
- A distributed implementation of
FileSystem
that uses Amazon S3. - org.apache.hadoop.hbase - package org.apache.hadoop.hbase
- Provides Hbase, the Hadoop
simple database.
- org.apache.hadoop.hbase.filter - package org.apache.hadoop.hbase.filter
-
- org.apache.hadoop.hbase.generated.master - package org.apache.hadoop.hbase.generated.master
-
- org.apache.hadoop.hbase.generated.regionserver - package org.apache.hadoop.hbase.generated.regionserver
-
- org.apache.hadoop.hbase.hql - package org.apache.hadoop.hbase.hql
-
- org.apache.hadoop.hbase.hql.formatter - package org.apache.hadoop.hbase.hql.formatter
-
- org.apache.hadoop.hbase.hql.generated - package org.apache.hadoop.hbase.hql.generated
-
- org.apache.hadoop.hbase.io - package org.apache.hadoop.hbase.io
-
- org.apache.hadoop.hbase.ipc - package org.apache.hadoop.hbase.ipc
-
- org.apache.hadoop.hbase.mapred - package org.apache.hadoop.hbase.mapred
-
- org.apache.hadoop.hbase.rest - package org.apache.hadoop.hbase.rest
-
- org.apache.hadoop.hbase.thrift - package org.apache.hadoop.hbase.thrift
- Provides an HBase Thrift
service.
- org.apache.hadoop.hbase.thrift.generated - package org.apache.hadoop.hbase.thrift.generated
-
- org.apache.hadoop.hbase.util - package org.apache.hadoop.hbase.util
-
- org.apache.hadoop.io - package org.apache.hadoop.io
- Generic i/o code for use when reading and writing data to the network,
to databases, and to files.
- org.apache.hadoop.io.compress - package org.apache.hadoop.io.compress
-
- org.apache.hadoop.io.compress.lzo - package org.apache.hadoop.io.compress.lzo
-
- org.apache.hadoop.io.compress.zlib - package org.apache.hadoop.io.compress.zlib
-
- org.apache.hadoop.io.retry - package org.apache.hadoop.io.retry
-
A mechanism for selectively retrying methods that throw exceptions under certain circumstances.
- org.apache.hadoop.ipc - package org.apache.hadoop.ipc
- Tools to help define network clients and servers.
- org.apache.hadoop.ipc.metrics - package org.apache.hadoop.ipc.metrics
-
- org.apache.hadoop.log - package org.apache.hadoop.log
-
- org.apache.hadoop.mapred - package org.apache.hadoop.mapred
- A software framework for easily writing applications which process vast
amounts of data (multi-terabyte data-sets) parallelly on large clusters
(thousands of nodes) built of commodity hardware in a reliable, fault-tolerant
manner.
- org.apache.hadoop.mapred.jobcontrol - package org.apache.hadoop.mapred.jobcontrol
- Utilities for managing dependent jobs.
- org.apache.hadoop.mapred.join - package org.apache.hadoop.mapred.join
- Given a set of sorted datasets keyed with the same class and yielding equal
partitions, it is possible to effect a join of those datasets prior to the map.
- org.apache.hadoop.mapred.lib - package org.apache.hadoop.mapred.lib
- Library of generally useful mappers, reducers, and partitioners.
- org.apache.hadoop.mapred.lib.aggregate - package org.apache.hadoop.mapred.lib.aggregate
- Classes for performing various counting and aggregations.
- org.apache.hadoop.mapred.pipes - package org.apache.hadoop.mapred.pipes
- Hadoop Pipes allows C++ code to use Hadoop DFS and map/reduce.
- org.apache.hadoop.metrics - package org.apache.hadoop.metrics
- This package defines an API for reporting performance metric information.
- org.apache.hadoop.metrics.file - package org.apache.hadoop.metrics.file
- Implementation of the metrics package that writes the metrics to a file.
- org.apache.hadoop.metrics.ganglia - package org.apache.hadoop.metrics.ganglia
- Implementation of the metrics package that sends metric data to
Ganglia.
- org.apache.hadoop.metrics.jvm - package org.apache.hadoop.metrics.jvm
-
- org.apache.hadoop.metrics.spi - package org.apache.hadoop.metrics.spi
- The Service Provider Interface for the Metrics API.
- org.apache.hadoop.metrics.util - package org.apache.hadoop.metrics.util
-
- org.apache.hadoop.net - package org.apache.hadoop.net
- Network-related classes.
- org.apache.hadoop.record - package org.apache.hadoop.record
- Hadoop record I/O contains classes and a record description language
translator for simplifying serialization and deserialization of records in a
language-neutral manner.
- org.apache.hadoop.record.compiler - package org.apache.hadoop.record.compiler
- This package contains classes needed for code generation
from the hadoop record compiler.
- org.apache.hadoop.record.compiler.ant - package org.apache.hadoop.record.compiler.ant
-
- org.apache.hadoop.record.compiler.generated - package org.apache.hadoop.record.compiler.generated
- This package contains code generated by JavaCC from the
Hadoop record syntax file rcc.jj.
- org.apache.hadoop.record.meta - package org.apache.hadoop.record.meta
-
- org.apache.hadoop.security - package org.apache.hadoop.security
-
- org.apache.hadoop.streaming - package org.apache.hadoop.streaming
- Hadoop Streaming is a utility which allows users to create and run
Map-Reduce jobs with any executables (e.g.
- org.apache.hadoop.tools - package org.apache.hadoop.tools
-
- org.apache.hadoop.util - package org.apache.hadoop.util
- Common utilities.
- org.onelab.filter - package org.onelab.filter
-
- out -
Variable in class org.apache.hadoop.io.compress.CompressionOutputStream
- The output stream to be compressed.
- OuterJoinRecordReader<K extends WritableComparable> - Class in org.apache.hadoop.mapred.join
- Full outer join.
- outerrThreadsThrowable -
Variable in class org.apache.hadoop.streaming.PipeMapRed
-
- output_ -
Variable in class org.apache.hadoop.streaming.StreamJob
-
- OUTPUT_TABLE -
Static variable in class org.apache.hadoop.hbase.mapred.TableOutputFormat
- JobConf parameter that specifies the output table
- OutputCollector<K extends WritableComparable,V extends Writable> - Interface in org.apache.hadoop.mapred
- Collects the
<key, value>
pairs output by Mapper
s
and Reducer
s. - outputColumnsMime(MultiPartResponse, Map<Text, byte[]>) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- outputColumnsXml(XMLOutputter, Map<Text, byte[]>) -
Method in class org.apache.hadoop.hbase.rest.GenericHandler
-
- OutputFormat<K extends WritableComparable,V extends Writable> - Interface in org.apache.hadoop.mapred
OutputFormat
describes the output-specification for a
Map-Reduce job.- OutputFormatBase<K extends WritableComparable,V extends Writable> - Class in org.apache.hadoop.mapred
- A base class for
OutputFormat
. - OutputFormatBase() -
Constructor for class org.apache.hadoop.mapred.OutputFormatBase
-
- outputFormatSpec_ -
Variable in class org.apache.hadoop.streaming.StreamJob
-
- OutputRecord - Class in org.apache.hadoop.metrics.spi
- Represents a record of metric data to be sent to a metrics system.
- outputSingleNode_ -
Variable in class org.apache.hadoop.streaming.StreamJob
-
- OverrideRecordReader<K extends WritableComparable,V extends Writable> - Class in org.apache.hadoop.mapred.join
- Prefer the "rightmost" data source for this key.
DefaultJobHistoryParser.MasterIndex
.
FileSystem
.PhasedFileSystem
is no longer used
during speculative execution of tasks.FsPermission
from DataInput
.
PermissionStatus
from DataInput
.
Base64.Base64InputStream.read()
repeatedly until the end of stream is reached or
len bytes are read.
buf
at offset
and checksum into checksum
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
in
.
in
.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
in
.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
CompressedWritable.readFields(DataInput)
.
conf
The object is expected to store with the property name attr
as a comma separated string that starts
with the user name followed by group names.
len
bytes from
stm
Writable
, String
, primitive type, or an array of
the preceding.
Writable
, String
, primitive type, or an array of
the preceding.
Writable
, String
, primitive type, or an array of
the preceding.
Writable
, String
, primitive type, or an array of
the preceding.
Record
comparison implementation.
RecordReader
reads <key, value> pairs from an
InputSplit
.RecordWriter
writes the output <key, value> pairs
to an output file.Mapper
that extracts text matching a regular expression.hsf
directory.
RetryPolicy
.Command
is
Command.execute(HBaseConfiguration)
'ed.Tool
by Tool.run(String[])
, after
parsing with the given generic arguments.
Tool
with its Configuration
.
RunningJob
is the user-interface to query for details on a
running Map-Reduce job.FileSystem
backed by Amazon S3.S3FileSystem
.ugi
as a comma separated string in
conf
as a property attr
The String starts with the user name followed by the default group names,
and other group names.
n
th value.
SequenceFile
s are flat files consisting of binary key/value
pairs.SequenceFile
.WritableComparator
.
InputFormat
for SequenceFile
s.OutputFormat
that writes SequenceFile
s.RecordReader
for SequenceFile
s.value
of the name
property.
name
property to a boolean
.
name
property to the name of a
theClass
implementing the given interface xface
.
InputFormat
implementation for the map-reduce job.
Path
of the input directory for the map-reduce job.
name
property to an int
.
JobPriority
for this job.
name
property to a long
.
SequenceFile.CompressionType
for the map outputs.
CompressionCodec
for the map outputs.
Mapper
class for the job.
MapRunnable
class for the job.
SequenceFile.CompressionType
for the output SequenceFile
.
CompressionCodec
to be used to compress job outputs.
OutputFormat
implementation for the map-reduce job.
WritableComparable
comparator used to compare keys.
Path
of the output directory for the map-reduce job.
WritableComparable
comparator for
grouping keys in the input to the reduce.
Partitioner
class used to partition
Mapper
-outputs to be sent to the Reducer
s.
Reducer
class for the job.
JobConf.setMapSpeculativeExecution(boolean)
or
JobConf.setReduceSpeculativeExecution(boolean)
instead.
Turn speculative execution on or off for this job.
Shell
instead.Shell.ShellCommandExecutor
instead.n
bytes of data from the
input stream.
startTime
- sleeper -
Variable in class org.apache.hadoop.hbase.HMaster
-
- Sleeper - Class in org.apache.hadoop.hbase.util
- Sleeper for current thread.
- Sleeper(int, AtomicBoolean) -
Constructor for class org.apache.hadoop.hbase.util.Sleeper
-
- SleepJob - Class in org.apache.hadoop.examples
- Dummy class for testing MR framefork.
- SleepJob() -
Constructor for class org.apache.hadoop.examples.SleepJob
-
- SocksSocketFactory - Class in org.apache.hadoop.net
- Specialized SocketFactory to create sockets with a SOCKS proxy
- SocksSocketFactory() -
Constructor for class org.apache.hadoop.net.SocksSocketFactory
- Default empty constructor (for use with the reflection API).
- SocksSocketFactory(Proxy) -
Constructor for class org.apache.hadoop.net.SocksSocketFactory
- Constructor with a supplied Proxy
- solution(List<List<ColumnName>>) -
Method in interface org.apache.hadoop.examples.dancing.DancingLinks.SolutionAcceptor
- A callback to return a solution to the application.
- solve(int[], DancingLinks.SolutionAcceptor<ColumnName>) -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Given a prefix, find solutions under it.
- solve(DancingLinks.SolutionAcceptor<ColumnName>) -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Solve a complete problem
- solve(int[]) -
Method in class org.apache.hadoop.examples.dancing.Pentomino
- Find all of the solutions that start with the given prefix.
- solve() -
Method in class org.apache.hadoop.examples.dancing.Pentomino
- Find all of the solutions to the puzzle.
- solve() -
Method in class org.apache.hadoop.examples.dancing.Sudoku
-
- Sort - Class in org.apache.hadoop.examples
- This is the trivial map/reduce program that does absolutely nothing
other than use the framework to fragment and sort the input values.
- Sort() -
Constructor for class org.apache.hadoop.examples.Sort
-
- sort(Path[], Path, boolean) -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- Perform a file sort from a set of input files into an output file.
- sort(Path, Path) -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- The backwards compatible interface to sort.
- sortAndIterate(Path[], Path, boolean) -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- Perform a file sort from a set of input files and return an iterator.
- sortedLeases -
Variable in class org.apache.hadoop.hbase.Leases
-
- SortedMapWritable - Class in org.apache.hadoop.io
- A Writable SortedMap.
- SortedMapWritable() -
Constructor for class org.apache.hadoop.io.SortedMapWritable
- default constructor.
- SortedMapWritable(SortedMapWritable) -
Constructor for class org.apache.hadoop.io.SortedMapWritable
- Copy constructor.
- sortNodeList(ArrayList<DatanodeDescriptor>, String, String) -
Method in class org.apache.hadoop.dfs.JspHelper
-
- SOURCE_TAGS_FIELD -
Static variable in class org.apache.hadoop.contrib.utils.join.DataJoinReducerBase
-
- specialConstructor -
Variable in exception org.apache.hadoop.hbase.hql.generated.ParseException
- This variable determines which constructor was used to create
this object and thereby affects the semantics of the
"getMessage" method (see below).
- specialConstructor -
Variable in exception org.apache.hadoop.record.compiler.generated.ParseException
- This variable determines which constructor was used to create
this object and thereby affects the semantics of the
"getMessage" method (see below).
- specialToken -
Variable in class org.apache.hadoop.hbase.hql.generated.Token
- This field is used to access special tokens that occur prior to this
token, but after the immediately preceding regular (non-special) token.
- specialToken -
Variable in class org.apache.hadoop.record.compiler.generated.Token
- This field is used to access special tokens that occur prior to this
token, but after the immediately preceding regular (non-special) token.
- split(int) -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Generate a list of row choices to cover the first moves.
- splitKeyVal(byte[], int, int, Text, Text, int) -
Static method in class org.apache.hadoop.streaming.UTF8ByteArrayUtils
- split a UTF-8 byte array into key and value
assuming that the delimilator is at splitpos.
- splitKeyVal(byte[], Text, Text, int) -
Static method in class org.apache.hadoop.streaming.UTF8ByteArrayUtils
- split a UTF-8 byte array into key and value
assuming that the delimilator is at splitpos.
- StandardSocketFactory - Class in org.apache.hadoop.net
- Specialized SocketFactory to create sockets with a SOCKS proxy
- StandardSocketFactory() -
Constructor for class org.apache.hadoop.net.StandardSocketFactory
- Default empty constructor (for use with the reflection API).
- start() -
Method in class org.apache.hadoop.hbase.Leases
- Starts the lease monitor
- start() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- Start the server.
- start() -
Method in class org.apache.hadoop.ipc.Server
- Starts the service.
- start() -
Method in class org.apache.hadoop.mapred.StatusHttpServer
- Start the server.
- START_ROW -
Static variable in class org.apache.hadoop.hbase.rest.GenericHandler
-
- startBatchUpdate(Text) -
Method in class org.apache.hadoop.hbase.HTable
- Deprecated. Batch operations are now the default. startBatchUpdate is now
implemented by @see
HTable.startUpdate(Text)
- STARTING -
Static variable in interface org.apache.hadoop.hbase.hql.generated.HQLParserConstants
-
- startKey -
Variable in class org.apache.hadoop.hbase.thrift.generated.RegionDescriptor.Isset
-
- startKey -
Variable in class org.apache.hadoop.hbase.thrift.generated.RegionDescriptor
-
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
-
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Returns a local File that the user can write output to.
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Returns a local File that the user can write output to.
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.mapred.PhasedFileSystem
- Deprecated.
- startMap(String) -
Method in class org.apache.hadoop.record.BinaryRecordInput
-
- startMap(TreeMap, String) -
Method in class org.apache.hadoop.record.BinaryRecordOutput
-
- startMap(String) -
Method in class org.apache.hadoop.record.CsvRecordInput
-
- startMap(TreeMap, String) -
Method in class org.apache.hadoop.record.CsvRecordOutput
-
- startMap(String) -
Method in interface org.apache.hadoop.record.RecordInput
- Check the mark for start of the serialized map.
- startMap(TreeMap, String) -
Method in interface org.apache.hadoop.record.RecordOutput
- Mark the start of a map to be serialized.
- startMap(String) -
Method in class org.apache.hadoop.record.XmlRecordInput
-
- startMap(TreeMap, String) -
Method in class org.apache.hadoop.record.XmlRecordOutput
-
- startMonitoring() -
Method in class org.apache.hadoop.metrics.file.FileContext
- Starts or restarts monitoring, by opening in append-mode, the
file specified by the
fileName
attribute,
if specified.
- startMonitoring() -
Method in interface org.apache.hadoop.metrics.MetricsContext
- Starts or restarts monitoring, the emitting of metrics records as they are
updated.
- startMonitoring() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Starts or restarts monitoring, the emitting of metrics records.
- startMonitoring() -
Method in class org.apache.hadoop.metrics.spi.NullContext
- Do-nothing version of startMonitoring
- startNextPhase() -
Method in class org.apache.hadoop.util.Progress
- Called during execution to move to the next phase at this level in the
tree.
- startNotifier() -
Static method in class org.apache.hadoop.mapred.JobEndNotifier
-
- startRecord(String) -
Method in class org.apache.hadoop.record.BinaryRecordInput
-
- startRecord(Record, String) -
Method in class org.apache.hadoop.record.BinaryRecordOutput
-
- startRecord(String) -
Method in class org.apache.hadoop.record.CsvRecordInput
-
- startRecord(Record, String) -
Method in class org.apache.hadoop.record.CsvRecordOutput
-
- startRecord(String) -
Method in interface org.apache.hadoop.record.RecordInput
- Check the mark for start of the serialized record.
- startRecord(Record, String) -
Method in interface org.apache.hadoop.record.RecordOutput
- Mark the start of a record to be serialized.
- startRecord(String) -
Method in class org.apache.hadoop.record.XmlRecordInput
-
- startRecord(Record, String) -
Method in class org.apache.hadoop.record.XmlRecordOutput
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args.Isset
-
- startRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args
-
- startTracker(JobConf) -
Static method in class org.apache.hadoop.mapred.JobTracker
- Start the JobTracker with given configuration.
- startup() -
Method in class org.apache.hadoop.hbase.LocalHBaseCluster
- Start the cluster.
- startUpdate(Text) -
Method in class org.apache.hadoop.hbase.HTable
- Start an atomic row insertion/update.
- startUpdate(Text) -
Method in class org.apache.hadoop.hbase.io.BatchUpdate
- Start a batch row insertion/update.
- startUpgrade() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Prepare for the upgrade.
- startupShutdownMessage(Class, String[], Log) -
Static method in class org.apache.hadoop.util.StringUtils
- Print a log message for starting up and shutting down
- startVector(String) -
Method in class org.apache.hadoop.record.BinaryRecordInput
-
- startVector(ArrayList, String) -
Method in class org.apache.hadoop.record.BinaryRecordOutput
-
- startVector(String) -
Method in class org.apache.hadoop.record.CsvRecordInput
-
- startVector(ArrayList, String) -
Method in class org.apache.hadoop.record.CsvRecordOutput
-
- startVector(String) -
Method in interface org.apache.hadoop.record.RecordInput
- Check the mark for start of the serialized vector.
- startVector(ArrayList, String) -
Method in interface org.apache.hadoop.record.RecordOutput
- Mark the start of a vector to be serialized.
- startVector(String) -
Method in class org.apache.hadoop.record.XmlRecordInput
-
- startVector(ArrayList, String) -
Method in class org.apache.hadoop.record.XmlRecordOutput
-
- stat2Paths(FileStatus[]) -
Static method in class org.apache.hadoop.fs.FileUtil
- convert an array of FileStatus to an array of Path
- stateChangeLog -
Static variable in class org.apache.hadoop.dfs.NameNode
-
- staticFlag -
Static variable in class org.apache.hadoop.hbase.hql.generated.SimpleCharStream
-
- staticFlag -
Static variable in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- StatusHttpServer - Class in org.apache.hadoop.mapred
- Create a Jetty embedded server to answer http requests.
- StatusHttpServer(String, String, int, boolean) -
Constructor for class org.apache.hadoop.mapred.StatusHttpServer
- Create a status server on the given port.
- StatusHttpServer.StackServlet - Class in org.apache.hadoop.mapred
- A very simple servlet to serve up a text representation of the current
stack traces.
- StatusHttpServer.StackServlet() -
Constructor for class org.apache.hadoop.mapred.StatusHttpServer.StackServlet
-
- StatusHttpServer.TaskGraphServlet - Class in org.apache.hadoop.mapred
- The servlet that outputs svg graphics for map / reduce task
statuses
- StatusHttpServer.TaskGraphServlet() -
Constructor for class org.apache.hadoop.mapred.StatusHttpServer.TaskGraphServlet
-
- statusUpdate(String, TaskStatus) -
Method in class org.apache.hadoop.mapred.TaskTracker
- Called periodically to report Task progress, from 0.0 to 1.0.
- STILL_WAITING -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- stop() -
Method in class org.apache.hadoop.dfs.NameNode
- Stop all NameNode threads and wait for all to finish.
- stop -
Variable in class org.apache.hadoop.hbase.Chore
-
- stop -
Variable in class org.apache.hadoop.hbase.Leases
-
- stop() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- stop the server
- stop() -
Method in class org.apache.hadoop.ipc.Client
- Stop all threads related to this client.
- stop() -
Method in class org.apache.hadoop.ipc.Server
- Stops the service.
- stop() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
- set the thread state to STOPPING so that the
thread will stop when it wakes up.
- stop() -
Method in class org.apache.hadoop.mapred.StatusHttpServer
- stop the server
- stopClient() -
Static method in class org.apache.hadoop.hbase.ipc.HbaseRPC
- Stop all RPC client connections
- stopClient() -
Static method in class org.apache.hadoop.ipc.RPC
- Stop all RPC client connections
- stopMonitoring() -
Method in class org.apache.hadoop.metrics.file.FileContext
- Stops monitoring, closing the file.
- stopMonitoring() -
Method in interface org.apache.hadoop.metrics.MetricsContext
- Stops monitoring.
- stopMonitoring() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Stops monitoring.
- stopNotifier() -
Static method in class org.apache.hadoop.mapred.JobEndNotifier
-
- stopRequested -
Variable in class org.apache.hadoop.hbase.HRegionServer
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args.Isset
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_args
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args.Isset
-
- stopRow -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_args
-
- StopRowFilter - Class in org.apache.hadoop.hbase.filter
- Implementation of RowFilterInterface that filters out rows greater than or
equal to a specified rowKey.
- StopRowFilter() -
Constructor for class org.apache.hadoop.hbase.filter.StopRowFilter
- Default constructor, filters nothing.
- StopRowFilter(Text) -
Constructor for class org.apache.hadoop.hbase.filter.StopRowFilter
- Constructor that takes a stopRowKey on which to filter
- stopRowKey -
Variable in class org.apache.hadoop.hbase.filter.StopRowFilter
-
- stopTracker() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- storageID -
Variable in class org.apache.hadoop.dfs.DatanodeID
-
- store(JobInProgress) -
Method in class org.apache.hadoop.mapred.CompletedJobStatusStore
- Persists a job in DFS.
- storeBlock(Block, File) -
Method in interface org.apache.hadoop.fs.s3.FileSystemStore
-
- storeINode(Path, INode) -
Method in interface org.apache.hadoop.fs.s3.FileSystemStore
-
- StreamBackedIterator<X extends Writable> - Class in org.apache.hadoop.mapred.join
- This class provides an implementation of ResetableIterator.
- StreamBackedIterator() -
Constructor for class org.apache.hadoop.mapred.join.StreamBackedIterator
-
- StreamBaseRecordReader - Class in org.apache.hadoop.streaming
- Shared functionality for hadoopStreaming formats.
- StreamBaseRecordReader(FSDataInputStream, FileSplit, Reporter, JobConf, FileSystem) -
Constructor for class org.apache.hadoop.streaming.StreamBaseRecordReader
-
- streamBlockInAscii(InetSocketAddress, long, long, long, long, JspWriter) -
Method in class org.apache.hadoop.dfs.JspHelper
-
- StreamFile - Class in org.apache.hadoop.dfs
-
- StreamFile() -
Constructor for class org.apache.hadoop.dfs.StreamFile
-
- StreamInputFormat - Class in org.apache.hadoop.streaming
- An input format that selects a RecordReader based on a JobConf property.
- StreamInputFormat() -
Constructor for class org.apache.hadoop.streaming.StreamInputFormat
-
- StreamJob - Class in org.apache.hadoop.streaming
- All the client-side work happens here.
- StreamJob(String[], boolean) -
Constructor for class org.apache.hadoop.streaming.StreamJob
-
- StreamLineRecordReader - Class in org.apache.hadoop.streaming
- Deprecated.
- StreamLineRecordReader(Configuration, FileSplit) -
Constructor for class org.apache.hadoop.streaming.StreamLineRecordReader
- Deprecated.
- StreamOutputFormat - Class in org.apache.hadoop.streaming
- Deprecated.
- StreamOutputFormat() -
Constructor for class org.apache.hadoop.streaming.StreamOutputFormat
- Deprecated.
- StreamSequenceRecordReader - Class in org.apache.hadoop.streaming
- Deprecated.
- StreamSequenceRecordReader(Configuration, FileSplit) -
Constructor for class org.apache.hadoop.streaming.StreamSequenceRecordReader
- Deprecated.
- StreamUtil - Class in org.apache.hadoop.streaming
- Utilities not available elsewhere in Hadoop.
- StreamUtil() -
Constructor for class org.apache.hadoop.streaming.StreamUtil
-
- StreamXmlRecordReader - Class in org.apache.hadoop.streaming
- A way to interpret XML fragments as Mapper input records.
- StreamXmlRecordReader(FSDataInputStream, FileSplit, Reporter, JobConf, FileSystem) -
Constructor for class org.apache.hadoop.streaming.StreamXmlRecordReader
-
- STRING -
Static variable in class org.apache.hadoop.record.meta.TypeID.RIOType
-
- STRING_LITERAL -
Static variable in interface org.apache.hadoop.hbase.hql.generated.HQLParserConstants
-
- STRING_VALUE_MAX -
Static variable in class org.apache.hadoop.mapred.lib.aggregate.ValueAggregatorBaseDescriptor
-
- STRING_VALUE_MIN -
Static variable in class org.apache.hadoop.mapred.lib.aggregate.ValueAggregatorBaseDescriptor
-
- stringifyException(Throwable) -
Static method in class org.apache.hadoop.util.StringUtils
- Make a string representation of the exception.
- stringifySolution(int, int, List<List<Pentomino.ColumnName>>) -
Static method in class org.apache.hadoop.examples.dancing.Pentomino
- Convert a solution to the puzzle returned by the model into a string
that represents the placement of the pieces onto the board.
- stringToBytes(String) -
Static method in class org.apache.hadoop.hbase.util.Writables
- Converts a string to a byte array in a consistent manner.
- stringToPath(String[]) -
Static method in class org.apache.hadoop.util.StringUtils
-
- stringToURI(String[]) -
Static method in class org.apache.hadoop.util.StringUtils
-
- StringTypeID -
Static variable in class org.apache.hadoop.record.meta.TypeID
-
- StringUtils - Class in org.apache.hadoop.util
- General string utils
- StringUtils() -
Constructor for class org.apache.hadoop.util.StringUtils
-
- StringValueMax - Class in org.apache.hadoop.mapred.lib.aggregate
- This class implements a value aggregator that maintain the biggest of
a sequence of strings.
- StringValueMax() -
Constructor for class org.apache.hadoop.mapred.lib.aggregate.StringValueMax
- the default constructor
- StringValueMin - Class in org.apache.hadoop.mapred.lib.aggregate
- This class implements a value aggregator that maintain the smallest of
a sequence of strings.
- StringValueMin() -
Constructor for class org.apache.hadoop.mapred.lib.aggregate.StringValueMin
- the default constructor
- STRUCT -
Static variable in class org.apache.hadoop.record.meta.TypeID.RIOType
-
- StructTypeID - Class in org.apache.hadoop.record.meta
- Represents typeID for a struct
- StructTypeID(RecordTypeInfo) -
Constructor for class org.apache.hadoop.record.meta.StructTypeID
- Create a StructTypeID based on the RecordTypeInfo of some record
- subMap(WritableComparable, WritableComparable) -
Method in class org.apache.hadoop.io.SortedMapWritable
-
- submit() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
- Submit this job to mapred.
- submitAndMonitorJob() -
Method in class org.apache.hadoop.streaming.StreamJob
-
- submitJob(String) -
Method in class org.apache.hadoop.mapred.JobClient
- Submit a job to the MR system.
- submitJob(JobConf) -
Method in class org.apache.hadoop.mapred.JobClient
- Submit a job to the MR system.
- submitJob(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Submit a Job for execution.
- submitJob(String) -
Method in class org.apache.hadoop.mapred.JobTracker
- JobTracker.submitJob() kicks off a new job.
- submitJob(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Submit a job to the map/reduce cluster.
- Submitter - Class in org.apache.hadoop.mapred.pipes
- The main entry point and job submitter.
- Submitter() -
Constructor for class org.apache.hadoop.mapred.pipes.Submitter
-
- SUCCEEDED -
Static variable in class org.apache.hadoop.mapred.JobStatus
-
- SUCCESS -
Static variable in class org.apache.hadoop.dfs.Balancer
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.get_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.get_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getColumnDescriptors_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getColumnDescriptors_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRow_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRow_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getRowTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableNames_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableNames_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableRegions_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getTableRegions_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVer_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVer_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVerTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.getVerTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerGet_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerGet_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpen_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenTs_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStop_result
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_result.Isset
-
- success -
Variable in class org.apache.hadoop.hbase.thrift.generated.Hbase.scannerOpenWithStopTs_result
-
- SUCCESS -
Static variable in class org.apache.hadoop.mapred.jobcontrol.Job
-
- Sudoku - Class in org.apache.hadoop.examples.dancing
- This class uses the dancing links algorithm from Knuth to solve sudoku
puzzles.
- Sudoku(InputStream) -
Constructor for class org.apache.hadoop.examples.dancing.Sudoku
- Set up a puzzle board to the given size.
- Sudoku.ColumnName - Interface in org.apache.hadoop.examples.dancing
- This interface is a marker class for the columns created for the
Sudoku solver.
- suffix(String) -
Method in class org.apache.hadoop.fs.Path
- Adds a suffix to the final name in the path.
- sum(Counters, Counters) -
Static method in class org.apache.hadoop.mapred.Counters
- Convenience method for computing the sum of two sets of counters.
- suspend() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
- suspend the running thread
- suspendEncoding() -
Method in class org.apache.hadoop.hbase.util.Base64.Base64OutputStream
- Suspends encoding of the stream.
- SwitchTo(int) -
Method in class org.apache.hadoop.hbase.hql.generated.HQLParserTokenManager
-
- SwitchTo(int) -
Method in class org.apache.hadoop.record.compiler.generated.RccTokenManager
-
- SYMBOL -
Variable in enum org.apache.hadoop.fs.permission.FsAction
- Symbolic representation
- symLink(String, String) -
Static method in class org.apache.hadoop.fs.FileUtil
- Create a soft link between a src and destination
only on a local disk.
- sync(long) -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Seek to the next sync mark past a given position.
- sync() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- create a sync point
- SYNC_INTERVAL -
Static variable in class org.apache.hadoop.io.SequenceFile
- The number of bytes between sync points.
- syncs -
Variable in class org.apache.hadoop.dfs.NameNodeMetrics
-
- syncSeen() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns true iff the previous call to next passed a sync mark.
tableName
exists.
InputFormat
for plain text files.OutputFormat
that writes plain text files.Text
(except a TextSequence is a Text whereas a String is a CharSequence).Mapper
that maps text values into Tool
interface, and use ToolRunner
for execution
functionality. Alternatively, GenericOptionsParser
can be used to parse generic arguments related to hadoop
framework.Tool
s.void
methods, or by
re-throwing the exception for non-void
methods.
Writable
s.Writable
interface for storing user and groups information.S3FileSystem
.VersionedWritable.readFields(DataInput)
when the
version of an object being read does not match the current implementation
version as returned by VersionedWritable.getVersion()
.DataInput
and DataOutput
.Writable
which is also Comparable
.WritableComparable
s.WritableComparable
implementation.
OutputStream
.
len
bytes from the specified byte array
starting at offset off
and generate a checksum for
each data chunk.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
Base64.Base64OutputStream.write(int)
repeatedly until len bytes are
written.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
CompressedWritable.write(DataOutput)
.
Writable
, String
, primitive type, or an array of
the preceding.
Writable
, String
, primitive type, or an array of
the preceding.
Compressor
based on the popular
zlib compression algorithm.Decompressor
based on the popular
zlib compression algorithm.
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |