|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
HTable.abort(long)
Configuration.addResource(String)
instead
Configuration.addResource(URL)
instead
Configuration.addResource(Path)
instead
Configuration.addResource(String)
instead
Configuration.addResource(URL)
instead
Configuration.addResource(Path)
instead
Path
to the list of inputs for the map-reduce job.
Writer
doing its own printlns,
etc.FileSystemStore
.BufferedFSInputStream
with the specified buffer size,
and saves its argument, the input stream
in
, for later use.
param
, to the IPC server running at
address
, returning the value.
position
.
Writable
class.
RecordWriter
to future operations.
JobClient
.
InputSplit
to future operations.
RecordWriter
to future operations.
regionName
is closed and no longer available.
IOException
IOException
regionName
is closing.
HTable.commit(long)
HTable.commit(long, long)
CompressionOutputStream
to compress data.Configuration
.JobConf
.
JobConf
.
JobConf
.
Configuration
.Group
of counters, comprising of counters from a particular
counter Enum
class.Compressor
for use by this CompressionCodec
.
Decompressor
for use by this CompressionCodec
.
CompressionInputStream
that will read from the given
InputStream
with the given Decompressor
.
CompressionOutputStream
that will write to the given
OutputStream
.
CompressionOutputStream
that will write to the given
OutputStream
with the given Compressor
.
recordName
.
recordName
.
Thread.setDaemon(boolean)
with true.DataInput
implementation that reads from an in-memory
buffer.DataOutput
implementation that writes to an in-memory
buffer.CompressionInputStream
to compress data.WritableComparable
implementation.
Record
implementation.
Configuration.iterator()
instead.
o
is a FloatWritable with the same value.
o
is a IntWritable with the same value.
o
is a LongWritable with the same value.
o
is an MD5Hash whose digest contains the
same values.
o
is a Text with the same contents.
o
is a UTF8 with the same contents.
o
is a VIntWritable with the same value.
o
is a VLongWritable with the same value.
InputFormat
.INode
s and Block
s.FilterFileSystem
contains
some other file system, which it uses as
its basic file system, possibly transforming
the data along the way or providing additional
functionality.what
in the backing
buffer, starting as position start
.
FSInputStream
in a DataInputStream
and buffers input through a BufferedInputStream
.OutputStream
in a DataOutputStream
,
buffers output through a BufferedOutputStream
and creates a checksum
file.GenericOptionsParser
is a utility to parse command line
arguments generic to the Hadoop framework.GenericOptionsParser to parse only the generic Hadoop
arguments.
- GenericOptionsParser(Configuration, Options, String[]) -
Constructor for class org.apache.hadoop.util.GenericOptionsParser
- Create a
GenericOptionsParser
to parse given options as well
as generic Hadoop options.
- GenericWritable - Class in org.apache.hadoop.io
- A wrapper for Writable instances.
- GenericWritable() -
Constructor for class org.apache.hadoop.io.GenericWritable
-
- get(String, Object) -
Method in class org.apache.hadoop.conf.Configuration
- Deprecated. A side map of Configuration to Object should be used instead.
- get(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property, null
if
no such property exists.
- get(String, String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property.
- get(int) -
Method in class org.apache.hadoop.dfs.LocatedBlocks
- Get located block.
- get(Configuration) -
Static method in class org.apache.hadoop.fs.FileSystem
- Returns the configured filesystem implementation.
- get(URI, Configuration) -
Static method in class org.apache.hadoop.fs.FileSystem
- Returns the FileSystem for this URI's scheme and authority.
- get(HStoreKey, int) -
Method in class org.apache.hadoop.hbase.HMemcache
- Look back through all the backlog TreeMaps to find the target.
- get(Text, Text, Text) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Retrieve a single value from the specified region for the specified row
and column keys
- get(Text, Text, Text, int) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get the specified number of versions of the specified row and column
- get(Text, Text, Text, long, int) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get the specified number of versions of the specified row and column with
the specified timestamp.
- get(Text, Text, Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Retrieve a single value from the specified region for the specified row
and column keys
- get(Text, Text, Text, int) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get the specified number of versions of the specified row and column
- get(Text, Text, Text, long, int) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get the specified number of versions of the specified row and column with
the specified timestamp.
- get(Text, Text) -
Method in class org.apache.hadoop.hbase.HTable
- Get a single value for the specified row and column
- get(Text, Text, int) -
Method in class org.apache.hadoop.hbase.HTable
- Get the specified number of versions of the specified row and column
- get(Text, Text, long, int) -
Method in class org.apache.hadoop.hbase.HTable
- Get the specified number of versions of the specified row and column with
the specified timestamp.
- get() -
Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
- Get the data from the BytesWritable.
- get() -
Method in class org.apache.hadoop.hbase.shell.TableFormatterFactory
-
- get(long, Writable) -
Method in class org.apache.hadoop.io.ArrayFile.Reader
- Return the
n
th value in the file.
- get() -
Method in class org.apache.hadoop.io.ArrayWritable
-
- get() -
Method in class org.apache.hadoop.io.BooleanWritable
- Returns the value of the BooleanWritable
- get() -
Method in class org.apache.hadoop.io.BytesWritable
- Get the data from the BytesWritable.
- get() -
Method in class org.apache.hadoop.io.FloatWritable
- Return the value of this FloatWritable.
- get() -
Method in class org.apache.hadoop.io.GenericWritable
- Return the wrapped instance.
- get() -
Method in class org.apache.hadoop.io.IntWritable
- Return the value of this IntWritable.
- get() -
Method in class org.apache.hadoop.io.LongWritable
- Return the value of this LongWritable.
- get(WritableComparable, Writable) -
Method in class org.apache.hadoop.io.MapFile.Reader
- Return the value for the named key, or null if none exists.
- get(Object) -
Method in class org.apache.hadoop.io.MapWritable
-
- get() -
Static method in class org.apache.hadoop.io.NullWritable
- Returns the single instance of this class.
- get() -
Method in class org.apache.hadoop.io.ObjectWritable
- Return the instance, or null if none.
- get(Text) -
Method in class org.apache.hadoop.io.SequenceFile.Metadata
-
- get(WritableComparable) -
Method in class org.apache.hadoop.io.SetFile.Reader
- Read the matching key from a set into
key
.
- get(Object) -
Method in class org.apache.hadoop.io.SortedMapWritable
-
- get() -
Method in class org.apache.hadoop.io.TwoDArrayWritable
-
- get() -
Method in class org.apache.hadoop.io.VIntWritable
- Return the value of this VIntWritable.
- get() -
Method in class org.apache.hadoop.io.VLongWritable
- Return the value of this LongWritable.
- get(Class) -
Static method in class org.apache.hadoop.io.WritableComparator
- Get a comparator for a
WritableComparable
implementation.
- get() -
Static method in class org.apache.hadoop.ipc.Server
- Returns the server instance called under or null.
- get(DataInput) -
Static method in class org.apache.hadoop.record.BinaryRecordInput
- Get a thread-local record input for the supplied DataInput.
- get(DataOutput) -
Static method in class org.apache.hadoop.record.BinaryRecordOutput
- Get a thread-local record output for the supplied DataOutput.
- get() -
Method in class org.apache.hadoop.record.Buffer
- Get the data from the Buffer.
- get() -
Method in class org.apache.hadoop.util.Progress
- Returns the overall progress of the root.
- getAbsolutePath(String) -
Method in class org.apache.hadoop.streaming.PathFinder
- Returns the full path name of this file if it is listed in the
path
- getAddress(Configuration) -
Static method in class org.apache.hadoop.mapred.JobTracker
-
- getAllTasks() -
Method in class org.apache.hadoop.mapred.JobHistory.JobInfo
- Returns all map and reduce tasks
.
- getAnalyzerName() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getApproxChkSumLength(long) -
Static method in class org.apache.hadoop.fs.ChecksumFileSystem
-
- getArchiveClassPaths(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the archive entries in classpath as an array of Path
- getArchiveTimestamps(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the timestamps of the archives
- getAssignedTracker(String) -
Method in class org.apache.hadoop.mapred.JobTracker
- Get tracker name for a given task id.
- getAttribute(String) -
Method in class org.apache.hadoop.hbase.util.InfoServer
- Get the value in the webapp context.
- getAttribute(String) -
Method in class org.apache.hadoop.mapred.StatusHttpServer
- Get the value in the webapp context.
- getAttribute(String) -
Method in class org.apache.hadoop.metrics.ContextFactory
- Returns the value of the named attribute, or null if there is no
attribute of that name.
- getAttribute(String) -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Convenience method for subclasses to access factory attributes.
- getAttributeNames() -
Method in class org.apache.hadoop.metrics.ContextFactory
- Returns the names of all the factory's attributes.
- getAttributeTable(String) -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Returns an attribute-value map derived from the factory attributes
by finding all factory attributes that begin with
contextName.tableName.
- getAvailable() -
Method in class org.apache.hadoop.fs.DF
-
- getBasePathInJarOut(String) -
Method in class org.apache.hadoop.streaming.JarBuilder
-
- getBasicCommand() -
Method in class org.apache.hadoop.hbase.shell.BasicCommand
-
- getBeginColumn() -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
-
- getBeginColumn() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getBeginLine() -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
-
- getBeginLine() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getBindAddress() -
Method in class org.apache.hadoop.hbase.HServerAddress
-
- getBlockLocations(String, long, long) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getBlocks(DatanodeInfo, long) -
Method in class org.apache.hadoop.dfs.NameNode
- return a list of blocks & their locations on
datanode
whose
total size is size
- getBlocks() -
Method in class org.apache.hadoop.fs.s3.INode
-
- getBlockSize() -
Method in class org.apache.hadoop.fs.FileStatus
- Get the block size of the file.
- getBlockSize(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated. Use getFileStatus() instead
- getBloomFilter() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getBoolean(String, boolean) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a boolean
.
- getBoost(String) -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getBoundAntProperty(String, String) -
Static method in class org.apache.hadoop.streaming.StreamUtil
-
- getBytes(HStoreKey) -
Static method in class org.apache.hadoop.hbase.HStoreKey
- Returns row and column bytes out of an HStoreKey.
- getBytes(Writable) -
Static method in class org.apache.hadoop.hbase.util.Writables
-
- getBytes() -
Method in class org.apache.hadoop.io.Text
- Retuns the raw bytes.
- getBytes() -
Method in class org.apache.hadoop.io.UTF8
- Deprecated. The raw bytes.
- getBytes(String) -
Static method in class org.apache.hadoop.io.UTF8
- Deprecated. Convert a string to a UTF-8 encoded byte array.
- getBytes() -
Method in class org.onelab.filter.Key
-
- getBytesPerChecksum() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getBytesPerSum() -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- Return the bytes Per Checksum
- getBytesRead() -
Method in class org.apache.hadoop.io.compress.zlib.ZlibCompressor
- Returns the total number of uncompressed bytes input so far.
- getBytesRead() -
Method in class org.apache.hadoop.io.compress.zlib.ZlibDecompressor
- Returns the total number of uncompressed bytes input so far.
- getBytesWritten() -
Method in class org.apache.hadoop.io.compress.zlib.ZlibCompressor
- Returns the total number of compressed bytes output so far.
- getBytesWritten() -
Method in class org.apache.hadoop.io.compress.zlib.ZlibDecompressor
- Returns the total number of compressed bytes output so far.
- getCacheArchives(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get cache archives set in the Configuration
- getCacheFiles(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get cache files set in the Configuration
- getCapacity() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- The raw capacity.
- getCapacity() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem.DiskStatus
-
- getCapacity() -
Method in class org.apache.hadoop.fs.DF
-
- getCapacity() -
Method in class org.apache.hadoop.io.BytesWritable
- Get the capacity, which is the maximum size that could handled without
resizing the backing storage.
- getCapacity() -
Method in class org.apache.hadoop.record.Buffer
- Get the capacity, which is the maximum count that could handled without
resizing the backing storage.
- getCategory(List<List<Pentomino.ColumnName>>) -
Method in class org.apache.hadoop.examples.dancing.Pentomino
- Find whether the solution has the x in the upper left quadrant, the
x-midline, the y-midline or in the center.
- getChecksumFile(Path) -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- Return the name of the checksum file associated with a file.
- getChecksumFileLength(Path, long) -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- Return the length of the checksum file given the size of the
actual file.
- getChecksumLength(long, int) -
Static method in class org.apache.hadoop.fs.ChecksumFileSystem
- Calculated the length of the checksum file in bytes.
- getChecksumSize() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getChecksumType() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getChunkPosition(long) -
Method in class org.apache.hadoop.fs.FSInputChecker
- Return position of beginning of chunk containing pos.
- getClass(String, Class<?>) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a Class
.
- getClass(String, Class<? extends U>, Class<U>) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a Class
implementing the interface specified by xface
.
- getClass(byte) -
Method in class org.apache.hadoop.io.AbstractMapWritable
-
- getClass(String, Configuration) -
Static method in class org.apache.hadoop.io.WritableName
- Return the class for a name.
- getClassByName(String) -
Method in class org.apache.hadoop.conf.Configuration
- Load a class by name.
- getClassByName(String) -
Static method in class org.apache.hadoop.contrib.utils.join.DataJoinJob
-
- getClassLoader() -
Method in class org.apache.hadoop.conf.Configuration
- Get the
ClassLoader
for this job.
- getClassName() -
Method in exception org.apache.hadoop.ipc.RemoteException
-
- getClientVersion() -
Method in exception org.apache.hadoop.ipc.RPC.VersionMismatch
- Get the client's prefered version
- getClosest(WritableComparable, Writable) -
Method in class org.apache.hadoop.io.MapFile.Reader
- Finds the record that is the closest match to the specified key.
- getClusterNick() -
Method in class org.apache.hadoop.streaming.StreamJob
-
- getClusterStatus() -
Method in class org.apache.hadoop.mapred.JobClient
- Get status information about the Map-Reduce cluster.
- getClusterStatus() -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Get the current status of the cluster
- getClusterStatus() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getCodec(Path) -
Method in class org.apache.hadoop.io.compress.CompressionCodecFactory
- Find the relevant compression codec for the given file based on its
filename suffix.
- getCodecClasses(Configuration) -
Static method in class org.apache.hadoop.io.compress.CompressionCodecFactory
- Get the list of codecs listed in the configuration
- getColumn() -
Method in class org.apache.hadoop.hbase.HLogEdit
-
- getColumn() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getColumn() -
Method in class org.apache.hadoop.hbase.io.BatchOperation
-
- getColumn() -
Method in class org.apache.hadoop.hbase.shell.generated.Parser
-
- getColumn() -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
- Deprecated.
- getColumn(int) -
Method in class org.apache.hadoop.hbase.shell.InsertCommand
-
- getColumn() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
- Deprecated.
- getColumnDescriptor(String, Map<String, Object>) -
Method in class org.apache.hadoop.hbase.shell.SchemaModificationCommand
- Given a column name and column spec, returns an instance of
HColumnDescriptor representing the column spec.
- getColumnList(HBaseAdmin, HTable) -
Method in class org.apache.hadoop.hbase.shell.DeleteCommand
-
- getColumnName(int) -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Get the name of a given column as a string
- getColumns() -
Method in class org.apache.hadoop.hbase.shell.generated.Parser
-
- getColumns(HBaseAdmin, boolean) -
Method in class org.apache.hadoop.hbase.shell.SelectCommand
- Make sense of the supplied list of columns.
- getCombinerClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user-defined combiner class used to combine map-outputs
before being sent to the reducers.
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.DoubleValueSum
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMax
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMin
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueSum
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMax
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMin
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.UniqValueCount
-
- getCombinerOutput() -
Method in interface org.apache.hadoop.mapred.lib.aggregate.ValueAggregator
-
- getCombinerOutput() -
Method in class org.apache.hadoop.mapred.lib.aggregate.ValueHistogram
-
- getCommand() -
Method in class org.apache.hadoop.hbase.shell.BasicCommand
- basic commands are their own factories.
- getCommand() -
Method in interface org.apache.hadoop.hbase.shell.CommandFactory
-
- getCommandLine() -
Method in class org.apache.hadoop.util.GenericOptionsParser
- Returns the commons-cli
CommandLine
object
to process the parsed arguments.
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.AlterCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.BasicCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.ClearCommand
-
- getCommandType() -
Method in interface org.apache.hadoop.hbase.shell.Command
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.CreateCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.DeleteCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.DisableCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.DropCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.EnableCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.ExitCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.FsCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.InsertCommand
-
- getCommandType() -
Method in class org.apache.hadoop.hbase.shell.JarCommand
-
- getCompression() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getCompressionCodec() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the compression codec of data in this file.
- getCompressionCodec() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- Returns the compression codec of data in this file.
- getCompressionType() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getCompressionType(Configuration) -
Static method in class org.apache.hadoop.io.SequenceFile
- Get the compression type for the reduce outputs
- getCompressMapOutput() -
Method in class org.apache.hadoop.mapred.JobConf
- Are the outputs of the maps be compressed?
- getCompressorType() -
Method in interface org.apache.hadoop.io.compress.CompressionCodec
- Get the type of
Compressor
needed by this CompressionCodec
.
- getCompressorType() -
Method in class org.apache.hadoop.io.compress.DefaultCodec
-
- getCompressorType() -
Method in class org.apache.hadoop.io.compress.GzipCodec
-
- getCompressorType() -
Method in class org.apache.hadoop.io.compress.LzoCodec
-
- getCompressOutput(JobConf) -
Static method in class org.apache.hadoop.mapred.OutputFormatBase
- Is the job output compressed?
- getConf() -
Method in interface org.apache.hadoop.conf.Configurable
- Return the configuration used by this object.
- getConf() -
Method in class org.apache.hadoop.conf.Configured
-
- getConf() -
Method in class org.apache.hadoop.fs.FilterFileSystem
-
- getConf() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getConf() -
Method in class org.apache.hadoop.io.AbstractMapWritable
-
- getConf() -
Method in class org.apache.hadoop.io.compress.DefaultCodec
-
- getConf() -
Method in class org.apache.hadoop.io.compress.LzoCodec
-
- getConf() -
Method in class org.apache.hadoop.io.GenericWritable
-
- getConf() -
Method in class org.apache.hadoop.io.ObjectWritable
-
- getConf() -
Method in class org.apache.hadoop.mapred.SequenceFileInputFilter.FilterBase
-
- getConf() -
Method in class org.apache.hadoop.net.SocksSocketFactory
-
- getConf() -
Method in class org.apache.hadoop.tools.Logalyzer.LogComparator
-
- getConf() -
Method in class org.apache.hadoop.util.CopyFiles
-
- getConf() -
Method in class org.apache.hadoop.util.ToolBase
- Deprecated.
- getConfResourceAsInputStream(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get an input stream attached to the configuration resource with the
given
name
.
- getConfResourceAsReader(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get a
Reader
attached to the configuration resource with the
given name
.
- getConnection(Configuration) -
Static method in class org.apache.hadoop.hbase.HConnectionManager
- Get the connection object for the instance specified by the configuration
If no current connection exists, create a new connection for that instance
- getConnection() -
Method in class org.apache.hadoop.hbase.HTable
-
- getContentLength(Path) -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
-
- getContentLength(Path) -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getContentLength(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Return the number of bytes of the given path
If f is a file, return the size of the file;
If f is a directory, return the size of the directory tree
- getContentLength(Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getContext(String) -
Method in class org.apache.hadoop.metrics.ContextFactory
- Returns the named MetricsContext instance, constructing it if necessary
using the factory's current configuration attributes.
- getContext(String) -
Static method in class org.apache.hadoop.metrics.MetricsUtil
- Utility method to return the named context.
- getContext() -
Method in class org.apache.hadoop.streaming.PipeMapRed
-
- getContextFactory() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Returns the factory by which this context was created.
- getContextName() -
Method in interface org.apache.hadoop.metrics.MetricsContext
- Returns the context name.
- getContextName() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Returns the context name.
- getCorruptFiles() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the number of currupted files.
- getCount() -
Method in class org.apache.hadoop.record.Buffer
- Get the current count of the buffer.
- getCounter(Enum) -
Method in class org.apache.hadoop.mapred.Counters
- Returns current value of the specified counter, or 0 if the counter
does not exist.
- getCounter(String) -
Method in class org.apache.hadoop.mapred.Counters.Group
- Returns the value of the specified counter, or 0 if the counter does
not exist.
- getCounterNames() -
Method in class org.apache.hadoop.mapred.Counters.Group
- Returns the counters for this group, with their names localized.
- getCounters() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Gets the counters for this job.
- getCounters() -
Method in class org.apache.hadoop.mapred.TaskReport
- A table of counters.
- getCurrentSplit(JobConf) -
Static method in class org.apache.hadoop.streaming.StreamUtil
-
- getCurrentTrashDir() -
Method in class org.apache.hadoop.fs.FsShell
- Returns the Trash object associated with this shell.
- getCurrentValue(Writable) -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Get the 'value' corresponding to the last read 'key'.
- getCurrentValue(V) -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
-
- getData() -
Method in class org.apache.hadoop.contrib.utils.join.TaggedMapOutput
-
- getData() -
Method in class org.apache.hadoop.io.DataOutputBuffer
- Returns the current contents of the buffer.
- getDataNode() -
Static method in class org.apache.hadoop.dfs.DataNode
- Return the DataNode object
- getDatanodeReport() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- A formatted string for reporting the status of the DataNode.
- getDatanodeReport(FSConstants.DatanodeReportType) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getDataNodeStats() -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
- Return statistics for each datanode.
- getDataNodeStats() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Return statistics for each datanode.
- getDate() -
Static method in class org.apache.hadoop.util.VersionInfo
- The date that Hadoop was compiled.
- getDeclaredClass() -
Method in class org.apache.hadoop.io.ObjectWritable
- Return the class this is meant to be.
- getDecompressorType() -
Method in interface org.apache.hadoop.io.compress.CompressionCodec
- Get the type of
Decompressor
needed by this CompressionCodec
.
- getDecompressorType() -
Method in class org.apache.hadoop.io.compress.DefaultCodec
-
- getDecompressorType() -
Method in class org.apache.hadoop.io.compress.GzipCodec
-
- getDecompressorType() -
Method in class org.apache.hadoop.io.compress.LzoCodec
-
- getDefaultBlockSize() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getDefaultBlockSize() -
Method in class org.apache.hadoop.fs.FileSystem
- Return the number of bytes that large input files should be optimally
be split into to minimize i/o time.
- getDefaultBlockSize() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Return the number of bytes that large input files should be optimally
be split into to minimize i/o time.
- getDefaultBlockSize() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getDefaultExtension() -
Method in interface org.apache.hadoop.io.compress.CompressionCodec
- Get the default filename extension for this kind of compression.
- getDefaultExtension() -
Method in class org.apache.hadoop.io.compress.DefaultCodec
-
- getDefaultExtension() -
Method in class org.apache.hadoop.io.compress.GzipCodec
-
- getDefaultExtension() -
Method in class org.apache.hadoop.io.compress.LzoCodec
- Get the default filename extension for this kind of compression.
- getDefaultHost(String, String) -
Static method in class org.apache.hadoop.net.DNS
- Returns the default (first) host name associated by the provided
nameserver with the address bound to the specified network interface
- getDefaultHost(String) -
Static method in class org.apache.hadoop.net.DNS
- Returns the default (first) host name associated by the default
nameserver with the address bound to the specified network interface
- getDefaultIP(String) -
Static method in class org.apache.hadoop.net.DNS
- Returns the first available IP address associated with the provided
network interface
- getDefaultReplication() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getDefaultReplication() -
Method in class org.apache.hadoop.fs.FileSystem
- Get the default replication.
- getDefaultReplication() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Get the default replication.
- getDefaultReplication() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getDefaultSocketFactory(Configuration) -
Static method in class org.apache.hadoop.net.NetUtils
- Get the default socket factory as specified by the configuration
parameter hadoop.rpc.socket.factory.default
- getDependants() -
Method in class org.apache.hadoop.hbase.generated.master.hql_jsp
-
- getDependants() -
Method in class org.apache.hadoop.hbase.generated.master.master_jsp
-
- getDependants() -
Method in class org.apache.hadoop.hbase.generated.regionserver.regionserver_jsp
-
- getDependingJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getDescription() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Description of the upgrade object for displaying.
- getDfsUsed() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- The used space by the data node.
- getDfsUsed() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem.DiskStatus
-
- getDiagnostics() -
Method in class org.apache.hadoop.mapred.TaskReport
- A list of error messages.
- getDigest() -
Method in class org.apache.hadoop.io.MD5Hash
- Returns the digest bytes.
- getDirPath() -
Method in class org.apache.hadoop.fs.DF
-
- getDirPath() -
Method in class org.apache.hadoop.fs.DU
-
- getDiskStatus() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Return the disk usage of the filesystem, including total capacity,
used space, and remaining space
- getDisplayName() -
Method in class org.apache.hadoop.mapred.Counters.Group
- Returns localized name of the group.
- getDisplayName(String) -
Method in class org.apache.hadoop.mapred.Counters.Group
- Returns localized name of the specified counter.
- getDistance(Node, Node) -
Method in class org.apache.hadoop.net.NetworkTopology
- Return the distance between two nodes
It is assumed that the distance from one node to its parent is 1
The distance between two nodes is calculated by summing up their distances
to their closest common ancestor.
- getDoubleValue(Object) -
Method in class org.apache.hadoop.contrib.utils.join.JobBase
-
- getDU(File) -
Static method in class org.apache.hadoop.fs.FileUtil
- Takes an input dir and returns the du on that local directory.
- getEditLogSize() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the size of the current edit log.
- getEmptier() -
Method in class org.apache.hadoop.fs.Trash
- Return a
Runnable
that periodically empties the trash.
- getEndColumn() -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
-
- getEndColumn() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getEndKey() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getEndKey() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getEndLine() -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
-
- getEndLine() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getEndRow() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getEntry(MapFile.Reader[], Partitioner<K, V>, K, V) -
Static method in class org.apache.hadoop.mapred.MapFileOutputFormat
- Get an entry from output generated by this class.
- getError() -
Static method in class org.apache.hadoop.metrics.jvm.EventCounter
-
- getEventId() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- Returns event Id.
- getExcludedHosts() -
Method in class org.apache.hadoop.util.HostsFileReader
-
- getExecString() -
Method in class org.apache.hadoop.fs.DF
-
- getExecString() -
Method in class org.apache.hadoop.fs.DU
-
- getExecString() -
Method in class org.apache.hadoop.fs.ShellCommand
- return an array comtaining the command name & its parameters
- getExecutable(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Get the URI of the application's executable.
- getFactor() -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- Get the number of streams to merge at once.
- getFactory(Class) -
Static method in class org.apache.hadoop.io.WritableFactories
- Define a factory for a class.
- getFactory() -
Static method in class org.apache.hadoop.metrics.ContextFactory
- Returns the singleton ContextFactory instance, constructing it if
necessary.
- getFailedJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getFamilies() -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getFatal() -
Static method in class org.apache.hadoop.metrics.jvm.EventCounter
-
- getFetchFailedMaps() -
Method in class org.apache.hadoop.mapred.ReduceTaskStatus
-
- getFile(String, String) -
Method in class org.apache.hadoop.conf.Configuration
- Get a local file name under a directory named in dirsProp with
the given path.
- getFile() -
Method in class org.apache.hadoop.mapred.FileSplit
- Deprecated. Call
FileSplit.getPath()
instead.
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.fs.FileSystem
- Return a 2D array of size 1x1 or greater, containing hostnames
where portions of the given file can be found.
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Return a 2D array of size 1x1 or greater, containing hostnames
where portions of the given file can be found.
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
- Return null if the file doesn't exist; otherwise, get the
locations of the various chunks of the file file from KFS.
- getFileCacheHints(Path, long, long) -
Method in class org.apache.hadoop.mapred.PhasedFileSystem
- Deprecated.
- getFileClassPaths(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the file entries in classpath as an array of Path
- getFileInfo(String) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getFileLength() -
Method in class org.apache.hadoop.dfs.LocatedBlocks
-
- getFileName() -
Method in class org.apache.hadoop.metrics.file.FileContext
- Returns the configured file name, or null.
- getFiles(PathFilter) -
Method in class org.apache.hadoop.fs.InMemoryFileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
- Returns the stat information about the file.
- getFileStatus(Path) -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Returns the stat information about the file.
- getFileStatus(Path) -
Method in class org.apache.hadoop.dfs.HftpFileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.FileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Get file status.
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- getFileStatus(Path) -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
- FileStatus for S3 file systems.
- getFilesystem() -
Method in class org.apache.hadoop.fs.DF
-
- getFileSystem(Configuration) -
Method in class org.apache.hadoop.fs.Path
- Return the FileSystem that owns this Path.
- getFilesystem() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getFileSystem() -
Method in class org.apache.hadoop.mapred.TaskTracker
- Return the DFS filesystem
- getFilesystemName() -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- A MapReduce system always operates on a single filesystem.
- getFilesystemName() -
Method in class org.apache.hadoop.mapred.JobTracker
- Grab the local fs name
- getFileTimestamps(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the timestamps of the files
- getFileType() -
Method in class org.apache.hadoop.fs.s3.INode
-
- getFinishTime() -
Method in class org.apache.hadoop.mapred.TaskReport
- Get finish time of task.
- getFlippable() -
Method in class org.apache.hadoop.examples.dancing.Pentomino.Piece
-
- getFloat(String, float) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a float
.
- getFormattedTimeWithDiff(DateFormat, long, long) -
Static method in class org.apache.hadoop.util.StringUtils
- Formats time in ms and appends difference (finishTime - startTime)
as returned by formatTimeDiff().
- getFs() -
Method in class org.apache.hadoop.mapred.JobClient
- Get a filesystem handle.
- getFsEditName() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the name of the edits file
- getFsImageName() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the name of the fsImage file
- getFsImageNameCheckpoint() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the name of the fsImage file uploaded by periodic
checkpointing
- getFSSize() -
Method in class org.apache.hadoop.fs.InMemoryFileSystem
-
- getFull(HStoreKey) -
Method in class org.apache.hadoop.hbase.HMemcache
- Return all the available columns for the given key.
- getGroup(String) -
Method in class org.apache.hadoop.mapred.Counters
- Returns the named counter group, or an empty group if there is none
with the specified name.
- getGroupNames() -
Method in class org.apache.hadoop.mapred.Counters
- Returns the names of all counter classes.
- getHadoopClientHome() -
Method in class org.apache.hadoop.streaming.StreamJob
-
- getHost() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getHost() -
Method in class org.apache.hadoop.streaming.Environment
-
- getHostName() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
-
- getHostname() -
Static method in class org.apache.hadoop.util.StringUtils
- Return hostname without throwing exception.
- getHosts(String, String) -
Static method in class org.apache.hadoop.net.DNS
- Returns all the host names associated by the provided nameserver with the
address bound to the specified network interface
- getHosts(String) -
Static method in class org.apache.hadoop.net.DNS
- Returns all the host names associated by the default nameserver with the
address bound to the specified network interface
- getHosts() -
Method in class org.apache.hadoop.util.HostsFileReader
-
- getHRegionConnection(HServerAddress) -
Method in interface org.apache.hadoop.hbase.HConnection
- Establishes a connection to the region server at the specified address.
- getHRegionInfo(byte[]) -
Static method in class org.apache.hadoop.hbase.util.Writables
-
- getHRegionInfoOrNull(byte[]) -
Static method in class org.apache.hadoop.hbase.util.Writables
-
- getId() -
Method in class org.apache.hadoop.fs.s3.Block
-
- getId(Class) -
Method in class org.apache.hadoop.io.AbstractMapWritable
-
- GetImage() -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
-
- GetImage() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- GetImageServlet - Class in org.apache.hadoop.dfs
- This class is used in Namesystem's jetty to retrieve a file.
- GetImageServlet() -
Constructor for class org.apache.hadoop.dfs.GetImageServlet
-
- getIndexInterval() -
Method in class org.apache.hadoop.io.MapFile.Writer
- The number of entries that are added before an index entry is added.
- getInetSocketAddress() -
Method in class org.apache.hadoop.hbase.HServerAddress
-
- getInfo() -
Static method in class org.apache.hadoop.metrics.jvm.EventCounter
-
- getInfoPort() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getInfoPort() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getInfoPort() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getInputFormat() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
InputFormat
implementation for the map-reduce job,
defaults to TextInputFormat
if not specified explicity.
- getInputKeyClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Deprecated. Call
RecordReader.createKey()
.
- getInputPaths() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the list of input
Path
s for the map-reduce job.
- getInputSplit() -
Method in interface org.apache.hadoop.mapred.Reporter
- Get the
InputSplit
object for a map.
- getInputValueClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Deprecated. Call
RecordReader.createValue()
.
- getInt(String, int) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as an int
.
- getInterfaceName() -
Method in exception org.apache.hadoop.ipc.RPC.VersionMismatch
- Get the interface name
- getInternalFilter() -
Method in class org.apache.hadoop.hbase.filter.WhileMatchRowFilter
- Returns the internal filter being wrapped
- getIPs(String) -
Static method in class org.apache.hadoop.net.DNS
- Returns all the IPs associated with the provided interface, if any, in
textual form.
- getIsJavaMapper(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Check whether the job is using a Java Mapper.
- getIsJavaRecordReader(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Check whether the job is using a Java RecordReader
- getIsJavaRecordWriter(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Will the reduce use a Java RecordWriter?
- getIsJavaReducer(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Check whether the job is using a Java Reducer.
- getIsMap() -
Method in class org.apache.hadoop.mapred.MapTaskStatus
-
- getIsMap() -
Method in class org.apache.hadoop.mapred.ReduceTaskStatus
-
- getJar() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user jar for the map-reduce job.
- getJob(String) -
Method in class org.apache.hadoop.mapred.JobClient
- Get an
RunningJob
object to track an ongoing job.
- getJob(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobClient() -
Method in class org.apache.hadoop.mapred.TaskTracker
- The connection to the JobTracker, used by the TaskRunner
for locating remote files.
- getJobConf() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getJobCounters(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab the current job counters
- getJobCounters(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobEndNotificationURI() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the uri to be invoked in-order to send a notification after the job
has completed (success/failure).
- getJobFile() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the configuration file for the job.
- getJobFile() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get the path of the submitted job configuration.
- getJobID() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getJobId() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the job id.
- getJobId() -
Method in class org.apache.hadoop.mapred.JobStatus
-
- getJobID() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get the job identifier.
- getJobName() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user-specified job name.
- getJobName() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getJobName() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the user-specified job name.
- getJobName() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get the name of the job.
- getJobPriority() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
JobPriority
for this job.
- getJobProfile(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab a handle to a job that is already known to the JobTracker.
- getJobProfile(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobStatus(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab a handle to a job that is already known to the JobTracker.
- getJobStatus(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobTrackerHostPort() -
Method in class org.apache.hadoop.streaming.StreamJob
-
- getJobTrackerMachine() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getJobTrackerState() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the current state of the
JobTracker
,
as JobTracker.State
- getKeepCommandFile(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Does the user want to keep the command file for debugging? If this is
true, pipes will write a copy of the command data to a file in the
task directory named "downlink.data", which may be used to run the C++
program under the debugger.
- getKeepFailedTaskFiles() -
Method in class org.apache.hadoop.mapred.JobConf
- Should the temporary files for failed tasks be kept?
- getKeepTaskFilesPattern() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the regular expression that is matched against the task names
to see if we need to keep the files.
- getKey() -
Method in interface org.apache.hadoop.io.SequenceFile.Sorter.RawKeyValueIterator
- Gets the current raw key
- getKey() -
Method in class org.apache.hadoop.io.SequenceFile.Sorter.SegmentDescriptor
- Returns the stored rawKey
- getKeyClass() -
Method in class org.apache.hadoop.io.MapFile.Reader
- Returns the class of keys in this file.
- getKeyClass() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the class of keys in this file.
- getKeyClass() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- Returns the class of keys in this file.
- getKeyClass() -
Method in class org.apache.hadoop.io.WritableComparator
- Returns the WritableComparable implementation class.
- getKeyClass() -
Method in class org.apache.hadoop.mapred.KeyValueLineRecordReader
-
- getKeyClass() -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
- The class of key that must be passed to
SequenceFileRecordReader.next(WritableComparable,Writable)
..
- getLastUpdate() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- The time when this information was accurate.
- getLen() -
Method in class org.apache.hadoop.fs.FileStatus
-
- getLength(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated. Use getFileStatus() instead
- getLength(Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
- Deprecated.
- getLength() -
Method in class org.apache.hadoop.fs.s3.Block
-
- getLength() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
- Get the total number of bytes in the data of the
InputSplit
.
- getLength() -
Method in class org.apache.hadoop.io.DataInputBuffer
- Returns the length of the input.
- getLength() -
Method in class org.apache.hadoop.io.DataOutputBuffer
- Returns the length of the valid data currently in the buffer.
- getLength() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- Returns the current length of the output file.
- getLength() -
Method in class org.apache.hadoop.io.Text
- Returns the number of bytes in the byte array
- getLength() -
Method in class org.apache.hadoop.io.UTF8
- Deprecated. The number of bytes in the encoded string.
- getLength() -
Method in class org.apache.hadoop.mapred.FileSplit
- The number of bytes in the file to process.
- getLength() -
Method in interface org.apache.hadoop.mapred.InputSplit
- Get the total number of bytes in the data of the
InputSplit
.
- getLength() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
-
- getLength(int) -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns the length of the ith Path
- getLengths() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns an array containing the lengths of the files in
the split
- getLevel() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- Return this node's level in the tree.
- getLevel() -
Method in interface org.apache.hadoop.net.Node
- Return this node's level in the tree.
- getLevel() -
Method in class org.apache.hadoop.net.NodeBase
- Return this node's level in the tree.
- getLine() -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
- Deprecated.
- getLine() -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
- Deprecated.
- getListenerAddress() -
Method in class org.apache.hadoop.ipc.Server
- Return the socket (ip+port) on which the RPC server is listening to.
- getListing(String) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getLiteralValues() -
Method in class org.apache.hadoop.hbase.shell.generated.Parser
-
- getLoad() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getLoad() -
Method in class org.apache.hadoop.hbase.HServerLoad
-
- getLoadNativeLibraries(JobConf) -
Method in class org.apache.hadoop.util.NativeCodeLoader
- Return if native hadoop libraries, if present, can be used for this job.
- getLocal(Configuration) -
Static method in class org.apache.hadoop.fs.FileSystem
- Get the local file syste
- getLocalCache(URI, Configuration, Path, boolean, long, Path) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Get the locally cached file or archive; it could either be
previously cached (and valid) or copy it from the
FileSystem
now.
- getLocalCacheArchives(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Return the path array of the localized caches
- getLocalCacheFiles(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Return the path array of the localized files
- getLocalDirs() -
Method in class org.apache.hadoop.mapred.JobConf
-
- getLocalJobFilePath(String) -
Static method in class org.apache.hadoop.mapred.JobHistory.JobInfo
- Get the path of the locally stored job file
- getLocalJobFilePath(String) -
Static method in class org.apache.hadoop.mapred.JobTracker
- Get the localized job file path on the job trackers local file system
- getLocalPath(String, String) -
Method in class org.apache.hadoop.conf.Configuration
- Get a local file under a directory named by dirsProp with
the given path.
- getLocalPath(String) -
Method in class org.apache.hadoop.mapred.JobConf
- Constructs a local file name.
- getLocalPathForWrite(String, Configuration) -
Method in class org.apache.hadoop.fs.LocalDirAllocator
- Get a path from the local FS.
- getLocalPathForWrite(String, long, Configuration) -
Method in class org.apache.hadoop.fs.LocalDirAllocator
- Get a path from the local FS.
- getLocalPathToRead(String, Configuration) -
Method in class org.apache.hadoop.fs.LocalDirAllocator
- Get a path from the local FS for reading.
- getLocatedBlocks() -
Method in class org.apache.hadoop.dfs.LocatedBlocks
- Get located blocks.
- getLocations() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
- Get the list of hostnames where the input split is located.
- getLocations() -
Method in class org.apache.hadoop.mapred.FileSplit
-
- getLocations() -
Method in interface org.apache.hadoop.mapred.InputSplit
- Get the list of hostnames where the input split is located.
- getLocations() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
-
- getLockid() -
Method in class org.apache.hadoop.hbase.io.BatchUpdate
-
- getLog() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getLong(String, long) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property as a long
.
- getLongValue(Object) -
Method in class org.apache.hadoop.contrib.utils.join.JobBase
-
- getMapCompletionEvents(String, int, int) -
Method in class org.apache.hadoop.mapred.TaskTracker
-
- getMapOutputCompressionType() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
SequenceFile.CompressionType
for the map outputs.
- getMapOutputCompressorClass(Class<? extends CompressionCodec>) -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
CompressionCodec
for compressing the map outputs.
- getMapOutputKeyClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the key class for the map output data.
- getMapOutputValueClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the value class for the map output data.
- getMapperClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
Mapper
class for the job.
- getMapredJobID() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getMapRunnerClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
MapRunnable
class for the job.
- getMapTaskReports(String) -
Method in class org.apache.hadoop.mapred.JobClient
- Get the information of the current state of the map tasks of a job.
- getMapTaskReports(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab a bunch of info on the map tasks that make up the job
- getMapTaskReports(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getMapTasks() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the number of currently running map tasks in the cluster.
- getMaster() -
Method in class org.apache.hadoop.hbase.HBaseAdmin
-
- getMaster() -
Method in interface org.apache.hadoop.hbase.HConnection
-
- getMasterAddress() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getMaxBufferedDeleteTerms() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMaxBufferedDocs() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMaxFieldLength() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMaxMapAttempts() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the configured number of maximum attempts that will be made to run a
map task, as specified by the
mapred.map.max.attempts
property.
- getMaxMapTaskFailuresPercent() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the maximum percentage of map tasks that can fail without
the job being aborted.
- getMaxMergeDocs() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMaxReduceAttempts() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the configured number of maximum attempts that will be made to run a
reduce task, as specified by the
mapred.reduce.max.attempts
property.
- getMaxReduceTaskFailuresPercent() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the maximum percentage of reduce tasks that can fail without
the job being aborted.
- getMaxTaskFailuresPerTracker() -
Method in class org.apache.hadoop.mapred.JobConf
- Expert: Get the maximum no.
- getMaxTasks() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the maximum capacity for running tasks in the cluster.
- getMaxValueLength() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getMaxVersions() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getMemory() -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- Get the total amount of buffer memory, in bytes.
- getMergeFactor() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getMessage() -
Method in exception org.apache.hadoop.hbase.shell.generated.ParseException
- This method has the standard behavior when this object has been
created using the standard constructors.
- getMessage() -
Method in error org.apache.hadoop.hbase.shell.generated.TokenMgrError
- You can also modify the body of this method to customize your error messages.
- getMessage() -
Method in exception org.apache.hadoop.mapred.InvalidInputException
- Get a summary message of the problems found.
- getMessage() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getMessage() -
Method in exception org.apache.hadoop.record.compiler.generated.ParseException
- This method has the standard behavior when this object has been
created using the standard constructors.
- getMessage() -
Method in error org.apache.hadoop.record.compiler.generated.TokenMgrError
- You can also modify the body of this method to customize your error messages.
- getMetadata() -
Method in class org.apache.hadoop.io.SequenceFile.Metadata
-
- getMetadata() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the metadata object of the file
- getMetric(String) -
Method in class org.apache.hadoop.metrics.spi.OutputRecord
- Returns the metric object which can be a Float, Integer, Short or Byte.
- getMetricNames() -
Method in class org.apache.hadoop.metrics.spi.OutputRecord
- Returns the set of metric names.
- getMissingIds() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return a list of missing block names (as list of Strings).
- getMissingSize() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return total size of missing data, in bytes.
- getModificationTime() -
Method in class org.apache.hadoop.fs.FileStatus
- Get the modification time of the file.
- getMount() -
Method in class org.apache.hadoop.fs.DF
-
- getMsg() -
Method in class org.apache.hadoop.hbase.HMsg
- Accessor
- getMsg() -
Method in class org.apache.hadoop.hbase.shell.ReturnMsg
-
- getName() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getName() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Deprecated.
- getName() -
Method in class org.apache.hadoop.examples.dancing.Pentomino.Piece
-
- getName() -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated. call #getUri() instead.
- getName() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Deprecated. call #getUri() instead.
- getName() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
- Deprecated.
- getName() -
Method in class org.apache.hadoop.fs.Path
- Returns the final component of this path.
- getName() -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
- Deprecated.
- getName() -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
-
- getName() -
Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getName() -
Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getName(Class) -
Static method in class org.apache.hadoop.io.WritableName
- Return the name for a class.
- getName() -
Method in class org.apache.hadoop.mapred.Counters.Group
- Returns raw name of the group.
- getName() -
Method in class org.apache.hadoop.mapred.PhasedFileSystem
- Deprecated.
- getName() -
Method in interface org.apache.hadoop.net.Node
- Return this node's name
- getName() -
Method in class org.apache.hadoop.net.NodeBase
- Return this node's name
- getNamed(String, Configuration) -
Static method in class org.apache.hadoop.fs.FileSystem
- Deprecated. call #get(URI,Configuration) instead.
- getNamenode() -
Method in class org.apache.hadoop.dfs.DataNode
- Return the namenode's identifier
- getNameNodeAddr() -
Method in class org.apache.hadoop.dfs.DataNode
-
- getNameNodeAddress() -
Method in class org.apache.hadoop.dfs.NameNode
- Returns the address on which the NameNodes is listening to.
- getNameNodeMetrics() -
Static method in class org.apache.hadoop.dfs.NameNode
-
- getNetworkLocation() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- rack name
- getNetworkLocation() -
Method in interface org.apache.hadoop.net.Node
- Return the string representation of this node's network location
- getNetworkLocation() -
Method in class org.apache.hadoop.net.NodeBase
- Return this node's network location
- getNewJobId() -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Allocate a name for the job.
- getNewJobId() -
Method in class org.apache.hadoop.mapred.JobTracker
- Allocates a new JobId string.
- getNextToken() -
Method in class org.apache.hadoop.hbase.shell.generated.Parser
-
- getNextToken() -
Method in class org.apache.hadoop.hbase.shell.generated.ParserTokenManager
-
- getNextToken() -
Method in class org.apache.hadoop.record.compiler.generated.Rcc
-
- getNextToken() -
Method in class org.apache.hadoop.record.compiler.generated.RccTokenManager
-
- getNullContext(String) -
Static method in class org.apache.hadoop.metrics.ContextFactory
- Returns a "null" context - one which does nothing.
- getNumber() -
Method in class org.apache.hadoop.metrics.spi.MetricValue
-
- getNumberColumns() -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Get the number of columns.
- getNumberOfRegions() -
Method in class org.apache.hadoop.hbase.HServerLoad
-
- getNumberOfRequests() -
Method in class org.apache.hadoop.hbase.HServerLoad
-
- getNumBytesInSum() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getNumFiles(PathFilter) -
Method in class org.apache.hadoop.fs.InMemoryFileSystem
-
- getNumMapTasks() -
Method in class org.apache.hadoop.mapred.JobConf
- Get configured the number of reduce tasks for this job.
- getNumOfLeaves() -
Method in class org.apache.hadoop.net.NetworkTopology
- Return the total number of nodes
- getNumOfRacks() -
Method in class org.apache.hadoop.net.NetworkTopology
- Return the total number of racks
- getNumPaths() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns the number of Paths in the split
- getNumReduceTasks() -
Method in class org.apache.hadoop.mapred.JobConf
- Get configured the number of reduce tasks for this job.
- getObject(String) -
Method in class org.apache.hadoop.conf.Configuration
- Deprecated. A side map of Configuration to Object should be used instead.
- getOnlineMetaRegions() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getOnlineRegions() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getOp() -
Method in class org.apache.hadoop.hbase.io.BatchOperation
-
- getOut() -
Method in class org.apache.hadoop.hbase.shell.BasicCommand
-
- getOut() -
Method in class org.apache.hadoop.hbase.shell.formatter.AsciiTableFormatter
-
- getOut() -
Method in class org.apache.hadoop.hbase.shell.formatter.HtmlTableFormatter
-
- getOut() -
Method in interface org.apache.hadoop.hbase.shell.TableFormatter
-
- getOutputCompressionType(JobConf) -
Static method in class org.apache.hadoop.mapred.SequenceFileOutputFormat
- Get the
SequenceFile.CompressionType
for the output SequenceFile
.
- getOutputCompressorClass(JobConf, Class<? extends CompressionCodec>) -
Static method in class org.apache.hadoop.mapred.OutputFormatBase
- Get the
CompressionCodec
for compressing the job outputs.
- getOutputFormat() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
OutputFormat
implementation for the map-reduce job,
defaults to TextOutputFormat
if not specified explicity.
- getOutputKeyClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the key class for the job output data.
- getOutputKeyComparator() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
WritableComparable
comparator used to compare keys.
- getOutputPath() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
Path
to the output directory for the map-reduce job.
- getOutputValueClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the value class for job outputs.
- getOutputValueGroupingComparator() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user defined
WritableComparable
comparator for
grouping keys of inputs to the reduce.
- getOverReplicatedBlocks() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the number of over-replicated blocks.
- getParent() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- Return this node's parent
- getParent() -
Method in class org.apache.hadoop.fs.Path
- Returns the parent of a path or null if at root.
- getParent() -
Method in interface org.apache.hadoop.net.Node
- Return this node's parent
- getParent() -
Method in class org.apache.hadoop.net.NodeBase
- Return this node's parent
- getPartition(IntWritable, IntWritable, int) -
Method in class org.apache.hadoop.examples.SleepJob
-
- getPartition(K2, V2, int) -
Method in class org.apache.hadoop.mapred.lib.HashPartitioner
- Use
Object.hashCode()
to partition.
- getPartition(K2, V2, int) -
Method in class org.apache.hadoop.mapred.lib.KeyFieldBasedPartitioner
- Use
Object.hashCode()
to partition.
- getPartition(K2, V2, int) -
Method in interface org.apache.hadoop.mapred.Partitioner
- Get the paritition number for a given key (hence record) given the total
number of partitions i.e.
- getPartitionerClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
Partitioner
used to partition Mapper
-outputs
to be sent to the Reducer
s.
- getPath() -
Method in class org.apache.hadoop.fs.FileStatus
-
- getPath() -
Method in class org.apache.hadoop.mapred.FileSplit
- The file containing this split's data.
- getPath(int) -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns the ith Path
- getPath(Node) -
Static method in class org.apache.hadoop.net.NodeBase
- Return this node's path
- getPaths() -
Method in class org.apache.hadoop.mapred.MultiFileSplit
- Returns all the Paths in the split
- getPercentUsed() -
Method in class org.apache.hadoop.fs.DF
-
- getPercentUsed() -
Method in class org.apache.hadoop.fs.InMemoryFileSystem
-
- getPeriod() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Returns the timer period.
- getPlatformName() -
Static method in class org.apache.hadoop.util.PlatformName
- Get the complete platform as per the java-vm.
- getPort() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getPort() -
Method in class org.apache.hadoop.hbase.HServerAddress
-
- getPort() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- Get the port that the server is on
- getPort() -
Method in class org.apache.hadoop.mapred.StatusHttpServer
- Get the port that the server is on
- getPos() -
Method in class org.apache.hadoop.fs.BufferedFSInputStream
-
- getPos() -
Method in exception org.apache.hadoop.fs.ChecksumException
-
- getPos() -
Method in class org.apache.hadoop.fs.FSDataInputStream
-
- getPos() -
Method in class org.apache.hadoop.fs.FSDataOutputStream
-
- getPos() -
Method in class org.apache.hadoop.fs.FSInputChecker
-
- getPos() -
Method in class org.apache.hadoop.fs.FSInputStream
- Return the current offset from the start of the file
- getPos() -
Method in interface org.apache.hadoop.fs.Seekable
- Return the current offset from the start of the file
- getPos() -
Method in class org.apache.hadoop.mapred.KeyValueLineRecordReader
-
- getPos() -
Method in class org.apache.hadoop.mapred.LineRecordReader
-
- getPos() -
Method in interface org.apache.hadoop.mapred.RecordReader
- Returns the current position in the input.
- getPos() -
Method in class org.apache.hadoop.mapred.SequenceFileAsTextRecordReader
-
- getPos() -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
-
- getPos() -
Method in class org.apache.hadoop.streaming.StreamBaseRecordReader
- Returns the current position in the input.
- getPosition() -
Method in class org.apache.hadoop.io.DataInputBuffer
- Returns the current position in the input.
- getPosition() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Return the current byte position in the input file.
- getPreferredBlockSize(String) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getProblems() -
Method in exception org.apache.hadoop.mapred.InvalidInputException
- Get the complete list of the problems reported.
- getProgress() -
Method in interface org.apache.hadoop.io.SequenceFile.Sorter.RawKeyValueIterator
- Gets the Progress object; this has a float (0.0 - 1.0)
indicating the bytes processed by the iterator so far
- getProgress() -
Method in class org.apache.hadoop.mapred.KeyValueLineRecordReader
-
- getProgress() -
Method in class org.apache.hadoop.mapred.LineRecordReader
- Get the progress within the split
- getProgress() -
Method in interface org.apache.hadoop.mapred.RecordReader
- How much of the input has the
RecordReader
consumed i.e.
- getProgress() -
Method in class org.apache.hadoop.mapred.SequenceFileAsTextRecordReader
-
- getProgress() -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
- Return the progress within the input split
- getProgress() -
Method in class org.apache.hadoop.mapred.TaskReport
- The amount completed, between zero and one.
- getProgress() -
Method in class org.apache.hadoop.streaming.StreamBaseRecordReader
-
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.dfs.NameNode
-
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.hbase.HMaster
- Return protocol version corresponding to protocol interface.
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Return protocol version corresponding to protocol interface.
- getProtocolVersion(String, long) -
Method in interface org.apache.hadoop.ipc.VersionedProtocol
- Return protocol version corresponding to protocol interface.
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getProtocolVersion(String, long) -
Method in class org.apache.hadoop.mapred.TaskTracker
-
- getProxy(Class<?>, long, InetSocketAddress, Configuration, SocketFactory) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a client-side proxy object that implements the named protocol,
talking to a server at the named address.
- getProxy(Class<?>, long, InetSocketAddress, Configuration) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a client-side proxy object with the default SocketFactory
- getQueryStr() -
Method in class org.apache.hadoop.hbase.shell.generated.Parser
-
- getRaw(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the value of the
name
property, without doing
variable expansion.
- getRawCapacity() -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
- Return the total raw capacity of the filesystem, disregarding
replication .
- getRawCapacity() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Return the total raw capacity of the filesystem, disregarding
replication .
- getRawFileSystem() -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
- get the raw file system
- getRawUsed() -
Method in class org.apache.hadoop.dfs.ChecksumDistributedFileSystem
- Return the total raw used space in the filesystem, disregarding
replication .
- getRawUsed() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
- Return the total raw used space in the filesystem, disregarding
replication .
- getReader(FileSystem, Filter) -
Method in class org.apache.hadoop.hbase.HStoreFile
- Get reader for the store file map file.
- getReaders(FileSystem, Path, Configuration) -
Static method in class org.apache.hadoop.mapred.MapFileOutputFormat
- Open the output generated by this format.
- getReaders(Configuration, Path) -
Static method in class org.apache.hadoop.mapred.SequenceFileOutputFormat
- Open the output generated by this format.
- getReadyJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getRecordName() -
Method in interface org.apache.hadoop.metrics.MetricsRecord
- Returns the record name.
- getRecordName() -
Method in class org.apache.hadoop.metrics.spi.MetricsRecordImpl
- Returns the record name.
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.hbase.mapred.TableInputFormat
- Get the
RecordReader
for the given InputSplit
.
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.FileInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in interface org.apache.hadoop.mapred.InputFormat
- Get the
RecordReader
for the given InputSplit
.
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.KeyValueTextInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.MultiFileInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.SequenceFileAsTextInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.SequenceFileInputFilter
- Create a record reader for the given split
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.SequenceFileInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.mapred.TextInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) -
Method in class org.apache.hadoop.streaming.StreamInputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.hbase.mapred.IndexOutputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.hbase.mapred.TableOutputFormat
- Get the
RecordWriter
for the given job.
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.lib.NullOutputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.MapFileOutputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in interface org.apache.hadoop.mapred.OutputFormat
- Get the
RecordWriter
for the given job.
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.OutputFormatBase
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.SequenceFileOutputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) -
Method in class org.apache.hadoop.mapred.TextOutputFormat
-
- getReducerClass() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the
Reducer
class for the job.
- getReduceTaskReports(String) -
Method in class org.apache.hadoop.mapred.JobClient
- Get the information of the current state of the reduce tasks of a job.
- getReduceTaskReports(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Grab a bunch of info on the reduce tasks that make up the job
- getReduceTaskReports(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getReduceTasks() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the number of currently running reduce tasks in the cluster.
- getRegion(Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Protected utility method for safely obtaining an HRegion handle.
- getRegion(Text, boolean) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Protected utility method for safely obtaining an HRegion handle.
- getRegionDir() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getRegionDir(Path, Text) -
Static method in class org.apache.hadoop.hbase.HRegion
- Computes the Path of the HRegion
- getRegionId() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getRegionId() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getRegionInfo() -
Method in class org.apache.hadoop.hbase.HMsg
- Accessor
- getRegionInfo() -
Method in class org.apache.hadoop.hbase.HRegion
- Returns a HRegionInfo object for this region
- getRegionInfo(Text) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get metainfo about an HRegion
- getRegionInfo() -
Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getRegionInfo(Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get metainfo about an HRegion
- getRegionName() -
Method in class org.apache.hadoop.hbase.HMaster.MetaRegion
-
- getRegionName() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getRegionName() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getRegionsToCheck() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getRemaining() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- The raw free space.
- getRemaining() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem.DiskStatus
-
- getRemainingArgs() -
Method in class org.apache.hadoop.util.GenericOptionsParser
- Returns an array of Strings containing only application-specific arguments.
- getRemoteAddress() -
Static method in class org.apache.hadoop.ipc.Server
- Returns remote address as a string when invoked inside an RPC.
- getRemoteIp() -
Static method in class org.apache.hadoop.ipc.Server
- Returns the remote side ip address when invoked inside an RPC
Returns null incase of an error.
- getReplication() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the intended replication factor, against which the over/under-
replicated blocks are counted.
- getReplication() -
Method in class org.apache.hadoop.fs.FileStatus
- Get the replication factor of a file.
- getReplication(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Deprecated. Use getFileStatus() instead
- getReplication(Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
- Deprecated.
- getReplicationFactor() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the actual replication factor.
- getReport() -
Method in class org.apache.hadoop.contrib.utils.join.JobBase
- log the counters
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.DoubleValueSum
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMax
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMin
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueSum
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMax
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMin
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.UniqValueCount
-
- getReport() -
Method in interface org.apache.hadoop.mapred.lib.aggregate.ValueAggregator
-
- getReport() -
Method in class org.apache.hadoop.mapred.lib.aggregate.ValueHistogram
-
- getReportDetails() -
Method in class org.apache.hadoop.mapred.lib.aggregate.ValueHistogram
-
- getReportItems() -
Method in class org.apache.hadoop.mapred.lib.aggregate.ValueHistogram
-
- getRequestCount() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getResource(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the
URL
for the named resource.
- getRevision() -
Static method in class org.apache.hadoop.util.VersionInfo
- Get the subversion revision number for the root directory
- getRootDir() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getRootDir() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getRootRegionLocation() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getRotations() -
Method in class org.apache.hadoop.examples.dancing.Pentomino.Piece
-
- getRow(Text, Text) -
Method in interface org.apache.hadoop.hbase.HRegionInterface
- Get all the data for the specified row
- getRow(Text, Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
- Get all the data for the specified row
- getRow() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getRow(Text) -
Method in class org.apache.hadoop.hbase.HTable
- Get all the data for the specified row
- getRow() -
Method in class org.apache.hadoop.hbase.io.BatchUpdate
-
- getRow() -
Method in class org.apache.hadoop.hbase.shell.InsertCommand
-
- getRowkeyName() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getRunnable() -
Method in class org.apache.hadoop.util.Daemon
-
- getRunningJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getRunningJobs() -
Method in class org.apache.hadoop.mapred.JobTracker
- Version that is called from a timer thread, and therefore needs to be
careful to synchronize.
- getRunState() -
Method in class org.apache.hadoop.mapred.JobStatus
-
- getSafeModeText() -
Method in class org.apache.hadoop.dfs.JspHelper
-
- getScanner(Text[], Text, long, RowFilterInterface) -
Method in class org.apache.hadoop.hbase.HRegion
- Return an iterator that scans over the HRegion, returning the indicated
columns for only the rows that match the data filter.
- getSerializedLength() -
Method in class org.apache.hadoop.fs.s3.INode
-
- getServer() -
Method in class org.apache.hadoop.hbase.HMaster.MetaRegion
-
- getServer(Object, String, int, Configuration) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServer(Object, String, int, int, boolean, Configuration) -
Static method in class org.apache.hadoop.ipc.RPC
- Construct a server for a protocol implementation instance listening on a
port and address.
- getServerAddress() -
Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getServerAddress() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getServerInfo() -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- getServersToLoad() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getServersToServerInfo() -
Method in class org.apache.hadoop.hbase.HMaster
-
- getServerVersion() -
Method in exception org.apache.hadoop.ipc.RPC.VersionMismatch
- Get the server's agreed to version.
- getSessionId() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the user-specified session identifier.
- getShape(boolean, int) -
Method in class org.apache.hadoop.examples.dancing.Pentomino.Piece
-
- getShuffleFinishTime() -
Method in class org.apache.hadoop.mapred.MapTaskStatus
-
- getShuffleFinishTime() -
Method in class org.apache.hadoop.mapred.ReduceTaskStatus
-
- getSimilarityName() -
Method in class org.apache.hadoop.hbase.mapred.IndexConfiguration
-
- getSize() -
Method in class org.apache.hadoop.hbase.HMemcache
-
- getSize() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getSize() -
Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- getSize() -
Method in class org.apache.hadoop.io.BytesWritable
- Get the current size of the buffer.
- getSocketFactory(Configuration, Class<?>) -
Static method in class org.apache.hadoop.net.NetUtils
- Get the socket factory for the given class according to its
configuration parameter
hadoop.rpc.socket.factory.class.<ClassName>.
- getSocketFactoryFromProperty(Configuration, String) -
Static method in class org.apache.hadoop.net.NetUtils
- Get the socket factory corresponding to the given proxy URI.
- getSortFinishTime() -
Method in class org.apache.hadoop.mapred.MapTaskStatus
-
- getSortFinishTime() -
Method in class org.apache.hadoop.mapred.ReduceTaskStatus
-
- getSpace(int) -
Static method in class org.apache.hadoop.streaming.StreamUtil
-
- getSpeculativeExecution() -
Method in class org.apache.hadoop.mapred.JobConf
- Should speculative execution be used for this job?
Defaults to
true
.
- getSplits(int) -
Method in class org.apache.hadoop.examples.dancing.Pentomino
- Generate a list of prefixes to a given depth
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.hbase.mapred.TableInputFormat
- A split will be created for each HRegion of the input table
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.mapred.FileInputFormat
- Splits files returned by
FileInputFormat.listPaths(JobConf)
when
they're too big.
- getSplits(JobConf, int) -
Method in interface org.apache.hadoop.mapred.InputFormat
- Logically split the set of input files for the job.
- getSplits(JobConf, int) -
Method in class org.apache.hadoop.mapred.MultiFileInputFormat
-
- getStart() -
Method in class org.apache.hadoop.mapred.FileSplit
- The position of the first byte in the file to process.
- getStartCode() -
Method in class org.apache.hadoop.hbase.HServerInfo
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.HMaster.MetaRegion
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getStartKey() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getStartKeys() -
Method in class org.apache.hadoop.hbase.HTable
- Gets the starting row key for every region in the currently open table
- getStartRow() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getStartTime() -
Method in class org.apache.hadoop.mapred.JobStatus
-
- getStartTime() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getStartTime() -
Method in class org.apache.hadoop.mapred.TaskReport
- Get start time of task.
- getState() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
-
- getState() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getState() -
Method in class org.apache.hadoop.mapred.TaskReport
- The most recent state, reported by a
Reporter
.
- getStats() -
Method in class org.apache.hadoop.dfs.NameNode
-
- getStatusText(boolean) -
Method in class org.apache.hadoop.dfs.UpgradeStatusReport
- Get upgradeStatus data as a text for reporting.
- getStopRowKey() -
Method in class org.apache.hadoop.hbase.filter.StopRowFilter
- An accessor for the stopRowKey
- getStorageID() -
Method in class org.apache.hadoop.dfs.DatanodeID
-
- getStringLiteral() -
Method in class org.apache.hadoop.hbase.shell.generated.Parser
-
- getStrings(String) -
Method in class org.apache.hadoop.conf.Configuration
- Get the comma delimited values of the
name
property as
an array of String
s.
- getStrings(String) -
Static method in class org.apache.hadoop.util.StringUtils
- returns an arraylist of strings
- getSuccessfulJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- GetSuffix(int) -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
-
- GetSuffix(int) -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getSum() -
Method in class org.apache.hadoop.mapred.lib.aggregate.DoubleValueSum
-
- getSum() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueSum
-
- getSymlink(Configuration) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- This method checks to see if symlinks are to be create for the
localized cache files in the current working directory
- getSystemDir() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the system directory where job-specific files are to be placed.
- getTableDesc() -
Method in class org.apache.hadoop.hbase.HRegion
-
- getTableDesc() -
Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getTableName() -
Method in class org.apache.hadoop.hbase.HTable
-
- getTableName() -
Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getTableNameFromRegionName(Text) -
Static method in class org.apache.hadoop.hbase.HRegionInfo
- Extracts table name prefix from a region name.
- getTableServers(Text) -
Method in interface org.apache.hadoop.hbase.HConnection
- Gets the servers of the given table.
- getTabSize(int) -
Method in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
-
- getTabSize(int) -
Method in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- getTag() -
Method in class org.apache.hadoop.contrib.utils.join.TaggedMapOutput
-
- getTag(String) -
Method in class org.apache.hadoop.metrics.spi.OutputRecord
- Returns a tag object which is can be a String, Integer, Short or Byte.
- getTagNames() -
Method in class org.apache.hadoop.metrics.spi.OutputRecord
- Returns the set of tag names
- getTask(String) -
Method in class org.apache.hadoop.mapred.TaskTracker
- Called upon startup by the child process, to fetch Task data.
- getTaskAttempts() -
Method in class org.apache.hadoop.mapred.JobHistory.Task
- Returns all task attempts for this task.
- getTaskCompletionEvents(String, int, int) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Get task completion events for the jobid, starting from fromEventId.
- getTaskCompletionEvents(String, int, int) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getTaskCompletionEvents(int) -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get events indicating completion (success/failure) of component tasks.
- getTaskDiagnostics(String, String, String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Get the diagnostics for a given task in a given job
- getTaskDiagnostics(String, String, String) -
Method in class org.apache.hadoop.mapred.JobTracker
- Get the diagnostics for a given task
- getTaskId() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- Returns task id.
- getTaskId() -
Method in class org.apache.hadoop.mapred.TaskLogAppender
- Getter/Setter methods for log4j.
- getTaskId() -
Method in class org.apache.hadoop.mapred.TaskReport
- The id of the task.
- getTaskInfo(JobConf) -
Static method in class org.apache.hadoop.streaming.StreamUtil
-
- getTaskLogFile(String, TaskLog.LogName) -
Static method in class org.apache.hadoop.mapred.TaskLog
-
- getTaskLogLength(JobConf) -
Static method in class org.apache.hadoop.mapred.TaskLog
- Get the desired maximum length of task's logs.
- getTaskOutputFilter(JobConf) -
Static method in class org.apache.hadoop.mapred.JobClient
- Get the task output filter out of the JobConf.
- getTaskOutputFilter() -
Method in class org.apache.hadoop.mapred.JobClient
- Deprecated.
- getTaskStatus() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- Returns enum Status.SUCESS or Status.FAILURE.
- getTaskTracker(String) -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getTaskTrackerHttp() -
Method in class org.apache.hadoop.mapred.TaskCompletionEvent
- http location of the tasktracker where this task ran.
- getTaskTrackerMetrics() -
Method in class org.apache.hadoop.mapred.TaskTracker
-
- getTaskTrackerReportAddress() -
Method in class org.apache.hadoop.mapred.TaskTracker
- Return the port at which the tasktracker bound to
- getTaskTrackers() -
Method in class org.apache.hadoop.mapred.ClusterStatus
- Get the number of task trackers in the cluster.
- getTimestamp(Configuration, URI) -
Static method in class org.apache.hadoop.filecache.DistributedCache
- Returns mtime of a given cache file on hdfs.
- getTimestamp() -
Method in class org.apache.hadoop.hbase.HLogEdit
-
- getTimestamp() -
Method in class org.apache.hadoop.hbase.HStoreKey
-
- getToken(int) -
Method in class org.apache.hadoop.hbase.shell.generated.Parser
-
- getToken(int) -
Method in class org.apache.hadoop.record.compiler.generated.Rcc
-
- getTotalBlocks() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the total number of blocks in the scanned area.
- getTotalDirs() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return total number of directories encountered during this scan.
- getTotalFiles() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return total number of files encountered during this scan.
- getTotalLogFileSize() -
Method in class org.apache.hadoop.mapred.TaskLogAppender
-
- getTotalSize() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return total size of scanned data, in bytes.
- getTotalSubmissions() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getTrackerIdentifier() -
Method in class org.apache.hadoop.mapred.JobTracker
- Get the unique identifier (ie.
- getTrackerPort() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- getTrackingURL() -
Method in interface org.apache.hadoop.mapred.RunningJob
- Get the URL where some job progress information will be displayed.
- getType() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Get the type of the software component, which this object is upgrading.
- getType() -
Method in class org.apache.hadoop.hbase.shell.ReturnMsg
-
- getTypes() -
Method in class org.apache.hadoop.io.GenericWritable
- Return all classes that may be wrapped.
- getUnderReplicatedBlocks() -
Method in class org.apache.hadoop.dfs.NamenodeFsck.FsckResult
- Return the number of under-replicated blocks.
- getUniqueItems() -
Method in class org.apache.hadoop.mapred.lib.aggregate.UniqValueCount
-
- getUpgradeStatus() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Upgrade status determines a percentage of the work done out of the total
amount required by the upgrade.
- getUpgradeStatus() -
Method in class org.apache.hadoop.dfs.UpgradeStatusReport
- Get upgrade upgradeStatus as a percentage of the total upgrade done.
- getUpgradeStatusReport(boolean) -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Get status report for the upgrade.
- getUri() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getUri() -
Method in class org.apache.hadoop.dfs.HftpFileSystem
-
- getUri() -
Method in class org.apache.hadoop.fs.FileSystem
- Returns a URI whose scheme and authority identify this FileSystem.
- getUri() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Returns a URI whose scheme and authority identify this FileSystem.
- getUri() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getUri() -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- getUri() -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
-
- getURIs(String, String) -
Method in class org.apache.hadoop.streaming.StreamJob
- get the uris of all the files/caches
- getURL() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the link to the web-ui for details of the job.
- getUrl() -
Static method in class org.apache.hadoop.util.VersionInfo
- Get the subversion URL for the root Hadoop directory.
- getUsed() -
Method in class org.apache.hadoop.fs.DF
-
- getUsed() -
Method in class org.apache.hadoop.fs.DU
-
- getUsed() -
Method in class org.apache.hadoop.fs.FileSystem
- Return the total size of all files in the filesystem.
- getUser() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the reported username for this job.
- getUser() -
Method in class org.apache.hadoop.mapred.JobProfile
- Get the user id.
- getUser() -
Static method in class org.apache.hadoop.util.VersionInfo
- The user that compiled Hadoop.
- getUsername() -
Method in class org.apache.hadoop.mapred.JobStatus
-
- getVal() -
Method in class org.apache.hadoop.hbase.HLogEdit
-
- getVal() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMax
-
- getVal() -
Method in class org.apache.hadoop.mapred.lib.aggregate.LongValueMin
-
- getVal() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMax
-
- getVal() -
Method in class org.apache.hadoop.mapred.lib.aggregate.StringValueMin
-
- getValue() -
Method in class org.apache.hadoop.dfs.DataChecksum
-
- getValue() -
Method in class org.apache.hadoop.hbase.io.BatchOperation
-
- getValue(int) -
Method in class org.apache.hadoop.hbase.shell.InsertCommand
-
- getValue() -
Method in interface org.apache.hadoop.io.SequenceFile.Sorter.RawKeyValueIterator
- Gets the current raw value
- getValueClass() -
Method in class org.apache.hadoop.io.ArrayWritable
-
- getValueClass() -
Method in class org.apache.hadoop.io.MapFile.Reader
- Returns the class of values in this file.
- getValueClass() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns the class of values in this file.
- getValueClass() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- Returns the class of values in this file.
- getValueClass() -
Method in class org.apache.hadoop.mapred.SequenceFileRecordReader
- The class of value that must be passed to
SequenceFileRecordReader.next(WritableComparable,Writable)
..
- getVersion() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Get the layout version of the upgrade object.
- getVersion() -
Method in class org.apache.hadoop.dfs.UpgradeStatusReport
- Get the layout version of the currently running upgrade.
- getVersion() -
Method in interface org.apache.hadoop.fs.s3.FileSystemStore
-
- getVersion() -
Method in class org.apache.hadoop.io.VersionedWritable
- Return the version number of the current implementation.
- getVersion() -
Static method in class org.apache.hadoop.util.VersionInfo
- Get the Hadoop version.
- getVIntSize(long) -
Static method in class org.apache.hadoop.io.WritableUtils
- Get the encoded length if an integer is stored in a variable-length format
- getVIntSize(long) -
Static method in class org.apache.hadoop.record.Utils
- Get the encoded length if an integer is stored in a variable-length format
- getWaitingJobs() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
-
- getWarn() -
Static method in class org.apache.hadoop.metrics.jvm.EventCounter
-
- getWeight() -
Method in class org.onelab.filter.Key
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.dfs.DistributedFileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.dfs.HftpFileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.FileSystem
- Get the current working directory for the given file system
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Get the current working directory for the given file system
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.fs.s3.S3FileSystem
-
- getWorkingDirectory() -
Method in class org.apache.hadoop.mapred.JobConf
- Get the current working directory for the default file system.
- getWritable(byte[], Writable) -
Static method in class org.apache.hadoop.hbase.util.Writables
- Set bytes into the passed Writable by calling its
Writable.readFields(java.io.DataInput)
.
- getWriter(FileSystem, SequenceFile.CompressionType, Filter) -
Method in class org.apache.hadoop.hbase.HStoreFile
- Get a store file writer.
- getXceiverCount() -
Method in class org.apache.hadoop.dfs.DatanodeInfo
- number of active connections
- getZlibCompressor(Configuration) -
Static method in class org.apache.hadoop.io.compress.zlib.ZlibFactory
- Return the appropriate implementation of the zlib compressor.
- getZlibCompressorType(Configuration) -
Static method in class org.apache.hadoop.io.compress.zlib.ZlibFactory
- Return the appropriate type of the zlib compressor.
- getZlibDecompressor(Configuration) -
Static method in class org.apache.hadoop.io.compress.zlib.ZlibFactory
- Return the appropriate implementation of the zlib decompressor.
- getZlibDecompressorType(Configuration) -
Static method in class org.apache.hadoop.io.compress.zlib.ZlibFactory
- Return the appropriate type of the zlib decompressor.
- globPaths(Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Return all the files that match filePattern and are not checksum
files.
- globPaths(Path, PathFilter) -
Method in class org.apache.hadoop.fs.FileSystem
- glob all the file names that matches filePattern
and is accepted by filter.
- go() -
Method in class org.apache.hadoop.streaming.StreamJob
- This is the method that actually
intializes the job conf and submits the job
to the jobtracker
- goodClassOrNull(String, String) -
Static method in class org.apache.hadoop.streaming.StreamUtil
- It may seem strange to silently switch behaviour when a String
is not a classname; the reason is simplified Usage:
- Grep - Class in org.apache.hadoop.examples
-
- GROUP_COLUMNS -
Static variable in class org.apache.hadoop.hbase.mapred.GroupingTableMap
- JobConf parameter to specify the columns used to produce the key passed to
collect from the map phase
- GroupingTableMap - Class in org.apache.hadoop.hbase.mapred
- Extract grouping columns from input record
- GroupingTableMap() -
Constructor for class org.apache.hadoop.hbase.mapred.GroupingTableMap
- default constructor
- GT_TKN -
Static variable in interface org.apache.hadoop.record.compiler.generated.RccConstants
-
- GzipCodec - Class in org.apache.hadoop.io.compress
- This class creates gzip compressors/decompressors.
- GzipCodec() -
Constructor for class org.apache.hadoop.io.compress.GzipCodec
-
- GzipCodec.GzipInputStream - Class in org.apache.hadoop.io.compress
-
- GzipCodec.GzipInputStream(InputStream) -
Constructor for class org.apache.hadoop.io.compress.GzipCodec.GzipInputStream
-
- GzipCodec.GzipInputStream(DecompressorStream) -
Constructor for class org.apache.hadoop.io.compress.GzipCodec.GzipInputStream
- Allow subclasses to directly set the inflater stream.
- GzipCodec.GzipOutputStream - Class in org.apache.hadoop.io.compress
- A bridge that wraps around a DeflaterOutputStream to make it
a CompressionOutputStream.
- GzipCodec.GzipOutputStream(OutputStream) -
Constructor for class org.apache.hadoop.io.compress.GzipCodec.GzipOutputStream
-
- GzipCodec.GzipOutputStream(CompressorStream) -
Constructor for class org.apache.hadoop.io.compress.GzipCodec.GzipOutputStream
- Allow children types to put a different type in here.
Object.hashCode()
.TaskTracker
and
the JobTracker
.
ibw
.
Enum
type, by the specified amount.
JobTracker
.
InputFormat
describes the input-specification for a
Map-Reduce job.FileInputFormat
InputSplit
represents the data to be processed by an
individual Mapper
.Mapper
that swaps keys and values.Iterator
to go through the list of String
key-value pairs in the configuration.
JobClient
is the primary interface for the user-job to interact
with the JobTracker
.JobConf
, and connect to the
default JobTracker
.
JobProfile
.
JobProfile
the userid, jobid,
job config-file, job-details url and job name.
ArrayFile.Reader.seek(long)
, ArrayFile.Reader.next(Writable)
, or ArrayFile.Reader.get(long,Writable)
.
Keying.createKey(String)
transform.
InputFormat
for plain text files.Mapper
that extracts text matching a regular expression.Reducer
that sums long values.CompressionCodec
for a streaming
lzo compression/decompression pair.Compressor
based on the lzo algorithm.LzoCompressor.CompressionStrategy
.
Decompressor
based on the lzo algorithm.stdout
or split the specified log files.
OutputFormat
that writes MapFile
s.Mapper
and Reducer
implementations.Mapper
s.MapRunnable
implementation.SegmentDescriptor
InputFormat
that returns MultiFileSplit
's
in MultiFileInputFormat.getSplits(JobConf, int)
method.true
if a preset dictionary is needed for decompression.
WritableComparable
instance.
key
and
val
.
key
, skipping its
value.
key
and
val
.
SequenceFile.Reader.nextRaw(DataOutputBuffer,SequenceFile.ValueBytes)
.
key
.
FileSystem
.FileSystem
that uses Amazon S3.<key, value>
pairs output by Mapper
s
and Reducer
s.OutputFormat
describes the output-specification for a
Map-Reduce job.OutputFormat
.DefaultJobHistoryParser.MasterIndex
.
FileSystem
.PhasedFileSystem
is no longer used
during speculative execution of tasks.buf
at offset
and checksum into checksum
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
in
.
in
.
in
.
For efficiency, implementations should attempt to re-use storage in the existing object where possible.
CompressedWritable.readFields(DataInput)
.
len
bytes from
stm
Writable
, String
, primitive type, or an array of
the preceding.
Writable
, String
, primitive type, or an array of
the preceding.
Record
comparison implementation.
RecordReader
reads <key, value> pairs from an
InputSplit
.RecordWriter
writes the output <key, value> pairs
to an output file.Mapper
that extracts text matching a regular expression.hsf
directory.
RetryPolicy
.Command
is
Command.execute(org.apache.hadoop.conf.Configuration)
'ed.Tool
by Tool.run(String[])
, after
parsing with the given generic arguments.
Tool
with its Configuration
.
RunningJob
is the user-interface to query for details on a
running Map-Reduce job.FileSystem
backed by Amazon S3.S3FileSystem
.n
th value.
SequenceFile
s are flat files consisting of binary key/value
pairs.SequenceFile
.WritableComparator
.
InputFormat
for SequenceFile
s.OutputFormat
that writes SequenceFile
s.RecordReader
for SequenceFile
s.value
of the name
property.
name
property to a boolean
.
name
property to the name of a
theClass
implementing the given interface xface
.
InputFormat
implementation for the map-reduce job.
Path
of the input directory for the map-reduce job.
name
property to an int
.
JobPriority
for this job.
name
property to a long
.
SequenceFile.CompressionType
for the map outputs.
CompressionCodec
for the map outputs.
Mapper
class for the job.
MapRunnable
class for the job.
SequenceFile.CompressionType
for the output SequenceFile
.
CompressionCodec
to be used to compress job outputs.
OutputFormat
implementation for the map-reduce job.
WritableComparable
comparator used to compare keys.
Path
of the output directory for the map-reduce job.
WritableComparable
comparator for
grouping keys in the input to the reduce.
Partitioner
class used to partition
Mapper
-outputs to be sent to the Reducer
s.
Reducer
class for the job.
n
bytes of data from the
input stream.
startTime
- sleeper -
Variable in class org.apache.hadoop.hbase.HMaster
-
- Sleeper - Class in org.apache.hadoop.hbase.util
- Sleeper for current thread.
- Sleeper(int, AtomicBoolean) -
Constructor for class org.apache.hadoop.hbase.util.Sleeper
-
- SleepJob - Class in org.apache.hadoop.examples
- Dummy class for testing MR framefork.
- SleepJob() -
Constructor for class org.apache.hadoop.examples.SleepJob
-
- SocksSocketFactory - Class in org.apache.hadoop.net
- Specialized SocketFactory to create sockets with a SOCKS proxy
- SocksSocketFactory() -
Constructor for class org.apache.hadoop.net.SocksSocketFactory
- Default empty constructor (for use with the reflection API).
- SocksSocketFactory(Proxy) -
Constructor for class org.apache.hadoop.net.SocksSocketFactory
- Constructor with a supplied Proxy
- solution(List<List<ColumnName>>) -
Method in interface org.apache.hadoop.examples.dancing.DancingLinks.SolutionAcceptor
- A callback to return a solution to the application.
- solve(int[], DancingLinks.SolutionAcceptor<ColumnName>) -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Given a prefix, find solutions under it.
- solve(DancingLinks.SolutionAcceptor<ColumnName>) -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Solve a complete problem
- solve(int[]) -
Method in class org.apache.hadoop.examples.dancing.Pentomino
- Find all of the solutions that start with the given prefix.
- solve() -
Method in class org.apache.hadoop.examples.dancing.Pentomino
- Find all of the solutions to the puzzle.
- solve() -
Method in class org.apache.hadoop.examples.dancing.Sudoku
-
- Sort - Class in org.apache.hadoop.examples
- This is the trivial map/reduce program that does absolutely nothing
other than use the framework to fragment and sort the input values.
- Sort() -
Constructor for class org.apache.hadoop.examples.Sort
-
- sort(Path[], Path, boolean) -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- Perform a file sort from a set of input files into an output file.
- sort(Path, Path) -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- The backwards compatible interface to sort.
- sortAndIterate(Path[], Path, boolean) -
Method in class org.apache.hadoop.io.SequenceFile.Sorter
- Perform a file sort from a set of input files and return an iterator.
- sortedLeases -
Variable in class org.apache.hadoop.hbase.Leases
-
- SortedMapWritable - Class in org.apache.hadoop.io
- A Writable SortedMap.
- SortedMapWritable() -
Constructor for class org.apache.hadoop.io.SortedMapWritable
- default constructor.
- SortedMapWritable(SortedMapWritable) -
Constructor for class org.apache.hadoop.io.SortedMapWritable
- Copy constructor.
- sortNodeList(ArrayList<DatanodeDescriptor>, String, String) -
Method in class org.apache.hadoop.dfs.JspHelper
-
- SOURCE_TAGS_FIELD -
Static variable in class org.apache.hadoop.contrib.utils.join.DataJoinReducerBase
-
- specialConstructor -
Variable in exception org.apache.hadoop.hbase.shell.generated.ParseException
- This variable determines which constructor was used to create
this object and thereby affects the semantics of the
"getMessage" method (see below).
- specialConstructor -
Variable in exception org.apache.hadoop.record.compiler.generated.ParseException
- This variable determines which constructor was used to create
this object and thereby affects the semantics of the
"getMessage" method (see below).
- specialToken -
Variable in class org.apache.hadoop.hbase.shell.generated.Token
- This field is used to access special tokens that occur prior to this
token, but after the immediately preceding regular (non-special) token.
- specialToken -
Variable in class org.apache.hadoop.record.compiler.generated.Token
- This field is used to access special tokens that occur prior to this
token, but after the immediately preceding regular (non-special) token.
- split(int) -
Method in class org.apache.hadoop.examples.dancing.DancingLinks
- Generate a list of row choices to cover the first moves.
- splitKeyVal(byte[], int, int, Text, Text, int) -
Static method in class org.apache.hadoop.streaming.UTF8ByteArrayUtils
- split a UTF-8 byte array into key and value
assuming that the delimilator is at splitpos.
- splitKeyVal(byte[], Text, Text, int) -
Static method in class org.apache.hadoop.streaming.UTF8ByteArrayUtils
- split a UTF-8 byte array into key and value
assuming that the delimilator is at splitpos.
- splitOrCompactLock -
Variable in class org.apache.hadoop.hbase.HRegionServer
-
- StandardSocketFactory - Class in org.apache.hadoop.net
- Specialized SocketFactory to create sockets with a SOCKS proxy
- StandardSocketFactory() -
Constructor for class org.apache.hadoop.net.StandardSocketFactory
- Default empty constructor (for use with the reflection API).
- start() -
Method in class org.apache.hadoop.hbase.Leases
- Starts the lease monitor
- start() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- Start the server.
- start() -
Method in class org.apache.hadoop.ipc.Server
- Starts the service.
- start() -
Method in class org.apache.hadoop.mapred.StatusHttpServer
- Start the server.
- startBatchUpdate(Text) -
Method in class org.apache.hadoop.hbase.HTable
- Deprecated. Batch operations are now the default. startBatchUpdate is now
implemented by @see
HTable.startUpdate(Text)
- STARTING -
Static variable in interface org.apache.hadoop.hbase.shell.generated.ParserConstants
-
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.ChecksumFileSystem
-
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.FileSystem
- Returns a local File that the user can write output to.
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.FilterFileSystem
- Returns a local File that the user can write output to.
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.kfs.KosmosFileSystem
-
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.fs.RawLocalFileSystem
-
- startLocalOutput(Path, Path) -
Method in class org.apache.hadoop.mapred.PhasedFileSystem
- Deprecated.
- startMap(String) -
Method in class org.apache.hadoop.record.BinaryRecordInput
-
- startMap(TreeMap, String) -
Method in class org.apache.hadoop.record.BinaryRecordOutput
-
- startMap(String) -
Method in class org.apache.hadoop.record.CsvRecordInput
-
- startMap(TreeMap, String) -
Method in class org.apache.hadoop.record.CsvRecordOutput
-
- startMap(String) -
Method in interface org.apache.hadoop.record.RecordInput
- Check the mark for start of the serialized map.
- startMap(TreeMap, String) -
Method in interface org.apache.hadoop.record.RecordOutput
- Mark the start of a map to be serialized.
- startMap(String) -
Method in class org.apache.hadoop.record.XmlRecordInput
-
- startMap(TreeMap, String) -
Method in class org.apache.hadoop.record.XmlRecordOutput
-
- startMonitoring() -
Method in class org.apache.hadoop.metrics.file.FileContext
- Starts or restarts monitoring, by opening in append-mode, the
file specified by the
fileName
attribute,
if specified.
- startMonitoring() -
Method in interface org.apache.hadoop.metrics.MetricsContext
- Starts or restarts monitoring, the emitting of metrics records as they are
updated.
- startMonitoring() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Starts or restarts monitoring, the emitting of metrics records.
- startMonitoring() -
Method in class org.apache.hadoop.metrics.spi.NullContext
- Do-nothing version of startMonitoring
- startNextPhase() -
Method in class org.apache.hadoop.util.Progress
- Called during execution to move to the next phase at this level in the
tree.
- startNotifier() -
Static method in class org.apache.hadoop.mapred.JobEndNotifier
-
- startRecord(String) -
Method in class org.apache.hadoop.record.BinaryRecordInput
-
- startRecord(Record, String) -
Method in class org.apache.hadoop.record.BinaryRecordOutput
-
- startRecord(String) -
Method in class org.apache.hadoop.record.CsvRecordInput
-
- startRecord(Record, String) -
Method in class org.apache.hadoop.record.CsvRecordOutput
-
- startRecord(String) -
Method in interface org.apache.hadoop.record.RecordInput
- Check the mark for start of the serialized record.
- startRecord(Record, String) -
Method in interface org.apache.hadoop.record.RecordOutput
- Mark the start of a record to be serialized.
- startRecord(String) -
Method in class org.apache.hadoop.record.XmlRecordInput
-
- startRecord(Record, String) -
Method in class org.apache.hadoop.record.XmlRecordOutput
-
- startTracker(JobConf) -
Static method in class org.apache.hadoop.mapred.JobTracker
- Start the JobTracker with given configuration.
- startUpdate(Text) -
Method in class org.apache.hadoop.hbase.HRegion
- The caller wants to apply a series of writes to a single row in the
HRegion.
- startUpdate(Text, Text) -
Method in class org.apache.hadoop.hbase.HRegionServer
-
- startUpdate(Text) -
Method in class org.apache.hadoop.hbase.HTable
- Start an atomic row insertion/update.
- startUpdate(Text) -
Method in class org.apache.hadoop.hbase.io.BatchUpdate
- Start a batch row insertion/update.
- startUpgrade() -
Method in interface org.apache.hadoop.dfs.Upgradeable
- Prepare for the upgrade.
- startupShutdownMessage(Class, String[], Log) -
Static method in class org.apache.hadoop.util.StringUtils
- Print a log message for starting up and shutting down
- startVector(String) -
Method in class org.apache.hadoop.record.BinaryRecordInput
-
- startVector(ArrayList, String) -
Method in class org.apache.hadoop.record.BinaryRecordOutput
-
- startVector(String) -
Method in class org.apache.hadoop.record.CsvRecordInput
-
- startVector(ArrayList, String) -
Method in class org.apache.hadoop.record.CsvRecordOutput
-
- startVector(String) -
Method in interface org.apache.hadoop.record.RecordInput
- Check the mark for start of the serialized vector.
- startVector(ArrayList, String) -
Method in interface org.apache.hadoop.record.RecordOutput
- Mark the start of a vector to be serialized.
- startVector(String) -
Method in class org.apache.hadoop.record.XmlRecordInput
-
- startVector(ArrayList, String) -
Method in class org.apache.hadoop.record.XmlRecordOutput
-
- stateChangeLog -
Static variable in class org.apache.hadoop.dfs.NameNode
-
- staticFlag -
Static variable in class org.apache.hadoop.hbase.shell.generated.SimpleCharStream
-
- staticFlag -
Static variable in class org.apache.hadoop.record.compiler.generated.SimpleCharStream
-
- StatusHttpServer - Class in org.apache.hadoop.mapred
- Create a Jetty embedded server to answer http requests.
- StatusHttpServer(String, String, int, boolean) -
Constructor for class org.apache.hadoop.mapred.StatusHttpServer
- Create a status server on the given port.
- StatusHttpServer.StackServlet - Class in org.apache.hadoop.mapred
- A very simple servlet to serve up a text representation of the current
stack traces.
- StatusHttpServer.StackServlet() -
Constructor for class org.apache.hadoop.mapred.StatusHttpServer.StackServlet
-
- StatusHttpServer.TaskGraphServlet - Class in org.apache.hadoop.mapred
- The servlet that outputs svg graphics for map / reduce task
statuses
- StatusHttpServer.TaskGraphServlet() -
Constructor for class org.apache.hadoop.mapred.StatusHttpServer.TaskGraphServlet
-
- statusUpdate(String, TaskStatus) -
Method in class org.apache.hadoop.mapred.TaskTracker
- Called periodically to report Task progress, from 0.0 to 1.0.
- STILL_WAITING -
Static variable in interface org.apache.hadoop.dfs.FSConstants
-
- stop() -
Method in class org.apache.hadoop.dfs.NameNode
- Stop all NameNode threads and wait for all to finish.
- stop -
Variable in class org.apache.hadoop.hbase.Chore
-
- stop -
Variable in class org.apache.hadoop.hbase.Leases
-
- stop() -
Method in class org.apache.hadoop.hbase.util.InfoServer
- stop the server
- stop() -
Method in class org.apache.hadoop.ipc.Client
- Stop all threads related to this client.
- stop() -
Method in class org.apache.hadoop.ipc.Server
- Stops the service.
- stop() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
- set the thread state to STOPPING so that the
thread will stop when it wakes up.
- stop() -
Method in class org.apache.hadoop.mapred.StatusHttpServer
- stop the server
- stopClient() -
Static method in class org.apache.hadoop.ipc.RPC
- Stop all RPC client connections
- stopMonitoring() -
Method in class org.apache.hadoop.metrics.file.FileContext
- Stops monitoring, closing the file.
- stopMonitoring() -
Method in interface org.apache.hadoop.metrics.MetricsContext
- Stops monitoring.
- stopMonitoring() -
Method in class org.apache.hadoop.metrics.spi.AbstractMetricsContext
- Stops monitoring.
- stopNotifier() -
Static method in class org.apache.hadoop.mapred.JobEndNotifier
-
- stopRequested -
Variable in class org.apache.hadoop.hbase.HRegionServer
-
- StopRowFilter - Class in org.apache.hadoop.hbase.filter
- Implementation of RowFilterInterface that filters out rows greater than or
equal to a specified rowKey.
- StopRowFilter() -
Constructor for class org.apache.hadoop.hbase.filter.StopRowFilter
- Default constructor, filters nothing.
- StopRowFilter(Text) -
Constructor for class org.apache.hadoop.hbase.filter.StopRowFilter
- Constructor that takes a stopRowKey on which to filter
- stopTracker() -
Method in class org.apache.hadoop.mapred.JobTracker
-
- storageID -
Variable in class org.apache.hadoop.dfs.DatanodeID
-
- storeBlock(Block, File) -
Method in interface org.apache.hadoop.fs.s3.FileSystemStore
-
- storeINode(Path, INode) -
Method in interface org.apache.hadoop.fs.s3.FileSystemStore
-
- StreamBaseRecordReader - Class in org.apache.hadoop.streaming
- Shared functionality for hadoopStreaming formats.
- StreamBaseRecordReader(FSDataInputStream, FileSplit, Reporter, JobConf, FileSystem) -
Constructor for class org.apache.hadoop.streaming.StreamBaseRecordReader
-
- streamBlockInAscii(InetSocketAddress, long, long, long, long, JspWriter) -
Method in class org.apache.hadoop.dfs.JspHelper
-
- StreamFile - Class in org.apache.hadoop.dfs
-
- StreamFile() -
Constructor for class org.apache.hadoop.dfs.StreamFile
-
- StreamInputFormat - Class in org.apache.hadoop.streaming
- An input format that selects a RecordReader based on a JobConf property.
- StreamInputFormat() -
Constructor for class org.apache.hadoop.streaming.StreamInputFormat
-
- StreamJob - Class in org.apache.hadoop.streaming
- All the client-side work happens here.
- StreamJob(String[], boolean) -
Constructor for class org.apache.hadoop.streaming.StreamJob
-
- StreamLineRecordReader - Class in org.apache.hadoop.streaming
- Deprecated.
- StreamLineRecordReader(Configuration, FileSplit) -
Constructor for class org.apache.hadoop.streaming.StreamLineRecordReader
- Deprecated.
- StreamOutputFormat - Class in org.apache.hadoop.streaming
- Deprecated.
- StreamOutputFormat() -
Constructor for class org.apache.hadoop.streaming.StreamOutputFormat
- Deprecated.
- StreamSequenceRecordReader - Class in org.apache.hadoop.streaming
- Deprecated.
- StreamSequenceRecordReader(Configuration, FileSplit) -
Constructor for class org.apache.hadoop.streaming.StreamSequenceRecordReader
- Deprecated.
- StreamUtil - Class in org.apache.hadoop.streaming
- Utilities not available elsewhere in Hadoop.
- StreamUtil() -
Constructor for class org.apache.hadoop.streaming.StreamUtil
-
- StreamXmlRecordReader - Class in org.apache.hadoop.streaming
- A way to interpret XML fragments as Mapper input records.
- StreamXmlRecordReader(FSDataInputStream, FileSplit, Reporter, JobConf, FileSystem) -
Constructor for class org.apache.hadoop.streaming.StreamXmlRecordReader
-
- STRING_LITERAL -
Static variable in interface org.apache.hadoop.hbase.shell.generated.ParserConstants
-
- STRING_VALUE_MAX -
Static variable in class org.apache.hadoop.mapred.lib.aggregate.ValueAggregatorBaseDescriptor
-
- STRING_VALUE_MIN -
Static variable in class org.apache.hadoop.mapred.lib.aggregate.ValueAggregatorBaseDescriptor
-
- stringifyException(Throwable) -
Static method in class org.apache.hadoop.util.StringUtils
- Make a string representation of the exception.
- stringifySolution(int, int, List<List<Pentomino.ColumnName>>) -
Static method in class org.apache.hadoop.examples.dancing.Pentomino
- Convert a solution to the puzzle returned by the model into a string
that represents the placement of the pieces onto the board.
- stringToBytes(String) -
Static method in class org.apache.hadoop.hbase.util.Writables
- Converts a string to a byte array in a consistent manner.
- stringToPath(String[]) -
Static method in class org.apache.hadoop.util.StringUtils
-
- stringToURI(String[]) -
Static method in class org.apache.hadoop.util.StringUtils
-
- StringUtils - Class in org.apache.hadoop.util
- General string utils
- StringUtils() -
Constructor for class org.apache.hadoop.util.StringUtils
-
- StringValueMax - Class in org.apache.hadoop.mapred.lib.aggregate
- This class implements a value aggregator that maintain the biggest of
a sequence of strings.
- StringValueMax() -
Constructor for class org.apache.hadoop.mapred.lib.aggregate.StringValueMax
- the default constructor
- StringValueMin - Class in org.apache.hadoop.mapred.lib.aggregate
- This class implements a value aggregator that maintain the smallest of
a sequence of strings.
- StringValueMin() -
Constructor for class org.apache.hadoop.mapred.lib.aggregate.StringValueMin
- the default constructor
- subMap(WritableComparable, WritableComparable) -
Method in class org.apache.hadoop.io.SortedMapWritable
-
- submit() -
Method in class org.apache.hadoop.mapred.jobcontrol.Job
- Submit this job to mapred.
- submitAndMonitorJob() -
Method in class org.apache.hadoop.streaming.StreamJob
-
- submitJob(String) -
Method in class org.apache.hadoop.mapred.JobClient
- Submit a job to the MR system.
- submitJob(JobConf) -
Method in class org.apache.hadoop.mapred.JobClient
- Submit a job to the MR system.
- submitJob(String) -
Method in interface org.apache.hadoop.mapred.JobSubmissionProtocol
- Submit a Job for execution.
- submitJob(String) -
Method in class org.apache.hadoop.mapred.JobTracker
- JobTracker.submitJob() kicks off a new job.
- submitJob(JobConf) -
Static method in class org.apache.hadoop.mapred.pipes.Submitter
- Submit a job to the map/reduce cluster.
- Submitter - Class in org.apache.hadoop.mapred.pipes
- The main entry point and job submitter.
- Submitter() -
Constructor for class org.apache.hadoop.mapred.pipes.Submitter
-
- SUCCEEDED -
Static variable in class org.apache.hadoop.mapred.JobStatus
-
- SUCCESS -
Static variable in class org.apache.hadoop.mapred.jobcontrol.Job
-
- Sudoku - Class in org.apache.hadoop.examples.dancing
- This class uses the dancing links algorithm from Knuth to solve sudoku
puzzles.
- Sudoku(InputStream) -
Constructor for class org.apache.hadoop.examples.dancing.Sudoku
- Set up a puzzle board to the given size.
- Sudoku.ColumnName - Interface in org.apache.hadoop.examples.dancing
- This interface is a marker class for the columns created for the
Sudoku solver.
- suffix(String) -
Method in class org.apache.hadoop.fs.Path
- Adds a suffix to the final name in the path.
- sum(Counters, Counters) -
Static method in class org.apache.hadoop.mapred.Counters
- Convenience method for computing the sum of two sets of counters.
- suspend() -
Method in class org.apache.hadoop.mapred.jobcontrol.JobControl
- suspend the running thread
- SwitchTo(int) -
Method in class org.apache.hadoop.hbase.shell.generated.ParserTokenManager
-
- SwitchTo(int) -
Method in class org.apache.hadoop.record.compiler.generated.RccTokenManager
-
- symLink(String, String) -
Static method in class org.apache.hadoop.fs.FileUtil
- Create a soft link between a src and destination
only on a local disk.
- sync(long) -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Seek to the next sync mark past a given position.
- sync() -
Method in class org.apache.hadoop.io.SequenceFile.Writer
- create a sync point
- SYNC_INTERVAL -
Static variable in class org.apache.hadoop.io.SequenceFile
- The number of bytes between sync points.
- syncSeen() -
Method in class org.apache.hadoop.io.SequenceFile.Reader
- Returns true iff the previous call to next passed a sync mark.
tableName
exists.
InputFormat
for plain text files.OutputFormat
that writes plain text files.Mapper
that maps text values into Tool
interface, and use ToolRunner
for execution
functionality. Alternatively, GenericOptionsParser
can be used to parse generic arguments related to hadoop
framework.Tool
s.void
methods, or by
re-throwing the exception for non-void
methods.
S3FileSystem
.VersionedWritable.readFields(DataInput)
when the
version of an object being read does not match the current implementation
version as returned by VersionedWritable.getVersion()
.DataInput
and DataOutput
.Writable
which is also Comparable
.WritableComparable
s.WritableComparable
implementation.
OutputStream
.
len
bytes from the specified byte array
starting at offset off
and generate a checksum for
each data chunk.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
out
.
CompressedWritable.write(DataOutput)
.
Writable
, String
, primitive type, or an array of
the preceding.
Compressor
based on the popular
zlib compression algorithm.Decompressor
based on the popular
zlib compression algorithm.
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |