Uses of Interface
org.apache.hadoop.io.Writable

Packages that use Writable
org.apache.hadoop.contrib.utils.join   
org.apache.hadoop.dfs A distributed implementation of FileSystem
org.apache.hadoop.examples Hadoop example code. 
org.apache.hadoop.examples.dancing This package is a distributed implementation of Knuth's dancing links algorithm that can run under Hadoop. 
org.apache.hadoop.hbase Provides HBase, the Hadoop simple database. 
org.apache.hadoop.hbase.filter   
org.apache.hadoop.hbase.io   
org.apache.hadoop.hbase.mapred   
org.apache.hadoop.io Generic i/o code for use when reading and writing data to the network, to databases, and to files. 
org.apache.hadoop.ipc Tools to help define network clients and servers. 
org.apache.hadoop.mapred A system for scalable, fault-tolerant, distributed computation over large data collections. 
org.apache.hadoop.mapred.lib Library of generally useful mappers, reducers, and partitioners. 
org.apache.hadoop.mapred.lib.aggregate Classes for performing various counting and aggregations. 
org.apache.hadoop.record Hadoop record I/O contains classes and a record description language translator for simplifying serialization and deserialization of records in a language-neutral manner. 
org.apache.hadoop.streaming   
org.apache.hadoop.tools   
org.apache.hadoop.util Common utilities. 
org.onelab.filter   
 

Uses of Writable in org.apache.hadoop.contrib.utils.join
 

Classes in org.apache.hadoop.contrib.utils.join that implement Writable
 class TaggedMapOutput
          This abstract class serves as the base class for the values that flow from the mappers to the reducers in a data join job.
 

Methods in org.apache.hadoop.contrib.utils.join that return Writable
abstract  Writable TaggedMapOutput.getData()
           
 

Methods in org.apache.hadoop.contrib.utils.join with parameters of type Writable
protected abstract  TaggedMapOutput DataJoinMapperBase.generateTaggedMapOutput(Writable value)
          Generate a tagged map output value.
 void DataJoinReducerBase.map(WritableComparable arg0, Writable arg1, OutputCollector arg2, Reporter arg3)
           
 void DataJoinMapperBase.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
           
 

Uses of Writable in org.apache.hadoop.dfs
 

Classes in org.apache.hadoop.dfs that implement Writable
 class DatanodeDescriptor
          DatanodeDescriptor tracks stats on a given DataNode, such as available storage capacity, last update time, etc., and maintains a set of blocks stored on the datanode.
 class DatanodeID
          DatanodeID is composed of the data node name (hostname:portNumber) and the data storage ID, which it currently represents.
 class DatanodeInfo
          DatanodeInfo represents the status of a DataNode.
 class LocatedBlocks
          Collection of blocks with their locations and the file length.
 class UpgradeStatusReport
          Base upgrade upgradeStatus class.
 

Uses of Writable in org.apache.hadoop.examples
 

Methods in org.apache.hadoop.examples with parameters of type Writable
 void WordCount.MapClass.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
           
 void PiEstimator.PiMapper.map(WritableComparable key, Writable val, OutputCollector out, Reporter reporter)
          Map method.
 

Uses of Writable in org.apache.hadoop.examples.dancing
 

Methods in org.apache.hadoop.examples.dancing with parameters of type Writable
 void DistributedPentomino.PentMap.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
          Break the prefix string into moves (a sequence of integer row ids that will be selected for each column in order).
 

Uses of Writable in org.apache.hadoop.hbase
 

Classes in org.apache.hadoop.hbase that implement Writable
 class BloomFilterDescriptor
          Supplied as a parameter to HColumnDescriptor to specify what kind of bloom filter to use for a column, and its configuration parameters
 class HColumnDescriptor
          A HColumnDescriptor contains information about a column family such as the number of versions, compression settings, etc.
 class HLogEdit
          A log value.
 class HLogKey
          A Key for an entry in the change log.
 class HMsg
          HMsg is for communicating instructions between the HMaster and the HRegionServers.
 class HRegionInfo
          HRegion information.
 class HServerAddress
          HServerAddress is a "label" for a HBase server that combines the host name and port number.
 class HServerInfo
          HRSInfo contains metainfo about an HRegionServer, including details about the source machine and load statistics.
 class HStoreFile
          Each HStore maintains a bunch of different data files.
 class HStoreKey
          A Key for a stored row
 class HTableDescriptor
          HTableDescriptor contains the name of an HTable, and its column families.
 

Uses of Writable in org.apache.hadoop.hbase.filter
 

Subinterfaces of Writable in org.apache.hadoop.hbase.filter
 interface RowFilterInterface
          Interface used for row-level filters applied to HRegion.HScanner scan results during calls to next().
 

Classes in org.apache.hadoop.hbase.filter that implement Writable
 class PageRowFilter
          Implementation of RowFilterInterface that limits results to a specific page size.
 class RegExpRowFilter
          Implementation of RowFilterInterface that can filter by rowkey regular expression and/or individual column values (equals comparison only).
 class RowFilterSet
          Implementation of RowFilterInterface that represents a set of RowFilters which will be evaluated with a specified boolean operator AND/OR.
 

Uses of Writable in org.apache.hadoop.hbase.io
 

Classes in org.apache.hadoop.hbase.io that implement Writable
 class BatchOperation
          batch update operation
 class BatchUpdate
          A Writable object that contains a series of BatchOperations There is one BatchUpdate object per server, so a series of batch operations can result in multiple BatchUpdate objects if the batch contains rows that are served by multiple region servers.
 class ImmutableBytesWritable
          A byte sequence that is usable as a key or value.
 class KeyedData
          KeyedData is just a data pair.
 class KeyedDataArrayWritable
          Wraps an array of KeyedData items as a Writable.
 

Uses of Writable in org.apache.hadoop.hbase.mapred
 

Classes in org.apache.hadoop.hbase.mapred that implement Writable
 class TableSplit
          A table split corresponds to a key range [low, high)
 

Methods in org.apache.hadoop.hbase.mapred with parameters of type Writable
 void TableMap.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
          Input:
 void TableOutputFormat.TableRecordWriter.write(WritableComparable key, Writable value)
          Expect key to be of type Text Expect value to be of type KeyedDataArrayWritable
 

Uses of Writable in org.apache.hadoop.io
 

Subinterfaces of Writable in org.apache.hadoop.io
 interface WritableComparable
          An interface which extends both Writable and Comparable.
 

Classes in org.apache.hadoop.io that implement Writable
 class ArrayWritable
          A Writable for arrays containing instances of a class.
 class BooleanWritable
          A WritableComparable for booleans.
 class BytesWritable
          A byte sequence that is usable as a key or value.
 class CompressedWritable
          A base-class for Writables which store themselves compressed and lazily inflate on field access.
 class FloatWritable
          A WritableComparable for floats.
 class GenericWritable
          A wrapper for Writable instances.
 class IntWritable
          A WritableComparable for ints.
 class LongWritable
          A WritableComparable for longs.
 class MD5Hash
          A Writable for MD5 hash values.
 class NullWritable
          Singleton Writable with no data.
 class ObjectWritable
          A polymorphic Writable that writes an instance with it's class name.
static class SequenceFile.Metadata
          The class encapsulating with the metadata of a file.
 class Text
          This class stores text using standard UTF8 encoding.
 class TwoDArrayWritable
          A Writable for 2D arrays containing a matrix of instances of a class.
 class UTF8
          Deprecated. replaced by Text
 class VersionedWritable
          A base class for Writables that provides version checking.
 class VIntWritable
          A WritableComparable for integer values stored in variable-length format.
 class VLongWritable
          A WritableComparable for longs in a variable-length format.
 

Methods in org.apache.hadoop.io that return Writable
static Writable WritableUtils.clone(Writable orig, JobConf conf)
          Make a copy of a writable object using serialization to a buffer.
 Writable[][] TwoDArrayWritable.get()
           
 Writable GenericWritable.get()
          Return the wrapped instance.
 Writable[] ArrayWritable.get()
           
 Writable ArrayFile.Reader.get(long n, Writable value)
          Return the nth value in the file.
 Writable MapFile.Reader.get(WritableComparable key, Writable val)
          Return the value for the named key, or null if none exists.
 Writable WritableFactory.newInstance()
          Return a new instance.
static Writable WritableFactories.newInstance(Class c)
          Create a new instance of a class with a defined factory.
static Writable WritableFactories.newInstance(Class c, Configuration conf)
          Create a new instance of a class with a defined factory.
 Writable ArrayFile.Reader.next(Writable value)
          Read and return the next value in the file.
 

Methods in org.apache.hadoop.io with parameters of type Writable
 void ArrayFile.Writer.append(Writable value)
          Append a value to the file.
 void MapFile.Writer.append(WritableComparable key, Writable val)
          Append a key/value pair to the map.
 void SequenceFile.Writer.append(Writable key, Writable val)
          Append a key/value pair.
static Writable WritableUtils.clone(Writable orig, JobConf conf)
          Make a copy of a writable object using serialization to a buffer.
 Writable ArrayFile.Reader.get(long n, Writable value)
          Return the nth value in the file.
 Writable MapFile.Reader.get(WritableComparable key, Writable val)
          Return the value for the named key, or null if none exists.
 WritableComparable MapFile.Reader.getClosest(WritableComparable key, Writable val)
          Finds the record that is the closest match to the specified key.
 void SequenceFile.Reader.getCurrentValue(Writable val)
          Get the 'value' corresponding to the last read 'key'.
 Writable ArrayFile.Reader.next(Writable value)
          Read and return the next value in the file.
 boolean SequenceFile.Reader.next(Writable key)
          Read the next key in the file into key, skipping its value.
 boolean MapFile.Reader.next(WritableComparable key, Writable val)
          Read the next key/value pair in the map into key and val.
 boolean SequenceFile.Reader.next(Writable key, Writable val)
          Read the next key/value pair in the file into key and val.
 void GenericWritable.set(Writable obj)
          Set the instance that is wrapped.
 void ArrayWritable.set(Writable[] values)
           
 void TwoDArrayWritable.set(Writable[][] values)
           
 

Constructors in org.apache.hadoop.io with parameters of type Writable
ArrayWritable(Class valueClass, Writable[] values)
           
TwoDArrayWritable(Class valueClass, Writable[][] values)
           
 

Uses of Writable in org.apache.hadoop.ipc
 

Methods in org.apache.hadoop.ipc that return Writable
 Writable RPC.Server.call(Writable param)
           
abstract  Writable Server.call(Writable param)
          Called for each call.
 Writable[] Client.call(Writable[] params, InetSocketAddress[] addresses)
          Makes a set of calls in parallel.
 Writable Client.call(Writable param, InetSocketAddress address)
          Make a call, passing param, to the IPC server running at address, returning the value.
 

Methods in org.apache.hadoop.ipc with parameters of type Writable
 Writable RPC.Server.call(Writable param)
           
abstract  Writable Server.call(Writable param)
          Called for each call.
 Writable[] Client.call(Writable[] params, InetSocketAddress[] addresses)
          Makes a set of calls in parallel.
 Writable Client.call(Writable param, InetSocketAddress address)
          Make a call, passing param, to the IPC server running at address, returning the value.
 

Uses of Writable in org.apache.hadoop.mapred
 

Subinterfaces of Writable in org.apache.hadoop.mapred
 interface InputSplit
          The description of the data for a single map task.
 

Classes in org.apache.hadoop.mapred that implement Writable
 class ClusterStatus
          Summarizes the size and current state of the cluster.
 class Counters
          A set of named counters.
 class FileSplit
          A section of an input file.
 class JobProfile
          A JobProfile is a MapReduce primitive.
 class JobStatus
          Describes the current status of a job.
 class MultiFileSplit
          A sub-collection of input files.
 class TaskCompletionEvent
          This is used to track task completion events on job tracker.
 class TaskReport
          A report on the state of a task.
 

Methods in org.apache.hadoop.mapred that return Writable
 Writable SequenceFileRecordReader.createValue()
           
 Writable SequenceFileAsTextRecordReader.createValue()
           
 Writable LineRecordReader.createValue()
           
 Writable RecordReader.createValue()
          Create an object of the appropriate type to be used as the value.
static Writable MapFileOutputFormat.getEntry(MapFile.Reader[] readers, Partitioner partitioner, WritableComparable key, Writable value)
          Get an entry from output generated by this class.
 

Methods in org.apache.hadoop.mapred that return types with arguments of type Writable
 Class<? extends Writable> JobConf.getMapOutputValueClass()
          Get the value class for the map output data.
 Class<? extends Writable> JobConf.getOutputValueClass()
           
 

Methods in org.apache.hadoop.mapred with parameters of type Writable
 boolean SequenceFileInputFilter.Filter.accept(Writable key)
          filter function Decide if a record should be filtered or not
 boolean SequenceFileInputFilter.RegexFilter.accept(Writable key)
          Filtering method If key matches the regex, return true; otherwise return false
 boolean SequenceFileInputFilter.PercentFilter.accept(Writable key)
          Filtering method If record# % frequency==0, return true; otherwise return false
 boolean SequenceFileInputFilter.MD5Filter.accept(Writable key)
          Filtering method If MD5(key) % frequency==0, return true; otherwise return false
 void OutputCollector.collect(WritableComparable key, Writable value)
          Adds a key/value pair to the output.
protected  void SequenceFileRecordReader.getCurrentValue(Writable value)
           
static Writable MapFileOutputFormat.getEntry(MapFile.Reader[] readers, Partitioner partitioner, WritableComparable key, Writable value)
          Get an entry from output generated by this class.
 int Partitioner.getPartition(WritableComparable key, Writable value, int numPartitions)
          Returns the paritition number for a given entry given the total number of partitions.
 void Mapper.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
          Maps a single input key/value pair into intermediate key/value pairs.
protected  boolean SequenceFileRecordReader.next(Writable key)
           
 boolean SequenceFileRecordReader.next(Writable key, Writable value)
           
 boolean SequenceFileAsTextRecordReader.next(Writable key, Writable value)
          Read key/value pair in a line.
 boolean LineRecordReader.next(Writable key, Writable value)
          Read a line.
 boolean KeyValueLineRecordReader.next(Writable key, Writable value)
          Read key/value pair in a line.
 boolean RecordReader.next(Writable key, Writable value)
          Reads the next key/value pair.
 void TextOutputFormat.LineRecordWriter.write(WritableComparable key, Writable value)
           
 void RecordWriter.write(WritableComparable key, Writable value)
          Writes a key/value pair.
 

Method parameters in org.apache.hadoop.mapred with type arguments of type Writable
 void JobConf.setMapOutputValueClass(Class<? extends Writable> theClass)
          Set the value class for the map output data.
 void JobConf.setOutputValueClass(Class<? extends Writable> theClass)
           
 

Uses of Writable in org.apache.hadoop.mapred.lib
 

Methods in org.apache.hadoop.mapred.lib with parameters of type Writable
 int KeyFieldBasedPartitioner.getPartition(WritableComparable key, Writable value, int numReduceTasks)
          Use Object.hashCode() to partition.
 int HashPartitioner.getPartition(WritableComparable key, Writable value, int numReduceTasks)
          Use Object.hashCode() to partition.
 void TokenCountMapper.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
           
 void RegexMapper.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
           
 void InverseMapper.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
          The inverse function.
 void IdentityMapper.map(WritableComparable key, Writable val, OutputCollector output, Reporter reporter)
          The identify function.
 void FieldSelectionMapReduce.map(WritableComparable key, Writable val, OutputCollector output, Reporter reporter)
          The identify function.
 

Uses of Writable in org.apache.hadoop.mapred.lib.aggregate
 

Methods in org.apache.hadoop.mapred.lib.aggregate with parameters of type Writable
 void ValueAggregatorReducer.map(WritableComparable arg0, Writable arg1, OutputCollector arg2, Reporter arg3)
          Do nothing.
 void ValueAggregatorMapper.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
          the map function.
 void ValueAggregatorCombiner.map(WritableComparable arg0, Writable arg1, OutputCollector arg2, Reporter arg3)
          Do nothing.
 

Uses of Writable in org.apache.hadoop.record
 

Classes in org.apache.hadoop.record that implement Writable
 class Record
          Abstract class that is extended by generated classes.
 

Uses of Writable in org.apache.hadoop.streaming
 

Methods in org.apache.hadoop.streaming that return Writable
 Writable StreamBaseRecordReader.createValue()
           
 

Methods in org.apache.hadoop.streaming with parameters of type Writable
 void PipeMapper.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
           
 boolean StreamXmlRecordReader.next(Writable key, Writable value)
           
abstract  boolean StreamBaseRecordReader.next(Writable key, Writable value)
          Read a record.
 

Uses of Writable in org.apache.hadoop.tools
 

Methods in org.apache.hadoop.tools with parameters of type Writable
 void Logalyzer.LogRegexMapper.map(WritableComparable key, Writable value, OutputCollector output, Reporter reporter)
           
 

Uses of Writable in org.apache.hadoop.util
 

Methods in org.apache.hadoop.util with parameters of type Writable
 void CopyFiles.FSCopyFilesMapper.map(WritableComparable key, Writable value, OutputCollector out, Reporter reporter)
          Map method.
 void CopyFiles.HTTPCopyFilesMapper.map(WritableComparable key, Writable val, OutputCollector out, Reporter reporter)
           
 

Uses of Writable in org.onelab.filter
 

Classes in org.onelab.filter that implement Writable
 class BloomFilter
          Implements a Bloom filter, as defined by Bloom in 1970.
 class CountingBloomFilter
          Implements a counting Bloom filter, as defined by Fan et al.
 class DynamicBloomFilter
          Implements a dynamic Bloom filter, as defined in the INFOCOM 2006 paper.
 class Filter
          Defines the general behavior of a filter.
 class Key
          The general behavior of a key that must be stored in a filter.
 class RetouchedBloomFilter
          Implements a retouched Bloom filter, as defined in the CoNEXT 2006 paper.
 



Copyright © 2006 The Apache Software Foundation