org.apache.pig.backend.hadoop.executionengine.mapReduceLayer
Class SliceWrapper
java.lang.Object
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper
- All Implemented Interfaces:
- org.apache.hadoop.io.Writable, org.apache.hadoop.mapred.InputSplit
public class SliceWrapper
- extends Object
- implements org.apache.hadoop.mapred.InputSplit
Wraps a Slice
in an InputSplit
so it's usable by hadoop.
Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
SliceWrapper
public SliceWrapper()
SliceWrapper
public SliceWrapper(Slice slice,
PigContext context,
int index,
org.apache.hadoop.fs.FileSystem fs,
ArrayList<OperatorKey> targetOps)
getIndex
public int getIndex()
getLength
public long getLength()
throws IOException
- Specified by:
getLength
in interface org.apache.hadoop.mapred.InputSplit
- Throws:
IOException
getLocations
public String[] getLocations()
throws IOException
- Specified by:
getLocations
in interface org.apache.hadoop.mapred.InputSplit
- Throws:
IOException
getJobConf
public org.apache.hadoop.mapred.JobConf getJobConf()
makeReader
public org.apache.hadoop.mapred.RecordReader<org.apache.hadoop.io.Text,Tuple> makeReader(org.apache.hadoop.mapred.JobConf job)
throws IOException
- Throws:
IOException
readFields
public void readFields(DataInput is)
throws IOException
- Specified by:
readFields
in interface org.apache.hadoop.io.Writable
- Throws:
IOException
write
public void write(DataOutput os)
throws IOException
- Specified by:
write
in interface org.apache.hadoop.io.Writable
- Throws:
IOException
getWrapped
public Slice getWrapped()
- Returns:
- the wrapped Slice
Copyright © ${year} The Apache Software Foundation