org.apache.hadoop.hbase.filter
Class WhileMatchRowFilter

java.lang.Object
  extended by org.apache.hadoop.hbase.filter.WhileMatchRowFilter
All Implemented Interfaces:
RowFilterInterface, org.apache.hadoop.io.Writable

public class WhileMatchRowFilter
extends Object
implements RowFilterInterface

WhileMatchRowFilter is a wrapper filter that filters everything after the first filtered row. Once the nested filter returns true for either of it's filter(..) methods or filterNotNull(SortedMap), this wrapper's filterAllRemaining() will return true. All filtering methods will thereafter defer to the result of filterAllRemaining().


Constructor Summary
WhileMatchRowFilter()
          Default constructor, filters nothing.
WhileMatchRowFilter(RowFilterInterface filter)
          Constructor
 
Method Summary
 boolean filter(org.apache.hadoop.io.Text rowKey)
          Filters on just a row key.
 boolean filter(org.apache.hadoop.io.Text rowKey, org.apache.hadoop.io.Text colKey, byte[] data)
          Filters on row key and/or a column key.
 boolean filterAllRemaining()
          Returns true once the nested filter has filtered out a row (returned true on a call to one of it's filtering methods).
 boolean filterNotNull(SortedMap<org.apache.hadoop.io.Text,byte[]> columns)
          Filters a row if: 1) The given row (@param columns) has a columnKey expected to be null AND the value associated with that columnKey is non-null.
 RowFilterInterface getInternalFilter()
          Returns the internal filter being wrapped
 boolean processAlways()
          Returns whether or not the filter should always be processed in any filtering call.
 void readFields(DataInput in)
          
 void reset()
          Resets the state of the filter.
 void rowProcessed(boolean filtered, org.apache.hadoop.io.Text rowKey)
          Called to let filter know the final decision (to pass or filter) on a given row.
 void validate(org.apache.hadoop.io.Text[] columns)
          Validates that this filter applies only to a subset of the given columns.
 void write(DataOutput out)
          
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

WhileMatchRowFilter

public WhileMatchRowFilter()
Default constructor, filters nothing. Required though for RPC deserialization.


WhileMatchRowFilter

public WhileMatchRowFilter(RowFilterInterface filter)
Constructor

Parameters:
filter -
Method Detail

getInternalFilter

public RowFilterInterface getInternalFilter()
Returns the internal filter being wrapped

Returns:
the internal filter

reset

public void reset()
Resets the state of the filter. Used prior to the start of a Region scan.

Specified by:
reset in interface RowFilterInterface

processAlways

public boolean processAlways()
Returns whether or not the filter should always be processed in any filtering call. This precaution is necessary for filters that maintain state and need to be updated according to their response to filtering calls (see WhileMatchRowFilter for an example). At times, filters nested in RowFilterSets may or may not be called because the RowFilterSet determines a result as fast as possible. Returning true for processAlways() ensures that the filter will always be called.

Specified by:
processAlways in interface RowFilterInterface
Returns:
whether or not to always process the filter

filterAllRemaining

public boolean filterAllRemaining()
Returns true once the nested filter has filtered out a row (returned true on a call to one of it's filtering methods). Until then it returns false.

Specified by:
filterAllRemaining in interface RowFilterInterface
Returns:
true/false whether the nested filter has returned true on a filter call.

filter

public boolean filter(org.apache.hadoop.io.Text rowKey)
Filters on just a row key.

Specified by:
filter in interface RowFilterInterface
Returns:
true if given row key is filtered and row should not be processed.

filter

public boolean filter(org.apache.hadoop.io.Text rowKey,
                      org.apache.hadoop.io.Text colKey,
                      byte[] data)
Filters on row key and/or a column key.

Specified by:
filter in interface RowFilterInterface
Parameters:
rowKey - row key to filter on. May be null for no filtering of row key.
colKey - column whose data will be filtered
data - column value
Returns:
true if row filtered and should not be processed.

filterNotNull

public boolean filterNotNull(SortedMap<org.apache.hadoop.io.Text,byte[]> columns)
Filters a row if: 1) The given row (@param columns) has a columnKey expected to be null AND the value associated with that columnKey is non-null. 2) The filter has a criterion for a particular columnKey, but that columnKey is not in the given row (@param columns). Note that filterNotNull does not care whether the values associated with a columnKey match. Also note that a "null value" associated with a columnKey is expressed as HConstants.DELETE_BYTES.

Specified by:
filterNotNull in interface RowFilterInterface
Returns:
true if null/non-null criteria not met.

rowProcessed

public void rowProcessed(boolean filtered,
                         org.apache.hadoop.io.Text rowKey)
Called to let filter know the final decision (to pass or filter) on a given row. With out HScanner calling this, the filter does not know if a row passed filtering even if it passed the row itself because other filters may have failed the row. E.g. when this filter is a member of a RowFilterSet with an OR operator.

Specified by:
rowProcessed in interface RowFilterInterface
See Also:
RowFilterSet

validate

public void validate(org.apache.hadoop.io.Text[] columns)
Validates that this filter applies only to a subset of the given columns. This check is done prior to opening of scanner due to the limitation that filtering of columns is dependent on the retrieval of those columns within the HRegion. Criteria on columns that are not part of a scanner's column list will be ignored. In the case of null value filters, all rows will pass the filter. This behavior should be 'undefined' for the user and therefore not permitted.

Specified by:
validate in interface RowFilterInterface

readFields

public void readFields(DataInput in)
                throws IOException

Specified by:
readFields in interface org.apache.hadoop.io.Writable
Throws:
IOException

write

public void write(DataOutput out)
           throws IOException

Specified by:
write in interface org.apache.hadoop.io.Writable
Throws:
IOException


Copyright © 2008 The Apache Software Foundation