Deprecated API


Contents
Deprecated Interfaces
org.apache.hadoop.hbase.filter.RowFilterInterface
          Use filters that are rooted on @{link Filter} instead 
org.apache.hadoop.hbase.client.Scanner
          See ResultScanner 
org.apache.hadoop.hbase.mapred.TableMap
           
org.apache.hadoop.hbase.mapred.TableReduce
           
 

Deprecated Classes
org.apache.hadoop.hbase.io.BatchOperation
          As of hbase 0.20.0, replaced by new Get/Put/Delete/Result-based API. 
org.apache.hadoop.hbase.io.BatchUpdate
          As of hbase 0.20.0, replaced by new Get/Put/Delete/Result-based API. 
org.apache.hadoop.hbase.mapred.BuildTableIndex
           
org.apache.hadoop.hbase.io.Cell
          As of hbase 0.20.0, replaced by new Get/Put/Delete/Result-based API. 
org.apache.hadoop.hbase.filter.ColumnValueFilter
          Use filters that are rooted on @{link Filter} instead. 
org.apache.hadoop.hbase.rest.Dispatcher
          Use the org.apache.hadoop.hbase.stargate hbase contrib instead. 
org.apache.hadoop.hbase.mapred.Driver
           
org.apache.hadoop.hbase.mapred.GroupingTableMap
           
org.apache.hadoop.hbase.mapred.HRegionPartitioner
           
org.apache.hadoop.hbase.HStoreKey
          Replaced by KeyValue. 
org.apache.hadoop.hbase.mapred.IdentityTableMap
           
org.apache.hadoop.hbase.mapred.IdentityTableReduce
           
org.apache.hadoop.hbase.filter.InclusiveStopRowFilter
          Use filters that are rooted on @{link Filter} instead 
org.apache.hadoop.hbase.mapred.IndexConfiguration
           
org.apache.hadoop.hbase.mapred.IndexOutputFormat
           
org.apache.hadoop.hbase.mapred.IndexTableReduce
           
org.apache.hadoop.hbase.mapred.LuceneDocumentWrapper
           
org.apache.hadoop.hbase.filter.PageRowFilter
          Use filters that are rooted on @{link Filter} instead 
org.apache.hadoop.hbase.filter.PrefixRowFilter
          Use filters that are rooted on @{link Filter} instead 
org.apache.hadoop.hbase.filter.RegExpRowFilter
          This interface doesn't work well in new KeyValue world. Use filters based on new Filter instead. 
org.apache.hadoop.hbase.mapred.RowCounter
           
org.apache.hadoop.hbase.filter.RowFilterSet
          Use filters that are rooted on @{link Filter} instead 
org.apache.hadoop.hbase.io.RowResult
          As of hbase 0.20.0, replaced by new Get/Put/Delete/Result-based API. 
org.apache.hadoop.hbase.filter.StopRowFilter
          Use filters that are rooted on @{link Filter} instead 
org.apache.hadoop.hbase.mapred.TableInputFormat
           
org.apache.hadoop.hbase.mapred.TableInputFormatBase
           
org.apache.hadoop.hbase.mapred.TableMapReduceUtil
           
org.apache.hadoop.hbase.mapred.TableOutputFormat
           
org.apache.hadoop.hbase.mapred.TableSplit
           
org.apache.hadoop.hbase.filter.WhileMatchRowFilter
          Use filters that are rooted on @{link Filter} instead 
 

Deprecated Enums
org.apache.hadoop.hbase.HColumnDescriptor.CompressionType
          Compression now means which compression library rather than 'what' to compress. 
 

Deprecated Methods
org.apache.hadoop.hbase.HServerLoad.addRegionInfo(byte[], int, int, int, int, int)
          Use HServerLoad.addRegionInfo(RegionLoad) 
org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler.atomicIncrement(byte[], byte[], byte[], long)
           
org.apache.hadoop.hbase.client.HTable.checkAndSave(BatchUpdate, HbaseMapWritable, RowLock)
          As of hbase 0.20.0, replaced by HTable.checkAndPut(byte[], byte[], byte[], byte[], org.apache.hadoop.hbase.client.Put) 
org.apache.hadoop.hbase.client.HTable.commit(BatchUpdate)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) or HTable.put(Put) 
org.apache.hadoop.hbase.client.HTable.commit(BatchUpdate, RowLock)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) or HTable.put(Put) 
org.apache.hadoop.hbase.client.HTable.commit(List)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) or HTable.put(List) 
org.apache.hadoop.hbase.HStoreKey.compareTo(HStoreKey)
          Use Comparators instead. This can give wrong results. 
org.apache.hadoop.hbase.rest.RowModel.delete(byte[], byte[])
           
org.apache.hadoop.hbase.rest.RowModel.delete(byte[], byte[], byte[][])
           
org.apache.hadoop.hbase.rest.TimestampModel.delete(byte[], byte[], byte[][], long)
           
org.apache.hadoop.hbase.rest.TimestampModel.delete(byte[], byte[], long)
           
org.apache.hadoop.hbase.client.HTable.deleteAll(byte[])
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAll(byte[], byte[])
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAll(byte[], byte[], long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAll(byte[], byte[], long, RowLock)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAll(byte[], long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAll(String)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAll(String, long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAll(String, String)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAll(String, String, long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAllByRegex(byte[], String, long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAllByRegex(byte[], String, long, RowLock)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAllByRegex(String, String)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteAllByRegex(String, String, long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamily(byte[], byte[])
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamily(byte[], byte[], long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamily(byte[], byte[], long, RowLock)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamily(String, String)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamily(String, String, long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamilyByRegex(byte[], String)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamilyByRegex(byte[], String, long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamilyByRegex(byte[], String, long, RowLock)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamilyByRegex(String, String)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.deleteFamilyByRegex(String, String, long)
          As of hbase 0.20.0, replaced by HTable.delete(Delete) 
org.apache.hadoop.hbase.client.HTable.exists(byte[])
          As of hbase 0.20.0, replaced by HTable.exists(Get) 
org.apache.hadoop.hbase.client.HTable.exists(byte[], byte[])
          As of hbase 0.20.0, replaced by HTable.exists(Get) 
org.apache.hadoop.hbase.client.HTable.exists(byte[], byte[], long)
          As of hbase 0.20.0, replaced by HTable.exists(Get) 
org.apache.hadoop.hbase.client.HTable.exists(byte[], byte[], long, RowLock)
          As of hbase 0.20.0, replaced by HTable.exists(Get) 
org.apache.hadoop.hbase.filter.RowFilterInterface.filterColumn(byte[], byte[], byte[])
          Use RowFilterInterface.filterColumn(byte[], int, int, byte[], int, int, byte[], int, int) instead. 
org.apache.hadoop.hbase.filter.RowFilterInterface.filterRowKey(byte[])
          Use RowFilterInterface.filterRowKey(byte[], int, int) instead. 
org.apache.hadoop.hbase.client.HTable.get(byte[], byte[])
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.rest.RowModel.get(byte[], byte[])
           
org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler.get(byte[], byte[], byte[])
           
org.apache.hadoop.hbase.rest.RowModel.get(byte[], byte[], byte[][])
           
org.apache.hadoop.hbase.rest.TimestampModel.get(byte[], byte[], byte[][], long)
           
org.apache.hadoop.hbase.rest.RowModel.get(byte[], byte[], byte[][], long)
           
org.apache.hadoop.hbase.rest.TimestampModel.get(byte[], byte[], byte[], long)
           
org.apache.hadoop.hbase.rest.TimestampModel.get(byte[], byte[], byte[], long, int)
           
org.apache.hadoop.hbase.client.HTable.get(byte[], byte[], int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.rest.RowModel.get(byte[], byte[], long)
           
org.apache.hadoop.hbase.client.HTable.get(byte[], byte[], long, int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.get(String, String)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.get(String, String, int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.get(String, String, long, int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getClosestRowBefore(byte[], byte[])
          As of hbase 0.20.0, replaced by HTable.getRowOrBefore(byte[], byte[]) 
org.apache.hadoop.hbase.migration.nineteen.regionserver.HStoreFile.getFilterDir(Path, int, byte[])
           
org.apache.hadoop.hbase.client.Scan.getOldFilter()
            
org.apache.hadoop.hbase.client.HTable.getRow(byte[])
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(byte[], byte[][])
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(byte[], byte[][], int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(byte[], byte[][], long)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(byte[], byte[][], long, int, RowLock)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(byte[], int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(byte[], long)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(byte[], long, int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(String)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(String, int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(String, long)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(String, long, int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(String, String[])
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(String, String[], int)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(String, String[], long)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getRow(String, String[], long, int, RowLock)
          As of hbase 0.20.0, replaced by HTable.get(Get) 
org.apache.hadoop.hbase.client.HTable.getScanner(byte[][])
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(byte[][], byte[])
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(byte[][], byte[], byte[])
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(byte[][], byte[], byte[], long)
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(byte[][], byte[], long)
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(byte[][], byte[], long, RowFilterInterface)
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(byte[][], byte[], RowFilterInterface)
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(String[])
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(String[], String)
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(String[], String, long, RowFilterInterface)
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.client.HTable.getScanner(String[], String, String, long)
          As of hbase 0.20.0, replaced by HTable.getScanner(Scan) 
org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler.getVer(byte[], byte[], byte[], int)
           
org.apache.hadoop.hbase.thrift.ThriftServer.HBaseHandler.getVerTs(byte[], byte[], byte[], long, int)
           
org.apache.hadoop.hbase.client.HTablePool.newHTable(String)
          Use createHTable 
org.apache.hadoop.hbase.filter.RowFilterInterface.rowProcessed(boolean, byte[])
          Use RowFilterInterface.rowProcessed(boolean, byte[], int, int) instead. 
org.apache.hadoop.hbase.filter.RegExpRowFilter.setColumnFilter(byte[], byte[])
          Column filtering has been replaced by ColumnValueFilter Specify a value that must be matched for the given column. 
org.apache.hadoop.hbase.filter.RegExpRowFilter.setColumnFilters(Map)
          Column filtering has been replaced by ColumnValueFilter Set column filters for a number of columns. 
org.apache.hadoop.hbase.client.Scan.setOldFilter(RowFilterInterface)
            
 

Deprecated Constructors
org.apache.hadoop.hbase.filter.RegExpRowFilter(String, Map)
          Column filtering has been replaced by ColumnValueFilter Constructor that takes a row key regular expression to filter on. 
 



Copyright © 2009 The Apache Software Foundation