hbase/hbase-examples
Nick Dimiduk 8556e2598e HBASE-12728 buffered writes substantially less useful after removal of HTablePool (Solomon Duskis and Nick Dimiduk)
In our pre-1.0 API, HTable is considered a light-weight object that consumed by
a single thread at a time. The HTablePool class provided a means of sharing
multiple HTable instances across a number of threads. As an optimization,
HTable managed a "write buffer", accumulating edits and sending a "batch" all
at once. By default the batch was sent as the last step in invocations of
put(Put) and put(List<Put>). The user could disable the automatic flushing of
the write buffer, retaining edits locally and only sending the whole "batch"
once the write buffer has filled or when the flushCommits() method in invoked
explicitly. Explicit or implicit batch writing was controlled by the
setAutoFlushTo(boolean) method. A value of true (the default) had the write
buffer flushed at the completion of a call to put(Put) or put(List<Put>). A
value of false allowed for explicit buffer management. HTable also exposed the
buffer to consumers via getWriteBuffer().

The combination of HTable with setAutoFlushTo(false) and the HTablePool
provided a convenient mechanism by which multiple "Put-producing" threads could
share a common write buffer. Both HTablePool and HTable are deprecated, and
they are officially replaced in The new 1.0 API by Table and BufferedMutator.
Table, which replaces HTable, no longer exposes explicit write-buffer
management. Instead, explicit buffer management is exposed via BufferedMutator.
BufferedMutator is made safe for concurrent use. Where code would previously
retrieve and return HTables from an HTablePool, now that code creates and
shares a single BufferedMutator instance across all threads.

Conflicts:
	hbase-client/src/main/java/org/apache/hadoop/hbase/client/HTable.java
	hbase-client/src/test/java/org/apache/hadoop/hbase/client/TestAsyncProcess.java
	hbase-it/src/test/java/org/apache/hadoop/hbase/test/IntegrationTestBigLinkedList.java
	hbase-it/src/test/java/org/apache/hadoop/hbase/test/IntegrationTestBigLinkedListWithVisibility.java
	hbase-it/src/test/java/org/apache/hadoop/hbase/test/IntegrationTestLoadAndVerify.java
	hbase-it/src/test/java/org/apache/hadoop/hbase/trace/IntegrationTestSendTraceRequests.java
	hbase-rest/src/test/java/org/apache/hadoop/hbase/rest/PerformanceEvaluation.java
	hbase-rest/src/test/java/org/apache/hadoop/hbase/rest/client/TestRemoteTable.java
	hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/MultiTableOutputFormat.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/TestMultiVersions.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestClientPushback.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestCloneSnapshotFromClient.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestMultiParallel.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestRestoreSnapshotFromClient.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestRpcControllerFactory.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMaster.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestFSErrorsExposed.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestRegionServerMetrics.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestScannerWithBulkload.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/wal/TestLogRolling.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/replication/TestReplicationChangingPeerRegionservers.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/replication/TestReplicationWithTags.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/security/visibility/TestVisibilityLabelsReplication.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/snapshot/SnapshotTestingUtils.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/snapshot/TestFlushSnapshotFromClient.java
	hbase-server/src/test/java/org/apache/hadoop/hbase/snapshot/TestRestoreFlushSnapshotFromClient.java
2015-01-23 09:07:55 -08:00
..
src HBASE-12728 buffered writes substantially less useful after removal of HTablePool (Solomon Duskis and Nick Dimiduk) 2015-01-23 09:07:55 -08:00
README.txt HBASE-11960 Provide a sample to show how to use Thrift client authentication 2014-09-19 09:34:00 -07:00
pom.xml Update pom.xml version for 1.1.0-SNAPSHOT for branch-1 2014-12-17 23:16:41 -08:00

README.txt

Example code.

* org.apache.hadoop.hbase.mapreduce.SampleUploader
    Demonstrates uploading data from text files (presumably stored in HDFS) to HBase.

* org.apache.hadoop.hbase.mapreduce.IndexBuilder
    Demonstrates map/reduce with a table as the source and other tables as the sink.
    You can generate sample data for this MR job via hbase-examples/src/main/ruby/index-builder-setup.rb.


* Thrift examples
    Sample clients of the HBase ThriftServer. They perform the same actions, implemented in
    C++, Java, Ruby, PHP, Perl, and Python. Pre-generated Thrift code for HBase is included
    to be able to compile/run the examples without Thrift installed.
    If desired, the code can be re-generated as follows:
    thrift --gen cpp --gen java --gen rb --gen py --gen php --gen perl \
        ${HBASE_ROOT}/hbase-server/src/main/resources/org/apache/hadoop/hbase/thrift/Hbase.thrift
    and re-placed at the corresponding paths.

    Before you run any Thrift examples, find a running HBase Thrift server.
    If you start one locally (bin/hbase thrift start), the default port is 9090.

    * Java: org.apache.hadoop.hbase.thrift.DemoClient (jar under lib/).
      1. Set up the classpath with all the necessary jars, for example:
        for f in `find . -name "libthrift-*.jar" -or -name "slf4j-*.jar" -or -name "log4j-*.jar"`; do
          HBASE_EXAMPLE_CLASSPATH=${HBASE_EXAMPLE_CLASSPATH}:$f;
        done
      2. If HBase server is not secure, or authentication is not enabled for the Thrift server, execute:
      {java -cp hbase-examples-[VERSION].jar:${HBASE_EXAMPLE_CLASSPATH} org.apache.hadoop.hbase.thrift.DemoClient <host> <port>}
      3. If HBase server is secure, and authentication is enabled for the Thrift server, run kinit at first, then execute:
      {java -cp hbase-examples-[VERSION].jar:${HBASE_EXAMPLE_CLASSPATH} org.apache.hadoop.hbase.thrift.DemoClient <host> <port> true}

    * Ruby: hbase-examples/src/main/ruby/DemoClient.rb
      1. Modify the import path in the file to point to {$THRIFT_HOME}/lib/rb/lib.
      2. Execute {ruby DemoClient.rb} (or {ruby DemoClient.rb <host> <port>}).

    * Python: hbase-examples/src/main/python/DemoClient.py
      1. Modify the added system path in the file to point to {$THRIFT_HOME}/lib/py/build/lib.[YOUR SYSTEM]
      2. Execute {python DemoClient.py <host> <port>}.

    * PHP: hbase-examples/src/main/php/DemoClient.php
      1. Modify the THRIFT_HOME path in the file to point to actual {$THRIFT_HOME}.
      2. Execute {php DemoClient.php}.
      3. Starting from Thrift 0.9.0, if Thrift.php complains about some files it cannot include, go to thrift root,
        and copy the contents of php/lib/Thrift under lib/php/src. Thrift.php appears to include, from under the same root,
        both TStringUtils.php, only present in src/, and other files only present under lib/; this will bring them under
        the same root (src/).
        If you know better about PHP and Thrift, please feel free to fix this.

    * Perl: hbase-examples/src/main/perl/DemoClient.pl
      1. Modify the "use lib" path in the file to point to {$THRIFT_HOME}/lib/perl/lib.
      2. Use CPAN to get Bit::Vector and Class::Accessor modules if not present (see thrift perl README if more modules are missing).
      3. Execute {perl DemoClient.pl}.

    * CPP: hbase-examples/src/main/cpp/DemoClient.cpp
      1. Make sure you have boost and Thrift C++ libraries; modify Makefile if necessary.
        The recent (0.9.0 as of this writing) version of Thrift can be downloaded from http://thrift.apache.org/download/.
        Boost can be found at http://www.boost.org/users/download/.
      2. Execute {make}.
      3. Execute {./DemoClient}.