HBASE-19023 Replace hbase-server with hbase-mapreduce for HBase and MapReduce chapter

RowCounter and other related HBase's MapReduce classes have been moved
to hbase-mapreduce component by HBASE-18640, related chapter was
out-of-date and this fix replaced hbase-server with hbase-mapreduce
to correct those commands

Also this change moved RowCounter_Counters.properties
to hbase-mapreduce package as well

JIRA https://issues.apache.org/jira/browse/HBASE-19023

Signed-off-by: tedyu <yuzhihong@gmail.com>
This commit is contained in:
TAK LON WU 2017-12-01 15:25:59 -08:00 committed by tedyu
parent b9f1f5a17c
commit b2f9b7bc19
5 changed files with 7 additions and 7 deletions

View File

@ -70,7 +70,7 @@ This example assumes you use a BASH-compatible shell.
[source,bash] [source,bash]
---- ----
$ HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/lib/hbase-server-VERSION.jar rowcounter usertable $ HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/lib/hbase-mapreduce-VERSION.jar rowcounter usertable
---- ----
When the command runs, internally, the HBase JAR finds the dependencies it needs and adds them to the MapReduce job configuration. When the command runs, internally, the HBase JAR finds the dependencies it needs and adds them to the MapReduce job configuration.
@ -98,7 +98,7 @@ If this occurs, try modifying the command as follows, so that it uses the HBase
[source,bash] [source,bash]
---- ----
$ HADOOP_CLASSPATH=${HBASE_BUILD_HOME}/hbase-server/target/hbase-server-VERSION-SNAPSHOT.jar:`${HBASE_BUILD_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_BUILD_HOME}/hbase-server/target/hbase-server-VERSION-SNAPSHOT.jar rowcounter usertable $ HADOOP_CLASSPATH=${HBASE_BUILD_HOME}/hbase-mapreduce/target/hbase-mapreduce-VERSION-SNAPSHOT.jar:`${HBASE_BUILD_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_BUILD_HOME}/hbase-mapreduce/target/hbase-mapreduce-VERSION-SNAPSHOT.jar rowcounter usertable
---- ----
==== ====
@ -194,7 +194,7 @@ To learn about the bundled MapReduce jobs, run the following command.
[source,bash] [source,bash]
---- ----
$ ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-server-VERSION.jar $ ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-mapreduce-VERSION.jar
An example program must be given as the first argument. An example program must be given as the first argument.
Valid program names are: Valid program names are:
copytable: Export a table from local cluster to peer cluster copytable: Export a table from local cluster to peer cluster
@ -210,7 +210,7 @@ To run one of the jobs, model your command after the following example.
[source,bash] [source,bash]
---- ----
$ ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-server-VERSION.jar rowcounter myTable $ ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-mapreduce-VERSION.jar rowcounter myTable
---- ----
== HBase as a MapReduce Job Data Source and Data Sink == HBase as a MapReduce Job Data Source and Data Sink

View File

@ -551,7 +551,7 @@ For ImportTsv to use this input file, the command line needs to look like this:
---- ----
HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-server-VERSION.jar importtsv -Dimporttsv.columns=HBASE_ROW_KEY,d:c1,d:c2 -Dimporttsv.bulk.output=hdfs://storefileoutput datatsv hdfs://inputfile HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-mapreduce-VERSION.jar importtsv -Dimporttsv.columns=HBASE_ROW_KEY,d:c1,d:c2 -Dimporttsv.bulk.output=hdfs://storefileoutput datatsv hdfs://inputfile
---- ----
\... and in this example the first column is the rowkey, which is why the HBASE_ROW_KEY is used. \... and in this example the first column is the rowkey, which is why the HBASE_ROW_KEY is used.
@ -1432,7 +1432,7 @@ The `VerifyReplication` MapReduce job, which is included in HBase, performs a sy
+ +
[source,bash] [source,bash]
---- ----
$ HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` "${HADOOP_HOME}/bin/hadoop" jar "${HBASE_HOME}/hbase-server-VERSION.jar" verifyrep --starttime=<timestamp> --endtime=<timestamp> --families=<myFam> <ID> <tableName> $ HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` "${HADOOP_HOME}/bin/hadoop" jar "${HBASE_HOME}/hbase-mapreduce-VERSION.jar" verifyrep --starttime=<timestamp> --endtime=<timestamp> --families=<myFam> <ID> <tableName>
---- ----
+ +
The `VerifyReplication` command prints out `GOODROWS` and `BADROWS` counters to indicate rows that did and did not replicate correctly. The `VerifyReplication` command prints out `GOODROWS` and `BADROWS` counters to indicate rows that did and did not replicate correctly.

View File

@ -755,7 +755,7 @@ For example (substitute VERSION with your HBase version):
[source,bourne] [source,bourne]
---- ----
HADOOP_CLASSPATH=`hbase classpath` hadoop jar $HBASE_HOME/hbase-server-VERSION.jar rowcounter usertable HADOOP_CLASSPATH=`hbase classpath` hadoop jar $HBASE_HOME/hbase-mapreduce-VERSION.jar rowcounter usertable
---- ----
See http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/package-summary.html#classpathfor more information on HBase MapReduce jobs and classpaths. See http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/package-summary.html#classpathfor more information on HBase MapReduce jobs and classpaths.