MAPREDUCE-5943. Separate mapred commands from CommandManual.apt.vm (Akira AJISAKA via aw)

git-svn-id: https://svn.apache.org/repos/asf/hadoop/common/branches/branch-2@1617807 13f79535-47bb-0310-9956-ffa450edef68
This commit is contained in:
Allen Wittenauer 2014-08-13 19:36:08 +00:00
parent 90d4cf83cc
commit 93e518be3a
4 changed files with 242 additions and 117 deletions

View File

@ -81,36 +81,15 @@ User Commands
* <<<archive>>>
Creates a hadoop archive. More information can be found at Hadoop
Archives.
Usage: <<<hadoop archive -archiveName NAME <src>* <dest> >>>
*-------------------+-------------------------------------------------------+
||COMMAND_OPTION || Description
*-------------------+-------------------------------------------------------+
| -archiveName NAME | Name of the archive to be created.
*-------------------+-------------------------------------------------------+
| src | Filesystem pathnames which work as usual with regular
| expressions.
*-------------------+-------------------------------------------------------+
| dest | Destination directory which would contain the archive.
*-------------------+-------------------------------------------------------+
Creates a hadoop archive. More information can be found at
{{{../../hadoop-mapreduce-client/hadoop-mapreduce-client-core/HadoopArchives.html}
Hadoop Archives Guide}}.
* <<<distcp>>>
Copy file or directories recursively. More information can be found at
Hadoop DistCp Guide.
Usage: <<<hadoop distcp <srcurl> <desturl> >>>
*-------------------+--------------------------------------------+
||COMMAND_OPTION || Description
*-------------------+--------------------------------------------+
| srcurl | Source Url
*-------------------+--------------------------------------------+
| desturl | Destination Url
*-------------------+--------------------------------------------+
{{{../../hadoop-mapreduce-client/hadoop-mapreduce-client-core/DistCp.html}
Hadoop DistCp Guide}}.
* <<<fs>>>
@ -142,103 +121,21 @@ User Commands
* <<<job>>>
Command to interact with Map Reduce Jobs.
Usage: <<<hadoop job [GENERIC_OPTIONS] [-submit <job-file>] | [-status <job-id>] | [-counter <job-id> <group-name> <counter-name>] | [-kill <job-id>] | [-events <job-id> <from-event-#> <#-of-events>] | [-history [all] <jobOutputDir>] | [-list [all]] | [-kill-task <task-id>] | [-fail-task <task-id>] | [-set-priority <job-id> <priority>]>>>
*------------------------------+---------------------------------------------+
|| COMMAND_OPTION || Description
*------------------------------+---------------------------------------------+
| -submit <job-file> | Submits the job.
*------------------------------+---------------------------------------------+
| -status <job-id> | Prints the map and reduce completion
| percentage and all job counters.
*------------------------------+---------------------------------------------+
| -counter <job-id> <group-name> <counter-name> | Prints the counter value.
*------------------------------+---------------------------------------------+
| -kill <job-id> | Kills the job.
*------------------------------+---------------------------------------------+
| -events <job-id> <from-event-#> <#-of-events> | Prints the events' details
| received by jobtracker for the given range.
*------------------------------+---------------------------------------------+
| -history [all]<jobOutputDir> | Prints job details, failed and killed tip
| details. More details about the job such as
| successful tasks and task attempts made for
| each task can be viewed by specifying the [all]
| option.
*------------------------------+---------------------------------------------+
| -list [all] | Displays jobs which are yet to complete.
| <<<-list all>>> displays all jobs.
*------------------------------+---------------------------------------------+
| -kill-task <task-id> | Kills the task. Killed tasks are NOT counted
| against failed attempts.
*------------------------------+---------------------------------------------+
| -fail-task <task-id> | Fails the task. Failed tasks are counted
| against failed attempts.
*------------------------------+---------------------------------------------+
| -set-priority <job-id> <priority> | Changes the priority of the job. Allowed
| priority values are VERY_HIGH, HIGH, NORMAL,
| LOW, VERY_LOW
*------------------------------+---------------------------------------------+
Deprecated. Use
{{{../../hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapredCommands.html#job}
<<<mapred job>>>}} instead.
* <<<pipes>>>
Runs a pipes job.
Usage: <<<hadoop pipes [-conf <path>] [-jobconf <key=value>, <key=value>,
...] [-input <path>] [-output <path>] [-jar <jar file>] [-inputformat
<class>] [-map <class>] [-partitioner <class>] [-reduce <class>] [-writer
<class>] [-program <executable>] [-reduces <num>]>>>
*----------------------------------------+------------------------------------+
|| COMMAND_OPTION || Description
*----------------------------------------+------------------------------------+
| -conf <path> | Configuration for job
*----------------------------------------+------------------------------------+
| -jobconf <key=value>, <key=value>, ... | Add/override configuration for job
*----------------------------------------+------------------------------------+
| -input <path> | Input directory
*----------------------------------------+------------------------------------+
| -output <path> | Output directory
*----------------------------------------+------------------------------------+
| -jar <jar file> | Jar filename
*----------------------------------------+------------------------------------+
| -inputformat <class> | InputFormat class
*----------------------------------------+------------------------------------+
| -map <class> | Java Map class
*----------------------------------------+------------------------------------+
| -partitioner <class> | Java Partitioner
*----------------------------------------+------------------------------------+
| -reduce <class> | Java Reduce class
*----------------------------------------+------------------------------------+
| -writer <class> | Java RecordWriter
*----------------------------------------+------------------------------------+
| -program <executable> | Executable URI
*----------------------------------------+------------------------------------+
| -reduces <num> | Number of reduces
*----------------------------------------+------------------------------------+
Deprecated. Use
{{{../../hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapredCommands.html#pipes}
<<<mapred pipes>>>}} instead.
* <<<queue>>>
command to interact and view Job Queue information
Usage: <<<hadoop queue [-list] | [-info <job-queue-name> [-showJobs]] | [-showacls]>>>
*-----------------+-----------------------------------------------------------+
|| COMMAND_OPTION || Description
*-----------------+-----------------------------------------------------------+
| -list | Gets list of Job Queues configured in the system.
| Along with scheduling information associated with the job queues.
*-----------------+-----------------------------------------------------------+
| -info <job-queue-name> [-showJobs] | Displays the job queue information and
| associated scheduling information of particular job queue.
| If <<<-showJobs>>> options is present a list of jobs
| submitted to the particular job queue is displayed.
*-----------------+-----------------------------------------------------------+
| -showacls | Displays the queue name and associated queue operations
| allowed for the current user. The list consists of only
| those queues to which the user has access.
*-----------------+-----------------------------------------------------------+
Deprecated. Use
{{{../../hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapredCommands.html#queue}
<<<mapred queue>>>}} instead.
* <<<version>>>

View File

@ -44,6 +44,9 @@ Release 2.6.0 - UNRELEASED
MAPREDUCE-5944. Remove MRv1 commands from CommandsManual.apt.vm
(Akira AJISAKA via aw)
MAPREDUCE-5943. Separate mapred commands from CommandManual.apt.vm
(Akira AJISAKA via aw)
Release 2.5.0 - UNRELEASED
INCOMPATIBLE CHANGES

View File

@ -0,0 +1,224 @@
~~ Licensed to the Apache Software Foundation (ASF) under one or more
~~ contributor license agreements. See the NOTICE file distributed with
~~ this work for additional information regarding copyright ownership.
~~ The ASF licenses this file to You under the Apache License, Version 2.0
~~ (the "License"); you may not use this file except in compliance with
~~ the License. You may obtain a copy of the License at
~~
~~ http://www.apache.org/licenses/LICENSE-2.0
~~
~~ Unless required by applicable law or agreed to in writing, software
~~ distributed under the License is distributed on an "AS IS" BASIS,
~~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~~ See the License for the specific language governing permissions and
~~ limitations under the License.
---
MapReduce Commands Guide
---
---
${maven.build.timestamp}
MapReduce Commands Guide
%{toc|section=1|fromDepth=2|toDepth=4}
* Overview
MapReduce commands are invoked by the <<<bin/mapred>>> script. Running the
script without any arguments prints the description for all commands.
Usage: <<<mapred [--config confdir] COMMAND>>>
MapReduce has an option parsing framework that employs parsing generic
options as well as running classes.
*-------------------------+---------------------------------------------------+
|| COMMAND_OPTIONS || Description |
*-------------------------+---------------------------------------------------+
| --config confdir | Overwrites the default Configuration directory. Default
| | is $\{HADOOP_PREFIX\}/conf.
*-------------------------+---------------------------------------------------+
| COMMAND COMMAND_OPTIONS | Various commands with their options are described
| | in the following sections. The commands have been
| | grouped into {{User Commands}} and
| | {{Administration Commands}}.
*-------------------------+---------------------------------------------------+
* User Commands
Commands useful for users of a hadoop cluster.
** <<<pipes>>>
Runs a pipes job.
Usage: <<<mapred pipes [-conf <path>] [-jobconf <key=value>, <key=value>,
...] [-input <path>] [-output <path>] [-jar <jar file>] [-inputformat
<class>] [-map <class>] [-partitioner <class>] [-reduce <class>] [-writer
<class>] [-program <executable>] [-reduces <num>]>>>
*----------------------------------------+------------------------------------+
|| COMMAND_OPTION || Description
*----------------------------------------+------------------------------------+
| -conf <path> | Configuration for job
*----------------------------------------+------------------------------------+
| -jobconf <key=value>, <key=value>, ... | Add/override configuration for job
*----------------------------------------+------------------------------------+
| -input <path> | Input directory
*----------------------------------------+------------------------------------+
| -output <path> | Output directory
*----------------------------------------+------------------------------------+
| -jar <jar file> | Jar filename
*----------------------------------------+------------------------------------+
| -inputformat <class> | InputFormat class
*----------------------------------------+------------------------------------+
| -map <class> | Java Map class
*----------------------------------------+------------------------------------+
| -partitioner <class> | Java Partitioner
*----------------------------------------+------------------------------------+
| -reduce <class> | Java Reduce class
*----------------------------------------+------------------------------------+
| -writer <class> | Java RecordWriter
*----------------------------------------+------------------------------------+
| -program <executable> | Executable URI
*----------------------------------------+------------------------------------+
| -reduces <num> | Number of reduces
*----------------------------------------+------------------------------------+
** <<<job>>>
Command to interact with Map Reduce Jobs.
Usage: <<<mapred job
| [{{{../../hadoop-project-dist/hadoop-common/CommandsManual.html#Generic_Options}GENERIC_OPTIONS}}]
| [-submit <job-file>]
| [-status <job-id>]
| [-counter <job-id> <group-name> <counter-name>]
| [-kill <job-id>]
| [-events <job-id> <from-event-#> <#-of-events>]
| [-history [all] <jobOutputDir>] | [-list [all]]
| [-kill-task <task-id>] | [-fail-task <task-id>]
| [-set-priority <job-id> <priority>]>>>
*------------------------------+---------------------------------------------+
|| COMMAND_OPTION || Description
*------------------------------+---------------------------------------------+
| -submit <job-file> | Submits the job.
*------------------------------+---------------------------------------------+
| -status <job-id> | Prints the map and reduce completion
| percentage and all job counters.
*------------------------------+---------------------------------------------+
| -counter <job-id> <group-name> <counter-name> | Prints the counter value.
*------------------------------+---------------------------------------------+
| -kill <job-id> | Kills the job.
*------------------------------+---------------------------------------------+
| -events <job-id> <from-event-#> <#-of-events> | Prints the events' details
| received by jobtracker for the given range.
*------------------------------+---------------------------------------------+
| -history [all]<jobOutputDir> | Prints job details, failed and killed tip
| details. More details about the job such as
| successful tasks and task attempts made for
| each task can be viewed by specifying the
| [all] option.
*------------------------------+---------------------------------------------+
| -list [all] | Displays jobs which are yet to complete.
| <<<-list all>>> displays all jobs.
*------------------------------+---------------------------------------------+
| -kill-task <task-id> | Kills the task. Killed tasks are NOT counted
| against failed attempts.
*------------------------------+---------------------------------------------+
| -fail-task <task-id> | Fails the task. Failed tasks are counted
| against failed attempts.
*------------------------------+---------------------------------------------+
| -set-priority <job-id> <priority> | Changes the priority of the job. Allowed
| priority values are VERY_HIGH, HIGH, NORMAL,
| LOW, VERY_LOW
*------------------------------+---------------------------------------------+
** <<<queue>>>
command to interact and view Job Queue information
Usage: <<<mapred queue [-list] | [-info <job-queue-name> [-showJobs]]
| [-showacls]>>>
*-----------------+-----------------------------------------------------------+
|| COMMAND_OPTION || Description
*-----------------+-----------------------------------------------------------+
| -list | Gets list of Job Queues configured in the system.
| Along with scheduling information associated with the job
| queues.
*-----------------+-----------------------------------------------------------+
| -info <job-queue-name> [-showJobs] | Displays the job queue information and
| associated scheduling information of particular job queue.
| If <<<-showJobs>>> options is present a list of jobs
| submitted to the particular job queue is displayed.
*-----------------+-----------------------------------------------------------+
| -showacls | Displays the queue name and associated queue operations
| allowed for the current user. The list consists of only
| those queues to which the user has access.
*-----------------+-----------------------------------------------------------+
** <<<classpath>>>
Prints the class path needed to get the Hadoop jar and the required
libraries.
Usage: <<<mapred classpath>>>
** <<<distcp>>>
Copy file or directories recursively. More information can be found at
{{{./DistCp.html}Hadoop DistCp Guide}}.
** <<<archive>>>
Creates a hadoop archive. More information can be found at
{{{./HadoopArchives.html}Hadoop Archives Guide}}.
* Administration Commands
Commands useful for administrators of a hadoop cluster.
** <<<historyserver>>>
Start JobHistoryServer.
Usage: <<<mapred historyserver>>>
** <<<hsadmin>>>
Runs a MapReduce hsadmin client for execute JobHistoryServer administrative
commands.
Usage: <<<mapred hsadmin
[-refreshUserToGroupsMappings] |
[-refreshSuperUserGroupsConfiguration] |
[-refreshAdminAcls] |
[-refreshLoadedJobCache] |
[-refreshLogRetentionSettings] |
[-refreshJobRetentionSettings] |
[-getGroups [username]] | [-help [cmd]]>>>
*-----------------+-----------------------------------------------------------+
|| COMMAND_OPTION || Description
*-----------------+-----------------------------------------------------------+
| -refreshUserToGroupsMappings | Refresh user-to-groups mappings
*-----------------+-----------------------------------------------------------+
| -refreshSuperUserGroupsConfiguration| Refresh superuser proxy groups mappings
*-----------------+-----------------------------------------------------------+
| -refreshAdminAcls | Refresh acls for administration of Job history server
*-----------------+-----------------------------------------------------------+
| -refreshLoadedJobCache | Refresh loaded job cache of Job history server
*-----------------+-----------------------------------------------------------+
| -refreshJobRetentionSettings|Refresh job history period, job cleaner settings
*-----------------+-----------------------------------------------------------+
| -refreshLogRetentionSettings | Refresh log retention period and log retention
| | check interval
*-----------------+-----------------------------------------------------------+
| -getGroups [username] | Get the groups which given user belongs to
*-----------------+-----------------------------------------------------------+
| -help [cmd] | Displays help for the given command or all commands if none is
| | specified.
*-----------------+-----------------------------------------------------------+

View File

@ -94,6 +94,7 @@
<menu name="MapReduce" inherit="top">
<item name="MapReduce Tutorial" href="hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html"/>
<item name="MapReduce Commands Reference" href="hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapredCommands.html"/>
<item name="Compatibilty between Hadoop 1.x and Hadoop 2.x" href="hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduce_Compatibility_Hadoop1_Hadoop2.html"/>
<item name="Encrypted Shuffle" href="hadoop-mapreduce-client/hadoop-mapreduce-client-core/EncryptedShuffle.html"/>
<item name="Pluggable Shuffle/Sort" href="hadoop-mapreduce-client/hadoop-mapreduce-client-core/PluggableShuffleAndPluggableSort.html"/>