This commit is contained in:
Jason van Zyl 2005-10-13 02:44:14 +00:00
parent 1481438ecd
commit a06a4da5ce
7 changed files with 402 additions and 586 deletions

View File

@ -1,13 +1,386 @@
------
Introduction to the Lifecycle
Introduction to the Build Lifecycle
------
Jason van Zyl
Brett Porter
------
12 October 2005
16 June 2005
------
Introduction to the Lifecycle
Introduction to the Build Lifecycle
* Build Lifecycle Basics
Maven 2.0 is based around the central concept of a build lifecycle. What this means is that the process for building
and distributing a particular artifact is clearly defined.
For the person building a project, this means that it is only necessary to learn a small set of commands to build any
Maven project, and the POM will ensure they get the results they desired.
The most common lifecycle phases that would be executed on a project are the following (a complete list of the lifecycle phases is given below):
* <<<validate>>> - validate the project is correct and all necessary information is available
* <<<compile>>> - compile the source code of the project
* <<<test>>> - test the compiled source code using a suitable unit testing framework. These tests should not
require the code be packaged or deployed
* <<<package>>> - take the compiled code and package it in its distributable format, such as a JAR.
* <<<integration-test>>> - process and deploy the package if necessary into an environment where integration tests
can be run
* <<<verify>>> - run any checks to verify the package is valid and meets quality criteria
* <<<install>>> - install the package into the local repository, for use as a dependency in other projects locally
* <<<deploy>>> - done in an integration or release environment, copies the final package to the remote repository
for sharing with other developers and projects.
Note that for each of these steps, all previous steps are always executed, so you only need to specify the last one
you desire on the command line. For example:
-------
m2 install
-------
This command will compile, test, package, verify and install the package into the local repository when run.
There are more commands that are part of the lifecycle, which will be discussed in the following sections.
It should also be noted that the same command can be used in a multi-module scenario. For example;
------
m2 clean:clean install
------
This command will traverse into all of the subprojects and run <<<clean:clean>>>, then <<<install>>> (including all of
the prior steps).
* Setting up your Project to Use the Build Lifecycle
The build lifecycle is simple enough to use, but when you are constructing a Maven build for a project, how do you go
about assigning tasks to each of those build phases?
** Packaging
The first, and most common way, is to set the <<<packaging>>> for your project. This defaults to <<<jar>>>, so whether
you have specifically done it or not, this has already happened. Each packaging contains a list of goals to bind to
a particular phase. For example, a JAR will add the following bindings to the lifecycle:
*------------------------------+---------------------------------------------------------------------------------------+
| <<<process-resources>>> | <<<resources:resources>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<compile>>> | <<<compiler:compile>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<process-test-resources>>> | <<<resources:testResources>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<test-compile>>> | <<<compiler:testCompile>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<test>>> | <<<surefire:test>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<package>>> | <<<jar:jar>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<install>>> | <<<install:install>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<deploy>>> | <<<deploy:deploy>>>
*------------------------------+---------------------------------------------------------------------------------------+
This is an almost standard set of bindings; however, some packages handle them differently. For example, a project
that is purely metadata (packaging <<<pom>>>) only binds the <<<install>>> and <<<deploy>>> phases.
Note that for some packaging tpyes to be available, you may also need to include a particular plugin in your
<<<build>>> section (as described in the next section). One example of a plugin that requires this is the Plexus plugin,
which provides a <<<plexus-application>>> and <<<plexus-service>>> packaging.
** Plugins
The second way to add goals to phases is to configure plugins in your project. As you will see in the later sections,
plugins contain information that indicate which lifecycle phase to bind each goal to. Note that adding the plugin on its own is not
enough information - you must also specify the goals you want run as part of your build.
The goals that are configured will be added to the goals already bound to the lifecycle from the packaging selected.
If more than one goal is bound to a particular phase, the order used is that those from the packaging are executed
first, followed by those configured in the POM. Note that you can use the <<<executions>>> element to gain more
control over the order of particular goals.
For example, the Modello plugin always binds <<<modello:java>>> to the <<<generate-sources>>> phase. So to use the
Modello plugin and have it generate sources from a model and incorporate that into the build, you would add the
following to your POM in the <<<plugins>>> section of <<<build>>>:
----
...
<plugin>
<groupId>org.codehaus.modello</groupId>
<artifactId>modello-maven-plugin</artifactId>
<executions>
<execution>
<configuration>
<model>maven.mdo</model>
<modelVersion>4.0.0</modelVersion>
</configuration>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
</plugin>
...
----
You might be wondering why that executions element is there. That is so that you can run the same goal multiple times
with different configuration if needed. Separate executions can also be given an ID so that during inheritence or the
application of profiles you can control whether goal configuration is merged or turned into an additional execution.
When multiple executions are given that match a particular phase, they are executed in the order specified in the POM,
with inherited executions running first.
Now, in the case of <<<modello:java>>>, it only makes sense in the <<<generate-sources>>> phase. But some goals can be
used in more than one phase, and there may not be a sensible default. For those, you can specify the phase yourself.
For example, let's say you have a goal <<<touch:timestamp>>> that echos the current time to a file, and you want it to
run in the <<<process-test-resources>>> phase to indicate when the tests were started. This would be configured like
so:
----
...
<plugin>
<groupId>com.mycompany.example</groupId>
<artifactId>touch-maven-plugin</artifactId>
<executions>
<execution>
<phase>process-test-resources</phase>
<configuration>
<file>${project.output.directory}/timestamp.txt</file>
</configuration>
<goals>
<goal>timestamp</goal>
</goals>
</execution>
</executions>
</plugin>
...
----
* Build Lifecycle Phase Reference
The following lists all build lifecycle phases, which are executed in the order given up to the point of the one specified.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<validate>>> | validate the project is correct and all necessary information is available.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<generate-sources>>> | generate any source code for inclusion in compilation.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-sources>>> | process the source code, for example to filter any values.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<generate-resources>>> | generate resources for inclusion in the package.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-resources>>> | copy and process the resources into the destination directory, ready for packaging.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<compile>>> | compile the source code of the project.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-classes>>> | post-process the generated files from compilation, for example to do bytecode enhancement on Java classes.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<generate-test-sources>>> | generate any test source code for inclusion in compilation.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-test-sources>>> | process the test source code, for example to filter any values.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<generate-test-resources>>> | create resources for testing.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-test-resources>>> | copy and process the resources into the test destination directory.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<test-compile>>> | compile the test source code into the test destination directory
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<test>>> | run tests using a suitable unit testing framework. These tests should not require the code be packaged or deployed.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<package>>> | take the compiled code and package it in its distributable format, such as a JAR.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<integration-test>>> | process and deploy the package if necessary into an environment where integration tests can be run.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<verify>>> | run any checks to verify the package is valid and meets quality criteria.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<install>>> | install the package into the local repository, for use as a dependency in other projects locally.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<deploy>>> | done in an integration or release environment, copies the final package to the remote repository for sharing with other developers and projects.
*-------------------------------+--------------------------------------------------------------------------------------+
* How the Build Lifecycle Affects Plugin Developers
The build lifecycle ensures that plugin developers only need to make their individual goal (a "mojo") perform a single
task with a simple set of inputs and outputs, and then have that goal bound to the appropriate stage of the build.
Information is passed between goals only through the project object - for example by adding new compile source roots,
changing the location of the classes directory after processing, and so on.
There are 3 ways that a plugin can interact with the build lifecycle: by binding a mojo to a particular phase, by
specifying an alternate packaging and appropriate lifecycle bindings, or by forking a parallel lifecycle.
** Binding a Mojo to a Phase
If the mojo is participating in a part of the normal build, usually the plugin developer will bind that mojo to a
particular phase, using the following syntax in the mojo level declaration:
----
@phase generate-sources
----
<<Note:>> <Some plugin languages have different ways of specifying mojo level declarations.
Please refer to the specific plugin development documentation for more information.>
Once this is specified, it will automatically be registered when the goal is listed in the project POM, as described
previously.
** Specifying a New Packaging
If your plugin is intended to provide a unique artifact type, then you will need to provide not only the goal to
package it with, but also a mapping of how the default lifecycle should behave.
This is currently achieved by adding a Plexus descriptor to your plugin (or modifying it if it already exists).
This file is <<<META-INF/plexus/components.xml>>> in the plugin JAR. The following is an example of configuring a
set of phases for the Plexus plugin itself to register the <<<plexus-application>>> type:
----
<component-set>
<components>
<component>
<role>org.apache.maven.lifecycle.mapping.LifecycleMapping</role>
<role-hint>plexus-application</role-hint>
<implementation>org.apache.maven.lifecycle.mapping.DefaultLifecycleMapping</implementation>
<configuration>
<phases>
<process-resources>resources:resources</process-resources>
<compile>compiler:compile</compile>
<process-test-resources>resources:testResources</process-test-resources>
<test-compile>compiler:testCompile</test-compile>
<test>surefire:test</test>
<package>plexus:app</package>
<install>install:install</install>
<deploy>deploy:deploy</deploy>
</phases>
</configuration>
</component>
</components>
</component-set>
----
In this example, the <<<role-hint>>> is used to specify the packaging, and the <<<role>>> of
<<<org.apache.maven.lifecycle.mapping.LifecycleMapping>>> indicates this is a lifecycle mapping for that packaging.
<<<implementation>>> is required, and while you can provide your own, the default given above should suit the standard case.
The phases to bind are listed in the configuration element, and each that is given can have one goal associated with
that phase for that particular packaging.
Once this is included in the JAR, the plugin needs to be added to the project to make the packaging available from
that project. In addition to listing the plugin, you must specify that it provides extensions:
----
...
<packaging>plexus-application</packaging>
...
<plugin>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-maven-plugin</artifactId>
<extensions>true</extensions>
</plugin>
...
----
Setting the extensions flag is also necessary if you provide custom artifact type handlers (closely related to
providing a packaging).
** Forking a Parallel Lifecycle
While lots of mojos will participate in the standard lifecycle, there are just as many that are used in other
scenarios. These are mojos that are executed standalone from the command line (such as <<<idea:idea>>>), or individual
reports in the site building process.
However, sometimes these goals require that a particular task has already been performed - for instance, the IDEA
plugin must ensure sources have been generated to properly construct its module files. If the goal were participating
in the lifecycle, it would easily do this by ensuring it occurred after the phase it depended on having run. Since
this isn't the case, it must have a way to first execute that task.
Additionally, even goals participating in the build lifecycle might need to perform a task with different parameters
to what was already used, and does not want the output to affect the current build (for example, running
<<<clover:check>>> to run tests with modified sources and fail if a certain coverage ratio is not achieved).
For these reasons, mojos are capable of forking a new lifecycle. The lifecycle will be a normal build lifecycle,
a clone of the one currently being used (including any additional bindings from the POM), executed up until the point
specified by the mojo.
For example, the <<<idea:idea>>> mojo specifies the following in the mojo level declarations to call the source
generation:
----
@execute phase="generate-sources"
----
But what happens if <<<generate-sources>>> has already been run in this build? In the current version of Maven, there
is no way to tell if the previous execution used the same input and outputs as the current mojo requires, so the task
(and any preceding ones if from the lifecycle) must be run again.
For this reason, it is important that if your plugin does any intensive work, you should first check whether it is
necessary to perform the tasks again, perhaps by using timestamp checking or a similar technique. As an example,
the compiler plugin will only recompile changed source files so can very efficiently be run multiple times in a build
if necessary.
When the lifecycle is forked, the project object being used is also cloned. In this way, modifications made to the
project as part of the execution, such as the addition of a new source root, will not affect the original build.
When the lifecycle finishes executing and control is passed to the original mojo, it can access that project using
the expression <<<${executedProject}>>>. For example:
----
/**
* @parameter expression="${executedProject}"
*/
private MavenProject executedProject;
----
This project instance can be used by the mojo to obtain results, and propogate any changes it sees fit into the
original build.
Finally, when forking the new lifecycle, it is possible to augment it on top of the changes already made by the
packaging and the plugins in the POM.
For example, consider the Clover plugin. If <<<clover:check>>> were to be run from the command line, the plugin
would need to fork the lifecycle, executing the <<<test>>> phase. But, it would also need to add some configuration
and bind the <<<clover:compiler>>> goal to the <<<generate-sources>>> phase.
This can be achieved by including the following file as <<<META-INF/maven/lifecycle.xml>>> in the plugin JAR:
----
<lifecycles>
<lifecycle>
<id>clover</id>
<phases>
<phase>
<id>generate-sources</id>
<executions>
<execution>
<configuration>
<debug>true</debug>
</configuration>
<goals>
<goal>compiler</goal>
</goals>
</execution>
</executions>
</phase>
</phases>
</lifecycle>
</lifecycles>
----
Here, the <<<executions>>> element is present in a similar way to a plugin declaration in the POM. This can be used
to bind a goal one or more times to a particular phase, as well as specifying configuration. Note that configuration
already provided in the POM to that plugin that is not part of a specific execution will also be applied.
The lifecycle ID given here (<<<clover>>>) can then be used in the mojo to specify what to overlay on the forked
lifecycle when executing it, using the following mojo level declaration:
----
@execute phase="test" lifecycle="clover"
----
For more information about plugin development in general, see the
{{{developers/plugin-overview.html} Developer's Section}}.
The content from the current lifecycle document will be incorporated here.
~~How do I make my own lifecycle

View File

@ -1,4 +1,11 @@
------
Guide to the APT Format
------
Jason van Zyl
------
12 October 2005
------
The APT format
~~~~~~~~~~~~~~

View File

@ -21,6 +21,8 @@ Documentation
* Mini Guides
* {{{mini/guide-apt-format.html}Guide to the APT Format}}
* {{{mini/guide-assemblies.html}Guide to Creating Assemblies}}
* {{{mini/guide-bash-m2-completion.html}Guide to Maven 2.x auto completion using BASH}}
@ -29,6 +31,8 @@ Documentation
* {{{mini/guide-coping-with-sun-jars.html}Guide to Coping with Sun JARs}}
* {{{mini/guide-creating-archetypes.html}Guide to Creating Archetypes}}
* {{{mini/guide-deploy-ftp.html}Guide to deploying with FTP}}
* {{{mini/guide-deployment-security-settings.html}Guide to Deployment and Security Settings}}
@ -55,6 +59,8 @@ Documentation
* {{{mini/guide-multiple-repositories.html}Guide to using Multiple Repositories}}
* {{{mini/guide-pom-properties.html}Guide to using POM properties}}
* {{{mini/guide-proxies.html}Guide to using proxies}}
* {{{mini/guide-test-customization.html}Guide to test customization}}
@ -70,9 +76,11 @@ Documentation
* {{{introduction/introduction-to-dependency-management.html}Introduction to Dependency Management}}
* {{{introduction/introduction-to-dependency-mechanism.html}Introduction to the Dependency Mechanism}}
* {{{introduction/introduction-to-repositories.html}Introduction to Repositories}}
* {{{introduction/introduction-to-the-lifecycle.html}Introduction to the Lifecycle}}
* {{{introduction/introduction-to-the-lifecycle.html}Introduction to the Build Lifecycle}}
* {{{introduction/introduction-to-the-pom.html}Introduction to the POM}}

View File

@ -1,386 +0,0 @@
------
Build Lifecycle
------
Brett Porter
------
16 June 2005
------
Build Lifecycle
* Build Lifecycle Basics
Maven 2.0 is based around the central concept of a build lifecycle. What this means is that the process for building
and distributing a particular artifact is clearly defined.
For the person building a project, this means that it is only necessary to learn a small set of commands to build any
Maven project, and the POM will ensure they get the results they desired.
The most common lifecycle phases that would be executed on a project are the following (a complete list of the lifecycle phases is given below):
* <<<validate>>> - validate the project is correct and all necessary information is available
* <<<compile>>> - compile the source code of the project
* <<<test>>> - test the compiled source code using a suitable unit testing framework. These tests should not
require the code be packaged or deployed
* <<<package>>> - take the compiled code and package it in its distributable format, such as a JAR.
* <<<integration-test>>> - process and deploy the package if necessary into an environment where integration tests
can be run
* <<<verify>>> - run any checks to verify the package is valid and meets quality criteria
* <<<install>>> - install the package into the local repository, for use as a dependency in other projects locally
* <<<deploy>>> - done in an integration or release environment, copies the final package to the remote repository
for sharing with other developers and projects.
Note that for each of these steps, all previous steps are always executed, so you only need to specify the last one
you desire on the command line. For example:
-------
m2 install
-------
This command will compile, test, package, verify and install the package into the local repository when run.
There are more commands that are part of the lifecycle, which will be discussed in the following sections.
It should also be noted that the same command can be used in a multi-module scenario. For example;
------
m2 clean:clean install
------
This command will traverse into all of the subprojects and run <<<clean:clean>>>, then <<<install>>> (including all of
the prior steps).
* Setting up your Project to Use the Build Lifecycle
The build lifecycle is simple enough to use, but when you are constructing a Maven build for a project, how do you go
about assigning tasks to each of those build phases?
** Packaging
The first, and most common way, is to set the <<<packaging>>> for your project. This defaults to <<<jar>>>, so whether
you have specifically done it or not, this has already happened. Each packaging contains a list of goals to bind to
a particular phase. For example, a JAR will add the following bindings to the lifecycle:
*------------------------------+---------------------------------------------------------------------------------------+
| <<<process-resources>>> | <<<resources:resources>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<compile>>> | <<<compiler:compile>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<process-test-resources>>> | <<<resources:testResources>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<test-compile>>> | <<<compiler:testCompile>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<test>>> | <<<surefire:test>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<package>>> | <<<jar:jar>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<install>>> | <<<install:install>>>
*------------------------------+---------------------------------------------------------------------------------------+
| <<<deploy>>> | <<<deploy:deploy>>>
*------------------------------+---------------------------------------------------------------------------------------+
This is an almost standard set of bindings; however, some packages handle them differently. For example, a project
that is purely metadata (packaging <<<pom>>>) only binds the <<<install>>> and <<<deploy>>> phases.
Note that for some packaging tpyes to be available, you may also need to include a particular plugin in your
<<<build>>> section (as described in the next section). One example of a plugin that requires this is the Plexus plugin,
which provides a <<<plexus-application>>> and <<<plexus-service>>> packaging.
** Plugins
The second way to add goals to phases is to configure plugins in your project. As you will see in the later sections,
plugins contain information that indicate which lifecycle phase to bind each goal to. Note that adding the plugin on its own is not
enough information - you must also specify the goals you want run as part of your build.
The goals that are configured will be added to the goals already bound to the lifecycle from the packaging selected.
If more than one goal is bound to a particular phase, the order used is that those from the packaging are executed
first, followed by those configured in the POM. Note that you can use the <<<executions>>> element to gain more
control over the order of particular goals.
For example, the Modello plugin always binds <<<modello:java>>> to the <<<generate-sources>>> phase. So to use the
Modello plugin and have it generate sources from a model and incorporate that into the build, you would add the
following to your POM in the <<<plugins>>> section of <<<build>>>:
----
...
<plugin>
<groupId>org.codehaus.modello</groupId>
<artifactId>modello-maven-plugin</artifactId>
<executions>
<execution>
<configuration>
<model>maven.mdo</model>
<modelVersion>4.0.0</modelVersion>
</configuration>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
</plugin>
...
----
You might be wondering why that executions element is there. That is so that you can run the same goal multiple times
with different configuration if needed. Separate executions can also be given an ID so that during inheritence or the
application of profiles you can control whether goal configuration is merged or turned into an additional execution.
When multiple executions are given that match a particular phase, they are executed in the order specified in the POM,
with inherited executions running first.
Now, in the case of <<<modello:java>>>, it only makes sense in the <<<generate-sources>>> phase. But some goals can be
used in more than one phase, and there may not be a sensible default. For those, you can specify the phase yourself.
For example, let's say you have a goal <<<touch:timestamp>>> that echos the current time to a file, and you want it to
run in the <<<process-test-resources>>> phase to indicate when the tests were started. This would be configured like
so:
----
...
<plugin>
<groupId>com.mycompany.example</groupId>
<artifactId>touch-maven-plugin</artifactId>
<executions>
<execution>
<phase>process-test-resources</phase>
<configuration>
<file>${project.output.directory}/timestamp.txt</file>
</configuration>
<goals>
<goal>timestamp</goal>
</goals>
</execution>
</executions>
</plugin>
...
----
* Build Lifecycle Phase Reference
The following lists all build lifecycle phases, which are executed in the order given up to the point of the one specified.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<validate>>> | validate the project is correct and all necessary information is available.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<generate-sources>>> | generate any source code for inclusion in compilation.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-sources>>> | process the source code, for example to filter any values.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<generate-resources>>> | generate resources for inclusion in the package.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-resources>>> | copy and process the resources into the destination directory, ready for packaging.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<compile>>> | compile the source code of the project.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-classes>>> | post-process the generated files from compilation, for example to do bytecode enhancement on Java classes.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<generate-test-sources>>> | generate any test source code for inclusion in compilation.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-test-sources>>> | process the test source code, for example to filter any values.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<generate-test-resources>>> | create resources for testing.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<process-test-resources>>> | copy and process the resources into the test destination directory.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<test-compile>>> | compile the test source code into the test destination directory
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<test>>> | run tests using a suitable unit testing framework. These tests should not require the code be packaged or deployed.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<package>>> | take the compiled code and package it in its distributable format, such as a JAR.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<integration-test>>> | process and deploy the package if necessary into an environment where integration tests can be run.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<verify>>> | run any checks to verify the package is valid and meets quality criteria.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<install>>> | install the package into the local repository, for use as a dependency in other projects locally.
*-------------------------------+--------------------------------------------------------------------------------------+
| <<<deploy>>> | done in an integration or release environment, copies the final package to the remote repository for sharing with other developers and projects.
*-------------------------------+--------------------------------------------------------------------------------------+
* How the Build Lifecycle Affects Plugin Developers
The build lifecycle ensures that plugin developers only need to make their individual goal (a "mojo") perform a single
task with a simple set of inputs and outputs, and then have that goal bound to the appropriate stage of the build.
Information is passed between goals only through the project object - for example by adding new compile source roots,
changing the location of the classes directory after processing, and so on.
There are 3 ways that a plugin can interact with the build lifecycle: by binding a mojo to a particular phase, by
specifying an alternate packaging and appropriate lifecycle bindings, or by forking a parallel lifecycle.
** Binding a Mojo to a Phase
If the mojo is participating in a part of the normal build, usually the plugin developer will bind that mojo to a
particular phase, using the following syntax in the mojo level declaration:
----
@phase generate-sources
----
<<Note:>> <Some plugin languages have different ways of specifying mojo level declarations.
Please refer to the specific plugin development documentation for more information.>
Once this is specified, it will automatically be registered when the goal is listed in the project POM, as described
previously.
** Specifying a New Packaging
If your plugin is intended to provide a unique artifact type, then you will need to provide not only the goal to
package it with, but also a mapping of how the default lifecycle should behave.
This is currently achieved by adding a Plexus descriptor to your plugin (or modifying it if it already exists).
This file is <<<META-INF/plexus/components.xml>>> in the plugin JAR. The following is an example of configuring a
set of phases for the Plexus plugin itself to register the <<<plexus-application>>> type:
----
<component-set>
<components>
<component>
<role>org.apache.maven.lifecycle.mapping.LifecycleMapping</role>
<role-hint>plexus-application</role-hint>
<implementation>org.apache.maven.lifecycle.mapping.DefaultLifecycleMapping</implementation>
<configuration>
<phases>
<process-resources>resources:resources</process-resources>
<compile>compiler:compile</compile>
<process-test-resources>resources:testResources</process-test-resources>
<test-compile>compiler:testCompile</test-compile>
<test>surefire:test</test>
<package>plexus:app</package>
<install>install:install</install>
<deploy>deploy:deploy</deploy>
</phases>
</configuration>
</component>
</components>
</component-set>
----
In this example, the <<<role-hint>>> is used to specify the packaging, and the <<<role>>> of
<<<org.apache.maven.lifecycle.mapping.LifecycleMapping>>> indicates this is a lifecycle mapping for that packaging.
<<<implementation>>> is required, and while you can provide your own, the default given above should suit the standard case.
The phases to bind are listed in the configuration element, and each that is given can have one goal associated with
that phase for that particular packaging.
Once this is included in the JAR, the plugin needs to be added to the project to make the packaging available from
that project. In addition to listing the plugin, you must specify that it provides extensions:
----
...
<packaging>plexus-application</packaging>
...
<plugin>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-maven-plugin</artifactId>
<extensions>true</extensions>
</plugin>
...
----
Setting the extensions flag is also necessary if you provide custom artifact type handlers (closely related to
providing a packaging).
** Forking a Parallel Lifecycle
While lots of mojos will participate in the standard lifecycle, there are just as many that are used in other
scenarios. These are mojos that are executed standalone from the command line (such as <<<idea:idea>>>), or individual
reports in the site building process.
However, sometimes these goals require that a particular task has already been performed - for instance, the IDEA
plugin must ensure sources have been generated to properly construct its module files. If the goal were participating
in the lifecycle, it would easily do this by ensuring it occurred after the phase it depended on having run. Since
this isn't the case, it must have a way to first execute that task.
Additionally, even goals participating in the build lifecycle might need to perform a task with different parameters
to what was already used, and does not want the output to affect the current build (for example, running
<<<clover:check>>> to run tests with modified sources and fail if a certain coverage ratio is not achieved).
For these reasons, mojos are capable of forking a new lifecycle. The lifecycle will be a normal build lifecycle,
a clone of the one currently being used (including any additional bindings from the POM), executed up until the point
specified by the mojo.
For example, the <<<idea:idea>>> mojo specifies the following in the mojo level declarations to call the source
generation:
----
@execute phase="generate-sources"
----
But what happens if <<<generate-sources>>> has already been run in this build? In the current version of Maven, there
is no way to tell if the previous execution used the same input and outputs as the current mojo requires, so the task
(and any preceding ones if from the lifecycle) must be run again.
For this reason, it is important that if your plugin does any intensive work, you should first check whether it is
necessary to perform the tasks again, perhaps by using timestamp checking or a similar technique. As an example,
the compiler plugin will only recompile changed source files so can very efficiently be run multiple times in a build
if necessary.
When the lifecycle is forked, the project object being used is also cloned. In this way, modifications made to the
project as part of the execution, such as the addition of a new source root, will not affect the original build.
When the lifecycle finishes executing and control is passed to the original mojo, it can access that project using
the expression <<<${executedProject}>>>. For example:
----
/**
* @parameter expression="${executedProject}"
*/
private MavenProject executedProject;
----
This project instance can be used by the mojo to obtain results, and propogate any changes it sees fit into the
original build.
Finally, when forking the new lifecycle, it is possible to augment it on top of the changes already made by the
packaging and the plugins in the POM.
For example, consider the Clover plugin. If <<<clover:check>>> were to be run from the command line, the plugin
would need to fork the lifecycle, executing the <<<test>>> phase. But, it would also need to add some configuration
and bind the <<<clover:compiler>>> goal to the <<<generate-sources>>> phase.
This can be achieved by including the following file as <<<META-INF/maven/lifecycle.xml>>> in the plugin JAR:
----
<lifecycles>
<lifecycle>
<id>clover</id>
<phases>
<phase>
<id>generate-sources</id>
<executions>
<execution>
<configuration>
<debug>true</debug>
</configuration>
<goals>
<goal>compiler</goal>
</goals>
</execution>
</executions>
</phase>
</phases>
</lifecycle>
</lifecycles>
----
Here, the <<<executions>>> element is present in a similar way to a plugin declaration in the POM. This can be used
to bind a goal one or more times to a particular phase, as well as specifying configuration. Note that configuration
already provided in the POM to that plugin that is not part of a specific execution will also be applied.
The lifecycle ID given here (<<<clover>>>) can then be used in the mojo to specify what to overlay on the forked
lifecycle when executing it, using the following mojo level declaration:
----
@execute phase="test" lifecycle="clover"
----
For more information about plugin development in general, see the
{{{developers/plugin-overview.html} Developer's Section}}.

View File

@ -18,7 +18,8 @@
<menu name="Installing">
<item name="Download" href="/download.html"/>
<item name="Install" href="/download.html#installation"/>
<item name="Configuration" href="/configuration.html"/>
<item name="Getting Started" href="/guides/getting-started/index.html"/>
<item name="Documentation" href="/guides/toc.html"/>
<item name="Release Notes" href="/release-notes.html"/>
</menu>
@ -30,6 +31,8 @@
<item name="Road Map" href="/roadmap.html"/>
<item name="Powered By" href="/powered-by-m2.html"/>
</menu>
<!--
<menu name="User's Guide">
<item name="Getting Started" href="/getting-started.html"/>
<item name="Build Lifecycle" href="/lifecycle.html"/>
@ -37,6 +40,8 @@
<item name="Dependency Mechanism" href="/dependency-mechanism.html"/>
<item name="Creating a Site" href="/site.html"/>
</menu>
-->
<menu name="Plugin Developers">
<item name="Plugin Development Guide" href="/developers/plugin-development-guide.html"/>
</menu>

View File

@ -1,191 +0,0 @@
<document>
<properties>
<title>Configuring Maven</title>
<author email="brett@apache.org">Brett Porter</author>
</properties>
<body>
<section name="Configuring Maven">
<p>
Maven configuration occurs at 3 levels:
</p>
<ul>
<li>
<i>Project</i>
- most static configuration occurs in
<code>pom.xml</code>
</li>
<li>
<i>Installation</i>
- this is configuration added once for a Maven installation
</li>
<li>
<i>User</i>
- this is configuration specific to a particular user
</li>
</ul>
<p>
The separation is quite clear - the project defines information that applies to the project, no matter who is
building it, while the others both define settings for the current environment.
</p>
<p>
<b>Note:</b>
the installation and user configuration cannot be used to add shared project information -
for example, setting
<code>&lt;organization&gt;</code>
or
<code>&lt;distributionManagement&gt;</code>
company-wide.
For this, you should have your projects inherit from a company-wide parent
<code>pom.xml</code>
.
<!-- TODO: versioning doc that discusses this -->
</p>
<p>
You can specify your user configuration in
<code>${user.home}/.m2/settings.xml</code>
. A
<a href="maven-settings/settings.html">full reference</a>
to the
configuration file is available. This section will show how to make some common configurations.
Note that the file is not required - defaults will be used if it is not found.
</p>
<h4>Configuring your Local Repository</h4>
<p>
The local repository is part of a profile in your user configuration. You can have multiple profiles, with one
set to active so that you can switch environments.
</p>
<source><![CDATA[
<settings>
.
.
<localRepository>/path/to/local/repo</localRepository>
.
.]]></source>
<p>
The local repository must be an absolute path.
</p>
<h4>Configuring a Proxy</h4>
<p>
You can configure a proxy to use for some or all of your HTTP requests in Maven 2.0. The username and
password are only required if your proxy requires basic authentication (note that later releases may support
storing your passwords in a secured keystore - in the meantime, please ensure your
<code>settings.xml</code>
file is secured with permissions appropriate for your operating system).
</p>
<p>
The
<code>nonProxyHosts</code>
setting accepts wild cards, and each host name is separated by a
<code>|</code>
character. This matches the
<a href="http://java.sun.com/j2se/1.4.2/docs/guide/net/properties.html">JDK configuration</a>
equivalent.
</p>
<source><![CDATA[
<settings>
.
.
<proxies>
<proxy>
<active>true</active>
<protocol>http</protocol>
<host>proxy.somewhere.com</host>
<port>8080</port>
<username>proxyuser</username>
<password>somepassword</password>
<nonProxyHosts>www.google.com|*.somewhere.com</nonProxyHosts>
</proxy>
</proxies>
.
.]]></source>
<h4>Security and Deployment Settings</h4>
<p>
Repositories to deploy to are defined in a project in the
<code>&lt;distributionManagement&gt;</code>
section.
However, you cannot put your username, password, or other security settings in that project. For that reason,
you should add a server definition to your own settings with an
<code>id</code>
that matches that of the
deployment repository in the project.
</p>
<p>
In addition, some repositories may require authorisation to download from, so the corresponding settings can
be specified in a
<code>server</code>
element in the same way.
</p>
<p>
Which settings are required will depend on the type of repository you are deploying to. As of the first release,
only SCP deployments and file deployments are supported by default, so only the following SCP configuration
is needed:
</p>
<source><![CDATA[
<settings>
.
.
<servers>
<server>
<id>repo1</id>
<username>repouser</username>
<!-- other optional elements:
<password>my_login_password</password>
<privateKey>/path/to/identity</privateKey> (default is ~/.ssh/id_dsa)
<passphrase>my_key_passphrase</passphrase>
-->
</server>
</servers>
.
.]]></source>
<h4>Using Mirrors for Repositories</h4>
<p>
Repositories are declared inside a project, which means that if you have your own custom repositories, those
sharing your project easily get the right settings out of the box. However, you may want to use an alternative
mirror for a particular repository without changing the project files.
</p>
<p>
Some reasons to use a mirror are:
</p>
<ul>
<li>There is a synchronized mirror on the internet that is geographically closer and faster</li>
<li>You want to replace a particular repository with your own internal repository which you have greater
control over</li>
<li>You want to run maven-proxy to provide a local cache to a mirror and need to use its URL instead</li>
</ul>
<p>
To configure a mirror of a given repository, you provide it in your settings file, giving the new repository
its own
<code>id</code>
and
<code>url</code>
, and specify the
<code>mirrorOf</code>
setting that is the ID of
the repository you are using a mirror of. For example, the ID of the main Maven repository included by default
is
<code>central</code>
, so to use an Australian mirror, you would configure the following:
</p>
<source><![CDATA[
<settings>
.
.
<mirrors>
<mirror>
<id>planetmirror</id>
<name>Australian Mirror of http://repo1.maven.org/maven2/</name>
<url>http://public.planetmirror.com/maven2/</url>
<mirrorOf>central</mirrorOf>
</mirror>
</mirrors>
.
.]]></source>
<p>
<i>Please note:</i>
this particular feature is not actually set up for Maven 2 yet, so this should be treated as an
example only.
</p>
</section>
</body>
</document>