Switch build system to Gradle

See #13930
This commit is contained in:
Ryan Ernst 2015-10-29 11:40:19 -07:00
parent 1194cd0bdc
commit c86100f636
80 changed files with 4191 additions and 90 deletions

2
.gitignore vendored
View File

@ -9,7 +9,7 @@ logs/
.DS_Store
build/
target/
*-execution-hints.log
**/.local*
docs/html/
docs/build.log
/tmp/

View File

@ -76,9 +76,7 @@ Contributing to the Elasticsearch codebase
**Repository:** [https://github.com/elastic/elasticsearch](https://github.com/elastic/elasticsearch)
Make sure you have [Maven](http://maven.apache.org) installed, as Elasticsearch uses it as its build system. Integration with IntelliJ and Eclipse should work out of the box. Eclipse users can automatically configure their IDE by running `mvn eclipse:eclipse` and then importing the project into their workspace: `File > Import > Existing project into workspace` and make sure to select `Search for nested projects...` option as Elasticsearch is a multi-module maven project. Additionally you will want to ensure that Eclipse is using 2048m of heap by modifying `eclipse.ini` accordingly to avoid GC overhead errors. Please make sure the [m2e-connector](http://marketplace.eclipse.org/content/m2e-connector-maven-dependency-plugin) is not installed in your Eclipse distribution as it will interfere with setup performed by `mvn eclipse:eclipse`.
Elasticsearch also works perfectly with Eclipse's [m2e](http://www.eclipse.org/m2e/). Once you've installed m2e you can import Elasticsearch as an `Existing Maven Project`.
Make sure you have [Gradle](http://gradle.org) installed, as Elasticsearch uses it as its build system. Integration with IntelliJ and Eclipse should work out of the box. Eclipse users can automatically configure their IDE by running `gradle eclipse` and then importing the project into their workspace: `File > Import > Existing project into workspace` and make sure to select `Search for nested projects...` option as Elasticsearch is a multi-module maven project. Additionally you will want to ensure that Eclipse is using 2048m of heap by modifying `eclipse.ini` accordingly to avoid GC overhead errors.
Please follow these formatting guidelines:
@ -92,15 +90,15 @@ To create a distribution from the source, simply run:
```sh
cd elasticsearch/
mvn clean package -DskipTests
gradle assemble
```
You will find the newly built packages under: `./target/releases/`.
You will find the newly built packages under: `./distribution/build/distributions/`.
Before submitting your changes, run the test suite to make sure that nothing is broken, with:
```sh
mvn clean test -Dtests.slow=true
gradle check
```
Source: [Contributing to elasticsearch](https://www.elastic.co/contributing-to-elasticsearch/)

View File

@ -200,10 +200,9 @@ We have just covered a very small portion of what Elasticsearch is all about. Fo
h3. Building from Source
Elasticsearch uses "Maven":http://maven.apache.org for its build system.
Elasticsearch uses "Gradle":http://gradle.org for its build system. You'll need to have a modern version of Gradle installed - 2.6 should do.
In order to create a distribution, simply run the @mvn clean package
-DskipTests@ command in the cloned directory.
In order to create a distribution, simply run the @gradle build@ command in the cloned directory.
The distribution for each project will be created under the @target/releases@ directory in that project.

View File

@ -13,7 +13,7 @@ To create a distribution without running the tests, simply run the
following:
-----------------------------
mvn clean package -DskipTests
gradle assemble
-----------------------------
== Other test options
@ -35,7 +35,7 @@ Use local transport (default since 1.3):
Alternatively, you can set the `ES_TEST_LOCAL` environment variable:
-------------------------------------
export ES_TEST_LOCAL=true && mvn test
export ES_TEST_LOCAL=true && gradle test
-------------------------------------
=== Running Elasticsearch from a checkout
@ -55,20 +55,20 @@ run it using Maven:
Run a single test case (variants)
----------------------------------------------------------
mvn test -Dtests.class=org.elasticsearch.package.ClassName
mvn test "-Dtests.class=*.ClassName"
gradle test -Dtests.class=org.elasticsearch.package.ClassName
gradle test "-Dtests.class=*.ClassName"
----------------------------------------------------------
Run all tests in a package and sub-packages
----------------------------------------------------
mvn test "-Dtests.class=org.elasticsearch.package.*"
gradle test "-Dtests.class=org.elasticsearch.package.*"
----------------------------------------------------
Run any test methods that contain 'esi' (like: ...r*esi*ze...).
-------------------------------
mvn test "-Dtests.method=*esi*"
gradle test "-Dtests.method=*esi*"
-------------------------------
You can also filter tests by certain annotations ie:
@ -81,7 +81,7 @@ You can also filter tests by certain annotations ie:
Those annotation names can be combined into a filter expression like:
------------------------------------------------
mvn test -Dtests.filter="@nightly and not @backwards"
gradle test -Dtests.filter="@nightly and not @backwards"
------------------------------------------------
to run all nightly test but not the ones that are backwards tests. `tests.filter` supports
@ -89,7 +89,7 @@ the boolean operators `and, or, not` and grouping ie:
---------------------------------------------------------------
mvn test -Dtests.filter="@nightly and not(@badapple or @backwards)"
gradle test -Dtests.filter="@nightly and not(@badapple or @backwards)"
---------------------------------------------------------------
=== Seed and repetitions.
@ -97,7 +97,7 @@ mvn test -Dtests.filter="@nightly and not(@badapple or @backwards)"
Run with a given seed (seed is a hex-encoded long).
------------------------------
mvn test -Dtests.seed=DEADBEEF
gradle test -Dtests.seed=DEADBEEF
------------------------------
=== Repeats _all_ tests of ClassName N times.
@ -106,7 +106,7 @@ Every test repetition will have a different method seed
(derived from a single random master seed).
--------------------------------------------------
mvn test -Dtests.iters=N -Dtests.class=*.ClassName
gradle test -Dtests.iters=N -Dtests.class=*.ClassName
--------------------------------------------------
=== Repeats _all_ tests of ClassName N times.
@ -115,7 +115,7 @@ Every test repetition will have exactly the same master (0xdead) and
method-level (0xbeef) seed.
------------------------------------------------------------------------
mvn test -Dtests.iters=N -Dtests.class=*.ClassName -Dtests.seed=DEAD:BEEF
gradle test -Dtests.iters=N -Dtests.class=*.ClassName -Dtests.seed=DEAD:BEEF
------------------------------------------------------------------------
=== Repeats a given test N times
@ -125,14 +125,14 @@ ie: testFoo[0], testFoo[1], etc... so using testmethod or tests.method
ending in a glob is necessary to ensure iterations are run).
-------------------------------------------------------------------------
mvn test -Dtests.iters=N -Dtests.class=*.ClassName -Dtests.method=mytest*
gradle test -Dtests.iters=N -Dtests.class=*.ClassName -Dtests.method=mytest*
-------------------------------------------------------------------------
Repeats N times but skips any tests after the first failure or M initial failures.
-------------------------------------------------------------
mvn test -Dtests.iters=N -Dtests.failfast=true -Dtestcase=...
mvn test -Dtests.iters=N -Dtests.maxfailures=M -Dtestcase=...
gradle test -Dtests.iters=N -Dtests.failfast=true -Dtestcase=...
gradle test -Dtests.iters=N -Dtests.maxfailures=M -Dtestcase=...
-------------------------------------------------------------
=== Test groups.
@ -142,9 +142,9 @@ Test groups can be enabled or disabled (true/false).
Default value provided below in [brackets].
------------------------------------------------------------------
mvn test -Dtests.nightly=[false] - nightly test group (@Nightly)
mvn test -Dtests.weekly=[false] - weekly tests (@Weekly)
mvn test -Dtests.awaitsfix=[false] - known issue (@AwaitsFix)
gradle test -Dtests.nightly=[false] - nightly test group (@Nightly)
gradle test -Dtests.weekly=[false] - weekly tests (@Weekly)
gradle test -Dtests.awaitsfix=[false] - known issue (@AwaitsFix)
------------------------------------------------------------------
=== Load balancing and caches.
@ -154,7 +154,7 @@ By default, the tests run sequentially on a single forked JVM.
To run with more forked JVMs than the default use:
----------------------------
mvn test -Dtests.jvms=8 test
gradle test -Dtests.jvms=8
----------------------------
Don't count hypercores for CPU-intense tests and leave some slack
@ -167,7 +167,7 @@ It is possible to provide a version that allows to adapt the tests behaviour
to older features or bugs that have been changed or fixed in the meantime.
-----------------------------------------
mvn test -Dtests.compatibility=1.0.0
gradle test -Dtests.compatibility=1.0.0
-----------------------------------------
@ -176,50 +176,50 @@ mvn test -Dtests.compatibility=1.0.0
Run all tests without stopping on errors (inspect log files).
-----------------------------------------
mvn test -Dtests.haltonfailure=false test
gradle test -Dtests.haltonfailure=false
-----------------------------------------
Run more verbose output (slave JVM parameters, etc.).
----------------------
mvn test -verbose test
gradle test -verbose
----------------------
Change the default suite timeout to 5 seconds for all
tests (note the exclamation mark).
---------------------------------------
mvn test -Dtests.timeoutSuite=5000! ...
gradle test -Dtests.timeoutSuite=5000! ...
---------------------------------------
Change the logging level of ES (not mvn)
Change the logging level of ES (not gradle)
--------------------------------
mvn test -Des.logger.level=DEBUG
gradle test -Des.logger.level=DEBUG
--------------------------------
Print all the logging output from the test runs to the commandline
even if tests are passing.
------------------------------
mvn test -Dtests.output=always
gradle test -Dtests.output=always
------------------------------
Configure the heap size.
------------------------------
mvn test -Dtests.heap.size=512m
gradle test -Dtests.heap.size=512m
------------------------------
Pass arbitrary jvm arguments.
------------------------------
# specify heap dump path
mvn test -Dtests.jvm.argline="-XX:HeapDumpPath=/path/to/heapdumps"
gradle test -Dtests.jvm.argline="-XX:HeapDumpPath=/path/to/heapdumps"
# enable gc logging
mvn test -Dtests.jvm.argline="-verbose:gc"
gradle test -Dtests.jvm.argline="-verbose:gc"
# enable security debugging
mvn test -Dtests.jvm.argline="-Djava.security.debug=access,failure"
gradle test -Dtests.jvm.argline="-Djava.security.debug=access,failure"
------------------------------
== Backwards Compatibility Tests
@ -230,7 +230,7 @@ To run backwards compatibilty tests untar or unzip a release and run the tests
with the following command:
---------------------------------------------------------------------------
mvn test -Dtests.filter="@backwards" -Dtests.bwc.version=x.y.z -Dtests.bwc.path=/path/to/elasticsearch -Dtests.security.manager=false
gradle test -Dtests.filter="@backwards" -Dtests.bwc.version=x.y.z -Dtests.bwc.path=/path/to/elasticsearch -Dtests.security.manager=false
---------------------------------------------------------------------------
Note that backwards tests must be run with security manager disabled.
@ -238,7 +238,7 @@ If the elasticsearch release is placed under `./backwards/elasticsearch-x.y.z` t
can be omitted:
---------------------------------------------------------------------------
mvn test -Dtests.filter="@backwards" -Dtests.bwc.version=x.y.z -Dtests.security.manager=false
gradle test -Dtests.filter="@backwards" -Dtests.bwc.version=x.y.z -Dtests.security.manager=false
---------------------------------------------------------------------------
To setup the bwc test environment execute the following steps (provided you are
@ -250,19 +250,25 @@ $ curl -O https://download.elasticsearch.org/elasticsearch/elasticsearch/elastic
$ tar -xzf elasticsearch-1.2.1.tar.gz
---------------------------------------------------------------------------
== Running integration tests
== Running verification tasks
To run the integration tests:
To run all verification tasks, including static checks, unit tests, and integration tests:
---------------------------------------------------------------------------
mvn verify
gradle check
---------------------------------------------------------------------------
Note that this will also run the unit tests first. If you want to just
run the integration tests only (because you are debugging them):
Note that this will also run the unit tests and precommit tasks first. If you want to just
run the integration tests (because you are debugging them):
---------------------------------------------------------------------------
mvn verify -Dskip.unit.tests
gradle integTest
---------------------------------------------------------------------------
If you want to just run the precommit checks:
---------------------------------------------------------------------------
gradle precommit
---------------------------------------------------------------------------
== Testing the REST layer
@ -278,7 +284,7 @@ The REST tests are run automatically when executing the maven test command. To r
REST tests use the following command:
---------------------------------------------------------------------------
mvn verify -Dtests.filter="@Rest" -Dskip.unit.tests=true
gradle integTest -Dtests.filter="@Rest"
---------------------------------------------------------------------------
`RestNIT` are the executable test classes that runs all the
@ -303,20 +309,6 @@ comma separated list of nodes to connect to (e.g. localhost:9300). A transport c
be created based on that and used for all the before|after test operations, and to extract
the http addresses of the nodes so that REST requests can be sent to them.
== Skip validate
To disable validation step (forbidden API or `// NOCOMMIT`) use
---------------------------------------------------------------------------
mvn test -Dvalidate.skip=true
---------------------------------------------------------------------------
You can also skip this by using the "dev" profile:
---------------------------------------------------------------------------
mvn test -Pdev
---------------------------------------------------------------------------
== Testing scripts
The simplest way to test scripts and the packaged distributions is to use
@ -334,7 +326,7 @@ vagrant plugin install vagrant-cachier
. Validate your installed dependencies:
-------------------------------------
mvn -Dtests.vagrant -pl qa/vagrant validate
gradle :qa:vagrant:validate
-------------------------------------
. Download the VMs. Since Maven or ant or something eats the progress reports

165
build.gradle Normal file
View File

@ -0,0 +1,165 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import com.bmuschko.gradle.nexus.NexusPlugin
buildscript {
repositories {
mavenCentral()
}
dependencies {
classpath 'com.bmuschko:gradle-nexus-plugin:2.3.1'
}
}
// common maven publishing configuration
subprojects {
plugins.withType(NexusPlugin).whenPluginAdded {
modifyPom {
project {
url 'https://github.com/elastic/elasticsearch'
inceptionYear '2009'
scm {
url 'https://github.com/elastic/elasticsearch'
connection 'scm:https://elastic@github.com/elastic/elasticsearch'
developerConnection 'scm:git://github.com/elastic/elasticsearch.git'
}
licenses {
license {
name 'The Apache Software License, Version 2.0'
url 'http://www.apache.org/licenses/LICENSE-2.0.txt'
distribution 'repo'
}
}
}
}
extraArchive {
javadoc = false
tests = false
}
// we have our own username/password prompts so that they only happen once
// TODO: add gpg signing prompts
project.gradle.taskGraph.whenReady { taskGraph ->
if (taskGraph.allTasks.any { it.name == 'uploadArchives' }) {
Console console = System.console()
if (project.hasProperty('nexusUsername') == false) {
String nexusUsername = console.readLine('\nNexus username: ')
project.rootProject.allprojects.each {
it.ext.nexusUsername = nexusUsername
}
}
if (project.hasProperty('nexusPassword') == false) {
String nexusPassword = new String(console.readPassword('\nNexus password: '))
project.rootProject.allprojects.each {
it.ext.nexusPassword = nexusPassword
}
}
}
}
}
}
if (hasProperty('projectsPrefix') == false) {
allprojects {
project.ext['projectsPrefix'] = ''
}
}
allprojects {
// injecting groovy property variables into all projects
project.ext {
// minimum java 8
sourceCompatibility = JavaVersion.VERSION_1_8
targetCompatibility = sourceCompatibility
luceneSnapshotRevision = '1710880'
// dependency versions that are used in more than one place
versions = [
lucene: "5.4.0-snapshot-${luceneSnapshotRevision}",
randomizedrunner: '2.2.0',
httpclient: '4.3.6'
]
}
}
subprojects {
repositories {
mavenCentral()
maven {
name 'sonatype-snapshots'
url 'http://oss.sonatype.org/content/repositories/snapshots/'
}
maven {
name 'lucene-snapshots'
url "http://s3.amazonaws.com/download.elasticsearch.org/lucenesnapshots/${luceneSnapshotRevision}"
}
}
// include license and notice in jars
gradle.projectsEvaluated {
tasks.withType(Jar) {
into('META-INF') {
from project.rootProject.rootDir
include 'LICENSE.txt'
include 'NOTICE.txt'
}
}
}
configurations {
all {
resolutionStrategy {
//failOnVersionConflict()
dependencySubstitution {
substitute module("org.elasticsearch:rest-api-spec:${version}") with project("${projectsPrefix}:rest-api-spec")
substitute module("org.elasticsearch:elasticsearch:${version}") with project("${projectsPrefix}:core")
substitute module("org.elasticsearch:test-framework:${version}") with project("${projectsPrefix}:test-framework")
substitute module("org.elasticsearch.distribution.zip:elasticsearch:${version}") with project("${projectsPrefix}:distribution:zip")
}
}
}
}
}
// IDE configuration
allprojects {
apply plugin: 'idea'
apply plugin: 'eclipse'
// TODO: similar for intellij
eclipse {
classpath {
defaultOutputDir = new File(project.buildDir, 'eclipse')
}
}
}
idea {
if (project != null) {
// could be null, if this project is attached to another...
project {
languageLevel = sourceCompatibility
vcs = 'Git'
}
}
}

61
buildSrc/build.gradle Normal file
View File

@ -0,0 +1,61 @@
import org.apache.tools.ant.filters.ReplaceTokens
plugins {
id 'groovy'
id 'com.bmuschko.nexus' version '2.3.1'
}
// TODO: move common IDE configuration to a common file to include
apply plugin: 'idea'
apply plugin: 'eclipse'
/*idea {
project {
languageLevel = '1.8'
vcs = 'Git'
}
}*/
group = 'org.elasticsearch.gradle'
archivesBaseName = 'build-tools'
repositories {
mavenCentral()
maven {
name 'sonatype-snapshots'
url "https://oss.sonatype.org/content/repositories/snapshots/"
}
}
dependencies {
compile gradleApi()
compile localGroovy()
compile 'com.carrotsearch.randomizedtesting:junit4-ant:2.2.0'
compile('junit:junit:4.11') {
transitive = false
}
compile 'com.netflix.nebula:gradle-extra-configurations-plugin:3.0.3'
compile 'de.thetaphi:forbiddenapis:2.0'
}
Properties props = new Properties()
props.load(project.file('../gradle.properties').newDataInputStream())
version = props.getProperty('version')
processResources {
inputs.file('../gradle.properties')
filter ReplaceTokens, tokens: [
'version': props.getProperty('version')
]
}
extraArchive {
javadoc = false
tests = false
}
eclipse {
classpath {
defaultOutputDir = new File(file('build'), 'eclipse')
}
}

View File

@ -0,0 +1,53 @@
package com.carrotsearch.gradle.randomizedtesting
import com.carrotsearch.ant.tasks.junit4.SuiteBalancer
import com.carrotsearch.ant.tasks.junit4.balancers.ExecutionTimeBalancer
import com.carrotsearch.ant.tasks.junit4.listeners.ExecutionTimesReport
import org.apache.tools.ant.types.FileSet
class BalancersConfiguration {
// parent task, so executionTime can register an additional listener
RandomizedTestingTask task
List<SuiteBalancer> balancers = new ArrayList<>()
void executionTime(Map<String,Object> properties) {
ExecutionTimeBalancer balancer = new ExecutionTimeBalancer()
FileSet fileSet = new FileSet()
Object filename = properties.remove('cacheFilename')
if (filename == null) {
throw new IllegalArgumentException('cacheFilename is required for executionTime balancer')
}
fileSet.setIncludes(filename.toString())
File cacheDir = task.project.projectDir
Object dir = properties.remove('cacheDir')
if (dir != null) {
cacheDir = new File(dir.toString())
}
fileSet.setDir(cacheDir)
balancer.add(fileSet)
int historySize = 10
Object size = properties.remove('historySize')
if (size instanceof Integer) {
historySize = (Integer)size
} else if (size != null) {
throw new IllegalArgumentException('historySize must be an integer')
}
ExecutionTimesReport listener = new ExecutionTimesReport()
listener.setFile(new File(cacheDir, filename.toString()))
listener.setHistoryLength(historySize)
if (properties.isEmpty() == false) {
throw new IllegalArgumentException('Unknown properties for executionTime balancer: ' + properties.keySet())
}
task.listenersConfig.listeners.add(listener)
balancers.add(balancer)
}
void custom(SuiteBalancer balancer) {
balancers.add(balancer)
}
}

View File

@ -0,0 +1,25 @@
package com.carrotsearch.gradle.randomizedtesting
import com.carrotsearch.ant.tasks.junit4.listeners.AggregatedEventListener
import com.carrotsearch.ant.tasks.junit4.listeners.antxml.AntXmlReport
class ListenersConfiguration {
RandomizedTestingTask task
List<AggregatedEventListener> listeners = new ArrayList<>()
void junitReport(Map<String, Object> props) {
AntXmlReport reportListener = new AntXmlReport()
Object dir = props == null ? null : props.get('dir')
if (dir != null) {
reportListener.setDir(task.project.file(dir))
} else {
reportListener.setDir(new File(task.project.buildDir, 'reports' + File.separator + "${task.name}Junit"))
}
listeners.add(reportListener)
}
void custom(AggregatedEventListener listener) {
listeners.add(listener)
}
}

View File

@ -0,0 +1,64 @@
package com.carrotsearch.gradle.randomizedtesting
import org.gradle.api.logging.LogLevel
import org.gradle.api.logging.Logger
/**
* Writes data passed to this stream as log messages.
*
* The stream will be flushed whenever a newline is detected.
* Allows setting an optional prefix before each line of output.
*/
public class LoggingOutputStream extends OutputStream {
/** The starting length of the buffer */
static final int DEFAULT_BUFFER_LENGTH = 4096
/** The buffer of bytes sent to the stream */
byte[] buffer = new byte[DEFAULT_BUFFER_LENGTH]
/** Offset of the start of unwritten data in the buffer */
int start = 0
/** Offset of the end (semi-open) of unwritten data in the buffer */
int end = 0
/** Logger to write stream data to */
Logger logger
/** Prefix to add before each line of output */
String prefix = ""
/** Log level to write log messages to */
LogLevel level
void write(final int b) throws IOException {
if (b == 0) return;
if (b == (int)'\n' as char) {
// always flush with newlines instead of adding to the buffer
flush()
return
}
if (end == buffer.length) {
if (start != 0) {
// first try shifting the used buffer back to the beginning to make space
System.arraycopy(buffer, start, buffer, 0, end - start)
} else {
// need more space, extend the buffer
}
final int newBufferLength = buffer.length + DEFAULT_BUFFER_LENGTH;
final byte[] newBuffer = new byte[newBufferLength];
System.arraycopy(buffer, 0, newBuffer, 0, buffer.length);
buffer = newBuffer;
}
buffer[end++] = (byte) b;
}
void flush() {
if (end == start) return
logger.log(level, prefix + new String(buffer, start, end - start));
start = end
}
}

View File

@ -0,0 +1,47 @@
package com.carrotsearch.gradle.randomizedtesting
import com.carrotsearch.ant.tasks.junit4.JUnit4
import org.gradle.api.AntBuilder
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.plugins.JavaBasePlugin
import org.gradle.api.tasks.TaskContainer
import org.gradle.api.tasks.testing.Test
class RandomizedTestingPlugin implements Plugin<Project> {
void apply(Project project) {
replaceTestTask(project.tasks)
configureAnt(project.ant)
}
static void replaceTestTask(TaskContainer tasks) {
Test oldTestTask = tasks.findByPath('test')
if (oldTestTask == null) {
// no test task, ok, user will use testing task on their own
return
}
tasks.remove(oldTestTask)
Map properties = [
name: 'test',
type: RandomizedTestingTask,
dependsOn: oldTestTask.dependsOn,
group: JavaBasePlugin.VERIFICATION_GROUP,
description: 'Runs unit tests with the randomized testing framework'
]
RandomizedTestingTask newTestTask = tasks.create(properties)
newTestTask.classpath = oldTestTask.classpath
newTestTask.testClassesDir = oldTestTask.testClassesDir
// hack so check task depends on custom test
Task checkTask = tasks.findByPath('check')
checkTask.dependsOn.remove(oldTestTask)
checkTask.dependsOn.add(newTestTask)
}
static void configureAnt(AntBuilder ant) {
ant.project.addTaskDefinition('junit4:junit4', JUnit4.class)
}
}

View File

@ -0,0 +1,248 @@
package com.carrotsearch.gradle.randomizedtesting
import com.carrotsearch.ant.tasks.junit4.ListenersList
import com.carrotsearch.ant.tasks.junit4.listeners.AggregatedEventListener
import groovy.xml.NamespaceBuilder
import org.apache.tools.ant.RuntimeConfigurable
import org.apache.tools.ant.UnknownElement
import org.gradle.api.DefaultTask
import org.gradle.api.file.FileCollection
import org.gradle.api.file.FileTreeElement
import org.gradle.api.internal.tasks.options.Option
import org.gradle.api.specs.Spec
import org.gradle.api.tasks.*
import org.gradle.api.tasks.util.PatternFilterable
import org.gradle.api.tasks.util.PatternSet
import org.gradle.logging.ProgressLoggerFactory
import org.gradle.util.ConfigureUtil
import javax.inject.Inject
class RandomizedTestingTask extends DefaultTask {
PatternFilterable patternSet = new PatternSet()
// TODO: change to "executable" to match gradle test params?
@Optional
@Input
String jvm = 'java'
@Optional
@Input
File workingDir = new File(project.buildDir, 'testrun' + File.separator + name)
@Optional
@Input
FileCollection classpath
@Input
String parallelism = '1'
@InputDirectory
File testClassesDir
@Optional
@Input
boolean haltOnFailure = true
@Optional
@Input
boolean shuffleOnSlave = true
@Optional
@Input
boolean enableAssertions = true
@Optional
@Input
boolean enableSystemAssertions = true
TestLoggingConfiguration testLoggingConfig = new TestLoggingConfiguration()
BalancersConfiguration balancersConfig = new BalancersConfiguration(task: this)
ListenersConfiguration listenersConfig = new ListenersConfiguration(task: this)
List<String> jvmArgs = new ArrayList<>()
Map<String, String> systemProperties = new HashMap<>()
RandomizedTestingTask() {
outputs.upToDateWhen {false} // randomized tests are never up to date
listenersConfig.listeners.add(new TestProgressLogger(factory: getProgressLoggerFactory()))
listenersConfig.listeners.add(new TestReportLogger(logger: logger, config: testLoggingConfig))
}
@Inject
ProgressLoggerFactory getProgressLoggerFactory() {
throw new UnsupportedOperationException();
}
void jvmArgs(Iterable<String> arguments) {
jvmArgs.addAll(arguments)
}
void jvmArg(String argument) {
jvmArgs.add(argument)
}
void systemProperty(String property, String value) {
systemProperties.put(property, value)
}
void include(String... includes) {
this.patternSet.include(includes);
}
void include(Iterable<String> includes) {
this.patternSet.include(includes);
}
void include(Spec<FileTreeElement> includeSpec) {
this.patternSet.include(includeSpec);
}
void include(Closure includeSpec) {
this.patternSet.include(includeSpec);
}
void exclude(String... excludes) {
this.patternSet.exclude(excludes);
}
void exclude(Iterable<String> excludes) {
this.patternSet.exclude(excludes);
}
void exclude(Spec<FileTreeElement> excludeSpec) {
this.patternSet.exclude(excludeSpec);
}
void exclude(Closure excludeSpec) {
this.patternSet.exclude(excludeSpec);
}
@Input
void testLogging(Closure closure) {
ConfigureUtil.configure(closure, testLoggingConfig)
}
@Input
void balancers(Closure closure) {
ConfigureUtil.configure(closure, balancersConfig)
}
@Input
void listeners(Closure closure) {
ConfigureUtil.configure(closure, listenersConfig)
}
@Option(
option = "tests",
description = "Sets test class or method name to be included. This is for IDEs. Use -Dtests.class and -Dtests.method"
)
void setTestNameIncludePattern(String testNamePattern) {
// This is only implemented to give support for IDEs running tests. There are 3 patterns expected:
// * An exact test class and method
// * An exact test class
// * A package name prefix, ending with .*
// There is no way to distinguish the first two without looking at classes, so we use the rule
// that class names start with an uppercase letter...
// TODO: this doesn't work yet, but not sure why...intellij says it is using --tests, and this work from the command line...
String[] parts = testNamePattern.split('\\.')
String lastPart = parts[parts.length - 1]
String classname
String methodname = null
if (lastPart.equals('*') || lastPart.charAt(0).isUpperCase()) {
// package name or class name, just pass through
classname = testNamePattern
} else {
// method name, need to separate
methodname = lastPart
classname = testNamePattern.substring(0, testNamePattern.length() - lastPart.length() - 1)
}
ant.setProperty('tests.class', classname)
if (methodname != null) {
ant.setProperty('tests.method', methodname)
}
}
// TODO: add leaveTemporary
// TODO: add jvmOutputAction?
// TODO: add ifNoTests!
@TaskAction
void executeTests() {
Map attributes = [
jvm: jvm,
parallelism: parallelism,
heartbeat: testLoggingConfig.slowTests.heartbeat,
dir: workingDir,
tempdir: new File(workingDir, 'temp'),
haltOnFailure: haltOnFailure,
shuffleOnSlave: shuffleOnSlave
]
def junit4 = NamespaceBuilder.newInstance(ant, 'junit4')
junit4.junit4(attributes) {
classpath {
pathElement(path: classpath.asPath)
}
if (enableAssertions) {
jvmarg(value: '-ea')
}
if (enableSystemAssertions) {
jvmarg(value: '-esa')
}
for (String arg : jvmArgs) {
jvmarg(value: arg)
}
fileset(dir: testClassesDir) {
for (String includePattern : patternSet.getIncludes()) {
include(name: includePattern)
}
for (String excludePattern : patternSet.getExcludes()) {
exclude(name: excludePattern)
}
}
for (Map.Entry<String, String> prop : systemProperties) {
sysproperty key: prop.getKey(), value: prop.getValue()
}
makeListeners()
}
}
static class ListenersElement extends UnknownElement {
AggregatedEventListener[] listeners
ListenersElement() {
super('listeners')
setNamespace('junit4')
setQName('listeners')
}
public void handleChildren(Object realThing, RuntimeConfigurable wrapper) {
assert realThing instanceof ListenersList
ListenersList list = (ListenersList)realThing
for (AggregatedEventListener listener : listeners) {
list.addConfigured(listener)
}
}
}
/**
* Makes an ant xml element for 'listeners' just as AntBuilder would, except configuring
* the element adds the already created children.
*/
def makeListeners() {
def context = ant.getAntXmlContext()
def parentWrapper = context.currentWrapper()
def parent = parentWrapper.getProxy()
UnknownElement element = new ListenersElement(listeners: listenersConfig.listeners)
element.setProject(context.getProject())
element.setRealThing(logger)
((UnknownElement)parent).addChild(element)
RuntimeConfigurable wrapper = new RuntimeConfigurable(element, element.getQName())
parentWrapper.addChild(wrapper)
return wrapper.getProxy()
}
}

View File

@ -0,0 +1,14 @@
package com.carrotsearch.gradle.randomizedtesting
class SlowTestsConfiguration {
int heartbeat = 0
int summarySize = 0
void heartbeat(int heartbeat) {
this.heartbeat = heartbeat
}
void summarySize(int summarySize) {
this.summarySize = summarySize
}
}

View File

@ -0,0 +1,14 @@
package com.carrotsearch.gradle.randomizedtesting
class StackTraceFiltersConfiguration {
List<String> patterns = new ArrayList<>()
List<String> contains = new ArrayList<>()
void regex(String pattern) {
patterns.add(pattern)
}
void contains(String contain) {
contains.add(contain)
}
}

View File

@ -0,0 +1,16 @@
package com.carrotsearch.gradle.randomizedtesting
import org.gradle.util.ConfigureUtil
class TestLoggingConfiguration {
SlowTestsConfiguration slowTests = new SlowTestsConfiguration()
StackTraceFiltersConfiguration stackTraceFilters = new StackTraceFiltersConfiguration()
void slowTests(Closure closure) {
ConfigureUtil.configure(closure, slowTests)
}
void stackTraceFilters(Closure closure) {
ConfigureUtil.configure(closure, stackTraceFilters)
}
}

View File

@ -0,0 +1,69 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package com.carrotsearch.gradle.randomizedtesting
import com.carrotsearch.ant.tasks.junit4.JUnit4
import com.carrotsearch.ant.tasks.junit4.dependencies.com.google.common.eventbus.Subscribe
import com.carrotsearch.ant.tasks.junit4.events.aggregated.AggregatedStartEvent
import com.carrotsearch.ant.tasks.junit4.events.aggregated.AggregatedSuiteResultEvent
import com.carrotsearch.ant.tasks.junit4.listeners.AggregatedEventListener
import org.gradle.logging.ProgressLogger
import org.gradle.logging.ProgressLoggerFactory
import org.junit.runner.Description
import java.util.concurrent.atomic.AtomicInteger
import static com.carrotsearch.ant.tasks.junit4.FormattingUtils.formatDurationInSeconds
class TestProgressLogger implements AggregatedEventListener {
/** Factory to build a progress logger when testing starts */
ProgressLoggerFactory factory
ProgressLogger progressLogger
int totalSuites;
AtomicInteger suitesCompleted = new AtomicInteger()
AtomicInteger testsCompleted = new AtomicInteger()
AtomicInteger testsFailed = new AtomicInteger()
AtomicInteger testsIgnored = new AtomicInteger()
@Subscribe
void onStart(AggregatedStartEvent e) throws IOException {
totalSuites = e.getSuiteCount();
progressLogger = factory.newOperation(TestProgressLogger)
progressLogger.setDescription('Randomized test runner')
progressLogger.started()
progressLogger.progress('Starting JUnit4 with ' + e.getSlaveCount() + ' jvms')
}
@Subscribe
void onSuiteResult(AggregatedSuiteResultEvent e) throws IOException {
final int suitesCompleted = suitesCompleted.incrementAndGet();
final int testsCompleted = testsCompleted.addAndGet(e.getDescription().testCount())
final int testsFailed = testsFailed.addAndGet(e.getErrorCount() + e.getFailureCount())
final int testsIgnored = testsIgnored.addAndGet(e.getIgnoredCount())
Description description = e.getDescription()
String suiteName = description.getDisplayName();
suiteName = suiteName.substring(suiteName.lastIndexOf('.') + 1);
progressLogger.progress('Suites [' + suitesCompleted + '/' + totalSuites + '], Tests [' + testsCompleted + '|' + testsFailed + '|' + testsIgnored + '], ' + suiteName + ' on J' + e.getSlave().id + ' in ' + formatDurationInSeconds(e.getExecutionTime()))
}
@Override
void setOuter(JUnit4 junit) {}
}

View File

@ -0,0 +1,373 @@
package com.carrotsearch.gradle.randomizedtesting
import com.carrotsearch.ant.tasks.junit4.JUnit4
import com.carrotsearch.ant.tasks.junit4.Pluralize
import com.carrotsearch.ant.tasks.junit4.TestsSummaryEventListener
import com.carrotsearch.ant.tasks.junit4.dependencies.com.google.common.base.Strings
import com.carrotsearch.ant.tasks.junit4.dependencies.com.google.common.eventbus.Subscribe
import com.carrotsearch.ant.tasks.junit4.events.*
import com.carrotsearch.ant.tasks.junit4.events.aggregated.*
import com.carrotsearch.ant.tasks.junit4.events.mirrors.FailureMirror
import com.carrotsearch.ant.tasks.junit4.listeners.AggregatedEventListener
import com.carrotsearch.ant.tasks.junit4.listeners.StackTraceFilter
import org.apache.tools.ant.filters.TokenFilter
import org.gradle.api.logging.LogLevel
import org.gradle.api.logging.Logger
import org.junit.runner.Description
import java.util.concurrent.atomic.AtomicInteger
import static com.carrotsearch.ant.tasks.junit4.FormattingUtils.*
class TestReportLogger extends TestsSummaryEventListener implements AggregatedEventListener {
static final String FAILURE_MARKER = " <<< FAILURES!"
/** Status names column. */
static EnumMap<TestStatus, String> statusNames;
static {
statusNames = new EnumMap<>(TestStatus.class);
for (TestStatus s : TestStatus.values()) {
statusNames.put(s,
s == TestStatus.IGNORED_ASSUMPTION
? "IGNOR/A" : s.toString());
}
}
JUnit4 owner
/** Logger to write the report to */
Logger logger
TestLoggingConfiguration config
/** Forked concurrent JVM count. */
int forkedJvmCount
/** Format line for JVM ID string. */
String jvmIdFormat
/** Summarize the first N failures at the end. */
int showNumFailuresAtEnd = 3
/** Output stream that logs messages to the given logger */
LoggingOutputStream outStream
LoggingOutputStream errStream
/** Display mode for output streams. */
static enum OutputMode {
/** Always display the output emitted from tests. */
ALWAYS,
/**
* Display the output only if a test/ suite failed. This requires internal buffering
* so the output will be shown only after a test completes.
*/
ONERROR,
/** Don't display the output, even on test failures. */
NEVER
}
OutputMode outputMode = OutputMode.ONERROR
/** A list of failed tests, if to be displayed at the end. */
List<Description> failedTests = new ArrayList<>()
/** Stack trace filters. */
StackTraceFilter stackFilter = new StackTraceFilter()
Map<String, Long> suiteTimes = new HashMap<>()
boolean slowTestsFound = false
int totalSuites
AtomicInteger suitesCompleted = new AtomicInteger()
@Subscribe
void onStart(AggregatedStartEvent e) throws IOException {
this.totalSuites = e.getSuiteCount();
StringBuilder info = new StringBuilder('==> Test Info: ')
info.append('seed=' + owner.getSeed() + '; ')
info.append(Pluralize.pluralize(e.getSlaveCount(), 'jvm') + '=' + e.getSlaveCount() + '; ')
info.append(Pluralize.pluralize(e.getSuiteCount(), 'suite') + '=' + e.getSuiteCount())
logger.lifecycle(info.toString())
forkedJvmCount = e.getSlaveCount();
jvmIdFormat = " J%-" + (1 + (int) Math.floor(Math.log10(forkedJvmCount))) + "d";
outStream = new LoggingOutputStream(logger: logger, level: LogLevel.ERROR, prefix: " 1> ")
errStream = new LoggingOutputStream(logger: logger, level: LogLevel.ERROR, prefix: " 2> ")
for (String contains : config.stackTraceFilters.contains) {
TokenFilter.ContainsString containsFilter = new TokenFilter.ContainsString()
containsFilter.setContains(contains)
stackFilter.addContainsString(containsFilter)
}
for (String pattern : config.stackTraceFilters.patterns) {
TokenFilter.ContainsRegex regexFilter = new TokenFilter.ContainsRegex()
regexFilter.setPattern(pattern)
stackFilter.addContainsRegex(regexFilter)
}
}
@Subscribe
void onChildBootstrap(ChildBootstrap e) throws IOException {
logger.info("Started J" + e.getSlave().id + " PID(" + e.getSlave().getPidString() + ").");
}
@Subscribe
void onHeartbeat(HeartBeatEvent e) throws IOException {
logger.warn("HEARTBEAT J" + e.getSlave().id + " PID(" + e.getSlave().getPidString() + "): " +
formatTime(e.getCurrentTime()) + ", stalled for " +
formatDurationInSeconds(e.getNoEventDuration()) + " at: " +
(e.getDescription() == null ? "<unknown>" : formatDescription(e.getDescription())))
slowTestsFound = true
}
@Subscribe
void onQuit(AggregatedQuitEvent e) throws IOException {
if (showNumFailuresAtEnd > 0 && !failedTests.isEmpty()) {
List<Description> sublist = this.failedTests
StringBuilder b = new StringBuilder()
b.append('Tests with failures')
if (sublist.size() > showNumFailuresAtEnd) {
sublist = sublist.subList(0, showNumFailuresAtEnd)
b.append(" (first " + showNumFailuresAtEnd + " out of " + failedTests.size() + ")")
}
b.append(':\n')
for (Description description : sublist) {
b.append(" - ").append(formatDescription(description, true)).append('\n')
}
logger.warn(b.toString())
}
if (config.slowTests.summarySize > 0) {
List<Map.Entry<String, Long>> sortedSuiteTimes = new ArrayList<>(suiteTimes.entrySet())
Collections.sort(sortedSuiteTimes, new Comparator<Map.Entry<String, Long>>() {
@Override
int compare(Map.Entry<String, Long> o1, Map.Entry<String, Long> o2) {
return o2.value - o1.value // sort descending
}
})
LogLevel level = slowTestsFound ? LogLevel.WARN : LogLevel.INFO
int numToLog = Math.min(config.slowTests.summarySize, sortedSuiteTimes.size())
logger.log(level, 'Slow Tests Summary:')
for (int i = 0; i < numToLog; ++i) {
logger.log(level, String.format(Locale.ENGLISH, '%6.2fs | %s',
sortedSuiteTimes.get(i).value / 1000.0,
sortedSuiteTimes.get(i).key));
}
logger.log(level, '') // extra vertical separation
}
if (failedTests.isEmpty()) {
// summary is already printed for failures
logger.lifecycle('==> Test Summary: ' + getResult().toString())
}
}
@Subscribe
void onSuiteStart(AggregatedSuiteStartedEvent e) throws IOException {
if (isPassthrough()) {
SuiteStartedEvent evt = e.getSuiteStartedEvent();
emitSuiteStart(LogLevel.INFO, evt.getDescription());
}
}
@Subscribe
void onOutput(PartialOutputEvent e) throws IOException {
if (isPassthrough() && logger.isInfoEnabled()) {
// We only allow passthrough output if there is one JVM.
switch (e.getEvent().getType()) {
case EventType.APPEND_STDERR:
((IStreamEvent) e.getEvent()).copyTo(errStream);
break;
case EventType.APPEND_STDOUT:
((IStreamEvent) e.getEvent()).copyTo(outStream);
break;
default:
break;
}
}
}
@Subscribe
void onTestResult(AggregatedTestResultEvent e) throws IOException {
if (isPassthrough() && e.getStatus() != TestStatus.OK) {
flushOutput();
emitStatusLine(LogLevel.ERROR, e, e.getStatus(), e.getExecutionTime());
}
if (!e.isSuccessful()) {
failedTests.add(e.getDescription());
}
}
@Subscribe
void onSuiteResult(AggregatedSuiteResultEvent e) throws IOException {
try {
final int completed = suitesCompleted.incrementAndGet();
if (e.isSuccessful() && e.getTests().isEmpty()) {
return;
}
if (config.slowTests.summarySize > 0) {
suiteTimes.put(e.getDescription().getDisplayName(), e.getExecutionTime())
}
LogLevel level = e.isSuccessful() ? LogLevel.INFO : LogLevel.ERROR
// We must emit buffered test and stream events (in case of failures).
if (!isPassthrough()) {
emitSuiteStart(level, e.getDescription())
emitBufferedEvents(level, e)
}
// Emit a synthetic failure for suite-level errors, if any.
if (!e.getFailures().isEmpty()) {
emitStatusLine(level, e, TestStatus.ERROR, 0)
}
if (!e.getFailures().isEmpty()) {
failedTests.add(e.getDescription())
}
emitSuiteEnd(level, e, completed)
} catch (Exception exc) {
logger.lifecycle('EXCEPTION: ', exc)
}
}
/** Suite prologue. */
void emitSuiteStart(LogLevel level, Description description) throws IOException {
logger.log(level, 'Suite: ' + description.getDisplayName());
}
void emitBufferedEvents(LogLevel level, AggregatedSuiteResultEvent e) throws IOException {
if (outputMode == OutputMode.NEVER) {
return
}
final IdentityHashMap<TestFinishedEvent,AggregatedTestResultEvent> eventMap = new IdentityHashMap<>();
for (AggregatedTestResultEvent tre : e.getTests()) {
eventMap.put(tre.getTestFinishedEvent(), tre)
}
final boolean emitOutput = outputMode == OutputMode.ALWAYS && isPassthrough() == false ||
outputMode == OutputMode.ONERROR && e.isSuccessful() == false
for (IEvent event : e.getEventStream()) {
switch (event.getType()) {
case EventType.APPEND_STDOUT:
if (emitOutput) ((IStreamEvent) event).copyTo(outStream);
break;
case EventType.APPEND_STDERR:
if (emitOutput) ((IStreamEvent) event).copyTo(errStream);
break;
case EventType.TEST_FINISHED:
assert eventMap.containsKey(event)
final AggregatedTestResultEvent aggregated = eventMap.get(event);
if (aggregated.getStatus() != TestStatus.OK) {
flushOutput();
emitStatusLine(level, aggregated, aggregated.getStatus(), aggregated.getExecutionTime());
}
default:
break;
}
}
if (emitOutput) {
flushOutput()
}
}
void emitSuiteEnd(LogLevel level, AggregatedSuiteResultEvent e, int suitesCompleted) throws IOException {
final StringBuilder b = new StringBuilder();
b.append(String.format(Locale.ENGLISH, 'Completed [%d/%d]%s in %.2fs, ',
suitesCompleted,
totalSuites,
e.getSlave().slaves > 1 ? ' on J' + e.getSlave().id : '',
e.getExecutionTime() / 1000.0d));
b.append(e.getTests().size()).append(Pluralize.pluralize(e.getTests().size(), ' test'));
int failures = e.getFailureCount();
if (failures > 0) {
b.append(', ').append(failures).append(Pluralize.pluralize(failures, ' failure'));
}
int errors = e.getErrorCount();
if (errors > 0) {
b.append(', ').append(errors).append(Pluralize.pluralize(errors, ' error'));
}
int ignored = e.getIgnoredCount();
if (ignored > 0) {
b.append(', ').append(ignored).append(' skipped');
}
if (!e.isSuccessful()) {
b.append(' <<< FAILURES!');
}
b.append('\n')
logger.log(level, b.toString());
}
/** Emit status line for an aggregated event. */
void emitStatusLine(LogLevel level, AggregatedResultEvent result, TestStatus status, long timeMillis) throws IOException {
final StringBuilder line = new StringBuilder();
line.append(Strings.padEnd(statusNames.get(status), 8, ' ' as char))
line.append(formatDurationInSeconds(timeMillis))
if (forkedJvmCount > 1) {
line.append(String.format(Locale.ENGLISH, jvmIdFormat, result.getSlave().id))
}
line.append(' | ')
line.append(formatDescription(result.getDescription()))
if (!result.isSuccessful()) {
line.append(FAILURE_MARKER)
}
logger.log(level, line.toString())
PrintWriter writer = new PrintWriter(new LoggingOutputStream(logger: logger, level: level, prefix: ' > '))
if (status == TestStatus.IGNORED && result instanceof AggregatedTestResultEvent) {
writer.write('Cause: ')
writer.write(((AggregatedTestResultEvent) result).getCauseForIgnored())
writer.flush()
}
final List<FailureMirror> failures = result.getFailures();
if (!failures.isEmpty()) {
int count = 0;
for (FailureMirror fm : failures) {
count++;
if (fm.isAssumptionViolation()) {
writer.write(String.format(Locale.ENGLISH,
'Assumption #%d: %s',
count, fm.getMessage() == null ? '(no message)' : fm.getMessage()));
} else {
writer.write(String.format(Locale.ENGLISH,
'Throwable #%d: %s',
count,
stackFilter.apply(fm.getTrace())));
}
}
writer.flush()
}
}
void flushOutput() throws IOException {
outStream.flush()
errStream.flush()
}
/** Returns true if output should be logged immediately. Only relevant when running with INFO log level. */
boolean isPassthrough() {
return forkedJvmCount == 1 && outputMode == OutputMode.ALWAYS && logger.isInfoEnabled()
}
@Override
void setOuter(JUnit4 task) {
owner = task
}
}

View File

@ -0,0 +1,135 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle
import org.elasticsearch.gradle.precommit.PrecommitTasks
import org.gradle.api.JavaVersion
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.tasks.compile.JavaCompile
/**
* Encapsulates build configuration for elasticsearch projects.
*/
class BuildPlugin implements Plugin<Project> {
@Override
void apply(Project project) {
project.pluginManager.apply('java')
project.pluginManager.apply('carrotsearch.randomizedtesting')
configureCompile(project)
configureTest(project)
PrecommitTasks.configure(project)
}
/** Adds compiler settings to the project */
static void configureCompile(Project project) {
project.afterEvaluate {
// fail on all javac warnings
project.tasks.withType(JavaCompile) {
options.compilerArgs << '-Werror' << '-Xlint:all' << '-Xdoclint:all/private' << '-Xdoclint:-missing'
options.encoding = 'UTF-8'
}
}
}
/** Returns a closure of common configuration shared by unit and integration tests. */
static Closure commonTestConfig(Project project) {
return {
jvm System.getProperty("java.home") + File.separator + 'bin' + File.separator + 'java'
parallelism System.getProperty('tests.jvms', 'auto')
// TODO: why are we not passing maxmemory to junit4?
jvmArg '-Xmx' + System.getProperty('tests.heap.size', '512m')
jvmArg '-Xms' + System.getProperty('tests.heap.size', '512m')
if (JavaVersion.current().isJava7()) {
// some tests need a large permgen, but that only exists on java 7
jvmArg '-XX:MaxPermSize=128m'
}
jvmArg '-XX:MaxDirectMemorySize=512m'
jvmArg '-XX:+HeapDumpOnOutOfMemoryError'
File heapdumpDir = new File(project.buildDir, 'heapdump')
heapdumpDir.mkdirs()
jvmArg '-XX:HeapDumpPath=' + heapdumpDir
// we use './temp' since this is per JVM and tests are forbidden from writing to CWD
systemProperty 'java.io.tmpdir', './temp'
systemProperty 'java.awt.headless', 'true'
systemProperty 'tests.maven', 'true' // TODO: rename this once we've switched to gradle!
systemProperty 'tests.artifact', project.name
systemProperty 'tests.task', path
systemProperty 'tests.security.manager', 'true'
// default test sysprop values
systemProperty 'tests.ifNoTests', 'fail'
systemProperty 'es.logger.level', 'WARN'
for (Map.Entry<String, String> property : System.properties.entrySet()) {
if (property.getKey().startsWith('tests.') ||
property.getKey().startsWith('es.')) {
systemProperty property.getKey(), property.getValue()
}
}
// System assertions (-esa) are disabled for now because of what looks like a
// JDK bug triggered by Groovy on JDK7. We should look at re-enabling system
// assertions when we upgrade to a new version of Groovy (currently 2.4.4) or
// require JDK8. See https://issues.apache.org/jira/browse/GROOVY-7528.
enableSystemAssertions false
testLogging {
slowTests {
heartbeat 10
summarySize 5
}
stackTraceFilters {
// custom filters: we carefully only omit test infra noise here
contains '.SlaveMain.'
regex(/^(\s+at )(org\.junit\.)/)
// also includes anonymous classes inside these two:
regex(/^(\s+at )(com\.carrotsearch\.randomizedtesting\.RandomizedRunner)/)
regex(/^(\s+at )(com\.carrotsearch\.randomizedtesting\.ThreadLeakControl)/)
regex(/^(\s+at )(com\.carrotsearch\.randomizedtesting\.rules\.)/)
regex(/^(\s+at )(org\.apache\.lucene\.util\.TestRule)/)
regex(/^(\s+at )(org\.apache\.lucene\.util\.AbstractBeforeAfterRule)/)
}
}
balancers {
executionTime cacheFilename: ".local-${project.version}-${name}-execution-times.log"
}
listeners {
junitReport()
}
exclude '**/*$*.class'
}
}
/** Configures the test task */
static Task configureTest(Project project) {
Task test = project.tasks.getByName('test')
test.configure(commonTestConfig(project))
test.configure {
include '**/*Tests.class'
}
return test
}
}

View File

@ -0,0 +1,35 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle
/**
* Accessor for properties about the version of elasticsearch this was built with.
*/
class ElasticsearchProperties {
static final String version
static {
Properties props = new Properties()
InputStream propsStream = ElasticsearchProperties.class.getResourceAsStream('/elasticsearch.properties')
if (propsStream == null) {
throw new RuntimeException('/elasticsearch.properties resource missing')
}
props.load(propsStream)
version = props.getProperty('version')
}
}

View File

@ -0,0 +1,45 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle
import org.apache.tools.ant.filters.ReplaceTokens
import org.gradle.api.file.CopySpec
/**
* Gradle provides "expansion" functionality using groovy's SimpleTemplatingEngine (TODO: check name).
* However, it allows substitutions of the form {@code $foo} (no curlies). Rest tests provide
* some substitution from the test runner, which this form is used for.
*
* This class provides a helper to do maven filtering, where only the form {@code $\{foo\}} is supported.
*
* TODO: we should get rid of this hack, and make the rest tests use some other identifier
* for builtin vars
*/
class MavenFilteringHack {
/**
* Adds a filter to the given copy spec that will substitute maven variables.
* @param CopySpec
*/
static void filter(CopySpec copySpec, Map substitutions) {
Map mavenSubstitutions = substitutions.collectEntries() {
key, value -> ["{${key}".toString(), value.toString()]
}
copySpec.filter(ReplaceTokens, tokens: mavenSubstitutions, beginToken: '$', endToken: '}')
}
}

View File

@ -0,0 +1,109 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.plugin
import nebula.plugin.extraconfigurations.ProvidedBasePlugin
import org.elasticsearch.gradle.BuildPlugin
import org.elasticsearch.gradle.ElasticsearchProperties
import org.elasticsearch.gradle.test.RestIntegTestTask
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.tasks.bundling.Zip
/**
* Encapsulates build configuration for an Elasticsearch plugin.
*/
class PluginBuildPlugin extends BuildPlugin {
@Override
void apply(Project project) {
super.apply(project)
project.pluginManager.apply(ProvidedBasePlugin)
// TODO: add target compatibility (java version) to elasticsearch properties and set for the project
configureDependencies(project)
// this afterEvaluate must happen before the afterEvaluate added by integTest configure,
// so that the file name resolution for installing the plugin will be setup
project.afterEvaluate {
project.jar.configure {
baseName project.pluginProperties.extension.name
}
project.bundlePlugin.configure {
baseName project.pluginProperties.extension.name
}
project.integTest.configure {
dependsOn project.bundlePlugin
cluster {
plugin 'installPlugin', project.bundlePlugin.outputs.files
}
}
}
Task bundle = configureBundleTask(project)
RestIntegTestTask.configure(project)
project.configurations.archives.artifacts.removeAll { it.archiveTask.is project.jar }
project.configurations.getByName('default').extendsFrom = []
project.artifacts {
archives bundle
'default' bundle
}
}
static void configureDependencies(Project project) {
String elasticsearchVersion = ElasticsearchProperties.version
project.dependencies {
provided "org.elasticsearch:elasticsearch:${elasticsearchVersion}"
testCompile "org.elasticsearch:test-framework:${elasticsearchVersion}"
// we "upgrade" these optional deps to provided for plugins, since they will run
// with a full elasticsearch server that includes optional deps
// TODO: remove duplication of version here with core...
provided 'com.spatial4j:spatial4j:0.4.1'
provided 'com.vividsolutions:jts:1.13'
provided 'com.github.spullara.mustache.java:compiler:0.9.1'
provided "log4j:log4j:1.2.17"
provided "log4j:apache-log4j-extras:1.2.17"
provided "org.slf4j:slf4j-api:1.6.2"
provided 'net.java.dev.jna:jna:4.1.0'
}
}
static Task configureBundleTask(Project project) {
PluginPropertiesTask buildProperties = project.tasks.create(name: 'pluginProperties', type: PluginPropertiesTask)
File pluginMetadata = project.file("src/main/plugin-metadata")
project.processTestResources {
from buildProperties
from pluginMetadata
}
Task bundle = project.tasks.create(name: 'bundlePlugin', type: Zip, dependsOn: [project.jar, buildProperties])
bundle.configure {
from buildProperties
from pluginMetadata
from project.jar
from bundle.project.configurations.runtime - bundle.project.configurations.provided
from('src/main/packaging') // TODO: move all config/bin/_size/etc into packaging
from('src/main') {
include 'config/**'
include 'bin/**'
}
from('src/site') {
include '_site/**'
}
}
project.assemble.dependsOn(bundle)
return bundle
}
}

View File

@ -0,0 +1,56 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.plugin
import org.gradle.api.Project
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.Optional
/**
* A container for plugin properties that will be written to the plugin descriptor, for easy
* manipulation in the gradle DSL.
*/
class PluginPropertiesExtension {
@Input
String name
@Input
String version
@Input
String description
@Input
boolean jvm = true
@Input
String classname
@Input
boolean site = false
@Input
boolean isolated = true
PluginPropertiesExtension(Project project) {
name = project.name
version = project.version
}
}

View File

@ -0,0 +1,95 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.plugin
import org.elasticsearch.gradle.ElasticsearchProperties
import org.gradle.api.DefaultTask
import org.gradle.api.InvalidUserDataException
import org.gradle.api.tasks.OutputFile
import org.gradle.api.tasks.TaskAction
/**
* Creates a plugin descriptor.
*
* TODO: copy the example properties file to plugin documentation
*/
class PluginPropertiesTask extends DefaultTask {
PluginPropertiesExtension extension
Map<String, String> properties = new HashMap<>()
PluginPropertiesTask() {
extension = project.extensions.create('esplugin', PluginPropertiesExtension, project)
project.afterEvaluate {
if (extension.description == null) {
throw new InvalidUserDataException('description is a required setting for esplugin')
}
if (extension.jvm && extension.classname == null) {
throw new InvalidUserDataException('classname is a required setting for esplugin with jvm=true')
}
if (extension.jvm) {
dependsOn(project.classes) // so we can check for the classname
}
fillProperties()
configure {
inputs.properties(properties)
}
}
}
@OutputFile
File propertiesFile = new File(project.buildDir, "plugin" + File.separator + "plugin-descriptor.properties")
void fillProperties() {
// TODO: need to copy the templated plugin-descriptor with a dependent task, since copy requires a file (not uri)
properties = [
'name': extension.name,
'description': extension.description,
'version': extension.version,
'elasticsearch.version': ElasticsearchProperties.version,
'jvm': extension.jvm as String,
'site': extension.site as String
]
if (extension.jvm) {
properties['classname'] = extension.classname
properties['isolated'] = extension.isolated as String
properties['java.version'] = project.targetCompatibility as String
}
}
@TaskAction
void buildProperties() {
if (extension.jvm) {
File classesDir = project.sourceSets.main.output.classesDir
File classFile = new File(classesDir, extension.classname.replace('.', File.separator) + '.class')
if (classFile.exists() == false) {
throw new InvalidUserDataException('classname ' + extension.classname + ' does not exist')
}
if (extension.isolated == false) {
logger.warn('Disabling isolation is deprecated and will be removed in the future')
}
}
Properties props = new Properties()
for (Map.Entry<String, String> prop : properties) {
props.put(prop.getKey(), prop.getValue())
}
props.store(propertiesFile.newWriter(), null)
}
}

View File

@ -0,0 +1,189 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.precommit
import org.gradle.api.DefaultTask
import org.gradle.api.GradleException
import org.gradle.api.InvalidUserDataException
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.file.FileCollection
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.InputDirectory
import org.gradle.api.tasks.InputFiles
import org.gradle.api.tasks.StopActionException
import org.gradle.api.tasks.TaskAction
import org.gradle.api.tasks.VerificationTask
import java.nio.file.Files
import java.security.MessageDigest
import java.util.regex.Matcher
import java.util.regex.Pattern
class DependencyLicensesTask extends DefaultTask {
static final String SHA_EXTENSION = '.sha1'
static Task configure(Project project, Closure closure) {
DependencyLicensesTask task = project.tasks.create(type: DependencyLicensesTask, name: 'dependencyLicenses')
UpdateShasTask update = project.tasks.create(type: UpdateShasTask, name: 'updateShas')
update.parentTask = task
task.configure(closure)
project.check.dependsOn(task)
return task
}
@InputFiles
FileCollection dependencies
@InputDirectory
File licensesDir = new File(project.projectDir, 'licenses')
LinkedHashMap<String, String> mappings = new LinkedHashMap<>()
@Input
void mapping(Map<String, String> props) {
String from = props.get('from')
if (from == null) {
throw new InvalidUserDataException('Missing "from" setting for license name mapping')
}
String to = props.get('to')
if (to == null) {
throw new InvalidUserDataException('Missing "to" setting for license name mapping')
}
mappings.put(from, to)
}
@TaskAction
void checkDependencies() {
// TODO: empty license dir (or error when dir exists and no deps)
if (licensesDir.exists() == false && dependencies.isEmpty() == false) {
throw new GradleException("Licences dir ${licensesDir} does not exist, but there are dependencies")
}
// order is the same for keys and values iteration since we use a linked hashmap
List<String> mapped = new ArrayList<>(mappings.values())
Pattern mappingsPattern = Pattern.compile('(' + mappings.keySet().join(')|(') + ')')
Map<String, Integer> licenses = new HashMap<>()
Map<String, Integer> notices = new HashMap<>()
Set<File> shaFiles = new HashSet<File>()
licensesDir.eachFile {
String name = it.getName()
if (name.endsWith(SHA_EXTENSION)) {
shaFiles.add(it)
} else if (name.endsWith('-LICENSE') || name.endsWith('-LICENSE.txt')) {
// TODO: why do we support suffix of LICENSE *and* LICENSE.txt??
licenses.put(name, 0)
} else if (name.contains('-NOTICE') || name.contains('-NOTICE.txt')) {
notices.put(name, 0)
}
}
for (File dependency : dependencies) {
String jarName = dependency.getName()
logger.info("Checking license/notice/sha for " + jarName)
checkSha(dependency, jarName, shaFiles)
String name = jarName - ~/\-\d+.*/
Matcher match = mappingsPattern.matcher(name)
if (match.matches()) {
int i = 0
while (i < match.groupCount() && match.group(i + 1) == null) ++i;
logger.info("Mapped dependency name ${name} to ${mapped.get(i)} for license check")
name = mapped.get(i)
}
checkFile(name, jarName, licenses, 'LICENSE')
checkFile(name, jarName, notices, 'NOTICE')
}
licenses.each { license, count ->
if (count == 0) {
throw new GradleException("Unused license ${license}")
}
}
notices.each { notice, count ->
if (count == 0) {
throw new GradleException("Unused notice ${notice}")
}
}
if (shaFiles.isEmpty() == false) {
throw new GradleException("Unused sha files found: \n${shaFiles.join('\n')}")
}
}
void checkSha(File jar, String jarName, Set<File> shaFiles) {
File shaFile = new File(licensesDir, jarName + SHA_EXTENSION)
if (shaFile.exists() == false) {
throw new GradleException("Missing SHA for ${jarName}. Run 'gradle updateSHAs' to create")
}
// TODO: shouldn't have to trim, sha files should not have trailing newline
String expectedSha = shaFile.getText('UTF-8').trim()
String sha = MessageDigest.getInstance("SHA-1").digest(jar.getBytes()).encodeHex().toString()
if (expectedSha.equals(sha) == false) {
throw new GradleException("SHA has changed! Expected ${expectedSha} for ${jarName} but got ${sha}. " +
"\nThis usually indicates a corrupt dependency cache or artifacts changed upstream." +
"\nEither wipe your cache, fix the upstream artifact, or delete ${shaFile} and run updateShas")
}
shaFiles.remove(shaFile)
}
void checkFile(String name, String jarName, Map<String, Integer> counters, String type) {
String fileName = "${name}-${type}"
Integer count = counters.get(fileName)
if (count == null) {
// try the other suffix...TODO: get rid of this, just support ending in .txt
fileName = "${fileName}.txt"
counters.get(fileName)
}
count = counters.get(fileName)
if (count == null) {
throw new GradleException("Missing ${type} for ${jarName}, expected in ${fileName}")
}
counters.put(fileName, count + 1)
}
static class UpdateShasTask extends DefaultTask {
DependencyLicensesTask parentTask
@TaskAction
void updateShas() {
Set<File> shaFiles = new HashSet<File>()
parentTask.licensesDir.eachFile {
String name = it.getName()
if (name.endsWith(SHA_EXTENSION)) {
shaFiles.add(it)
}
}
for (File dependency : parentTask.dependencies) {
String jarName = dependency.getName()
File shaFile = new File(parentTask.licensesDir, jarName + SHA_EXTENSION)
if (shaFile.exists() == false) {
logger.lifecycle("Adding sha for ${jarName}")
String sha = MessageDigest.getInstance("SHA-1").digest(dependency.getBytes()).encodeHex().toString()
shaFile.setText(sha, 'UTF-8')
} else {
shaFiles.remove(shaFile)
}
}
shaFiles.each { shaFile ->
logger.lifecycle("Removing unused sha ${shaFile.getName()}")
Files.delete(shaFile.toPath())
}
}
}
}

View File

@ -0,0 +1,108 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.precommit
import org.gradle.api.DefaultTask
import org.gradle.api.file.FileCollection
import org.gradle.api.tasks.InputFiles
import org.gradle.api.tasks.OutputFiles
import org.gradle.api.tasks.SourceSet
import org.gradle.api.tasks.TaskAction
import org.gradle.api.tasks.util.PatternFilterable
import org.gradle.api.tasks.util.PatternSet
import java.util.regex.Pattern
/**
* Checks for patterns in source files for the project which are forbidden.
*/
class ForbiddenPatternsTask extends DefaultTask {
Map<String,String> patterns = new LinkedHashMap<>()
PatternFilterable filesFilter = new PatternSet()
ForbiddenPatternsTask() {
// we always include all source files, and exclude what should not be checked
filesFilter.include('**')
// exclude known binary extensions
filesFilter.exclude('**/*.gz')
filesFilter.exclude('**/*.ico')
filesFilter.exclude('**/*.jar')
filesFilter.exclude('**/*.zip')
filesFilter.exclude('**/*.jks')
filesFilter.exclude('**/*.crt')
filesFilter.exclude('**/*.png')
// TODO: add compile and test compile outputs as this tasks outputs, so we don't rerun when source files haven't changed
}
/** Adds a file glob pattern to be excluded */
void exclude(String... excludes) {
this.filesFilter.exclude(excludes)
}
/** Adds pattern to forbid */
void rule(Map<String,String> props) {
String name = props.get('name')
if (name == null) {
throw new IllegalArgumentException('Missing [name] for invalid pattern rule')
}
String pattern = props.get('pattern')
if (pattern == null) {
throw new IllegalArgumentException('Missing [pattern] for invalid pattern rule')
}
// TODO: fail if pattern contains a newline, it won't work (currently)
patterns.put(name, pattern)
}
/** Returns the files this task will check */
@InputFiles
FileCollection files() {
List<FileCollection> collections = new ArrayList<>()
for (SourceSet sourceSet : project.sourceSets) {
collections.add(sourceSet.allSource.matching(filesFilter))
}
return project.files(collections.toArray())
}
@TaskAction
void checkInvalidPatterns() {
Pattern allPatterns = Pattern.compile('(' + patterns.values().join(')|(') + ')')
List<String> failures = new ArrayList<>()
for (File f : files()) {
f.eachLine('UTF-8') { line, lineNumber ->
if (allPatterns.matcher(line).find()) {
addErrorMessages(failures, f, (String)line, (int)lineNumber)
}
}
}
if (failures.isEmpty() == false) {
throw new IllegalArgumentException('Found invalid patterns:\n' + failures.join('\n'))
}
}
// iterate through patterns to find the right ones for nice error messages
void addErrorMessages(List<String> failures, File f, String line, int lineNumber) {
String path = project.getRootProject().projectDir.toURI().relativize(f.toURI()).toString()
for (Map.Entry<String,String> pattern : patterns.entrySet()) {
if (Pattern.compile(pattern.value).matcher(line).find()) {
failures.add('- ' + pattern.key + ' on line ' + lineNumber + ' of ' + path)
}
}
}
}

View File

@ -0,0 +1,93 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.precommit
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.plugins.JavaBasePlugin
import org.gradle.api.tasks.TaskContainer
/**
* Validation tasks which should be run before committing. These run before tests.
*/
class PrecommitTasks {
/** Adds a precommit task, which depends on non-test verification tasks. */
static void configure(Project project) {
List precommitTasks = [
configureForbiddenApis(project),
configureForbiddenPatterns(project.tasks)]
Map precommitOptions = [
name: 'precommit',
group: JavaBasePlugin.VERIFICATION_GROUP,
description: 'Runs all non-test checks.',
dependsOn: precommitTasks
]
Task precommit = project.tasks.create(precommitOptions)
project.check.dependsOn(precommit)
// delay ordering relative to test tasks, since they may not be setup yet
project.afterEvaluate {
Task test = project.tasks.findByName('test')
if (test != null) {
test.mustRunAfter(precommit)
}
Task integTest = project.tasks.findByName('integTest')
if (integTest != null) {
integTest.mustRunAfter(precommit)
}
}
}
static Task configureForbiddenApis(Project project) {
project.pluginManager.apply('de.thetaphi.forbiddenapis')
project.forbiddenApis {
internalRuntimeForbidden = true
failOnUnsupportedJava = false
bundledSignatures = ['jdk-unsafe', 'jdk-deprecated']
signaturesURLs = [getClass().getResource('/forbidden/all-signatures.txt')]
suppressAnnotations = ['**.SuppressForbidden']
}
project.tasks.findByName('forbiddenApisMain').configure {
bundledSignatures += ['jdk-system-out']
signaturesURLs += [
getClass().getResource('/forbidden/core-signatures.txt'),
getClass().getResource('/forbidden/third-party-signatures.txt')]
}
project.tasks.findByName('forbiddenApisTest').configure {
signaturesURLs += [getClass().getResource('/forbidden/test-signatures.txt')]
}
Task forbiddenApis = project.tasks.findByName('forbiddenApis')
forbiddenApis.group = "" // clear group, so this does not show up under verification tasks
return forbiddenApis
}
static Task configureForbiddenPatterns(TaskContainer tasks) {
Map options = [
name: 'forbiddenPatterns',
type: ForbiddenPatternsTask,
description: 'Checks source files for invalid patterns like nocommits or tabs',
]
return tasks.create(options) {
rule name: 'nocommit', pattern: /nocommit/
rule name: 'tab', pattern: /\t/
}
}
}

View File

@ -0,0 +1,63 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.test
import org.gradle.api.file.FileCollection
import org.gradle.api.tasks.Input
/** Configuration for an elasticsearch cluster, used for integration tests. */
class ClusterConfiguration {
@Input
int numNodes = 1
@Input
int httpPort = 9400
@Input
int transportPort = 9500
Map<String, String> systemProperties = new HashMap<>()
@Input
void systemProperty(String property, String value) {
systemProperties.put(property, value)
}
LinkedHashMap<String, Object[]> setupCommands = new LinkedHashMap<>()
@Input
void plugin(String name, FileCollection file) {
setupCommands.put(name, ['bin/plugin', 'install', new LazyFileUri(file: file)])
}
static class LazyFileUri {
FileCollection file
@Override
String toString() {
return file.singleFile.toURI().toURL().toString();
}
}
@Input
void setupCommand(String name, Object... args) {
setupCommands.put(name, args)
}
}

View File

@ -0,0 +1,199 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.test
import org.apache.tools.ant.taskdefs.condition.Os
import org.elasticsearch.gradle.ElasticsearchProperties
import org.gradle.api.DefaultTask
import org.gradle.api.GradleException
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.tasks.Copy
import org.gradle.api.tasks.Delete
import org.gradle.api.tasks.Exec
/**
* A helper for creating tasks to build a cluster that is used by a task, and tear down the cluster when the task is finished.
*/
class ClusterFormationTasks {
/**
* Adds dependent tasks to the given task to start a cluster with the given configuration.
* Also adds a finalize task to stop the cluster.
*/
static void setup(Project project, Task task, ClusterConfiguration config) {
if (task.getEnabled() == false) {
// no need to cluster formation if the task won't run!
return
}
addZipConfiguration(project)
File clusterDir = new File(project.buildDir, 'cluster' + File.separator + task.name)
if (config.numNodes == 1) {
addNodeStartupTasks(project, task, config, clusterDir)
addNodeStopTask(project, task, clusterDir)
} else {
for (int i = 0; i < config.numNodes; ++i) {
File nodeDir = new File(clusterDir, "node${i}")
addNodeStartupTasks(project, task, config, nodeDir)
addNodeStopTask(project, task, nodeDir)
}
}
}
static void addNodeStartupTasks(Project project, Task task, ClusterConfiguration config, File baseDir) {
String clusterName = "${task.path.replace(':', '_').substring(1)}"
File home = new File(baseDir, "elasticsearch-${ElasticsearchProperties.version}")
List setupDependsOn = [project.configurations.elasticsearchZip]
setupDependsOn.addAll(task.dependsOn)
Task setup = project.tasks.create(name: task.name + '#setup', type: Copy, dependsOn: setupDependsOn) {
from { project.zipTree(project.configurations.elasticsearchZip.singleFile) }
into baseDir
}
// chain setup tasks to maintain their order
setup = project.tasks.create(name: "${task.name}#clean", type: Delete, dependsOn: setup) {
delete new File(home, 'plugins'), new File(home, 'data'), new File(home, 'logs')
}
setup = project.tasks.create(name: "${task.name}#configure", type: DefaultTask, dependsOn: setup) << {
File configFile = new File(home, 'config' + File.separator + 'elasticsearch.yml')
logger.info("Configuring ${configFile}")
configFile.setText("cluster.name: ${clusterName}", 'UTF-8')
}
for (Map.Entry<String, String> command : config.setupCommands.entrySet()) {
Task nextSetup = project.tasks.create(name: "${task.name}#${command.getKey()}", type: Exec, dependsOn: setup) {
workingDir home
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
executable 'cmd'
args '/C', 'call'
} else {
executable 'sh'
}
args command.getValue()
// only show output on failure, when not in info or debug mode
if (logger.isInfoEnabled() == false) {
standardOutput = new ByteArrayOutputStream()
errorOutput = standardOutput
ignoreExitValue = true
doLast {
if (execResult.exitValue != 0) {
logger.error(standardOutput.toString())
throw new GradleException("Process '${command.getValue().join(' ')}' finished with non-zero exit value ${execResult.exitValue}")
}
}
}
}
setup = nextSetup
}
File pidFile = pidFile(baseDir)
List esArgs = [
"-Des.http.port=${config.httpPort}",
"-Des.transport.tcp.port=${config.transportPort}",
"-Des.pidfile=${pidFile}",
"-Des.path.repo=${home}/repo",
"-Des.path.shared_data=${home}/../",
]
esArgs.addAll(config.systemProperties.collect {key, value -> "-D${key}=${value}"})
Closure esPostStartActions = { ant, logger ->
ant.waitfor(maxwait: '30', maxwaitunit: 'second', checkevery: '500', checkeveryunit: 'millisecond', timeoutproperty: "failed${task.name}#start") {
and {
resourceexists {
file file: pidFile.toString()
}
http(url: "http://localhost:${config.httpPort}")
}
}
if (ant.properties.containsKey("failed${task.name}#start".toString())) {
new File(home, 'logs' + File.separator + clusterName + '.log').eachLine {
line -> logger.error(line)
}
throw new GradleException('Failed to start elasticsearch')
}
}
Task start;
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
// elasticsearch.bat is spawned as it has no daemon mode
start = project.tasks.create(name: "${task.name}#start", type: DefaultTask, dependsOn: setup) << {
// Fall back to Ant exec task as Gradle Exec task does not support spawning yet
ant.exec(executable: 'cmd', spawn: true, dir: home) {
(['/C', 'call', 'bin/elasticsearch'] + esArgs).each { arg(value: it) }
}
esPostStartActions(ant, logger)
}
} else {
start = project.tasks.create(name: "${task.name}#start", type: Exec, dependsOn: setup) {
workingDir home
executable 'sh'
args 'bin/elasticsearch', '-d' // daemonize!
args esArgs
errorOutput = new ByteArrayOutputStream()
doLast {
if (errorOutput.toString().isEmpty() == false) {
logger.error(errorOutput.toString())
new File(home, 'logs' + File.separator + clusterName + '.log').eachLine {
line -> logger.error(line)
}
throw new GradleException('Failed to start elasticsearch')
}
esPostStartActions(ant, logger)
}
}
}
task.dependsOn(start)
}
static void addNodeStopTask(Project project, Task task, File baseDir) {
LazyPidReader pidFile = new LazyPidReader(pidFile: pidFile(baseDir))
Task stop = project.tasks.create(name: task.name + '#stop', type: Exec) {
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
executable 'Taskkill'
args '/PID', pidFile, '/F'
} else {
executable 'kill'
args '-9', pidFile
}
doLast {
// TODO: wait for pid to close, or kill -9 and fail
}
}
task.finalizedBy(stop)
}
/** Delays reading a pid file until needing to use the pid */
static class LazyPidReader {
File pidFile
@Override
String toString() {
return pidFile.text.stripMargin()
}
}
static File pidFile(File dir) {
return new File(dir, 'es.pid')
}
static void addZipConfiguration(Project project) {
String elasticsearchVersion = ElasticsearchProperties.version
project.configurations {
elasticsearchZip
}
project.dependencies {
elasticsearchZip "org.elasticsearch.distribution.zip:elasticsearch:${elasticsearchVersion}@zip"
}
}
}

View File

@ -0,0 +1,90 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.test
import com.carrotsearch.gradle.randomizedtesting.RandomizedTestingTask
import org.elasticsearch.gradle.BuildPlugin
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.plugins.JavaBasePlugin
import org.gradle.api.tasks.Input
import org.gradle.util.ConfigureUtil
/**
* Runs integration tests, but first starts an ES cluster,
* and passes the ES cluster info as parameters to the tests.
*/
class RestIntegTestTask extends RandomizedTestingTask {
ClusterConfiguration clusterConfig = new ClusterConfiguration()
@Input
boolean includePackaged = false
static RestIntegTestTask configure(Project project) {
Map integTestOptions = [
name: 'integTest',
type: RestIntegTestTask,
dependsOn: 'testClasses',
group: JavaBasePlugin.VERIFICATION_GROUP,
description: 'Runs rest tests against an elasticsearch cluster.'
]
RestIntegTestTask integTest = project.tasks.create(integTestOptions)
integTest.configure(BuildPlugin.commonTestConfig(project))
integTest.configure {
include '**/*IT.class'
systemProperty 'tests.rest.load_packaged', 'false'
}
RandomizedTestingTask test = project.tasks.findByName('test')
if (test != null) {
integTest.classpath = test.classpath
integTest.testClassesDir = test.testClassesDir
integTest.mustRunAfter(test)
}
project.check.dependsOn(integTest)
RestSpecHack.configureDependencies(project)
project.afterEvaluate {
integTest.dependsOn(RestSpecHack.configureTask(project, integTest.includePackaged))
}
return integTest
}
RestIntegTestTask() {
project.afterEvaluate {
Task test = project.tasks.findByName('test')
if (test != null) {
mustRunAfter(test)
}
ClusterFormationTasks.setup(project, this, clusterConfig)
configure {
parallelism '1'
systemProperty 'tests.cluster', "localhost:${clusterConfig.transportPort}"
}
}
}
@Input
void cluster(Closure closure) {
ConfigureUtil.configure(closure, clusterConfig)
}
ClusterConfiguration getCluster() {
return clusterConfig
}
}

View File

@ -0,0 +1,75 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.test
import org.elasticsearch.gradle.ElasticsearchProperties
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.tasks.Copy
/**
* The rest-api-spec tests are loaded from the classpath. However, they
* currently must be available on the local filesystem. This class encapsulates
* setting up tasks to copy the rest spec api to test resources.
*/
class RestSpecHack {
/**
* Sets dependencies needed to copy the rest spec.
* @param project The project to add rest spec dependency to
*/
static void configureDependencies(Project project) {
project.configurations {
restSpec
}
project.dependencies {
restSpec "org.elasticsearch:rest-api-spec:${ElasticsearchProperties.version}"
}
}
/**
* Creates a task to copy the rest spec files.
*
* @param project The project to add the copy task to
* @param includePackagedTests true if the packaged tests should be copied, false otherwise
*/
static Task configureTask(Project project, boolean includePackagedTests) {
Map copyRestSpecProps = [
name : 'copyRestSpec',
type : Copy,
dependsOn: [project.configurations.restSpec, 'processTestResources']
]
Task copyRestSpec = project.tasks.create(copyRestSpecProps) {
from { project.zipTree(project.configurations.restSpec.singleFile) }
include 'rest-api-spec/api/**'
if (includePackagedTests) {
include 'rest-api-spec/test/**'
}
into project.sourceSets.test.output.resourcesDir
}
project.idea {
module {
if (scopes.TEST != null) {
// TODO: need to add the TEST scope somehow for rest test plugin...
scopes.TEST.plus.add(project.configurations.restSpec)
}
}
}
return copyRestSpec
}
}

View File

@ -0,0 +1,59 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.gradle.test
import com.carrotsearch.gradle.randomizedtesting.RandomizedTestingTask
import org.elasticsearch.gradle.ElasticsearchProperties
import org.gradle.api.Plugin
import org.gradle.api.Project
/** Configures the build to have a rest integration test. */
class RestTestPlugin implements Plugin<Project> {
@Override
void apply(Project project) {
project.pluginManager.apply('java-base')
project.pluginManager.apply('carrotsearch.randomizedtesting')
project.pluginManager.apply('idea')
// remove some unnecessary tasks for a qa test
project.tasks.removeAll { it.name in ['assemble', 'buildDependents'] }
// only setup tests to build
project.sourceSets {
test
}
project.dependencies {
testCompile "org.elasticsearch:test-framework:${ElasticsearchProperties.version}"
}
RandomizedTestingTask integTest = RestIntegTestTask.configure(project)
RestSpecHack.configureDependencies(project)
integTest.configure {
classpath = project.sourceSets.test.runtimeClasspath
testClassesDir project.sourceSets.test.output.classesDir
}
project.eclipse {
classpath {
sourceSets = [project.sourceSets.test]
}
}
}
}

View File

@ -0,0 +1 @@
implementation-class=com.carrotsearch.gradle.randomizedtesting.RandomizedTestingPlugin

View File

@ -0,0 +1 @@
implementation-class=org.elasticsearch.gradle.BuildPlugin

View File

@ -0,0 +1 @@
implementation-class=org.elasticsearch.gradle.plugin.PluginBuildPlugin

View File

@ -0,0 +1 @@
implementation-class=org.elasticsearch.gradle.test.RestTestPlugin

View File

@ -0,0 +1 @@
version=@version@

View File

@ -0,0 +1,92 @@
# Licensed to Elasticsearch under one or more contributor
# license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright
# ownership. Elasticsearch licenses this file to you under
# the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on
# an 'AS IS' BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
# either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
@defaultMessage Convert to URI
java.net.URL#getPath()
java.net.URL#getFile()
@defaultMessage Usage of getLocalHost is discouraged
java.net.InetAddress#getLocalHost()
@defaultMessage Use java.nio.file instead of java.io.File API
java.util.jar.JarFile
java.util.zip.ZipFile
java.io.File
java.io.FileInputStream
java.io.FileOutputStream
java.io.PrintStream#<init>(java.lang.String,java.lang.String)
java.io.PrintWriter#<init>(java.lang.String,java.lang.String)
java.util.Formatter#<init>(java.lang.String,java.lang.String,java.util.Locale)
java.io.RandomAccessFile
java.nio.file.Path#toFile()
@defaultMessage Don't use deprecated lucene apis
org.apache.lucene.index.DocsEnum
org.apache.lucene.index.DocsAndPositionsEnum
org.apache.lucene.queries.TermFilter
org.apache.lucene.queries.TermsFilter
org.apache.lucene.search.TermRangeFilter
org.apache.lucene.search.NumericRangeFilter
org.apache.lucene.search.PrefixFilter
java.nio.file.Paths @ Use org.elasticsearch.common.io.PathUtils.get() instead.
java.nio.file.FileSystems#getDefault() @ use org.elasticsearch.common.io.PathUtils.getDefaultFileSystem() instead.
@defaultMessage Specify a location for the temp file/directory instead.
java.nio.file.Files#createTempDirectory(java.lang.String,java.nio.file.attribute.FileAttribute[])
java.nio.file.Files#createTempFile(java.lang.String,java.lang.String,java.nio.file.attribute.FileAttribute[])
@defaultMessage Don't use java serialization - this can break BWC without noticing it
java.io.ObjectOutputStream
java.io.ObjectOutput
java.io.ObjectInputStream
java.io.ObjectInput
java.nio.file.Files#isHidden(java.nio.file.Path) @ Dependent on the operating system, use FileSystemUtils.isHidden instead
java.nio.file.Files#getFileStore(java.nio.file.Path) @ Use org.elasticsearch.env.Environment.getFileStore() instead, impacted by JDK-8034057
java.nio.file.Files#isWritable(java.nio.file.Path) @ Use org.elasticsearch.env.Environment.isWritable() instead, impacted by JDK-8034057
@defaultMessage Resolve hosts explicitly to the address(es) you want with InetAddress.
java.net.InetSocketAddress#<init>(java.lang.String,int)
java.net.Socket#<init>(java.lang.String,int)
java.net.Socket#<init>(java.lang.String,int,java.net.InetAddress,int)
@defaultMessage Don't bind to wildcard addresses. Be specific.
java.net.DatagramSocket#<init>()
java.net.DatagramSocket#<init>(int)
java.net.InetSocketAddress#<init>(int)
java.net.MulticastSocket#<init>()
java.net.MulticastSocket#<init>(int)
java.net.ServerSocket#<init>(int)
java.net.ServerSocket#<init>(int,int)
@defaultMessage use NetworkAddress format/formatAddress to print IP or IP+ports
java.net.InetAddress#toString()
java.net.InetAddress#getHostAddress()
java.net.Inet4Address#getHostAddress()
java.net.Inet6Address#getHostAddress()
java.net.InetSocketAddress#toString()
@defaultMessage avoid DNS lookups by accident: if you have a valid reason, then @SuppressWarnings with that reason so its completely clear
java.net.InetAddress#getHostName()
java.net.InetAddress#getCanonicalHostName()
java.net.InetSocketAddress#getHostName() @ Use getHostString() instead, which avoids a DNS lookup
@defaultMessage Do not violate java's access system
java.lang.reflect.AccessibleObject#setAccessible(boolean)
java.lang.reflect.AccessibleObject#setAccessible(java.lang.reflect.AccessibleObject[], boolean)

View File

@ -0,0 +1,85 @@
# Licensed to Elasticsearch under one or more contributor
# license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright
# ownership. Elasticsearch licenses this file to you under
# the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on
# an 'AS IS' BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
# either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
# For third-party dependencies, please put signatures in third-party.txt instead of here.
@defaultMessage spawns threads with vague names; use a custom thread factory and name threads so that you can tell (by its name) which executor it is associated with
java.util.concurrent.Executors#newFixedThreadPool(int)
java.util.concurrent.Executors#newSingleThreadExecutor()
java.util.concurrent.Executors#newCachedThreadPool()
java.util.concurrent.Executors#newSingleThreadScheduledExecutor()
java.util.concurrent.Executors#newScheduledThreadPool(int)
java.util.concurrent.Executors#defaultThreadFactory()
java.util.concurrent.Executors#privilegedThreadFactory()
java.lang.Character#codePointBefore(char[],int) @ Implicit start offset is error-prone when the char[] is a buffer and the first chars are random chars
java.lang.Character#codePointAt(char[],int) @ Implicit end offset is error-prone when the char[] is a buffer and the last chars are random chars
java.io.StringReader#<init>(java.lang.String) @ Use FastStringReader instead
@defaultMessage Reference management is tricky, leave it to SearcherManager
org.apache.lucene.index.IndexReader#decRef()
org.apache.lucene.index.IndexReader#incRef()
org.apache.lucene.index.IndexReader#tryIncRef()
@defaultMessage Pass the precision step from the mappings explicitly instead
org.apache.lucene.search.NumericRangeQuery#newDoubleRange(java.lang.String,java.lang.Double,java.lang.Double,boolean,boolean)
org.apache.lucene.search.NumericRangeQuery#newFloatRange(java.lang.String,java.lang.Float,java.lang.Float,boolean,boolean)
org.apache.lucene.search.NumericRangeQuery#newIntRange(java.lang.String,java.lang.Integer,java.lang.Integer,boolean,boolean)
org.apache.lucene.search.NumericRangeQuery#newLongRange(java.lang.String,java.lang.Long,java.lang.Long,boolean,boolean)
org.apache.lucene.search.NumericRangeFilter#newDoubleRange(java.lang.String,java.lang.Double,java.lang.Double,boolean,boolean)
org.apache.lucene.search.NumericRangeFilter#newFloatRange(java.lang.String,java.lang.Float,java.lang.Float,boolean,boolean)
org.apache.lucene.search.NumericRangeFilter#newIntRange(java.lang.String,java.lang.Integer,java.lang.Integer,boolean,boolean)
org.apache.lucene.search.NumericRangeFilter#newLongRange(java.lang.String,java.lang.Long,java.lang.Long,boolean,boolean)
@defaultMessage Only use wait / notify when really needed try to use concurrency primitives, latches or callbacks instead.
java.lang.Object#wait()
java.lang.Object#wait(long)
java.lang.Object#wait(long,int)
java.lang.Object#notify()
java.lang.Object#notifyAll()
@defaultMessage Beware of the behavior of this method on MIN_VALUE
java.lang.Math#abs(int)
java.lang.Math#abs(long)
@defaultMessage Please do not try to stop the world
java.lang.System#gc()
@defaultMessage Use Channels.* methods to write to channels. Do not write directly.
java.nio.channels.WritableByteChannel#write(java.nio.ByteBuffer)
java.nio.channels.FileChannel#write(java.nio.ByteBuffer, long)
java.nio.channels.GatheringByteChannel#write(java.nio.ByteBuffer[], int, int)
java.nio.channels.GatheringByteChannel#write(java.nio.ByteBuffer[])
java.nio.channels.ReadableByteChannel#read(java.nio.ByteBuffer)
java.nio.channels.ScatteringByteChannel#read(java.nio.ByteBuffer[])
java.nio.channels.ScatteringByteChannel#read(java.nio.ByteBuffer[], int, int)
java.nio.channels.FileChannel#read(java.nio.ByteBuffer, long)
@defaultMessage Use Lucene.parseLenient instead it strips off minor version
org.apache.lucene.util.Version#parseLeniently(java.lang.String)
@defaultMessage Spawns a new thread which is solely under lucenes control use ThreadPool#estimatedTimeInMillisCounter instead
org.apache.lucene.search.TimeLimitingCollector#getGlobalTimerThread()
org.apache.lucene.search.TimeLimitingCollector#getGlobalCounter()
@defaultMessage Don't interrupt threads use FutureUtils#cancel(Future<T>) instead
java.util.concurrent.Future#cancel(boolean)
@defaultMessage Don't try reading from paths that are not configured in Environment, resolve from Environment instead
org.elasticsearch.common.io.PathUtils#get(java.lang.String, java.lang.String[])
org.elasticsearch.common.io.PathUtils#get(java.net.URI)

View File

@ -0,0 +1,23 @@
# Licensed to Elasticsearch under one or more contributor
# license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright
# ownership. Elasticsearch licenses this file to you under
# the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on
# an 'AS IS' BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
# either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
com.carrotsearch.randomizedtesting.RandomizedTest#globalTempDir() @ Use newTempDirPath() instead
com.carrotsearch.randomizedtesting.annotations.Seed @ Don't commit hardcoded seeds
com.carrotsearch.randomizedtesting.annotations.Repeat @ Don't commit hardcoded repeats
org.apache.lucene.codecs.Codec#setDefault(org.apache.lucene.codecs.Codec) @ Use the SuppressCodecs("*") annotation instead
org.apache.lucene.util.LuceneTestCase$Slow @ Don't write slow tests
org.junit.Ignore @ Use AwaitsFix instead

View File

@ -0,0 +1,66 @@
# Licensed to Elasticsearch under one or more contributor
# license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright
# ownership. Elasticsearch licenses this file to you under
# the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on
# an 'AS IS' BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
# either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
@defaultMessage unsafe encoders/decoders have problems in the lzf compress library. Use variants of encode/decode functions which take Encoder/Decoder.
com.ning.compress.lzf.impl.UnsafeChunkEncoders#createEncoder(int)
com.ning.compress.lzf.impl.UnsafeChunkEncoders#createNonAllocatingEncoder(int)
com.ning.compress.lzf.impl.UnsafeChunkEncoders#createEncoder(int, com.ning.compress.BufferRecycler)
com.ning.compress.lzf.impl.UnsafeChunkEncoders#createNonAllocatingEncoder(int, com.ning.compress.BufferRecycler)
com.ning.compress.lzf.impl.UnsafeChunkDecoder#<init>()
com.ning.compress.lzf.parallel.CompressTask
com.ning.compress.lzf.util.ChunkEncoderFactory#optimalInstance()
com.ning.compress.lzf.util.ChunkEncoderFactory#optimalInstance(int)
com.ning.compress.lzf.util.ChunkEncoderFactory#optimalNonAllocatingInstance(int)
com.ning.compress.lzf.util.ChunkEncoderFactory#optimalInstance(com.ning.compress.BufferRecycler)
com.ning.compress.lzf.util.ChunkEncoderFactory#optimalInstance(int, com.ning.compress.BufferRecycler)
com.ning.compress.lzf.util.ChunkEncoderFactory#optimalNonAllocatingInstance(int, com.ning.compress.BufferRecycler)
com.ning.compress.lzf.util.ChunkDecoderFactory#optimalInstance()
com.ning.compress.lzf.util.LZFFileInputStream#<init>(java.io.File)
com.ning.compress.lzf.util.LZFFileInputStream#<init>(java.io.FileDescriptor)
com.ning.compress.lzf.util.LZFFileInputStream#<init>(java.lang.String)
com.ning.compress.lzf.util.LZFFileOutputStream#<init>(java.io.File)
com.ning.compress.lzf.util.LZFFileOutputStream#<init>(java.io.File, boolean)
com.ning.compress.lzf.util.LZFFileOutputStream#<init>(java.io.FileDescriptor)
com.ning.compress.lzf.util.LZFFileOutputStream#<init>(java.lang.String)
com.ning.compress.lzf.util.LZFFileOutputStream#<init>(java.lang.String, boolean)
com.ning.compress.lzf.LZFEncoder#encode(byte[])
com.ning.compress.lzf.LZFEncoder#encode(byte[], int, int)
com.ning.compress.lzf.LZFEncoder#encode(byte[], int, int, com.ning.compress.BufferRecycler)
com.ning.compress.lzf.LZFEncoder#appendEncoded(byte[], int, int, byte[], int)
com.ning.compress.lzf.LZFEncoder#appendEncoded(byte[], int, int, byte[], int, com.ning.compress.BufferRecycler)
com.ning.compress.lzf.LZFCompressingInputStream#<init>(java.io.InputStream)
com.ning.compress.lzf.LZFDecoder#fastDecoder()
com.ning.compress.lzf.LZFDecoder#decode(byte[])
com.ning.compress.lzf.LZFDecoder#decode(byte[], int, int)
com.ning.compress.lzf.LZFDecoder#decode(byte[], byte[])
com.ning.compress.lzf.LZFDecoder#decode(byte[], int, int, byte[])
com.ning.compress.lzf.LZFInputStream#<init>(java.io.InputStream)
com.ning.compress.lzf.LZFInputStream#<init>(java.io.InputStream, boolean)
com.ning.compress.lzf.LZFInputStream#<init>(java.io.InputStream, com.ning.compress.BufferRecycler)
com.ning.compress.lzf.LZFInputStream#<init>(java.io.InputStream, com.ning.compress.BufferRecycler, boolean)
com.ning.compress.lzf.LZFOutputStream#<init>(java.io.OutputStream)
com.ning.compress.lzf.LZFOutputStream#<init>(java.io.OutputStream, com.ning.compress.BufferRecycler)
com.ning.compress.lzf.LZFUncompressor#<init>(com.ning.compress.DataHandler)
com.ning.compress.lzf.LZFUncompressor#<init>(com.ning.compress.DataHandler, com.ning.compress.BufferRecycler)
@defaultMessage Constructing a DateTime without a time zone is dangerous
org.joda.time.DateTime#<init>()
org.joda.time.DateTime#<init>(long)
org.joda.time.DateTime#<init>(int, int, int, int, int)
org.joda.time.DateTime#<init>(int, int, int, int, int, int)
org.joda.time.DateTime#<init>(int, int, int, int, int, int, int)
org.joda.time.DateTime#now()
org.joda.time.DateTimeZone#getDefault()

View File

@ -0,0 +1,76 @@
# Elasticsearch plugin descriptor file
# This file must exist as 'plugin-descriptor.properties' at
# the root directory of all plugins.
#
# A plugin can be 'site', 'jvm', or both.
#
### example site plugin for "foo":
#
# foo.zip <-- zip file for the plugin, with this structure:
# _site/ <-- the contents that will be served
# plugin-descriptor.properties <-- example contents below:
#
# site=true
# description=My cool plugin
# version=1.0
#
### example jvm plugin for "foo"
#
# foo.zip <-- zip file for the plugin, with this structure:
# <arbitrary name1>.jar <-- classes, resources, dependencies
# <arbitrary nameN>.jar <-- any number of jars
# plugin-descriptor.properties <-- example contents below:
#
# jvm=true
# classname=foo.bar.BazPlugin
# description=My cool plugin
# version=2.0
# elasticsearch.version=2.0
# java.version=1.7
#
### mandatory elements for all plugins:
#
# 'description': simple summary of the plugin
description=${project.description}
#
# 'version': plugin's version
version=${project.version}
#
# 'name': the plugin name
name=${elasticsearch.plugin.name}
### mandatory elements for site plugins:
#
# 'site': set to true to indicate contents of the _site/
# directory in the root of the plugin should be served.
site=${elasticsearch.plugin.site}
#
### mandatory elements for jvm plugins :
#
# 'jvm': true if the 'classname' class should be loaded
# from jar files in the root directory of the plugin.
# Note that only jar files in the root directory are
# added to the classpath for the plugin! If you need
# other resources, package them into a resources jar.
jvm=${elasticsearch.plugin.jvm}
#
# 'classname': the name of the class to load, fully-qualified.
classname=${elasticsearch.plugin.classname}
#
# 'java.version' version of java the code is built against
# use the system property java.specification.version
# version string must be a sequence of nonnegative decimal integers
# separated by "."'s and may have leading zeros
java.version=${java.target.version}
#
# 'elasticsearch.version' version of elasticsearch compiled against
elasticsearch.version=${elasticsearch.version}
#
### deprecated elements for jvm plugins :
#
# 'isolated': true if the plugin should have its own classloader.
# passing false is deprecated, and only intended to support plugins
# that have hard dependencies against each other. If this is
# not specified, then the plugin is isolated by default.
isolated=${elasticsearch.plugin.isolated}
#

133
core/build.gradle Normal file
View File

@ -0,0 +1,133 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import com.carrotsearch.gradle.randomizedtesting.RandomizedTestingTask
import org.elasticsearch.gradle.BuildPlugin
import org.elasticsearch.gradle.test.RestSpecHack
apply plugin: 'elasticsearch.build'
apply plugin: 'com.bmuschko.nexus'
apply plugin: 'nebula.optional-base'
archivesBaseName = 'elasticsearch'
versions << [
jackson: '2.6.2',
log4j: '1.2.17',
slf4j: '1.6.2'
]
dependencies {
// lucene
compile "org.apache.lucene:lucene-core:${versions.lucene}"
compile "org.apache.lucene:lucene-backward-codecs:${versions.lucene}"
compile "org.apache.lucene:lucene-analyzers-common:${versions.lucene}"
compile "org.apache.lucene:lucene-queries:${versions.lucene}"
compile "org.apache.lucene:lucene-memory:${versions.lucene}"
compile "org.apache.lucene:lucene-highlighter:${versions.lucene}"
compile "org.apache.lucene:lucene-queryparser:${versions.lucene}"
compile "org.apache.lucene:lucene-suggest:${versions.lucene}"
compile "org.apache.lucene:lucene-join:${versions.lucene}"
compile "org.apache.lucene:lucene-spatial:${versions.lucene}"
compile 'org.elasticsearch:securesm:1.0'
// utilities
compile 'commons-cli:commons-cli:1.3.1'
compile 'com.carrotsearch:hppc:0.7.1'
// time handling, remove with java 8 time
compile 'joda-time:joda-time:2.8.2'
// joda 2.0 moved to using volatile fields for datetime
// When updating to a new version, make sure to update our copy of BaseDateTime
compile 'org.joda:joda-convert:1.2'
// json and yaml
compile "com.fasterxml.jackson.core:jackson-core:${versions.jackson}"
compile "com.fasterxml.jackson.dataformat:jackson-dataformat-smile:${versions.jackson}"
compile(group: 'com.fasterxml.jackson.dataformat', name: 'jackson-dataformat-yaml', version: versions.jackson) {
exclude group: 'com.fasterxml.jackson.core', module: 'jackson-databind'
}
compile "com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:${versions.jackson}"
// network stack
compile 'io.netty:netty:3.10.5.Final'
// compression of transport protocol
compile 'com.ning:compress-lzf:1.0.2'
// percentiles aggregation
compile 'com.tdunning:t-digest:3.0'
// precentil ranks aggregation
compile 'org.hdrhistogram:HdrHistogram:2.1.6'
// lucene spatial
compile 'com.spatial4j:spatial4j:0.5', optional
compile 'com.vividsolutions:jts:1.13', optional
// templating
compile 'com.github.spullara.mustache.java:compiler:0.9.1', optional
// logging
compile "log4j:log4j:${versions.log4j}", optional
compile "log4j:apache-log4j-extras:${versions.log4j}", optional
compile "org.slf4j:slf4j-api:${versions.slf4j}", optional
compile 'net.java.dev.jna:jna:4.1.0', optional
// TODO: remove these test deps and just depend on test-framework
testCompile(group: 'junit', name: 'junit', version: '4.11') {
transitive = false
}
testCompile "com.carrotsearch.randomizedtesting:randomizedtesting-runner:${versions.randomizedrunner}"
testCompile("org.apache.lucene:lucene-test-framework:${versions.lucene}") {
exclude group: 'com.carrotsearch.randomizedtesting', module: 'junit4-ant'
}
testCompile(group: 'org.hamcrest', name: 'hamcrest-all', version: '1.3') {
exclude group: 'org.hamcrest', module: 'hamcrest-core'
}
testCompile 'com.google.jimfs:jimfs:1.0'
testCompile "org.apache.httpcomponents:httpclient:${versions.httpclient}"
}
compileJava.options.compilerArgs << "-Xlint:-cast,-deprecation,-fallthrough,-overrides,-rawtypes,-serial,-try,-unchecked"
compileTestJava.options.compilerArgs << "-Xlint:-cast,-deprecation,-fallthrough,-overrides,-rawtypes,-serial,-try,-unchecked"
forbiddenPatterns {
exclude '**/*.json'
exclude '**/*.jmx'
exclude '**/org/elasticsearch/cluster/routing/shard_routes.txt'
}
task integTest(type: RandomizedTestingTask,
group: JavaBasePlugin.VERIFICATION_GROUP,
description: 'Multi-node tests',
dependsOn: test.dependsOn) {
configure(BuildPlugin.commonTestConfig(project))
classpath = project.test.classpath
testClassesDir = project.test.testClassesDir
include '**/*IT.class'
}
check.dependsOn integTest
integTest.mustRunAfter test
RestSpecHack.configureDependencies(project)
Task copyRestSpec = RestSpecHack.configureTask(project, true)
integTest.dependsOn copyRestSpec
test.dependsOn copyRestSpec

View File

@ -107,7 +107,6 @@ public class AnalysisModuleTests extends ESTestCase {
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.V_0_90_0)
.build();
AnalysisService analysisService2 = getAnalysisService(settings2);
// indicesanalysisservice always has the current version
IndicesAnalysisService indicesAnalysisService2 = injector.getInstance(IndicesAnalysisService.class);
assertThat(indicesAnalysisService2.analyzer("default"), is(instanceOf(NamedAnalyzer.class)));

View File

@ -76,40 +76,25 @@ public class ReproduceInfoPrinter extends RunListener {
return;
}
final StringBuilder b = new StringBuilder();
if (inVerifyPhase()) {
b.append("REPRODUCE WITH: mvn verify -Pdev -Dskip.unit.tests" );
} else {
b.append("REPRODUCE WITH: mvn test -Pdev");
}
String project = System.getProperty("tests.project");
if (project != null) {
b.append(" -pl " + project);
}
MavenMessageBuilder mavenMessageBuilder = new MavenMessageBuilder(b);
mavenMessageBuilder.appendAllOpts(failure.getDescription());
final StringBuilder b = new StringBuilder("REPRODUCE WITH: gradle ");
String task = System.getProperty("tests.task");
// TODO: enforce (intellij still runs the runner?) or use default "test" but that wont' work for integ
b.append(task);
GradleMessageBuilder gradleMessageBuilder = new GradleMessageBuilder(b);
gradleMessageBuilder.appendAllOpts(failure.getDescription());
//Rest tests are a special case as they allow for additional parameters
if (failure.getDescription().getTestClass().isAnnotationPresent(Rest.class)) {
mavenMessageBuilder.appendRestTestsProperties();
gradleMessageBuilder.appendRestTestsProperties();
}
System.err.println(b.toString());
}
protected TraceFormatting traces() {
TraceFormatting traces = new TraceFormatting();
try {
traces = RandomizedContext.current().getRunner().getTraceFormatting();
} catch (IllegalStateException e) {
// Ignore if no context.
}
return traces;
}
protected static class GradleMessageBuilder extends ReproduceErrorMessageBuilder {
protected static class MavenMessageBuilder extends ReproduceErrorMessageBuilder {
public MavenMessageBuilder(StringBuilder b) {
public GradleMessageBuilder(StringBuilder b) {
super(b);
}

View File

@ -94,7 +94,7 @@ public final class FileUtils {
String newPath = optionalPathPrefix + "/" + path;
file = findFile(fileSystem, newPath, optionalFileSuffix);
if (!lenientExists(file)) {
throw new NoSuchFileException(path);
throw new NoSuchFileException("path prefix: " + optionalPathPrefix + ", path: " + path + ", file suffix: " + optionalFileSuffix);
}
}
return file;

View File

@ -0,0 +1,9 @@
es.logger.level=INFO
log4j.rootLogger=${es.logger.level}, out
log4j.logger.org.apache.http=INFO, out
log4j.additivity.org.apache.http=false
log4j.appender.out=org.apache.log4j.ConsoleAppender
log4j.appender.out.layout=org.apache.log4j.PatternLayout
log4j.appender.out.layout.conversionPattern=[%d{ISO8601}][%-5p][%-25c] %m%n

12
dev-tools/build.gradle Normal file
View File

@ -0,0 +1,12 @@
apply plugin: 'groovy'
repositories {
mavenCentral()
}
dependencies {
compile gradleApi()
compile localGroovy()
//compile group: 'com.carrotsearch.randomizedtesting', name: 'junit4-ant', version: '2.1.16'
}

201
distribution/build.gradle Normal file
View File

@ -0,0 +1,201 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import org.apache.tools.ant.filters.FixCrLfFilter
import org.elasticsearch.gradle.precommit.DependencyLicensesTask
import org.elasticsearch.gradle.MavenFilteringHack
// for deb/rpm
buildscript {
repositories {
maven {
url "https://plugins.gradle.org/m2/"
}
}
dependencies {
classpath 'com.netflix.nebula:gradle-ospackage-plugin:3.1.0'
}
}
allprojects {
project.ext {
// this is common configuration for distributions, but we also add it here for the license check to use
deps = project("${projectsPrefix}:core").configurations.runtime.copyRecursive().exclude(module: 'slf4j-api')
}
}
subprojects {
/*****************************************************************************
* Maven config *
*****************************************************************************/
// note: the group must be correct before applying the nexus plugin, or it will capture the wrong value...
project.group = "org.elasticsearch.distribution.${project.name}"
apply plugin: 'com.bmuschko.nexus'
// we must create our own install task, because it is only added when the java plugin is added
task install(type: Upload, description: "Installs the 'archives' artifacts into the local Maven repository.", group: 'Upload') {
configuration = configurations.archives
MavenRepositoryHandlerConvention repositoriesHandler = (MavenRepositoryHandlerConvention)getRepositories().getConvention().getPlugin(MavenRepositoryHandlerConvention);
repositoriesHandler.mavenInstaller();
}
// TODO: the map needs to be an input of the tasks, so that when it changes, the task will re-run...
/*****************************************************************************
* Properties to expand when copying packaging files *
*****************************************************************************/
project.ext {
expansions = [
'project.version': version,
'project.parent.artifactId': 'distributions',
// Default values for min/max heap memory allocated to elasticsearch java process
'packaging.elasticsearch.heap.min': '256m',
'packaging.elasticsearch.heap.max': '1g',
'project.build.finalName': "elasticsearch-${version}",
// Default configuration directory and file to use in bin/plugin script
'packaging.plugin.default.config.dir': '$ES_HOME/config',
'packaging.plugin.default.config.file': '$ES_HOME/config/elasticsearch.yml',
'packaging.env.file': '',
// TODO: do we really need this marker? the tgz and zip are exactly the same,
// we should not need to specify twice just to change this
'packaging.type': 'tar.gz',
]
/*****************************************************************************
* Common files in all distributions *
*****************************************************************************/
libFiles = copySpec {
into 'lib'
from project("${projectsPrefix}:core").jar
from deps
}
configFiles = copySpec {
from '../src/main/resources/config'
}
commonFiles = copySpec {
// everything except windows files, and config is separate
from '../src/main/resources'
exclude 'bin/*.bat'
exclude 'bin/*.exe'
exclude 'config/**'
filesMatching('bin/*') { it.setMode(0755) }
}
}
}
/*****************************************************************************
* Zip and tgz configuration *
*****************************************************************************/
configure(subprojects.findAll { it.name == 'zip' || it.name == 'tar' }) {
project.ext.archivesFiles = copySpec {
into("elasticsearch-${version}") {
with libFiles
into('config') {
with configFiles
}
with copySpec {
with commonFiles
from('../src/main/resources') {
include 'bin/*.bat'
filter(FixCrLfFilter, eol: FixCrLfFilter.CrLf.newInstance('crlf'))
}
MavenFilteringHack.filter(it, expansions)
}
from('../src/main/resources') {
include 'bin/*.exe'
}
}
}
}
/*****************************************************************************
* Deb and rpm configuration *
*****************************************************************************/
// ospackage supports adding empty dirs with directory() to rpm, but not deb...yet
// https://github.com/nebula-plugins/gradle-ospackage-plugin/issues/115
// however, even adding just for rpm doesn't seem to work...
// gradle may also get native support https://issues.gradle.org/browse/GRADLE-1671
// in the meantime, we hack this by copying an empty dir
// TODO: HACK DOES NOT WORK
/*ext.emptyDir = new File(project.buildDir, 'empty')
Closure emptyDirSpec() {
return {
from emptyDir
addParentDirs false
createDirectoryEntry true
}
}
task createEmptyDir << {
emptyDir.mkdirs()
}
buildRpm.dependsOn createEmptyDir
buildDeb.dependsOn createEmptyDir
*/
/*****************************************************************************
* Deb and rpm configuration *
*****************************************************************************/
configure(subprojects.findAll { it.name == 'zip' || it.name == 'tar' }) {
apply plugin: 'nebula.ospackage-base'
ospackage {
packageName = 'elasticsearch'
// TODO: '-' is an illegal character in rpm version...redline croaks
version = '3.0.0'
into '/usr/share/elasticsearch'
user 'root'
permissionGroup 'root'
with libFiles
with copySpec {
with commonFiles
// TODO: omit LICENSE.txt file on deb??
}
into('/etc/elasticsearch') {
with configFiles
//into('scripts', emptyDirSpec())
createDirectoryEntry = true
includeEmptyDirs = true
}
directory('/etc/elasticsearch/scripts')
}
if (project.name == 'deb') {
task buildDeb(type: Deb) {
dependsOn deps
}
artifacts {
archives buildDeb
}
} else if (project.name == 'rpm') {
task buildRpm(type: Rpm) {
dependsOn deps
}
artifacts {
archives buildRpm
}
}
}
// TODO: dependency checks should really be when building the jar itself, which would remove the need
// for this hackery and instead we can do this inside the BuildPlugin
task check(group: 'Verification', description: 'Runs all checks.') {} // dummy task!
DependencyLicensesTask.configure(project) {
dependsOn = [deps]
dependencies = deps
mapping from: /lucene-.*/, to: 'lucene'
mapping from: /jackson-.*/, to: 'jackson'
}

View File

@ -0,0 +1,4 @@
/*task buildDeb(type: Deb) {
dependsOn deps
}*/

View File

@ -0,0 +1,4 @@
/*task buildRpm(type: Rpm) {
dependsOn deps
}*/

View File

@ -0,0 +1,10 @@
task buildTar(type: Tar, dependsOn: deps) {
baseName = 'elasticsearch'
with archivesFiles
compression = Compression.GZIP
}
artifacts {
archives buildTar
}

View File

@ -0,0 +1,11 @@
task buildZip(type: Zip, dependsOn: deps) {
baseName = 'elasticsearch'
with archivesFiles
}
artifacts {
'default' buildZip
archives buildZip
}

View File

@ -0,0 +1,2 @@
confi

2
gradle.properties Normal file
View File

@ -0,0 +1,2 @@
group=org.elasticsearch
version=3.0.0-SNAPSHOT

View File

@ -0,0 +1,34 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The ICU Analysis plugin integrates Lucene ICU module into elasticsearch, adding ICU relates analysis components.'
classname 'org.elasticsearch.plugin.analysis.icu.AnalysisICUPlugin'
}
dependencies {
compile "org.apache.lucene:lucene-analyzers-icu:${versions.lucene}"
}
dependencyLicenses {
mapping from: /lucene-.*/, to: 'lucene'
}
compileJava.options.compilerArgs << "-Xlint:-deprecation"

View File

@ -0,0 +1,32 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Japanese (kuromoji) Analysis plugin integrates Lucene kuromoji analysis module into elasticsearch.'
classname 'org.elasticsearch.plugin.analysis.kuromoji.AnalysisKuromojiPlugin'
}
dependencies {
compile "org.apache.lucene:lucene-analyzers-kuromoji:${versions.lucene}"
}
dependencyLicenses {
mapping from: /lucene-.*/, to: 'lucene'
}

View File

@ -0,0 +1,34 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Phonetic Analysis plugin integrates phonetic token filter analysis with elasticsearch.'
classname 'org.elasticsearch.plugin.analysis.AnalysisPhoneticPlugin'
}
dependencies {
compile "org.apache.lucene:lucene-analyzers-phonetic:${versions.lucene}"
}
dependencyLicenses {
mapping from: /lucene-.*/, to: 'lucene'
}
compileJava.options.compilerArgs << "-Xlint:-rawtypes,-unchecked"

View File

@ -0,0 +1,32 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'Smart Chinese Analysis plugin integrates Lucene Smart Chinese analysis module into elasticsearch.'
classname 'org.elasticsearch.plugin.analysis.smartcn.AnalysisSmartChinesePlugin'
}
dependencies {
compile "org.apache.lucene:lucene-analyzers-smartcn:${versions.lucene}"
}
dependencyLicenses {
mapping from: /lucene-.*/, to: 'lucene'
}

View File

@ -0,0 +1,31 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Stempel (Polish) Analysis plugin integrates Lucene stempel (polish) analysis module into elasticsearch.'
classname 'org.elasticsearch.plugin.analysis.stempel.AnalysisStempelPlugin'
}
dependencies {
compile "org.apache.lucene:lucene-analyzers-stempel:${versions.lucene}"
}
dependencyLicenses {
mapping from: /lucene-.*/, to: 'lucene'
}

32
plugins/build.gradle Normal file
View File

@ -0,0 +1,32 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import org.elasticsearch.gradle.precommit.DependencyLicensesTask
subprojects {
group = 'org.elasticsearch.plugin'
apply plugin: 'elasticsearch.esplugin'
apply plugin: 'com.bmuschko.nexus'
Task dependencyLicensesTask = DependencyLicensesTask.configure(project) {
dependencies = project.configurations.runtime - project.configurations.provided
}
project.precommit.dependsOn(dependencyLicensesTask)
}

View File

@ -0,0 +1,24 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Delete By Query plugin allows to delete documents in Elasticsearch with a single query.'
classname 'org.elasticsearch.plugin.deletebyquery.DeleteByQueryPlugin'
}

View File

@ -0,0 +1,48 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Azure Discovery plugin allows to use Azure API for the unicast discovery mechanism.'
classname 'org.elasticsearch.plugin.discovery.azure.AzureDiscoveryPlugin'
}
dependencies {
compile('com.microsoft.azure:azure-management-compute:0.7.0') {
exclude group: 'stax', module: 'stax-api'
}
compile('com.microsoft.azure:azure-management:0.7.0') {
exclude group: 'stax', module: 'stax-api'
}
compile "org.apache.httpcomponents:httpclient:${versions.httpclient}"
}
dependencyLicenses {
mapping from: /azure-.*/, to: 'azure'
mapping from: /jackson-.*/, to: 'jackson'
mapping from: /jersey-.*/, to: 'jersey'
mapping from: /jaxb-.*/, to: 'jaxb'
mapping from: /stax-.*/, to: 'stax'
}
compileJava.options.compilerArgs << '-Xlint:-path,-serial,-static,-unchecked'
// TODO: why is deprecation needed here but not in maven....?
compileJava.options.compilerArgs << '-Xlint:-deprecation'
// TODO: and why does this static not show up in maven...
compileTestJava.options.compilerArgs << '-Xlint:-static'

View File

@ -0,0 +1,40 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The EC2 discovery plugin allows to use AWS API for the unicast discovery mechanism.'
classname 'org.elasticsearch.plugin.discovery.ec2.Ec2DiscoveryPlugin'
}
dependencies {
compile 'com.amazonaws:aws-java-sdk-ec2:1.10.19'
compile "org.apache.httpcomponents:httpclient:${versions.httpclient}"
}
dependencyLicenses {
mapping from: /aws-java-sdk-.*/, to: 'aws-java-sdk'
mapping from: /jackson-.*/, to: 'jackson'
}
compileJava.options.compilerArgs << '-Xlint:-rawtypes'
test {
// this is needed for insecure plugins, remove if possible!
systemProperty 'tests.artifact', project.name
}

View File

@ -0,0 +1,23 @@
esplugin {
description 'The Google Compute Engine (GCE) Discovery plugin allows to use GCE API for the unicast discovery mechanism.'
classname 'org.elasticsearch.plugin.discovery.gce.GceDiscoveryPlugin'
}
dependencies {
compile('com.google.apis:google-api-services-compute:v1-rev71-1.20.0') {
exclude group: 'com.google.guava', module: 'guava-jdk5'
}
compile "org.apache.httpcomponents:httpclient:${versions.httpclient}"
}
dependencyLicenses {
mapping from: /google-.*/, to: 'google'
}
compileJava.options.compilerArgs << '-Xlint:-rawtypes,-unchecked'
test {
// this is needed for insecure plugins, remove if possible!
systemProperty 'tests.artifact', project.name
}

View File

@ -0,0 +1,25 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Multicast Discovery plugin allows discovery other nodes using multicast requests'
classname 'org.elasticsearch.plugin.discovery.multicast.MulticastDiscoveryPlugin'
}
compileJava.options.compilerArgs << "-Xlint:-deprecation"

View File

@ -0,0 +1,29 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'Demonstrates all the pluggable Java entry points in Elasticsearch'
classname 'org.elasticsearch.plugin.example.JvmExamplePlugin'
}
// no unit tests
test.enabled = false
compileJava.options.compilerArgs << "-Xlint:-rawtypes"

View File

@ -0,0 +1,35 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'Lucene expressions integration for Elasticsearch'
classname 'org.elasticsearch.script.expression.ExpressionPlugin'
}
dependencies {
compile "org.apache.lucene:lucene-expressions:${versions.lucene}"
}
dependencyLicenses {
mapping from: /lucene-.*/, to: 'lucene'
}
compileJava.options.compilerArgs << '-Xlint:-rawtypes'
compileTestJava.options.compilerArgs << '-Xlint:-rawtypes'

View File

@ -0,0 +1,37 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'Groovy scripting integration for Elasticsearch'
classname 'org.elasticsearch.script.groovy.GroovyPlugin'
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.4.4:indy'
}
compileJava.options.compilerArgs << '-Xlint:-rawtypes,-unchecked,-cast,-deprecation'
compileTestJava.options.compilerArgs << '-Xlint:-rawtypes,-unchecked,-cast,-deprecation'
integTest {
cluster {
systemProperty 'es.script.inline', 'on'
systemProperty 'es.script.indexed', 'on'
}
}

View File

@ -0,0 +1,38 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The JavaScript language plugin allows to have javascript as the language of scripts to execute.'
classname 'org.elasticsearch.plugin.javascript.JavaScriptPlugin'
}
dependencies {
compile 'org.mozilla:rhino:1.7R4'
}
compileJava.options.compilerArgs << "-Xlint:-rawtypes,-unchecked"
compileTestJava.options.compilerArgs << "-Xlint:-rawtypes,-unchecked"
integTest {
cluster {
systemProperty 'es.script.inline', 'on'
systemProperty 'es.script.indexed', 'on'
}
}

View File

@ -0,0 +1,38 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Python language plugin allows to have python as the language of scripts to execute.'
classname 'org.elasticsearch.plugin.python.PythonPlugin'
}
dependencies {
compile 'org.python:jython-standalone:2.7.0'
}
compileJava.options.compilerArgs << "-Xlint:-unchecked"
compileTestJava.options.compilerArgs << "-Xlint:-unchecked"
integTest {
cluster {
systemProperty 'es.script.inline', 'on'
systemProperty 'es.script.indexed', 'on'
}
}

View File

@ -0,0 +1,25 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Mapper Murmur3 plugin allows to compute hashes of a field\'s values at index-time and to store them in the index.'
classname 'org.elasticsearch.plugin.mapper.MapperMurmur3Plugin'
}
compileJava.options.compilerArgs << "-Xlint:-rawtypes"

View File

@ -0,0 +1,24 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Mapper Size plugin allows document to record their uncompressed size at index time.'
classname 'org.elasticsearch.plugin.mapper.MapperSizePlugin'
}

View File

@ -0,0 +1,40 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Azure Repository plugin adds support for Azure storage repositories.'
classname 'org.elasticsearch.plugin.repository.azure.AzureRepositoryPlugin'
}
dependencies {
compile('com.microsoft.azure:azure-storage:2.0.0') {
exclude group: 'org.slf4j', module: 'slf4j-api'
}
}
dependencyLicenses {
mapping from: /azure-.*/, to: 'azure'
mapping from: /jackson-.*/, to: 'jackson'
mapping from: /jersey-.*/, to: 'jersey'
mapping from: /jaxb-.*/, to: 'jaxb'
mapping from: /stax-.*/, to: 'stax'
}
compileJava.options.compilerArgs << '-Xlint:-deprecation,-serial'

View File

@ -0,0 +1,40 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The S3 repository plugin adds S3 repositories.'
classname 'org.elasticsearch.plugin.repository.s3.S3RepositoryPlugin'
}
dependencies {
compile 'com.amazonaws:aws-java-sdk-s3:1.10.19'
compile "org.apache.httpcomponents:httpclient:${versions.httpclient}"
}
dependencyLicenses {
mapping from: /aws-java-sdk-.*/, to: 'aws-java-sdk'
mapping from: /jackson-.*/, to: 'jackson'
}
compileJava.options.compilerArgs << '-Xlint:-deprecation,-rawtypes'
test {
// this is needed for insecure plugins, remove if possible!
systemProperty 'tests.artifact', project.name
}

View File

@ -0,0 +1,27 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'Demonstrates how to serve resources via elasticsearch.'
jvm false
site true
}
// no unit tests
test.enabled = false

View File

@ -0,0 +1,24 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
esplugin {
description 'The Store SMB plugin adds support for SMB stores.'
classname 'org.elasticsearch.plugin.store.smb.SMBStorePlugin'
}

View File

@ -0,0 +1,22 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
apply plugin: 'elasticsearch.rest-test'
// TODO: this test works, but it isn't really a rest test...should we have another plugin for "non rest test that just needs N clusters?"

View File

@ -0,0 +1,47 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import org.elasticsearch.gradle.MavenFilteringHack
apply plugin: 'elasticsearch.rest-test'
ext.pluginCount = 0
for (Project subproj : project.rootProject.subprojects) {
if (subproj.path.startsWith(':plugins:')) {
integTest {
def bundlePlugin = subproj.tasks.findByName('bundlePlugin')
String camelName = subproj.name.replaceAll(/-(\w)/) { _, c -> c.toUpperCase() }
dependsOn bundlePlugin
cluster {
plugin "install${camelName.capitalize()}", bundlePlugin.outputs.files
}
}
pluginCount += 1
}
}
ext.expansions = [
'expected.plugin.count': pluginCount
]
processTestResources {
inputs.properties(expansions)
MavenFilteringHack.filter(it, expansions)
}

View File

@ -0,0 +1 @@
apply plugin: 'java'

42
settings.gradle Normal file
View File

@ -0,0 +1,42 @@
rootProject.name = 'elasticsearch'
String[] projects = [
'rest-api-spec',
'core',
'distribution:zip',
'distribution:tar',
'distribution:deb',
'distribution:rpm',
'test-framework',
'plugins:analysis-icu',
'plugins:analysis-kuromoji',
'plugins:analysis-phonetic',
'plugins:analysis-smartcn',
'plugins:analysis-stempel',
'plugins:delete-by-query',
'plugins:discovery-azure',
'plugins:discovery-ec2',
'plugins:discovery-gce',
'plugins:discovery-multicast',
'plugins:lang-expression',
'plugins:lang-groovy',
'plugins:lang-javascript',
'plugins:lang-python',
'plugins:mapper-murmur3',
'plugins:mapper-size',
'plugins:repository-azure',
'plugins:repository-s3',
'plugins:jvm-example',
'plugins:site-example',
'plugins:store-smb',
'qa:smoke-test-client',
'qa:smoke-test-plugins'
]
if (hasProperty('elasticsearch.projectsPrefix')) {
String prefix = getProperty('elasticsearch.projectsPrefix')
projects = projects.collect { "${prefix}:${it}" }
}
include projects

View File

@ -0,0 +1,69 @@
apply plugin: 'java'
apply plugin: 'com.bmuschko.nexus'
dependencies {
// TODO: change to elasticsearch core jar dep, and use dependnecy subs to point at core project
compile "org.elasticsearch:elasticsearch:${version}"
compile(group: 'junit', name: 'junit', version: '4.11') {
exclude group: 'org.hamcrest', module: 'hamcrest-core'
}
compile "com.carrotsearch.randomizedtesting:randomizedtesting-runner:${versions.randomizedrunner}"
compile("org.apache.lucene:lucene-test-framework:${versions.lucene}") {
exclude group: 'com.carrotsearch.randomizedtesting', module: 'junit4-ant'
}
compile(group: 'org.hamcrest', name: 'hamcrest-all', version: '1.3') {
exclude group: 'org.hamcrest', module: 'hamcrest-core'
}
compile "com.google.jimfs:jimfs:1.0"
compile "org.apache.httpcomponents:httpclient:${versions.httpclient}"
}
// HACK: this is temporary until we have moved to gradle, at which
// point we can physically move the test framework files to this project
project.ext {
srcDir = new File(project.buildDir, 'src')
coreDir = new File(project("${projectsPrefix}:core").projectDir, 'src' + File.separator + 'test')
}
sourceSets.main.java.srcDir(new File(srcDir, "java"))
sourceSets.main.resources.srcDir(new File(srcDir, "resources"))
task copySourceFiles(type: Sync) {
from(coreDir) {
include 'resources/log4j.properties'
include 'java/org/elasticsearch/test/**'
include 'java/org/elasticsearch/bootstrap/BootstrapForTesting.java'
include 'java/org/elasticsearch/bootstrap/MockPluginPolicy.java'
include 'java/org/elasticsearch/common/cli/CliToolTestCase.java'
include 'java/org/elasticsearch/cluster/MockInternalClusterInfoService.java'
include 'java/org/elasticsearch/cluster/routing/TestShardRouting.java'
include 'java/org/elasticsearch/index/MockEngineFactoryPlugin.java'
include 'java/org/elasticsearch/search/MockSearchService.java'
include 'java/org/elasticsearch/search/aggregations/bucket/AbstractTermsTestCase.java'
include 'java/org/elasticsearch/search/aggregations/bucket/script/NativeSignificanceScoreScriptNoParams.java'
include 'java/org/elasticsearch/search/aggregations/bucket/script/NativeSignificanceScoreScriptWithParams.java'
include 'java/org/elasticsearch/search/aggregations/bucket/script/TestScript.java'
include 'java/org/elasticsearch/search/aggregations/metrics/AbstractNumericTestCase.java'
include 'java/org/elasticsearch/percolator/PercolatorTestUtil.java'
include 'java/org/elasticsearch/cache/recycler/MockPageCacheRecycler.java'
include 'java/org/elasticsearch/common/util/MockBigArrays.java'
include 'java/org/elasticsearch/node/NodeMocksPlugin.java'
include 'java/org/elasticsearch/node/MockNode.java'
include 'java/org/elasticsearch/common/io/PathUtilsForTesting.java'
// unit tests for yaml suite parser & rest spec parser need to be excluded
exclude 'java/org/elasticsearch/test/rest/test/**'
// unit tests for test framework classes
exclude 'java/org/elasticsearch/test/test/**'
// no geo (requires optional deps)
exclude 'java/org/elasticsearch/test/hamcrest/ElasticsearchGeoAssertions.java'
exclude 'java/org/elasticsearch/test/geo/RandomShapeGenerator.java'
// this mock is just for a single logging test
exclude 'java/org/elasticsearch/test/MockLogAppender.java'
}
into srcDir
}
compileJava.dependsOn copySourceFiles
compileJava.options.compilerArgs << "-Xlint:-cast,-deprecation,-fallthrough,-overrides,-rawtypes,-serial,-try,-unchecked"