This reworks how we configure the `shadow` plugin in the build. The major
change is that we no longer bundle dependencies in the `compile` configuration,
instead we bundle dependencies in the new `bundle` configuration. This feels
more right because it is a little more "opt in" rather than "opt out" and the
name of the `bundle` configuration is a little more obvious.
As an neat side effect of this, the `runtimeElements` configuration used when
one project depends on another now contains exactly the dependencies needed
to run the project so you no longer need to reference projects that use the
shadow plugin like this:
```
testCompile project(path: ':client:rest-high-level', configuration: 'shadow')
```
You can instead use the much more normal:
```
testCompile "org.elasticsearch.client:elasticsearch-rest-high-level-client:${version}"
```
Elasticsearch versions earlier than 6.4.0 cannot properly run in a
FIPS 140 JVM. This commit ensures that we use a non-FIPS JVM for
nodes that we spin up in BWC tests even when we're testing FIPS.
* Scripted metric aggregations: add deprecation warning and system property to control legacy params
Scripted metric aggregation params._agg/_aggs are replaced by state/states context variables. By default the old params are still present, and a deprecation warning is emitted when Scripted Metric Aggregations are used. A new system property can be used to disable the legacy params. This functionality will be removed in a future revision.
* Fix minor style issue and docs test failure
* Disable deprecated params._agg/_aggs in tests and revise tests to use state/states instead
* Add integration test covering deprecated scripted metrics aggs params._agg/_aggs access
* Disable deprecated params._agg/_aggs in docs integration tests and revise stored scripts to use state/states instead
* Revert unnecessary migrations doc change
A relevant note should be added in the changes destined for 7.0; this PR is going to be backported to 6.x.
* Replace deprecated _agg param bwc integration test with a couple of unit tests
* Fix compatibility test after merge
* Rename backwards compatibility system property per code review feedback
* Tweak deprecation warning text per review feedback
Add tests for build-tools to make sure example plugins build stand-alone using it.
This will catch issues such as referencing files from the buildSrc directly, breaking external uses of build-tools.
* Implement Version in java
- This allows to move all all .java files from .groovy.
- Will prevent eclipse from tangling up in this setup
- make it possible to use Version from Java
* PR review comments
* Cluster formation plugin with reference counting
```
> Task :plugins:ingest-user-agent:listElasticSearchClusters
Starting cluster: myTestCluster
* myTestCluster: /home/alpar/work/elastic/elasticsearch/plugins/ingest-user-agent/foo
Asked to unClaimAndStop myTestCluster, since cluster still has 1 claim it will not be stopped
> Task :plugins:ingest-user-agent:testme UP-TO-DATE
Stopping myTestCluster, since no of claims is 0
```
- Meant to auto manage the clusters lifecycle
- Add integration test for cluster formation
* Fix rebase
* Change to `useCluster` method on task
* Add a task to run forbiddenapis using cli
Add a task that offers an equivalent check to the forbidden APIs plugin,
but runs it using the forbiddenAPIs CLI instead.
This isn't wired into precommit first, and doesn't work for projects
that require specific signatures, etc. It's meant to show how this can
be used. The next step is to make a custom task type and configure it
based on the project extension from the pugin and make some minor
adjustments to some build scripts as we can't bee 100% compatible with
that at least due to how additional signatures are passed.
Notes:
- there's no `--target` for the CLI version so we have to pass in
specific bundled signature names
- the cli task already wires to `runtimeJavaHome`
- no equivalent for `failOnUnsupportedJava = false` but doesn't seem to
be a problem. Tested with Java 8 to 11
- there's no way to pass additional signatures as URL, these will have
to be on the file system, and can't be resources on the cp unless we
rely on how forbiddenapis is implemented and mimic them as boundled
signatures.
- the average of 3 runs is 4% slower using the CLI for :server.
( `./gradlew clean :server:forbiddenApis` vs `./gradlew clean
:server:forbiddenApisCli`)
- up-to-date checks don't work on the cli task yet, that will happen
with the custom task.
See also: #31715
The upcoming ML log structure finder functionality will use these
libraries, and it makes sense to use the same versions that are
being used elsewhere in Elasticsearch. This is especially true
with icu4j, which is pretty big.
Enhance reproduction line with info about jdks
Provide the ability to control compiler and hava versions just by
passing a property. The actual java home comes from the
`JAVA<major>_HOME` env vars that we allready require.
This works better with the Gradle daemon as well.
Output is also changed a bit.
for `-Druntime.java=8 -Dcompiler.java=9`:
```
=======================================
Elasticsearch Build Hamster says Hello!
Gradle Version : 4.9
OS Info : Linux 4.17.8-1-ARCH (amd64)
Compiler JDK Version : 11 (Oracle Corporation 11-ea [OpenJDK 64-Bit Server VM 11-ea+22])
Runtime JDK Version : 11 (Oracle Corporation 11-ea [OpenJDK 64-Bit Server VM 11-ea+22])
Gradle JDK Version : 10 (Oracle Corporation 10.0.1 [OpenJDK 64-Bit Server VM 10.0.1+10])
Compiler java.home : /home/alpar/opt/jdk-11-ea22/
Runtime java.home : /home/alpar/opt/jdk-11-ea22/
Gradle java.home : /usr/lib/jvm/java-10-openjdk
Random Testing Seed : EA858533191E8DFB
=======================================
```
Without configuration:
```
=======================================
Elasticsearch Build Hamster says Hello!
=======================================
Gradle Version : 4.9
OS Info : Linux 4.17.8-1-ARCH (amd64)
JDK Version : 10 (Oracle Corporation 10.0.1 [OpenJDK 64-Bit Server VM 10.0.1+10])
JAVA_HOME : /usr/lib/jvm/java-10-openjdk
Random Testing Seed : 4BD5B2A839C8FCA1
=======================================
```
Here's how a reproduction line will look like (test made to fail):
```
./gradlew :modules:lang-painless:test -Dtests.seed=2DA2379065A4EEAB -Dtests.class=org.elasticsearch.painless.AdditionTests -Dtests.method="testInt" -Dtests.security.manager=true -Dtests.locale=es-PE -Dtests.timezone=WET -Dcompiler.java=10 -Druntime.java=10
```
This commit adds two pieces. The first is a small set of documentation providing
instructions on how to get setup to run context examples. This will require a download
similar to how Kibana works for some of the examples. The second is an ingest processor
example using the downloaded data. More examples will follow as ideally one per PR.
This also adds a set of tests to individually test each script as a unit test.
Currently, snippets in lists cannot be rendered correctly as a console command because the console command requires a line continuation '+'. This allows snippets to have a line continuation between the snippet and the // CONSOLE.
This commit adds the elastic repo, alongside maven central, to any
plugin using BuildPlugin. This is necessary now because the default
distribution, which the test framework uses by default, is now only
hosted on elastic maven. While inside the elasticsearch build this does
not matter, those that build external plugins with our build-tools could
have tests fail to find the distribution dependency.
This commit adds a boolean system property, `es.scripting.use_java_time`,
which controls the concrete return type used by doc values within
scripts. The return type of accessing doc values for a date field is
changed to Object, essentially duck typing the type to allow
co-existence during the transition from joda time to java time.
The main highlight is the removal of the reclaim_deletes_weight in the TieredMergePolicy.
The es setting index.merge.policy.reclaim_deletes_weight is deprecated in this commit and the value is ignored. The new merge policy setting setDeletesPctAllowed should be added in a follow up.
When we added the `java-gradle-plugin` to `buildSrc` it added a second
task to generate the pom that duplicates the publishing work that we
configure in `BuildPlugin`. Not only does it dupliciate the pom, it
creates a pom that is missing things like `name` and `description` which
are required for publishing to maven central.
This change disables the duplicate pom generation.
These are collected from a number of open PRs and are required to
improove existing and write more readable future tests.
I am extracting them to their own PR hoping to be able to merge and use
them sooner.
* Determine the minimum gradle version based on the wrapper
This is restrictive and forces users of the plugin to move together with
us, but without integration tests it's close to impossible to make sure
that the claimed compatability is really there.
If we do want to offer more flexibility, we should add those tests
first.
* Track gradle version in individual file
* PR review
This bundles the x-pack:protocol project into the x-pack:plugin:core
project because we'd like folks to consider it an implementation detail
of our build rather than a separate artifact to be managed and depended
on. It is now bundled into both x-pack:plugin:core and
client:rest-high-level. To make this work I had to fix a few things.
Firstly, I had to make PluginBuildPlugin work with the shadow plugin.
In that case we have to bundle only the `shadow` dependencies and the
shadow jar.
Secondly, every reference to x-pack:plugin:core has to use the `shadow`
configuration. Without that the reference is missing all of the
un-shadowed dependencies. I tried to make it so that applying the shadow
plugin automatically redefines the `default` configuration to mirror the
`shadow` configuration which would allow us to use bare project references
to the x-pack:plugin:core project but I couldn't make it work. It'd *look*
like it works but then fail for transitive dependencies anyway. I think
it is still a good thing to do but I don't have the willpower to do it
now.
Finally, I had to fix an issue where Eclipse and IntelliJ didn't properly
reference shadowed transitive dependencies. Neither IDE supports shadowing
natively so they have to reference the shadowed projects. We fix this by
detecting `shadow` dependencies when in "Intellij mode" or "Eclipse mode"
and adding `runtime` dependencies to the same target. This convinces
IntelliJ and Eclipse to play nice.
* Complete changes for running IT in a fips JVM
- Mute :x-pack:qa:sql:security:ssl:integTest as it
cannot run in FIPS 140 JVM until the SQL CLI supports key/cert.
- Set default JVM keystore/truststore password in top level build
script for all integTest tasks in a FIPS 140 JVM
- Changed top level x-pack build script to use keys and certificates
for trust/key material when spinning up clusters for IT
Throw an exception for doc['field'].value
if this document is missing a value for the field.
After deprecation changes have been backported to 6.x,
make this a default behaviour in 7.0
Closes#29286
In 1.x and 2.x, plugins were published to maven and the plugin
installer downloaded them from there. This was later changed to install
from the download service, and in 5.0 plugin zips were no longer
published to maven. However, the build still currently produces an
unused pom file. This is troublesome in the special case when the main
jar of a plugin needs to be published (and thus needs a pom file of
the same name).
closes#31946
This commit moves additional unit test runners from being dependencies
of the test task to dependencies of check. Without this change,
reproduce lines are incorrect due to the additional test runner not
matching any of the reproduce class/method info.
closes#31964
Moves the customizations to the build to produce nice shadow jars and
javadocs into common build code, mostly BuildPlugin with a little into
the root build.gradle file. This means that any project that applies the
shadow plugin will automatically be set up just like the high level rest
client:
* The non-shadow jar will not be built
* The shadow jar will not have a "classifier"
* Tests will run against the shadow jar
* Javadoc will include all of the shadowed classes
* Service files in `META-INF/services` will be merged
We have been encountering name mismatches between API defined in our
REST spec and method names that have been added to the high-level REST
client. We should check this automatically to prevent furher mismatches,
and correct all the current ones.
This commit adds a test for this and corrects the issues found by it.
With this commit we disable the real-memory circuit breaker in REST
tests as this breaker is based on real memory usage over which we have
no (full) control in tests and the REST client is not yet ready to retry
on circuit breaker exceptions.
This is only meant as a temporary measure to avoid spurious test
failures while we ensure that the REST client can handle those
situations appropriately.
Closes#32050
Relates #31767
Relates #31986
Relates #32074
Implement buildSrc Version in java
- This allows to move all all .java files from .groovy.
- Will prevent eclipse from tangling up in this setup
- make it possible to use Version from Java
Recreates the rest of the bats packaging tests for the tar distribution
in the java packaging test project, with support for both tar and zip
packaging, both oss and default flavors, and on Linux and Windows. Most
tests are followed fairly closely, some have either been dropped if
unnecessary or folded into others if convenient.
* Handle missing values in painless
Throw an exception for `doc['field'].value`
if this document is missing a value for the `field`.
For 7.0:
This is the default behaviour from 7.0
For 6.x:
To enable this behavior from 6.x, a user can set a jvm.option:
`-Des.script.exception_for_missing_value=true` on a node.
If a user does not enable this behavior, a deprecation warning is logged on start up.
Closes#29286
* Upgrade bouncycastle
Required to fix
`bcprov-jdk15on-1.55.jar; invalid manifest format `
on jdk 11
* Downgrade bouncycastle to avoid invalid manifest
* Add checksum for new jars
* Update tika permissions for jdk 11
* Mute test failing on jdk 11
* Add JDK11 to CI
* Thread#stop(Throwable) was removed
http://mail.openjdk.java.net/pipermail/core-libs-dev/2018-June/053536.html
* Disable failing tests #31456
* Temprorarily disable doc tests
To see if there are other failures on JDK11
* Only blacklist specific doc tests
* Disable only failing tests in ingest attachment plugin
* Mute failing HDFS tests #31498
* Mute failing lang-painless tests #31500
* Fix backwards compatability builds
Fix JAVA version to 10 for ES 6.3
* Add 6.x to bwx -> java10
* Prefix out and err from buildBwcVersion for readability
```
> Task :distribution:bwc:next-bugfix-snapshot:buildBwcVersion
[bwc] :buildSrc:compileJava
[bwc] WARNING: An illegal reflective access operation has occurred
[bwc] WARNING: Illegal reflective access by org.codehaus.groovy.reflection.CachedClass (file:/home/alpar/.gradle/wrapper/dists/gradle-4.5-all/cg9lyzfg3iwv6fa00os9gcgj4/gradle-4.5/lib/groovy-all-2.4.12.jar) to method java.lang.Object.finalize()
[bwc] WARNING: Please consider reporting this to the maintainers of org.codehaus.groovy.reflection.CachedClass
[bwc] WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
[bwc] WARNING: All illegal access operations will be denied in a future release
[bwc] :buildSrc:compileGroovy
[bwc] :buildSrc:writeVersionProperties
[bwc] :buildSrc:processResources
[bwc] :buildSrc:classes
[bwc] :buildSrc:jar
```
* Also set RUNTIME_JAVA_HOME for bwcBuild
So that we can make sure it's not too new for the build to understand.
* Align bouncycastle dependency
* fix painles array tets
closes#31500
* Update jar checksums
* Keep 8/10 runtime/compile untill consensus builds on 11
* Only skip failing tests if running on Java 11
* Failures are dependent of compile java version not runtime
* Condition doc test exceptions on compiler java version as well
* Disable hdfs tests based on runtime java
* Set runtime java to minimum supported for bwc
* PR review
* Add comment with ticket for forbidden apis
* remove explicit wrapper task
It's created by Gradle and triggers a deprecation warning
Simplify configuration
* Upgrade shadow plugin to get rid of Gradle deprecation
* Move compile configuration to base plugin
Solves Gradle deprecation warning from earlier Gradle versions
* Enable stable publishing in the Gradle build
* Replace usage of deprecated property
* bump Gradle version in build compare
* Remove deprecation warnings to prepare for Gradle 5
Gradle replaced `project.sourceSets.main.output.classesDir` of type
`File` with `project.sourceSets.main.output.classesDirs` of type
`FileCollection`
(see [SourceSetOutput](https://github.com/gradle/gradle/blob/master/subprojects/plugins/src/main/java/org/gradle/api/tasks/SourceSetOutput.java))
Build output is now stored on a per language folder.
There are a few places where we use that, here's these and how it's
fixed:
- Randomized Test execution
- look in all test folders ( pass the multi dir configuration to the
ant runner )
- DRY the task configuration by introducing `basedOn` for
`RandomizedTestingTask` DSL
- Extend the naming convention test to support passing in multiple
directories
- Fix the standalon test plugin, the dires were not passed trough,
checked with a debuger and the statement had no affect due to a
missing `=`.
Closes#30354
* Only check Java tests, PR feedback
- Name checker was ran for Groovy tests that don't adhere to the same
convections causing the check to fail
- implement PR feedback
* Replace `add` with `addAll`
This worked because the list is passed to `project.files` that does the
right thing.
* Revert "Only check Java tests, PR feedback"
This reverts commit 9bd9389875d8b88aadb50df57a45cd0d2b073241.
* Remove `basedOn` helper
* Bring some changes back
Previus revert accidentally reverted too much
* Fix negation
* add back public
* revert name check changes
* Revert "revert name check changes"
This reverts commit a2800c0b363168339ea65e2a79ec8256e5883e6d.
* Pass all dirs to name check
Only run on Java for build-tools, this is safe because it's a self test.
It needs more work before we could pass in the Groovy classes as well as
these inherit from `GroovyTestCase`
* remove self tests from name check
The self complicates the task setup and disable real checks on
build-tools.
With this change there are no more self tests, and the build-tools tests
adhere to the conventions.
The self test will be replaced by gradle test kit, thus the addition of
the Gradle plugin builder plugin.
* First test to run a Gradle build
* Add tests that replace the name check self test
* Clean up integ test base class
* Always run tests
* Align with test naming conventions
* Make integ. test case inherit from unit test case
The check requires this
* Remove `import static org.junit.Assert.*`
* Move to Gradle 4.8 RC1
* Use latest version of plugin
The current does not work with Gradle 4.8 RC1
* Switch to Gradle GA
* Add and configure build compare plugin
* add work-around for https://github.com/gradle/gradle/issues/5692
* work around https://github.com/gradle/gradle/issues/5696
* Make use of Gradle build compare with reference project
* Make the manifest more compare friendly
* Clear the manifest in compare friendly mode
* Remove animalsniffer from buildscript classpath
* Fix javadoc errors
* Fix doc issues
* reference Gradle issues in comments
* Conditionally configure build compare
* Fix some more doclint issues
* fix typo in build script
* Add sanity check to make sure the test task was replaced
Relates to #31324. It seems like Gradle has an inconsistent behavior and
the taks is not always replaced.
* Include number of non conforming tasks in the exception.
* No longer replace test task, create implicit instead
Closes#31324. The issue has full context in comments.
With this change the `test` task becomes nothing more than an alias for `utest`.
Some of the stand alone tests that had a `test` task now have `integTest`, and a
few of them that used to have `integTest` to run multiple tests now only
have `check`.
This will also help separarate unit/micro tests from integration tests.
* Revert "No longer replace test task, create implicit instead"
This reverts commit f1ebaf7d93e4a0a19e751109bf620477dc35023c.
* Fix replacement of the test task
Based on information from gradle/gradle#5730 replace the task taking
into account the task providres.
Closes#31324.
* Only apply build comapare plugin if needed
* Make sure test runs before integTest
* Fix doclint aftter merge
* PR review comments
* Switch to Gradle 4.8.1 and remove workaround
* PR review comments
* Consolidate task ordering
Skips tests the require xpack if we run the doc build without xpack. So
this should work:
```
./gradlew -p docs check -Dtests.distribution=oss-zip
```
This is implemented by detecting parts of the doc that look like:
```
[testenv="basic"]
```
Relates to #30665
* remove left-over comment
* make sure of the property for plugins
* skip installing modules if these exist in the distribution
* Log the distrbution being ran
* Don't allow running with integ-tests-zip passed externally
* top level x-pack/qa can't run with oss distro
* Add support for matching objects in lists
Makes it possible to have a key that points to a list and assert that a
certain object is present in the list. All keys have to be present and
values have to match. The objects in the source list may have additional
fields.
example:
```
match: { 'nodes.$master.plugins': { name: ingest-attachment } }
```
* Update plugin and module tests to work with other distributions
Some of the tests expected that the integration tests will always be ran
with the `integ-test-zip` distribution so that there will be no other
plugins loaded.
With this change, we check for the presence of the plugin without
assuming exclusivity.
* Allow modules to run on other distros as well
To match the behavior of tets.distributions
* Add and use a new `contains` assertion
Replaces the previus changes that caused `match` to do a partial match.
* Implement PR review comments
This switches the docs tests from the `oss-zip` distribution to the
`zip` distribution so they have xpack installed and configured with the
default basic license. The goal is to be able to merge the
`x-pack/docs` directory into the `docs` directory, marking the x-pack
docs with some kind of marker. This is the first step in that process.
This also enables `-Dtests.distribution` support for the `docs`
directory so you can run the tests against the `oss-zip` distribution
with something like
```
./gradlew -p docs check -Dtests.distribution=oss-zip
```
We can set up Jenkins to run both.
Relates to #30665
The goal of this commit is to address unknown licenses when producing
the dependencies info report. We have two different checks that we run
on licenses. The first check is whether or not we have stashed a copy of
the license text for a dependency in the repository. The second is to
map every dependency to a license type (e.g., BSD 3-clause). The problem
here is that the way we were handling licenses in the second check
differs from how we handle licenses in the first check. The first check
works by finding a license file with the name of the artifact followed
by the text -LICENSE.txt. Yet in some cases we allow mapping an artifact
name to another name used to check for the license (e.g., we map
lucene-.* to lucene, and opensaml-.* to shibboleth. The second check
understood the first way of looking for a license file but not the
second way. So in this commit we teach the second check about the
mappings from artifact names to license names. We do this by copying the
configuration from the dependencyLicenses task to the dependenciesInfo
task and then reusing the code from the first check in the second
check. There were some other challenges here though. For example,
dependenciesInfo was checking too many dependencies. For now, we should
only be checking direct dependencies and leaving transitive dependencies
from another org.elasticsearch artifact to that artifact (we want to do
this differently in a follow-up). We also want to disable
dependenciesInfo for projects that we do not publish, users only care
about licenses they might be exposed to if they use our assembled
products. With all of the changes in this commit we have eliminated all
unknown licenses. A follow-up will enforce that when we add a new
dependency it does not get mapped to unknown, these will be forbidden in
the future. Therefore, with this change and earlier changes are left
having no unknown licenses and two custom licenses; custom here means it
does not map to an SPDX license type. Those two licenses are xz and
ldapsdk. A future change will not allow additional custom licenses
unless they are explicitly whitelisted. This ensures that if a new
dependency is added it is mapped to an SPDX license or mapped to custom
because it does not have an SPDX license.
Most of our license file names strip the version off the artifact name
when deducing the license filename. However, the version on the GCS SDK
(google-api-services-storage) does not match the usual format and
instead starts with a vee. This means that the license filename for this
license ended up carrying the version and we should not do that. This
commit adjusts the regex the deduces the license filename to account for
this case, and adjusts the google-api-services-storage license files
accordingly.
This commit enhances the license detection that we have for various
licenses. Here we improve the detection for all licenses (especially the
Apache 2.0 License), the BSD 2-clause license, the MIT (with
attribution) license, and we add detection for the BSD 3-clause
license. One way that we achieved this improvement is by changing how
the license files are read so that rather than reading them as a
multi-line string which ended up represented as "[line1, line2, line3,
...]" internally, we read the full bytes of the license text and replace
all whitespace with a single space so the license text is now loaded as
"line1 line2 line3". For the MIT license we add the actual license text
and remove the "MIT" string as not all copies of the license clearly
indicate that the text is the MIT license. We take a similar strategy
for the BSD-2 and BSD-3 clause licenses. With this change, we reduce the
number of "custom" licenses in the codebase from 31 to 2. The two
remaining appear to be truly custom licenses, not carrying licenses
identifiable by SPDX. A follow-up will address "unknown" licenses.
Use all running nodes as unicast seeds in the rolling restart tests to
avoid a race between pinging and the tests. Without this if the tests
are too fast then when a new node comes up and pings its single
configured seed node that node *might* not have a ping from the other
running node.
This commit adds a new writeBlobAtomic() method to the BlobContainer
interface that can be implemented by repository implementations which
support atomic writes operations.
When the BlobContainer implementation does not provide a specific
implementation of writeBlobAtomic(), then the writeBlob() method is used.
Related to #30680
This snapshot includes:
- LUCENE-8341: Record soft deletes in SegmentCommitInfo which will resolve#30851
- LUCENE-8335: Enforce soft-deletes field up-front
Today when executing REST tests we take full responsibility for cluster
configuration. Yet, there are use cases for brining your own cluster to
the REST tests. This commit is a small first step towards that effort by
skipping creating the cluster if the tests.rest.cluster and test.cluster
system properties are set. In this case, the user takes full
responsibility for configuring the cluster as expected by the REST
tests. This step is by no means meant to be perfect or complete, only a
baby step.
This commit ensures the delete of the upgrade_is_oss indicator for
the packaging tests is always deleted before each run. It works by
moving the check on version which skips the task into the doFirst block,
replacing the onlyIf.
closes#30682
Ports the first couple tests for archive distributions from the old bats
project to the new java project that includes windows platforms,
consolidating them into one test method that tests that the
distributions can be extracted and their contents verified. Includes the
zip distributions which were not tested in the bats project.
The new snapshot includes LUCENE-8324 which fixes missing checkpoint
after a fully deletes segment is dropped on flush. This snapshot should
resolves failed tests in the CorruptedFileIT suite.
Closes#30741Closes#30577
* disable annotation processor for docs
Could not find evidence that the log4j annotation processor is used.
The compiler flag enables the Gradle 5.0 behavior
Closes#30476
* Disable annotation processors for all tests
* remove redundant `-proc:none` already handled by required plugins
* Revert unintentional changes
Meta plugins existed only for a short time, in order to enable breaking
up x-pack into multiple plugins. However, now that x-pack is no longer
installed as a plugin, the need for them has disappeared. This commit
removes the meta plugins infrastructure.
Adds windows server 2012r2 and 2016 vagrant boxes to packaging tests.
They can only be used if IDs for their images are specified, which are
passed to gradle and then to vagrant via env variables. Adds options
to the project property `vagrant.boxes` to choose between linux and
windows boxes.
Bats tests are run only on linux boxes, and portable packaging tests run
on all boxes. Platform tests are only run on linux boxes since they are
not being maintained.
For #26741
This commit changes the default out-of-the-box configuration for the
number of shards from five to one. We think this will help address a
common problem of oversharding. For users with time-based indices that
need a different default, this can be managed with index templates. For
users with non-time-based indices that find they need to re-shard with
the split API in place they no longer need to resort only to
reindexing.
Since this has the impact of changing the default number of shards used
in REST tests, we want to ensure that we still have coverage for issues
that could arise from multiple shards. As such, we randomize (rarely)
the default number of shards in REST tests to two. This is managed via a
global index template. However, some tests check the templates that are
in the cluster state during the test. Since this template is randomly
there, we need a way for tests to skip adding the template used to set
the number of shards to two. For this we add the default_shards feature
skip. To avoid having to write our docs in a complicated way because
sometimes they might be behind one shard, and sometimes they might be
behind two shards we apply the default_shards feature skip to all docs
tests. That is, these tests will always run with the default number of
shards (one).
This commit adds the ability to specify a plugin from maven for a
test cluster to use. Currently, only local projects may be used as
plugins, except when testing bwc, where the coordinates of the project
are used. However, that assumes all projects always keep the same
coordinates, or are even still plugins, which is no longer the case for
x-pack. The full cluster and rolling restart tests are changed to use
this new method when pulling x-pack versions before 6.3.0.
We have a pile of documentation describing how to rebuild the built in
language analyzers and, previously, our documentation testing framework
made sure that the examples successfully built *an* analyzer but they
didn't assert that the analyzer built by the documentation matches the
built in anlayzer. Unsuprisingly, some of the examples aren't quite
right.
This adds a mechanism that tests that the analyzers built by the docs.
The mechanism is fairly simple and brutal but it seems to be working:
build a hundred random unicode sequences and send them through the
`_analyze` API with the rebuilt analyzer and then again through the
built in analyzer. Then make sure both APIs return the same results.
Each of these calls to `_anlayze` takes about 20ms on my laptop which
seems fine.
This is a follow-up to our previous change to only fork javac if needed
to respect the Java compiler home (via JAVA_HOME). This commit makes the
same change for groovyc: we only fork groovyc if the JDK for Gradle is
not the JDK specified for the compiler (via JAVA_HOME).
We started forking javac to avoid GC overhead when running builds. Yet,
we do not seem to have this problem anymore and not forking leads to a
substantial speed improvement. This commit stops forking javac.
Upgrade to lucene-7.4.0-snapshot-1ed95c097b
This version contains:
* An Analyzer for Korean
* An IntervalQuery and IntervalsSource that retrieve minimum intervals of positional queries.
* A new API to retrieve matches (offsets and positions) of a query for a single document.
* Support for soft deletes in the index writer.
* A fixed shingle filter that handles index time synonyms.
* Support for emoji sequence in ICUTokenizer (with an upgrade to icu 61.1)
Uses a filter on the copy task for the eclipse settings files to
replace the token @@LICENSE_HEADER_TEXT@@ with the correct licence
header from the new buildSrc/src/main/resources/license-headers
directory
xpack core contains a fork of `Cron` from quartz who's javadoc has a
`<table>` with non-html5 compatible stuff. This html5ifies the table and
switches the `:x-pack:plugin:core` project to building javadoc with
HTML5.
[test] add java packaging test project
Adds a project for building and running packaging tests written in java
for portability. The vagrant tasks use jars on the packagingTest
configuration, which are built in the same project. No tests are added
yet.
Corresponding changes are not made to :x-pack:qa:vagrant because the
java packaging tests will all be consolidated into one project.
For #26741
`javadoc` will switch from detaulting to html4 to html5 in "a future
release". We should get ahead of it so we're not surprised. Also, HTML5
is the future! Er, the present. Anyway, this follows up from #30220 to
make the Javadoc for two of the four remaining projects HTML5
compatible.
This *mostly* silences `javadoc`'s warning about defaulting to
generating html4 files by enabling generating html5 file for the
projects for which that works. It didn't work in a half dozen projects,
about half of which I've fixed in this PR, entirely by replacing
`<tt>thing</tt>` with `{@code thing}`.
There are a few remaining projects that contain javadoc with invalid
html5. I'll fix those projects in a followup.
Add the oss tar distribution to the packaging test plugin. Test the oss
tar distribution in the core packaging tests, and the non-oss tar
distribution in the x-pack packaging tests.
Today we update index settings directly via IndexService instead of the
cluster state in IndexServiceTests. However, those changes will be lost
if there is a cluster state update. In general, we should update index
settings via client and limit the direct usage in only special tests.
This commit replaces direct usages by the updateSettings api of client.
Closes#24491
Today when forking setup commands we do not set JAVA_HOME. This means
that we might not use a version of Java compatible with the version of
Java the command is expecting to run on (for example, 5.6 nodes would
expect JDK 8, and this is true even for their setup commands). This
commit sets JAVA_HOME when configuring setup command tasks.
This commit moves the apache and elastic license files into a new
root level `licenses` directory and rewrites the top level LICENSE.txt
to clarify the repository has a mix of apache and elastic licensed code.
With the switch to X-Pack as a module, we lost production of POMs for
the JARs that we publish, and did not have a license/notice file in the
zip archives nor the exploded module. This commit ensures that we
generate these POMs, and license/notice files.
This commit makes x-pack a module and adds it to the default
distrubtion. It also creates distributions for zip, tar, deb and rpm
which contain only oss code.
This commit moves the checks on JAVAX_HOME (where X is the java version
number) existing to the end of gradle's configuration phase, and based
on whether the tasks needing the java home are configured to execute.
relates #29519
This commit is a minor cleanup of a code block in NodeInfo.groovy. We
remove an unused variable, make the formatting of the code consistent,
and cast a property that is typed as an Object to a String to avoid an
annoying IDE warning.
Some build tasks require older JDKs. For example, the BWC build tasks
for older versions of Elasticsearch require older JDKs. It is onerous to
require these be configured when merely compiling Elasticsearch, the
requirement that they be strictly set to appropriate values should only
be enforced if these tasks are going to be executed. To address this, we
lazy configure these tasks.
Today we have a nodeVersion property on the NodeInfo class that we use
to carry around information about a standalone node that we will start
during tests. This property is a String which we usually end up parsing
to a Version anyway to do various checks on it. This can end up
happening a lot during configuration so it would be more efficient and
safer to have this already be strongly-typed as a Version and parsed
from a String only once for each instance of NodeInfo. Therefore, this
commit makes NodeInfo#nodeVersion strongly-typed as a Version.
There are some scenarios where the license on a source file is one that
is compatible with our projects yet we do not want to add the license to
the list of approved license headers (to keep the number of files with
that compatible license contained). This commit adds the ability to
exclude a file from the license check.
Today we have JAVA_HOME for the compiler Java home and RUNTIME_JAVA_HOME
for the test Java home. However, when we compile BWC nodes and run them,
neither of these Java homes might be the version that was suitable for
that BWC node (e.g., 5.6 requires JDK 8 to compile and to run). This
commit adds support for the environment variables JAVA\d+_HOME and uses
the appropriate Java home based on the version of the node being
started. We even do this for reindex-from-old which requires JDK 7 for
these very old nodes. Note that these environment variables are not
required if not running BWC tests, and they are strictly required if
running BWC tests.
This commit introduces built in support for adding files to the
keystore when configuring the integration test cluster for a project.
In order to use this support, simply add `keystoreFile` followed by the
secure setting name and the path to the source file inside the
integTestCluster closure for a project. The built in support will
handle the creation of the keystore and the addition of the file to the
keystore.
This change moves the -ea and -esa options that enable assertions for
test nodes before the cluster-specific JVM arguments on the Java command
line. This opens up the possibility for the cluster-specific JVM
arguments to disable assertions for one particular package or class,
which can be useful in BWC testing where incorrect assertions cannot be
removed from released versions of the product.
Today we have a silent batch mode in the install plugin command when
standard input is closed or there is no tty. It appears that
historically this was useful when running tests where we want to accept
plugin permissions without having to acknowledge them. Now that we have
an explicit batch mode flag, this use-case is removed. The motivation
for removing this now is that there is another place where silent batch
mode arises and that is when a user attempts to install a plugin inside
a Docker container without keeping standard input open and attaching a
tty. In this case, the install plugin command will treat the situation
as a silent batch mode and therefore the user will never have the chance
to acknowledge the additional permissions required by a plugin. This
commit removes this silent batch mode in favor of using the --batch flag
when running tests and requiring the user to take explicit action to
acknowledge the additional permissions (either by leaving standard input
open and attaching a tty, or by passing the --batch flags themselves).
Note that with this change the user will now see a null pointer
exception when they try to install a plugin in a Docker container
without keeping standard input open and attaching a tty. This will be
addressed in an immediate follow-up, but because the implications of
that change are larger, they should be handled separately from this one.
Correctly setup classpath/dependencies and fix checkstyle task that was partly broken because delayed setup of Java9 sourcesets. This also cleans packaging of META-INF. It also prepares forbiddenapis 2.6 upgrade
relates #29292
* Begin moving XContent to a separate lib/artifact
This commit moves a large portion of the XContent code from the `server` project
to the `libs/xcontent` project. For the pieces that have been moved, some
helpers have been duplicated to allow them to be decoupled from ES helper
classes. In addition, `Booleans` and `CheckedFunction` have been moved to the
`elasticsearch-core` project.
This decoupling is a move so that we can eventually make things like the
high-level REST client not rely on the entire ES jar, only the parts it needs.
There are some pieces that are still not decoupled, in particular some of the
XContent tests still remain in the server project, this is because they test a
large portion of the pluggable xcontent pieces through
`XContentElasticsearchException`. They may be decoupled in future work.
Additionally, there may be more piecese that we want to move to the xcontent lib
in the future that are not part of this PR, this is a starting point.
Relates to #28504
The sysprop repos.mavenLocal may be used to add the local .m2 maven
repository for testing snapshots of locally build dependencies.
Unfortunately this has to be checked in two different places (they cannot
be shared, due to buildSrc being built essentially as a separate
project), and the casing of the string sysprop lookups did not align.
This commit fixes BuildPlugin's checking of repos.mavenLocal to use the
correct casing (camelCase, to match the gradle dsl element).
When a module or plugin register that it has a client JAR, we copy
artifacts like the Javadoc and sources JARs as the JARs for the client
as well (with -client added to the name). I previously had to disable
the Javadoc task on JDK 10 due to a bug in bin/javadoc. After JDK 10
went GA without a fix for this bug, I added workaround to fix the
Javadoc task on JDK 10. However, I made a mistake reverting the
previously skipped Javadocs tasks and missed that one that copies the
Javadoc JAR for client JARs. This commit fixes that issue.
The vagrant test plugin adds tasks for the groovy packaging tests,
which run after the bats packaging test tasks.Rename the 'bats'
configuration to 'packaging' and remove the option to inherit
archives from this configuration.
This commit reenables the Javadoc tasks on JDK 10. To reenable these
tasks, we have to workaround a bug in JDK 10 which trips on some deeply
nested anonymous classes that we have in the codebase (and are fine
as-is, this is not a problem with this code). The workaround is to
remove the compiled classes from the classpath. This has been reported
upstream and the workaround was suggested there (see the code comment).
We need to configure the Java 9 checkstyle task to depend on the
checkstyle configuration task or the task could run before the
checkstyle conf has been copied leading to runtime failures. We have to
do this after projects have been evaluated because the configuration of
these tasks can occur before the Java 9 source set has been added to a
project.
This commit fixes the directory name bundled plugins are added under
within a meta plugin to be the configured name of the bundled plugin,
instead of the project name.
This commit creates the copyRestSpec task for rest integ tests
immediately on creation of the RestIntegTestTask instead of lazily in
afterEvaluate. This allows other projects to add additional rest specs
to be copied, instead of needing to create another parallel copy task.
Adds support for triple quoted strings to the documentation test
generator. Kibana's CONSOLE tool has supported them for a year but we
were unable to use them in Elasticsearch's docs because the process that
converts example snippets into tests couldn't handle this. This change
adds code to convert them into standard JSON so we can pass them to
Elasticsearch.
This commit enhances the error messages reported when JAVA_HOME and
RUNTIME_JAVA_HOME are not correctly set to point towards the minimum
compiler and minimum runtime JDKs that are expected by the builds. The
previous error message would say:
Java 1.9 or above is required to build Elasticsearch
which is confusing if the user does have a JDK 9 installation and is
even the version that they have on their path yet they have JAVA_HOME
pointing to another JDK installation. The error message reported after
this change is:
the environment variable JAVA_HOME must be set to a JDK installation directory for Java 1.9 but is [/usr/java/jdk-8] corresponding to [1.8]
As we have factored Elasticsearch into smaller libraries, we have ended
up in a situation that some of the dependencies of Elasticsearch are not
available to code that depends on these smaller libraries but not server
Elasticsearch. This is a good thing, this was one of the goals of
separating Elasticsearch into smaller libraries, to shed some of the
dependencies from other components of the system. However, this now
means that simple utility methods from Lucene that we rely on are no
longer available everywhere. This commit copies IOUtils (with some small
formatting changes for our codebase) into the fold so that other
components of the system can rely on these methods where they no longer
depend on Lucene.
This commit removes the ability to specify that a plugin requires the
keystore and instead creates the keystore on package installation or
when Elasticsearch is started for the first time. The reason that we opt
to create the keystore on package installation is to ensure that the
keystore has the correct permissions (the package installation scripts
run as root as opposed to Elasticsearch running as the elasticsearch
user) and to enable removing the keystore on package removal if the
keystore is not modified.
Log4j2 provides a wide range of logging methods. Our code typically only uses a subset of them. In particular, uses of the methods trace|debug|info|warn|error|fatal(Object) or trace|debug|info|warn|error|fatal(Object, Throwable) have all been wrong, leading to not properly logging the provided message. To prevent these issues in the future, the corresponding Logger methods have been blacklisted.
This allows us to remove another dependency in the decoupling of the XContent
code. Rather than move this class over or decouple it, it can simply be removed.
Relates tangentially to #28504
Running any randomized testing task within Elasticsearch currently fails
if a project has zero tests. This was supposed to be overrideable, but
it was always set to 'fail', and the system property to override was
passed down to the test runner, but never read there. This commit
changes the value of the ifNoTests setting to randomized runner to be
read from system properties and continue to default to 'fail'.
This commit fixes the test progress logging to not produce an NPE when
there are no tests run. The onQuit method is always called, but onStart
would not be called if no tests match the test patterns.
It is only a comment, but can confuse those reading the code
Used 6.0 as an arbitrary elasticsearch.version value since it is version that required Java 8
This commit specifies that the working directory of the destroy task for
destroying test VMs is the root of the build. This is necessary in case
the build was run from a sub-directory, the Vagrant command would then
not be able to locate the Vagrantfile for the VMs in question.
Today we do not destroy Vagrant boxes before tests. This is because
constantly reprovisioning these boxes is time-consuming. Yet, not
destroying these boxes can lead to state being left around that impacts
subsequent test runs. To address this, we now always destroy these boxes
before tests and provide a flag to set if this is not desired while
iterating locally.
Applying the rest test gradle plugin already uses the zip distribution
by default, so specifying it explicitly is not necessary. These are
leftovers from before zip was the default for rest tests.