mirror of
https://github.com/honeymoose/OpenSearch.git
synced 2025-02-25 22:36:20 +00:00
Merge branch 'master' into feature/seq_no
* master: (904 commits) Removes unused methods in the o/e/common/Strings class. Add note regarding thread stack size on Windows painless: restore accidentally removed test Documented fuzzy_transpositions in match query Add not-null precondition check in BulkRequest Build: Make run task you full zip distribution Build: More pom generation improvements Add test for wrong array index Take return type from "after" field. painless: build descriptor of array and field load/store in code; fix array index to adapt type not DEF Build: Add developer info to generated pom files painless: improve exception stacktraces painless: Rename the dynamic call site factory to DefBootstrap and make the inner class very short (PIC = Polymorphic Inline Cache) Remove dead code. Avoid race while retiring executors Allow only a single extension for a scripting engine Adding REST tests to ensure key_as_string behavior stays consistent [test] Set logging to 11 on reindex test [TEST] increase logger level until we know what is going on Don't allow `fuzziness` for `multi_match` types cross_fields, phrase and phrase_prefix ...
This commit is contained in:
commit
15d3d74444
13
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
13
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@ -0,0 +1,13 @@
|
||||
<!--
|
||||
Thank you for your interest in and contributing to Elasticsearch! There
|
||||
are a few simple things to check before submitting your pull request
|
||||
that can help with the review process. You should delete these items
|
||||
from your submission, but they are here to help bring them to your
|
||||
attention.
|
||||
-->
|
||||
|
||||
- Have you signed the [contributor license agreement](https://www.elastic.co/contributor-agreement)?
|
||||
- Have you followed the [contributor guidelines](https://github.com/elastic/elasticsearch/blob/master/.github/CONTRIBUTING.md)?
|
||||
- If submitting code, have you built your formula locally prior to submission with `gradle check`?
|
||||
- If submitting code, is your pull request against master? Unless there is a good reason otherwise, we prefer pull requests against master and will backport as needed.
|
||||
- If submitting code, have you checked that your submission is for an [OS that we support](https://www.elastic.co/support/matrix#show_os)?
|
@ -54,7 +54,7 @@ Once your changes and tests are ready to submit for review:
|
||||
1. Test your changes
|
||||
|
||||
Run the test suite to make sure that nothing is broken. See the
|
||||
[TESTING](../TESTING.asciidoc) file for help running tests.
|
||||
[TESTING](TESTING.asciidoc) file for help running tests.
|
||||
|
||||
2. Sign the Contributor License Agreement
|
||||
|
||||
@ -76,7 +76,31 @@ Contributing to the Elasticsearch codebase
|
||||
|
||||
**Repository:** [https://github.com/elastic/elasticsearch](https://github.com/elastic/elasticsearch)
|
||||
|
||||
Make sure you have [Gradle](http://gradle.org) installed, as Elasticsearch uses it as its build system. Integration with IntelliJ and Eclipse should work out of the box. Eclipse users can automatically configure their IDE: `gradle eclipse` then `File: Import: Existing Projects into Workspace`. Select the option `Search for nested projects`. Additionally you will want to ensure that Eclipse is using 2048m of heap by modifying `eclipse.ini` accordingly to avoid GC overhead errors.
|
||||
Make sure you have [Gradle](http://gradle.org) installed, as
|
||||
Elasticsearch uses it as its build system.
|
||||
|
||||
Eclipse users can automatically configure their IDE: `gradle eclipse`
|
||||
then `File: Import: Existing Projects into Workspace`. Select the
|
||||
option `Search for nested projects`. Additionally you will want to
|
||||
ensure that Eclipse is using 2048m of heap by modifying `eclipse.ini`
|
||||
accordingly to avoid GC overhead errors.
|
||||
|
||||
IntelliJ users can automatically configure their IDE: `gradle idea`
|
||||
then `File->New Project From Existing Sources`. Point to the root of
|
||||
the source directory, select
|
||||
`Import project from external model->Gradle`, enable
|
||||
`Use auto-import`.
|
||||
|
||||
The Elasticsearch codebase makes heavy use of Java `assert`s and the
|
||||
test runner requires that assertions be enabled within the JVM. This
|
||||
can be accomplished by passing the flag `-ea` to the JVM on startup.
|
||||
|
||||
For IntelliJ, go to
|
||||
`Run->Edit Configurations...->Defaults->JUnit->VM options` and input
|
||||
`-ea`.
|
||||
|
||||
For Eclipse, go to `Preferences->Java->Installed JREs` and add `-ea` to
|
||||
`VM Arguments`.
|
||||
|
||||
Please follow these formatting guidelines:
|
||||
|
||||
@ -95,12 +119,10 @@ cd elasticsearch/
|
||||
gradle assemble
|
||||
```
|
||||
|
||||
You will find the newly built packages under: `./distribution/build/distributions/`.
|
||||
You will find the newly built packages under: `./distribution/(deb|rpm|tar|zip)/build/distributions/`.
|
||||
|
||||
Before submitting your changes, run the test suite to make sure that nothing is broken, with:
|
||||
|
||||
```sh
|
||||
gradle check
|
||||
```
|
||||
|
||||
Source: [Contributing to elasticsearch](https://www.elastic.co/contributing-to-elasticsearch/)
|
||||
|
@ -1,5 +1,5 @@
|
||||
Elasticsearch
|
||||
Copyright 2009-2015 Elasticsearch
|
||||
Copyright 2009-2016 Elasticsearch
|
||||
|
||||
This product includes software developed by The Apache Software
|
||||
Foundation (http://www.apache.org/).
|
||||
|
@ -9,7 +9,7 @@ Elasticsearch is a distributed RESTful search engine built for the cloud. Featur
|
||||
* Distributed and Highly Available Search Engine.
|
||||
** Each index is fully sharded with a configurable number of shards.
|
||||
** Each shard can have one or more replicas.
|
||||
** Read / Search operations performed on either one of the replica shard.
|
||||
** Read / Search operations performed on any of the replica shards.
|
||||
* Multi Tenant with Multi Types.
|
||||
** Support for more than one index.
|
||||
** Support for more than one type per index.
|
||||
@ -50,19 +50,19 @@ h3. Indexing
|
||||
Let's try and index some twitter like information. First, let's create a twitter user, and add some tweets (the @twitter@ index will be created automatically):
|
||||
|
||||
<pre>
|
||||
curl -XPUT 'http://localhost:9200/twitter/user/kimchy' -d '{ "name" : "Shay Banon" }'
|
||||
curl -XPUT 'http://localhost:9200/twitter/user/kimchy?pretty' -d '{ "name" : "Shay Banon" }'
|
||||
|
||||
curl -XPUT 'http://localhost:9200/twitter/tweet/1' -d '
|
||||
curl -XPUT 'http://localhost:9200/twitter/tweet/1?pretty' -d '
|
||||
{
|
||||
"user": "kimchy",
|
||||
"postDate": "2009-11-15T13:12:00",
|
||||
"post_date": "2009-11-15T13:12:00",
|
||||
"message": "Trying out Elasticsearch, so far so good?"
|
||||
}'
|
||||
|
||||
curl -XPUT 'http://localhost:9200/twitter/tweet/2' -d '
|
||||
curl -XPUT 'http://localhost:9200/twitter/tweet/2?pretty' -d '
|
||||
{
|
||||
"user": "kimchy",
|
||||
"postDate": "2009-11-15T14:12:12",
|
||||
"post_date": "2009-11-15T14:12:12",
|
||||
"message": "Another tweet, will it be indexed?"
|
||||
}'
|
||||
</pre>
|
||||
@ -101,7 +101,7 @@ Just for kicks, let's get all the documents stored (we should see the user as we
|
||||
curl -XGET 'http://localhost:9200/twitter/_search?pretty=true' -d '
|
||||
{
|
||||
"query" : {
|
||||
"matchAll" : {}
|
||||
"match_all" : {}
|
||||
}
|
||||
}'
|
||||
</pre>
|
||||
@ -113,7 +113,7 @@ curl -XGET 'http://localhost:9200/twitter/_search?pretty=true' -d '
|
||||
{
|
||||
"query" : {
|
||||
"range" : {
|
||||
"postDate" : { "from" : "2009-11-15T13:00:00", "to" : "2009-11-15T14:00:00" }
|
||||
"post_date" : { "from" : "2009-11-15T13:00:00", "to" : "2009-11-15T14:00:00" }
|
||||
}
|
||||
}
|
||||
}'
|
||||
@ -130,19 +130,19 @@ Elasticsearch supports multiple indices, as well as multiple types per index. In
|
||||
Another way to define our simple twitter system is to have a different index per user (note, though that each index has an overhead). Here is the indexing curl's in this case:
|
||||
|
||||
<pre>
|
||||
curl -XPUT 'http://localhost:9200/kimchy/info/1' -d '{ "name" : "Shay Banon" }'
|
||||
curl -XPUT 'http://localhost:9200/kimchy/info/1?pretty' -d '{ "name" : "Shay Banon" }'
|
||||
|
||||
curl -XPUT 'http://localhost:9200/kimchy/tweet/1' -d '
|
||||
curl -XPUT 'http://localhost:9200/kimchy/tweet/1?pretty' -d '
|
||||
{
|
||||
"user": "kimchy",
|
||||
"postDate": "2009-11-15T13:12:00",
|
||||
"post_date": "2009-11-15T13:12:00",
|
||||
"message": "Trying out Elasticsearch, so far so good?"
|
||||
}'
|
||||
|
||||
curl -XPUT 'http://localhost:9200/kimchy/tweet/2' -d '
|
||||
curl -XPUT 'http://localhost:9200/kimchy/tweet/2?pretty' -d '
|
||||
{
|
||||
"user": "kimchy",
|
||||
"postDate": "2009-11-15T14:12:12",
|
||||
"post_date": "2009-11-15T14:12:12",
|
||||
"message": "Another tweet, will it be indexed?"
|
||||
}'
|
||||
</pre>
|
||||
@ -152,11 +152,11 @@ The above will index information into the @kimchy@ index, with two types, @info@
|
||||
Complete control on the index level is allowed. As an example, in the above case, we would want to change from the default 5 shards with 1 replica per index, to only 1 shard with 1 replica per index (== per twitter user). Here is how this can be done (the configuration can be in yaml as well):
|
||||
|
||||
<pre>
|
||||
curl -XPUT http://localhost:9200/another_user/ -d '
|
||||
curl -XPUT http://localhost:9200/another_user?pretty -d '
|
||||
{
|
||||
"index" : {
|
||||
"numberOfShards" : 1,
|
||||
"numberOfReplicas" : 1
|
||||
"number_of_shards" : 1,
|
||||
"number_of_replicas" : 1
|
||||
}
|
||||
}'
|
||||
</pre>
|
||||
@ -168,7 +168,7 @@ index (twitter user), for example:
|
||||
curl -XGET 'http://localhost:9200/kimchy,another_user/_search?pretty=true' -d '
|
||||
{
|
||||
"query" : {
|
||||
"matchAll" : {}
|
||||
"match_all" : {}
|
||||
}
|
||||
}'
|
||||
</pre>
|
||||
@ -179,7 +179,7 @@ Or on all the indices:
|
||||
curl -XGET 'http://localhost:9200/_search?pretty=true' -d '
|
||||
{
|
||||
"query" : {
|
||||
"matchAll" : {}
|
||||
"match_all" : {}
|
||||
}
|
||||
}'
|
||||
</pre>
|
||||
@ -222,7 +222,7 @@ h1. License
|
||||
<pre>
|
||||
This software is licensed under the Apache License, version 2 ("ALv2"), quoted below.
|
||||
|
||||
Copyright 2009-2015 Elasticsearch <https://www.elastic.co>
|
||||
Copyright 2009-2016 Elasticsearch <https://www.elastic.co>
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License"); you may not
|
||||
use this file except in compliance with the License. You may obtain a copy of
|
||||
|
@ -345,20 +345,21 @@ gradle :qa:vagrant:checkVagrantVersion
|
||||
-------------------------------------
|
||||
|
||||
. Download and smoke test the VMs with `gradle vagrantSmokeTest` or
|
||||
`gradle vagrantSmokeTestAllDistros`. The first time you run this it will
|
||||
`gradle -Pvagrant.boxes=all vagrantSmokeTest`. The first time you run this it will
|
||||
download the base images and provision the boxes and immediately quit. If you
|
||||
you this again it'll skip the download step.
|
||||
|
||||
. Run the tests with `gradle checkPackages`. This will cause gradle to build
|
||||
. Run the tests with `gradle packagingTest`. This will cause gradle to build
|
||||
the tar, zip, and deb packages and all the plugins. It will then run the tests
|
||||
on ubuntu-1404 and centos-7. We chose those two distributions as the default
|
||||
because they cover deb and rpm packaging and SyvVinit and systemd.
|
||||
|
||||
You can run on all the VMs by running `gradle checkPackagesAllDistros`. You can
|
||||
run a particular VM with a command like `gradle checkOel7`. See `gradle tasks`
|
||||
for a list. Its important to know that if you ctrl-c any of these `gradle`
|
||||
commands then the boxes will remain running and you'll have to terminate them
|
||||
with `vagrant halt`.
|
||||
You can run on all the VMs by running `gradle -Pvagrant.boxes=all packagingTest`.
|
||||
You can run a particular VM with a command like
|
||||
`gradle -Pvagrant.boxes=oel-7 packagingTest`. See `gradle tasks` for a complete
|
||||
list of available vagrant boxes for testing. It's important to know that if you
|
||||
ctrl-c any of these `gradle` commands then the boxes will remain running and
|
||||
you'll have to terminate them with 'gradle stop'.
|
||||
|
||||
All the regular vagrant commands should just work so you can get a shell in a
|
||||
VM running trusty by running
|
||||
@ -387,7 +388,7 @@ We're missing the follow because our tests are very linux/bash centric:
|
||||
|
||||
* Windows Server 2012
|
||||
|
||||
Its important to think of VMs like cattle. If they become lame you just shoot
|
||||
It's important to think of VMs like cattle. If they become lame you just shoot
|
||||
them and let vagrant reprovision them. Say you've hosed your precise VM:
|
||||
|
||||
----------------------------------------------------
|
||||
@ -432,7 +433,7 @@ and in another window:
|
||||
|
||||
----------------------------------------------------
|
||||
vagrant up centos-7 --provider virtualbox && vagrant ssh centos-7
|
||||
cd $RPM
|
||||
cd $TESTROOT
|
||||
sudo bats $BATS/*rpm*.bats
|
||||
----------------------------------------------------
|
||||
|
||||
@ -440,7 +441,7 @@ If you wanted to retest all the release artifacts on a single VM you could:
|
||||
|
||||
-------------------------------------------------
|
||||
gradle prepareTestRoot
|
||||
vagrant up trusty --provider virtualbox && vagrant ssh trusty
|
||||
vagrant up ubuntu-1404 --provider virtualbox && vagrant ssh ubuntu-1404
|
||||
cd $TESTROOT
|
||||
sudo bats $BATS/*.bats
|
||||
-------------------------------------------------
|
||||
|
69
build.gradle
69
build.gradle
@ -18,12 +18,40 @@
|
||||
*/
|
||||
|
||||
import com.bmuschko.gradle.nexus.NexusPlugin
|
||||
import org.eclipse.jgit.lib.Repository
|
||||
import org.eclipse.jgit.lib.RepositoryBuilder
|
||||
import org.gradle.plugins.ide.eclipse.model.SourceFolder
|
||||
import org.apache.tools.ant.taskdefs.condition.Os
|
||||
|
||||
// common maven publishing configuration
|
||||
subprojects {
|
||||
group = 'org.elasticsearch'
|
||||
version = org.elasticsearch.gradle.VersionProperties.elasticsearch
|
||||
description = "Elasticsearch subproject ${project.path}"
|
||||
|
||||
// we only use maven publish to add tasks for pom generation
|
||||
plugins.withType(MavenPublishPlugin).whenPluginAdded {
|
||||
publishing {
|
||||
publications {
|
||||
// add license information to generated poms
|
||||
all {
|
||||
pom.withXml { XmlProvider xml ->
|
||||
Node node = xml.asNode()
|
||||
node.appendNode('inceptionYear', '2009')
|
||||
|
||||
Node license = node.appendNode('licenses').appendNode('license')
|
||||
license.appendNode('name', 'The Apache Software License, Version 2.0')
|
||||
license.appendNode('url', 'http://www.apache.org/licenses/LICENSE-2.0.txt')
|
||||
license.appendNode('distribution', 'repo')
|
||||
|
||||
Node developer = node.appendNode('developers').appendNode('developer')
|
||||
developer.appendNode('name', 'Elastic')
|
||||
developer.appendNode('url', 'http://www.elastic.co')
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
plugins.withType(NexusPlugin).whenPluginAdded {
|
||||
modifyPom {
|
||||
@ -50,21 +78,37 @@ subprojects {
|
||||
javadoc = true
|
||||
tests = false
|
||||
}
|
||||
nexus {
|
||||
String buildSnapshot = System.getProperty('build.snapshot', 'true')
|
||||
if (buildSnapshot == 'false') {
|
||||
Repository repo = new RepositoryBuilder().findGitDir(new File('.')).build()
|
||||
String shortHash = repo.resolve('HEAD')?.name?.substring(0,7)
|
||||
repositoryUrl = project.hasProperty('build.repository') ? project.property('build.repository') : "file://${System.getenv('HOME')}/elasticsearch-releases/${version}-${shortHash}/"
|
||||
}
|
||||
}
|
||||
// we have our own username/password prompts so that they only happen once
|
||||
// TODO: add gpg signing prompts
|
||||
// TODO: add gpg signing prompts, which is tricky, as the buildDeb/buildRpm tasks are executed before this code block
|
||||
project.gradle.taskGraph.whenReady { taskGraph ->
|
||||
if (taskGraph.allTasks.any { it.name == 'uploadArchives' }) {
|
||||
Console console = System.console()
|
||||
if (project.hasProperty('nexusUsername') == false) {
|
||||
String nexusUsername = console.readLine('\nNexus username: ')
|
||||
// no need for username/password on local deploy
|
||||
if (project.nexus.repositoryUrl.startsWith('file://')) {
|
||||
project.rootProject.allprojects.each {
|
||||
it.ext.nexusUsername = nexusUsername
|
||||
it.ext.nexusUsername = 'foo'
|
||||
it.ext.nexusPassword = 'bar'
|
||||
}
|
||||
}
|
||||
if (project.hasProperty('nexusPassword') == false) {
|
||||
String nexusPassword = new String(console.readPassword('\nNexus password: '))
|
||||
project.rootProject.allprojects.each {
|
||||
it.ext.nexusPassword = nexusPassword
|
||||
} else {
|
||||
if (project.hasProperty('nexusUsername') == false) {
|
||||
String nexusUsername = console.readLine('\nNexus username: ')
|
||||
project.rootProject.allprojects.each {
|
||||
it.ext.nexusUsername = nexusUsername
|
||||
}
|
||||
}
|
||||
if (project.hasProperty('nexusPassword') == false) {
|
||||
String nexusPassword = new String(console.readPassword('\nNexus password: '))
|
||||
project.rootProject.allprojects.each {
|
||||
it.ext.nexusPassword = nexusPassword
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -181,6 +225,10 @@ allprojects {
|
||||
outputDir = file('build-idea/classes/main')
|
||||
testOutputDir = file('build-idea/classes/test')
|
||||
|
||||
// also ignore other possible build dirs
|
||||
excludeDirs += file('build')
|
||||
excludeDirs += file('build-eclipse')
|
||||
|
||||
iml {
|
||||
// fix so that Gradle idea plugin properly generates support for resource folders
|
||||
// see also https://issues.gradle.org/browse/GRADLE-2975
|
||||
@ -227,6 +275,9 @@ allprojects {
|
||||
// Name all the non-root projects after their path so that paths get grouped together when imported into eclipse.
|
||||
if (path != ':') {
|
||||
eclipse.project.name = path
|
||||
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
|
||||
eclipse.project.name = eclipse.project.name.replace(':', '_')
|
||||
}
|
||||
}
|
||||
|
||||
plugins.withType(JavaBasePlugin) {
|
||||
|
1
buildSrc/.gitignore
vendored
Normal file
1
buildSrc/.gitignore
vendored
Normal file
@ -0,0 +1 @@
|
||||
build-bootstrap/
|
@ -1,5 +1,3 @@
|
||||
import java.nio.file.Files
|
||||
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
@ -19,25 +17,21 @@ import java.nio.file.Files
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
// we must use buildscript + apply so that an external plugin
|
||||
// can apply this file, since the plugins directive is not
|
||||
// supported through file includes
|
||||
buildscript {
|
||||
repositories {
|
||||
jcenter()
|
||||
}
|
||||
dependencies {
|
||||
classpath 'com.bmuschko:gradle-nexus-plugin:2.3.1'
|
||||
}
|
||||
}
|
||||
import java.nio.file.Files
|
||||
|
||||
apply plugin: 'groovy'
|
||||
apply plugin: 'com.bmuschko.nexus'
|
||||
// TODO: move common IDE configuration to a common file to include
|
||||
apply plugin: 'idea'
|
||||
apply plugin: 'eclipse'
|
||||
|
||||
group = 'org.elasticsearch.gradle'
|
||||
archivesBaseName = 'build-tools'
|
||||
|
||||
if (project == rootProject) {
|
||||
// change the build dir used during build init, so that doing a clean
|
||||
// won't wipe out the buildscript jar
|
||||
buildDir = 'build-bootstrap'
|
||||
}
|
||||
|
||||
/*****************************************************************************
|
||||
* Propagating version.properties to the rest of the build *
|
||||
*****************************************************************************/
|
||||
|
||||
Properties props = new Properties()
|
||||
props.load(project.file('version.properties').newDataInputStream())
|
||||
@ -51,35 +45,10 @@ if (snapshot) {
|
||||
props.put("elasticsearch", version);
|
||||
}
|
||||
|
||||
|
||||
repositories {
|
||||
mavenCentral()
|
||||
maven {
|
||||
name 'sonatype-snapshots'
|
||||
url "https://oss.sonatype.org/content/repositories/snapshots/"
|
||||
}
|
||||
jcenter()
|
||||
}
|
||||
|
||||
dependencies {
|
||||
compile gradleApi()
|
||||
compile localGroovy()
|
||||
compile "com.carrotsearch.randomizedtesting:junit4-ant:${props.getProperty('randomizedrunner')}"
|
||||
compile("junit:junit:${props.getProperty('junit')}") {
|
||||
transitive = false
|
||||
}
|
||||
compile 'com.netflix.nebula:gradle-extra-configurations-plugin:3.0.3'
|
||||
compile 'com.netflix.nebula:gradle-info-plugin:3.0.3'
|
||||
compile 'org.eclipse.jgit:org.eclipse.jgit:3.2.0.201312181205-r'
|
||||
compile 'com.perforce:p4java:2012.3.551082' // THIS IS SUPPOSED TO BE OPTIONAL IN THE FUTURE....
|
||||
compile 'de.thetaphi:forbiddenapis:2.0'
|
||||
compile 'com.bmuschko:gradle-nexus-plugin:2.3.1'
|
||||
compile 'org.apache.rat:apache-rat:0.11'
|
||||
}
|
||||
|
||||
File tempPropertiesFile = new File(project.buildDir, "version.properties")
|
||||
task writeVersionProperties {
|
||||
inputs.properties(props)
|
||||
outputs.file(tempPropertiesFile)
|
||||
doLast {
|
||||
OutputStream stream = Files.newOutputStream(tempPropertiesFile.toPath());
|
||||
try {
|
||||
@ -95,31 +64,77 @@ processResources {
|
||||
from tempPropertiesFile
|
||||
}
|
||||
|
||||
extraArchive {
|
||||
javadoc = false
|
||||
tests = false
|
||||
/*****************************************************************************
|
||||
* Dependencies used by the entire build *
|
||||
*****************************************************************************/
|
||||
|
||||
repositories {
|
||||
jcenter()
|
||||
}
|
||||
|
||||
idea {
|
||||
module {
|
||||
inheritOutputDirs = false
|
||||
outputDir = file('build-idea/classes/main')
|
||||
testOutputDir = file('build-idea/classes/test')
|
||||
dependencies {
|
||||
compile gradleApi()
|
||||
compile localGroovy()
|
||||
compile "com.carrotsearch.randomizedtesting:junit4-ant:${props.getProperty('randomizedrunner')}"
|
||||
compile("junit:junit:${props.getProperty('junit')}") {
|
||||
transitive = false
|
||||
}
|
||||
compile 'com.netflix.nebula:gradle-extra-configurations-plugin:3.0.3'
|
||||
compile 'com.netflix.nebula:nebula-publishing-plugin:4.4.4'
|
||||
compile 'com.netflix.nebula:gradle-info-plugin:3.0.3'
|
||||
compile 'org.eclipse.jgit:org.eclipse.jgit:3.2.0.201312181205-r'
|
||||
compile 'com.perforce:p4java:2012.3.551082' // THIS IS SUPPOSED TO BE OPTIONAL IN THE FUTURE....
|
||||
compile 'de.thetaphi:forbiddenapis:2.0'
|
||||
compile 'com.bmuschko:gradle-nexus-plugin:2.3.1'
|
||||
compile 'org.apache.rat:apache-rat:0.11'
|
||||
}
|
||||
|
||||
|
||||
/*****************************************************************************
|
||||
* Bootstrap repositories *
|
||||
*****************************************************************************/
|
||||
// this will only happen when buildSrc is built on its own during build init
|
||||
if (project == rootProject) {
|
||||
|
||||
repositories {
|
||||
mavenCentral()
|
||||
maven {
|
||||
name 'sonatype-snapshots'
|
||||
url "https://oss.sonatype.org/content/repositories/snapshots/"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
eclipse {
|
||||
classpath {
|
||||
defaultOutputDir = file('build-eclipse')
|
||||
/*****************************************************************************
|
||||
* Normal project checks *
|
||||
*****************************************************************************/
|
||||
|
||||
// this happens when included as a normal project in the build, which we do
|
||||
// to enforce precommit checks like forbidden apis, as well as setup publishing
|
||||
if (project != rootProject) {
|
||||
apply plugin: 'elasticsearch.build'
|
||||
apply plugin: 'nebula.maven-base-publish'
|
||||
apply plugin: 'nebula.maven-scm'
|
||||
|
||||
// groovydoc succeeds, but has some weird internal exception...
|
||||
groovydoc.enabled = false
|
||||
|
||||
// build-tools is not ready for primetime with these...
|
||||
dependencyLicenses.enabled = false
|
||||
forbiddenApisMain.enabled = false
|
||||
jarHell.enabled = false
|
||||
loggerUsageCheck.enabled = false
|
||||
thirdPartyAudit.enabled = false
|
||||
|
||||
// test for elasticsearch.build tries to run with ES...
|
||||
test.enabled = false
|
||||
|
||||
// TODO: re-enable once randomizedtesting gradle code is published and removed from here
|
||||
licenseHeaders.enabled = false
|
||||
|
||||
forbiddenPatterns {
|
||||
exclude '**/*.wav'
|
||||
// the file that actually defines nocommit
|
||||
exclude '**/ForbiddenPatternsTask.groovy'
|
||||
}
|
||||
}
|
||||
|
||||
task copyEclipseSettings(type: Copy) {
|
||||
from project.file('src/main/resources/eclipse.settings')
|
||||
into '.settings'
|
||||
}
|
||||
// otherwise .settings is not nuked entirely
|
||||
tasks.cleanEclipse {
|
||||
delete '.settings'
|
||||
}
|
||||
tasks.eclipse.dependsOn(cleanEclipse, copyEclipseSettings)
|
||||
|
@ -19,6 +19,7 @@
|
||||
package org.elasticsearch.gradle
|
||||
|
||||
import nebula.plugin.extraconfigurations.ProvidedBasePlugin
|
||||
import nebula.plugin.publishing.maven.MavenBasePublishPlugin
|
||||
import org.elasticsearch.gradle.precommit.PrecommitTasks
|
||||
import org.gradle.api.GradleException
|
||||
import org.gradle.api.JavaVersion
|
||||
@ -33,6 +34,8 @@ import org.gradle.api.artifacts.ProjectDependency
|
||||
import org.gradle.api.artifacts.ResolvedArtifact
|
||||
import org.gradle.api.artifacts.dsl.RepositoryHandler
|
||||
import org.gradle.api.artifacts.maven.MavenPom
|
||||
import org.gradle.api.publish.maven.MavenPublication
|
||||
import org.gradle.api.publish.maven.tasks.GenerateMavenPom
|
||||
import org.gradle.api.tasks.bundling.Jar
|
||||
import org.gradle.api.tasks.compile.JavaCompile
|
||||
import org.gradle.internal.jvm.Jvm
|
||||
@ -54,7 +57,7 @@ class BuildPlugin implements Plugin<Project> {
|
||||
project.pluginManager.apply('java')
|
||||
project.pluginManager.apply('carrotsearch.randomized-testing')
|
||||
// these plugins add lots of info to our jars
|
||||
configureJarManifest(project) // jar config must be added before info broker
|
||||
configureJars(project) // jar config must be added before info broker
|
||||
project.pluginManager.apply('nebula.info-broker')
|
||||
project.pluginManager.apply('nebula.info-basic')
|
||||
project.pluginManager.apply('nebula.info-java')
|
||||
@ -68,6 +71,7 @@ class BuildPlugin implements Plugin<Project> {
|
||||
configureConfigurations(project)
|
||||
project.ext.versions = VersionProperties.versions
|
||||
configureCompile(project)
|
||||
configurePomGeneration(project)
|
||||
|
||||
configureTest(project)
|
||||
configurePrecommit(project)
|
||||
@ -100,9 +104,12 @@ class BuildPlugin implements Plugin<Project> {
|
||||
println " OS Info : ${System.getProperty('os.name')} ${System.getProperty('os.version')} (${System.getProperty('os.arch')})"
|
||||
if (gradleJavaVersionDetails != javaVersionDetails) {
|
||||
println " JDK Version (gradle) : ${gradleJavaVersionDetails}"
|
||||
println " JAVA_HOME (gradle) : ${gradleJavaHome}"
|
||||
println " JDK Version (compile) : ${javaVersionDetails}"
|
||||
println " JAVA_HOME (compile) : ${javaHome}"
|
||||
} else {
|
||||
println " JDK Version : ${gradleJavaVersionDetails}"
|
||||
println " JAVA_HOME : ${gradleJavaHome}"
|
||||
}
|
||||
|
||||
// enforce gradle version
|
||||
@ -225,7 +232,7 @@ class BuildPlugin implements Plugin<Project> {
|
||||
*/
|
||||
static void configureConfigurations(Project project) {
|
||||
// we are not shipping these jars, we act like dumb consumers of these things
|
||||
if (project.path.startsWith(':test:fixtures')) {
|
||||
if (project.path.startsWith(':test:fixtures') || project.path == ':build-tools') {
|
||||
return
|
||||
}
|
||||
// fail on any conflicting dependency versions
|
||||
@ -263,44 +270,7 @@ class BuildPlugin implements Plugin<Project> {
|
||||
|
||||
// add exclusions to the pom directly, for each of the transitive deps of this project's deps
|
||||
project.modifyPom { MavenPom pom ->
|
||||
pom.withXml { XmlProvider xml ->
|
||||
// first find if we have dependencies at all, and grab the node
|
||||
NodeList depsNodes = xml.asNode().get('dependencies')
|
||||
if (depsNodes.isEmpty()) {
|
||||
return
|
||||
}
|
||||
|
||||
// check each dependency for any transitive deps
|
||||
for (Node depNode : depsNodes.get(0).children()) {
|
||||
String groupId = depNode.get('groupId').get(0).text()
|
||||
String artifactId = depNode.get('artifactId').get(0).text()
|
||||
String version = depNode.get('version').get(0).text()
|
||||
|
||||
// collect the transitive deps now that we know what this dependency is
|
||||
String depConfig = transitiveDepConfigName(groupId, artifactId, version)
|
||||
Configuration configuration = project.configurations.findByName(depConfig)
|
||||
if (configuration == null) {
|
||||
continue // we did not make this dep non-transitive
|
||||
}
|
||||
Set<ResolvedArtifact> artifacts = configuration.resolvedConfiguration.resolvedArtifacts
|
||||
if (artifacts.size() <= 1) {
|
||||
// this dep has no transitive deps (or the only artifact is itself)
|
||||
continue
|
||||
}
|
||||
|
||||
// we now know we have something to exclude, so add the exclusion elements
|
||||
Node exclusions = depNode.appendNode('exclusions')
|
||||
for (ResolvedArtifact transitiveArtifact : artifacts) {
|
||||
ModuleVersionIdentifier transitiveDep = transitiveArtifact.moduleVersion.id
|
||||
if (transitiveDep.group == groupId && transitiveDep.name == artifactId) {
|
||||
continue; // don't exclude the dependency itself!
|
||||
}
|
||||
Node exclusion = exclusions.appendNode('exclusion')
|
||||
exclusion.appendNode('groupId', transitiveDep.group)
|
||||
exclusion.appendNode('artifactId', transitiveDep.name)
|
||||
}
|
||||
}
|
||||
}
|
||||
pom.withXml(removeTransitiveDependencies(project))
|
||||
}
|
||||
}
|
||||
|
||||
@ -329,6 +299,70 @@ class BuildPlugin implements Plugin<Project> {
|
||||
}
|
||||
}
|
||||
|
||||
/** Returns a closure which can be used with a MavenPom for removing transitive dependencies. */
|
||||
private static Closure removeTransitiveDependencies(Project project) {
|
||||
// TODO: remove this when enforcing gradle 2.13+, it now properly handles exclusions
|
||||
return { XmlProvider xml ->
|
||||
// first find if we have dependencies at all, and grab the node
|
||||
NodeList depsNodes = xml.asNode().get('dependencies')
|
||||
if (depsNodes.isEmpty()) {
|
||||
return
|
||||
}
|
||||
|
||||
// check each dependency for any transitive deps
|
||||
for (Node depNode : depsNodes.get(0).children()) {
|
||||
String groupId = depNode.get('groupId').get(0).text()
|
||||
String artifactId = depNode.get('artifactId').get(0).text()
|
||||
String version = depNode.get('version').get(0).text()
|
||||
|
||||
// collect the transitive deps now that we know what this dependency is
|
||||
String depConfig = transitiveDepConfigName(groupId, artifactId, version)
|
||||
Configuration configuration = project.configurations.findByName(depConfig)
|
||||
if (configuration == null) {
|
||||
continue // we did not make this dep non-transitive
|
||||
}
|
||||
Set<ResolvedArtifact> artifacts = configuration.resolvedConfiguration.resolvedArtifacts
|
||||
if (artifacts.size() <= 1) {
|
||||
// this dep has no transitive deps (or the only artifact is itself)
|
||||
continue
|
||||
}
|
||||
|
||||
// we now know we have something to exclude, so add the exclusion elements
|
||||
Node exclusions = depNode.appendNode('exclusions')
|
||||
for (ResolvedArtifact transitiveArtifact : artifacts) {
|
||||
ModuleVersionIdentifier transitiveDep = transitiveArtifact.moduleVersion.id
|
||||
if (transitiveDep.group == groupId && transitiveDep.name == artifactId) {
|
||||
continue; // don't exclude the dependency itself!
|
||||
}
|
||||
Node exclusion = exclusions.appendNode('exclusion')
|
||||
exclusion.appendNode('groupId', transitiveDep.group)
|
||||
exclusion.appendNode('artifactId', transitiveDep.name)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**Configuration generation of maven poms. */
|
||||
public static void configurePomGeneration(Project project) {
|
||||
project.plugins.withType(MavenBasePublishPlugin.class).whenPluginAdded {
|
||||
project.publishing {
|
||||
publications {
|
||||
all { MavenPublication publication -> // we only deal with maven
|
||||
// add exclusions to the pom directly, for each of the transitive deps of this project's deps
|
||||
publication.pom.withXml(removeTransitiveDependencies(project))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
project.tasks.withType(GenerateMavenPom.class) { GenerateMavenPom t ->
|
||||
// place the pom next to the jar it is for
|
||||
t.destination = new File(project.buildDir, "distributions/${project.archivesBaseName}-${project.version}.pom")
|
||||
// build poms with assemble
|
||||
project.assemble.dependsOn(t)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/** Adds compiler settings to the project */
|
||||
static void configureCompile(Project project) {
|
||||
project.ext.compactProfile = 'compact3'
|
||||
@ -352,13 +386,21 @@ class BuildPlugin implements Plugin<Project> {
|
||||
}
|
||||
options.encoding = 'UTF-8'
|
||||
//options.incremental = true
|
||||
|
||||
// gradle ignores target/source compatibility when it is "unnecessary", but since to compile with
|
||||
// java 9, gradle is running in java 8, it incorrectly thinks it is unnecessary
|
||||
assert minimumJava == JavaVersion.VERSION_1_8
|
||||
options.compilerArgs << '-target' << '1.8' << '-source' << '1.8'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/** Adds additional manifest info to jars */
|
||||
static void configureJarManifest(Project project) {
|
||||
/** Adds additional manifest info to jars, and adds source and javadoc jars */
|
||||
static void configureJars(Project project) {
|
||||
project.tasks.withType(Jar) { Jar jarTask ->
|
||||
// we put all our distributable files under distributions
|
||||
jarTask.destinationDir = new File(project.buildDir, 'distributions')
|
||||
// fixup the jar manifest
|
||||
jarTask.doFirst {
|
||||
boolean isSnapshot = VersionProperties.elasticsearch.endsWith("-SNAPSHOT");
|
||||
String version = VersionProperties.elasticsearch;
|
||||
|
@ -26,14 +26,17 @@ import org.gradle.api.tasks.Exec
|
||||
* A wrapper around gradle's Exec task to capture output and log on error.
|
||||
*/
|
||||
class LoggedExec extends Exec {
|
||||
|
||||
protected ByteArrayOutputStream output = new ByteArrayOutputStream()
|
||||
|
||||
LoggedExec() {
|
||||
if (logger.isInfoEnabled() == false) {
|
||||
standardOutput = new ByteArrayOutputStream()
|
||||
errorOutput = standardOutput
|
||||
standardOutput = output
|
||||
errorOutput = output
|
||||
ignoreExitValue = true
|
||||
doLast {
|
||||
if (execResult.exitValue != 0) {
|
||||
standardOutput.toString('UTF-8').eachLine { line -> logger.error(line) }
|
||||
output.toString('UTF-8').eachLine { line -> logger.error(line) }
|
||||
throw new GradleException("Process '${executable} ${args.join(' ')}' finished with non-zero exit value ${execResult.exitValue}")
|
||||
}
|
||||
}
|
||||
|
@ -0,0 +1,65 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
package org.elasticsearch.gradle.doc
|
||||
|
||||
import org.elasticsearch.gradle.test.RestTestPlugin
|
||||
import org.gradle.api.Project
|
||||
import org.gradle.api.Task
|
||||
|
||||
/**
|
||||
* Sets up tests for documentation.
|
||||
*/
|
||||
public class DocsTestPlugin extends RestTestPlugin {
|
||||
|
||||
@Override
|
||||
public void apply(Project project) {
|
||||
super.apply(project)
|
||||
Task listSnippets = project.tasks.create('listSnippets', SnippetsTask)
|
||||
listSnippets.group 'Docs'
|
||||
listSnippets.description 'List each snippet'
|
||||
listSnippets.perSnippet { println(it.toString()) }
|
||||
|
||||
Task listConsoleCandidates = project.tasks.create(
|
||||
'listConsoleCandidates', SnippetsTask)
|
||||
listConsoleCandidates.group 'Docs'
|
||||
listConsoleCandidates.description
|
||||
'List snippets that probably should be marked // CONSOLE'
|
||||
listConsoleCandidates.perSnippet {
|
||||
if (
|
||||
it.console // Already marked, nothing to do
|
||||
|| it.testResponse // It is a response
|
||||
) {
|
||||
return
|
||||
}
|
||||
List<String> languages = [
|
||||
// These languages should almost always be marked console
|
||||
'js', 'json',
|
||||
// These are often curl commands that should be converted but
|
||||
// are probably false positives
|
||||
'sh', 'shell',
|
||||
]
|
||||
if (false == languages.contains(it.language)) {
|
||||
return
|
||||
}
|
||||
println(it.toString())
|
||||
}
|
||||
|
||||
project.tasks.create('buildRestTests', RestTestsFromSnippetsTask)
|
||||
}
|
||||
}
|
@ -0,0 +1,212 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.gradle.doc
|
||||
|
||||
import org.elasticsearch.gradle.doc.SnippetsTask.Snippet
|
||||
import org.gradle.api.InvalidUserDataException
|
||||
import org.gradle.api.tasks.Input
|
||||
import org.gradle.api.tasks.OutputDirectory
|
||||
|
||||
import java.nio.file.Files
|
||||
import java.nio.file.Path
|
||||
import java.util.regex.Matcher
|
||||
|
||||
/**
|
||||
* Generates REST tests for each snippet marked // TEST.
|
||||
*/
|
||||
public class RestTestsFromSnippetsTask extends SnippetsTask {
|
||||
@Input
|
||||
Map<String, String> setups = new HashMap()
|
||||
|
||||
/**
|
||||
* Root directory of the tests being generated. To make rest tests happy
|
||||
* we generate them in a testRoot() which is contained in this directory.
|
||||
*/
|
||||
@OutputDirectory
|
||||
File testRoot = project.file('build/rest')
|
||||
|
||||
public RestTestsFromSnippetsTask() {
|
||||
project.afterEvaluate {
|
||||
// Wait to set this so testRoot can be customized
|
||||
project.sourceSets.test.output.dir(testRoot, builtBy: this)
|
||||
}
|
||||
TestBuilder builder = new TestBuilder()
|
||||
doFirst { outputRoot().delete() }
|
||||
perSnippet builder.&handleSnippet
|
||||
doLast builder.&finishLastTest
|
||||
}
|
||||
|
||||
/**
|
||||
* Root directory containing all the files generated by this task. It is
|
||||
* contained withing testRoot.
|
||||
*/
|
||||
File outputRoot() {
|
||||
return new File(testRoot, '/rest-api-spec/test')
|
||||
}
|
||||
|
||||
private class TestBuilder {
|
||||
private static final String SYNTAX = {
|
||||
String method = /(?<method>GET|PUT|POST|HEAD|OPTIONS|DELETE)/
|
||||
String pathAndQuery = /(?<pathAndQuery>[^\n]+)/
|
||||
String badBody = /GET|PUT|POST|HEAD|OPTIONS|DELETE|#/
|
||||
String body = /(?<body>(?:\n(?!$badBody)[^\n]+)+)/
|
||||
String nonComment = /$method\s+$pathAndQuery$body?/
|
||||
String comment = /(?<comment>#.+)/
|
||||
/(?:$comment|$nonComment)\n+/
|
||||
}()
|
||||
|
||||
/**
|
||||
* The file in which we saw the last snippet that made a test.
|
||||
*/
|
||||
Path lastDocsPath
|
||||
|
||||
/**
|
||||
* The file we're building.
|
||||
*/
|
||||
PrintWriter current
|
||||
|
||||
/**
|
||||
* Called each time a snippet is encountered. Tracks the snippets and
|
||||
* calls buildTest to actually build the test.
|
||||
*/
|
||||
void handleSnippet(Snippet snippet) {
|
||||
if (snippet.testSetup) {
|
||||
setup(snippet)
|
||||
return
|
||||
}
|
||||
if (snippet.testResponse) {
|
||||
response(snippet)
|
||||
return
|
||||
}
|
||||
if (snippet.test || snippet.console) {
|
||||
test(snippet)
|
||||
return
|
||||
}
|
||||
// Must be an unmarked snippet....
|
||||
}
|
||||
|
||||
private void test(Snippet test) {
|
||||
setupCurrent(test)
|
||||
|
||||
if (false == test.continued) {
|
||||
current.println('---')
|
||||
current.println("\"$test.start\":")
|
||||
}
|
||||
if (test.skipTest) {
|
||||
current.println(" - skip:")
|
||||
current.println(" features: always_skip")
|
||||
current.println(" reason: $test.skipTest")
|
||||
}
|
||||
if (test.setup != null) {
|
||||
String setup = setups[test.setup]
|
||||
if (setup == null) {
|
||||
throw new InvalidUserDataException("Couldn't find setup "
|
||||
+ "for $test")
|
||||
}
|
||||
current.println(setup)
|
||||
}
|
||||
|
||||
body(test)
|
||||
}
|
||||
|
||||
private void response(Snippet response) {
|
||||
current.println(" - response_body: |")
|
||||
response.contents.eachLine { current.println(" $it") }
|
||||
}
|
||||
|
||||
void emitDo(String method, String pathAndQuery,
|
||||
String body, String catchPart) {
|
||||
def (String path, String query) = pathAndQuery.tokenize('?')
|
||||
current.println(" - do:")
|
||||
if (catchPart != null) {
|
||||
current.println(" catch: $catchPart")
|
||||
}
|
||||
current.println(" raw:")
|
||||
current.println(" method: $method")
|
||||
current.println(" path: \"$path\"")
|
||||
if (query != null) {
|
||||
for (String param: query.tokenize('&')) {
|
||||
def (String name, String value) = param.tokenize('=')
|
||||
current.println(" $name: \"$value\"")
|
||||
}
|
||||
}
|
||||
if (body != null) {
|
||||
// Throw out the leading newline we get from parsing the body
|
||||
body = body.substring(1)
|
||||
current.println(" body: |")
|
||||
body.eachLine { current.println(" $it") }
|
||||
}
|
||||
}
|
||||
|
||||
private void setup(Snippet setup) {
|
||||
if (lastDocsPath == setup.path) {
|
||||
throw new InvalidUserDataException("$setup: wasn't first")
|
||||
}
|
||||
setupCurrent(setup)
|
||||
current.println('---')
|
||||
current.println("setup:")
|
||||
body(setup)
|
||||
}
|
||||
|
||||
private void body(Snippet snippet) {
|
||||
parse("$snippet", snippet.contents, SYNTAX) { matcher, last ->
|
||||
if (matcher.group("comment") != null) {
|
||||
// Comment
|
||||
return
|
||||
}
|
||||
String method = matcher.group("method")
|
||||
String pathAndQuery = matcher.group("pathAndQuery")
|
||||
String body = matcher.group("body")
|
||||
String catchPart = last ? snippet.catchPart : null
|
||||
if (pathAndQuery.startsWith('/')) {
|
||||
// Leading '/'s break the generated paths
|
||||
pathAndQuery = pathAndQuery.substring(1)
|
||||
}
|
||||
emitDo(method, pathAndQuery, body, catchPart)
|
||||
}
|
||||
}
|
||||
|
||||
private PrintWriter setupCurrent(Snippet test) {
|
||||
if (lastDocsPath == test.path) {
|
||||
return
|
||||
}
|
||||
finishLastTest()
|
||||
lastDocsPath = test.path
|
||||
|
||||
// Make the destination file:
|
||||
// Shift the path into the destination directory tree
|
||||
Path dest = outputRoot().toPath().resolve(test.path)
|
||||
// Replace the extension
|
||||
String fileName = dest.getName(dest.nameCount - 1)
|
||||
dest = dest.parent.resolve(fileName.replace('.asciidoc', '.yaml'))
|
||||
|
||||
// Now setup the writer
|
||||
Files.createDirectories(dest.parent)
|
||||
current = dest.newPrintWriter('UTF-8')
|
||||
}
|
||||
|
||||
void finishLastTest() {
|
||||
if (current != null) {
|
||||
current.close()
|
||||
current = null
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,308 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.gradle.doc
|
||||
|
||||
import org.gradle.api.DefaultTask
|
||||
import org.gradle.api.InvalidUserDataException
|
||||
import org.gradle.api.file.ConfigurableFileTree
|
||||
import org.gradle.api.tasks.InputFiles
|
||||
import org.gradle.api.tasks.TaskAction
|
||||
|
||||
import java.nio.file.Path
|
||||
import java.util.regex.Matcher
|
||||
|
||||
/**
|
||||
* A task which will run a closure on each snippet in the documentation.
|
||||
*/
|
||||
public class SnippetsTask extends DefaultTask {
|
||||
private static final String SCHAR = /(?:\\\/|[^\/])/
|
||||
private static final String SUBSTITUTION = /s\/($SCHAR+)\/($SCHAR*)\//
|
||||
private static final String CATCH = /catch:\s*((?:\/[^\/]+\/)|[^ \]]+)/
|
||||
private static final String SKIP = /skip:([^\]]+)/
|
||||
private static final String SETUP = /setup:([^ \]]+)/
|
||||
private static final String TEST_SYNTAX =
|
||||
/(?:$CATCH|$SUBSTITUTION|$SKIP|(continued)|$SETUP) ?/
|
||||
|
||||
/**
|
||||
* Action to take on each snippet. Called with a single parameter, an
|
||||
* instance of Snippet.
|
||||
*/
|
||||
Closure perSnippet
|
||||
|
||||
/**
|
||||
* The docs to scan. Defaults to every file in the directory exception the
|
||||
* build.gradle file because that is appropriate for Elasticsearch's docs
|
||||
* directory.
|
||||
*/
|
||||
@InputFiles
|
||||
ConfigurableFileTree docs = project.fileTree(project.projectDir) {
|
||||
// No snippets in the build file
|
||||
exclude 'build.gradle'
|
||||
// That is where the snippets go, not where they come from!
|
||||
exclude 'build'
|
||||
}
|
||||
|
||||
@TaskAction
|
||||
public void executeTask() {
|
||||
/*
|
||||
* Walks each line of each file, building snippets as it encounters
|
||||
* the lines that make up the snippet.
|
||||
*/
|
||||
for (File file: docs) {
|
||||
String lastLanguage
|
||||
int lastLanguageLine
|
||||
Snippet snippet = null
|
||||
StringBuilder contents = null
|
||||
List substitutions = null
|
||||
Closure emit = {
|
||||
snippet.contents = contents.toString()
|
||||
contents = null
|
||||
if (substitutions != null) {
|
||||
substitutions.each { String pattern, String subst ->
|
||||
/*
|
||||
* $body is really common but it looks like a
|
||||
* backreference so we just escape it here to make the
|
||||
* tests cleaner.
|
||||
*/
|
||||
subst = subst.replace('$body', '\\$body')
|
||||
// \n is a new line....
|
||||
subst = subst.replace('\\n', '\n')
|
||||
snippet.contents = snippet.contents.replaceAll(
|
||||
pattern, subst)
|
||||
}
|
||||
substitutions = null
|
||||
}
|
||||
perSnippet(snippet)
|
||||
snippet = null
|
||||
}
|
||||
file.eachLine('UTF-8') { String line, int lineNumber ->
|
||||
Matcher matcher
|
||||
if (line ==~ /-{4,}\s*/) { // Four dashes looks like a snippet
|
||||
if (snippet == null) {
|
||||
Path path = docs.dir.toPath().relativize(file.toPath())
|
||||
snippet = new Snippet(path: path, start: lineNumber)
|
||||
if (lastLanguageLine == lineNumber - 1) {
|
||||
snippet.language = lastLanguage
|
||||
}
|
||||
} else {
|
||||
snippet.end = lineNumber
|
||||
}
|
||||
return
|
||||
}
|
||||
matcher = line =~ /\[source,(\w+)]\s*/
|
||||
if (matcher.matches()) {
|
||||
lastLanguage = matcher.group(1)
|
||||
lastLanguageLine = lineNumber
|
||||
return
|
||||
}
|
||||
if (line ==~ /\/\/\s*AUTOSENSE\s*/) {
|
||||
throw new InvalidUserDataException("AUTOSENSE has been " +
|
||||
"replaced by CONSOLE. Use that instead at " +
|
||||
"$file:$lineNumber")
|
||||
}
|
||||
if (line ==~ /\/\/\s*CONSOLE\s*/) {
|
||||
if (snippet == null) {
|
||||
throw new InvalidUserDataException("CONSOLE not " +
|
||||
"paired with a snippet at $file:$lineNumber")
|
||||
}
|
||||
snippet.console = true
|
||||
return
|
||||
}
|
||||
matcher = line =~ /\/\/\s*TEST(\[(.+)\])?\s*/
|
||||
if (matcher.matches()) {
|
||||
if (snippet == null) {
|
||||
throw new InvalidUserDataException("TEST not " +
|
||||
"paired with a snippet at $file:$lineNumber")
|
||||
}
|
||||
snippet.test = true
|
||||
if (matcher.group(2) != null) {
|
||||
String loc = "$file:$lineNumber"
|
||||
parse(loc, matcher.group(2), TEST_SYNTAX) {
|
||||
if (it.group(1) != null) {
|
||||
snippet.catchPart = it.group(1)
|
||||
return
|
||||
}
|
||||
if (it.group(2) != null) {
|
||||
if (substitutions == null) {
|
||||
substitutions = []
|
||||
}
|
||||
substitutions.add([it.group(2), it.group(3)])
|
||||
return
|
||||
}
|
||||
if (it.group(4) != null) {
|
||||
snippet.skipTest = it.group(4)
|
||||
return
|
||||
}
|
||||
if (it.group(5) != null) {
|
||||
snippet.continued = true
|
||||
return
|
||||
}
|
||||
if (it.group(6) != null) {
|
||||
snippet.setup = it.group(6)
|
||||
return
|
||||
}
|
||||
throw new InvalidUserDataException(
|
||||
"Invalid test marker: $line")
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
matcher = line =~ /\/\/\s*TESTRESPONSE(\[(.+)\])?\s*/
|
||||
if (matcher.matches()) {
|
||||
if (snippet == null) {
|
||||
throw new InvalidUserDataException("TESTRESPONSE not " +
|
||||
"paired with a snippet at $file:$lineNumber")
|
||||
}
|
||||
snippet.testResponse = true
|
||||
if (matcher.group(2) != null) {
|
||||
if (substitutions == null) {
|
||||
substitutions = []
|
||||
}
|
||||
String loc = "$file:$lineNumber"
|
||||
parse(loc, matcher.group(2), /$SUBSTITUTION ?/) {
|
||||
substitutions.add([it.group(1), it.group(2)])
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
if (line ==~ /\/\/\s*TESTSETUP\s*/) {
|
||||
snippet.testSetup = true
|
||||
return
|
||||
}
|
||||
if (snippet == null) {
|
||||
// Outside
|
||||
return
|
||||
}
|
||||
if (snippet.end == Snippet.NOT_FINISHED) {
|
||||
// Inside
|
||||
if (contents == null) {
|
||||
contents = new StringBuilder()
|
||||
}
|
||||
// We don't need the annotations
|
||||
line = line.replaceAll(/<\d+>/, '')
|
||||
// Nor any trailing spaces
|
||||
line = line.replaceAll(/\s+$/, '')
|
||||
contents.append(line).append('\n')
|
||||
return
|
||||
}
|
||||
// Just finished
|
||||
emit()
|
||||
}
|
||||
if (snippet != null) emit()
|
||||
}
|
||||
}
|
||||
|
||||
static class Snippet {
|
||||
static final int NOT_FINISHED = -1
|
||||
|
||||
/**
|
||||
* Path to the file containing this snippet. Relative to docs.dir of the
|
||||
* SnippetsTask that created it.
|
||||
*/
|
||||
Path path
|
||||
int start
|
||||
int end = NOT_FINISHED
|
||||
String contents
|
||||
|
||||
boolean console = false
|
||||
boolean test = false
|
||||
boolean testResponse = false
|
||||
boolean testSetup = false
|
||||
String skipTest = null
|
||||
boolean continued = false
|
||||
String language = null
|
||||
String catchPart = null
|
||||
String setup = null
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
String result = "$path[$start:$end]"
|
||||
if (language != null) {
|
||||
result += "($language)"
|
||||
}
|
||||
if (console) {
|
||||
result += '// CONSOLE'
|
||||
}
|
||||
if (test) {
|
||||
result += '// TEST'
|
||||
if (catchPart) {
|
||||
result += "[catch: $catchPart]"
|
||||
}
|
||||
if (skipTest) {
|
||||
result += "[skip=$skipTest]"
|
||||
}
|
||||
if (continued) {
|
||||
result += '[continued]'
|
||||
}
|
||||
if (setup) {
|
||||
result += "[setup:$setup]"
|
||||
}
|
||||
}
|
||||
if (testResponse) {
|
||||
result += '// TESTRESPONSE'
|
||||
}
|
||||
if (testSetup) {
|
||||
result += '// TESTSETUP'
|
||||
}
|
||||
return result
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Repeatedly match the pattern to the string, calling the closure with the
|
||||
* matchers each time there is a match. If there are characters that don't
|
||||
* match then blow up. If the closure takes two parameters then the second
|
||||
* one is "is this the last match?".
|
||||
*/
|
||||
protected parse(String location, String s, String pattern, Closure c) {
|
||||
if (s == null) {
|
||||
return // Silly null, only real stuff gets to match!
|
||||
}
|
||||
Matcher m = s =~ pattern
|
||||
int offset = 0
|
||||
Closure extraContent = { message ->
|
||||
StringBuilder cutOut = new StringBuilder()
|
||||
cutOut.append(s[offset - 6..offset - 1])
|
||||
cutOut.append('*')
|
||||
cutOut.append(s[offset..Math.min(offset + 5, s.length() - 1)])
|
||||
String cutOutNoNl = cutOut.toString().replace('\n', '\\n')
|
||||
throw new InvalidUserDataException("$location: Extra content "
|
||||
+ "$message ('$cutOutNoNl') matching [$pattern]: $s")
|
||||
}
|
||||
while (m.find()) {
|
||||
if (m.start() != offset) {
|
||||
extraContent("between [$offset] and [${m.start()}]")
|
||||
}
|
||||
offset = m.end()
|
||||
if (c.maximumNumberOfParameters == 1) {
|
||||
c(m)
|
||||
} else {
|
||||
c(m, offset == s.length())
|
||||
}
|
||||
}
|
||||
if (offset == 0) {
|
||||
throw new InvalidUserDataException("$location: Didn't match "
|
||||
+ "$pattern: $s")
|
||||
}
|
||||
if (offset != s.length()) {
|
||||
extraContent("after [$offset]")
|
||||
}
|
||||
}
|
||||
}
|
@ -18,11 +18,13 @@
|
||||
*/
|
||||
package org.elasticsearch.gradle.plugin
|
||||
|
||||
import nebula.plugin.publishing.maven.MavenBasePublishPlugin
|
||||
import nebula.plugin.publishing.maven.MavenManifestPlugin
|
||||
import nebula.plugin.publishing.maven.MavenScmPlugin
|
||||
import org.elasticsearch.gradle.BuildPlugin
|
||||
import org.elasticsearch.gradle.test.RestIntegTestTask
|
||||
import org.elasticsearch.gradle.test.RunTask
|
||||
import org.gradle.api.Project
|
||||
import org.gradle.api.artifacts.Dependency
|
||||
import org.gradle.api.tasks.SourceSet
|
||||
import org.gradle.api.tasks.bundling.Zip
|
||||
|
||||
@ -50,6 +52,7 @@ public class PluginBuildPlugin extends BuildPlugin {
|
||||
} else {
|
||||
project.integTest.clusterConfig.plugin(name, project.bundlePlugin.outputs.files)
|
||||
project.tasks.run.clusterConfig.plugin(name, project.bundlePlugin.outputs.files)
|
||||
addPomGeneration(project)
|
||||
}
|
||||
|
||||
project.namingConventions {
|
||||
@ -125,4 +128,32 @@ public class PluginBuildPlugin extends BuildPlugin {
|
||||
project.configurations.getByName('default').extendsFrom = []
|
||||
project.artifacts.add('default', bundle)
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds the plugin jar and zip as publications.
|
||||
*/
|
||||
protected static void addPomGeneration(Project project) {
|
||||
project.plugins.apply(MavenBasePublishPlugin.class)
|
||||
project.plugins.apply(MavenScmPlugin.class)
|
||||
|
||||
project.publishing {
|
||||
publications {
|
||||
nebula {
|
||||
artifact project.bundlePlugin
|
||||
pom.withXml {
|
||||
// overwrite the name/description in the pom nebula set up
|
||||
Node root = asNode()
|
||||
for (Node node : root.children()) {
|
||||
if (node.name() == 'name') {
|
||||
node.setValue(project.pluginProperties.extension.name)
|
||||
} else if (node.name() == 'description') {
|
||||
node.setValue(project.pluginProperties.extension.description)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
@ -57,11 +57,13 @@ class PluginPropertiesTask extends Copy {
|
||||
// configure property substitution
|
||||
from(templateFile)
|
||||
into(generatedResourcesDir)
|
||||
expand(generateSubstitutions())
|
||||
Map<String, String> properties = generateSubstitutions()
|
||||
expand(properties)
|
||||
inputs.properties(properties)
|
||||
}
|
||||
}
|
||||
|
||||
Map generateSubstitutions() {
|
||||
Map<String, String> generateSubstitutions() {
|
||||
def stringSnap = { version ->
|
||||
if (version.endsWith("-SNAPSHOT")) {
|
||||
return version.substring(0, version.length() - 9)
|
||||
|
@ -21,7 +21,6 @@ package org.elasticsearch.gradle.precommit
|
||||
|
||||
import org.elasticsearch.gradle.LoggedExec
|
||||
import org.gradle.api.file.FileCollection
|
||||
import org.gradle.api.tasks.InputFile
|
||||
import org.gradle.api.tasks.OutputFile
|
||||
|
||||
/**
|
||||
@ -35,14 +34,12 @@ public class JarHellTask extends LoggedExec {
|
||||
* inputs (ie the jars/class files).
|
||||
*/
|
||||
@OutputFile
|
||||
public File successMarker = new File(project.buildDir, 'markers/jarHell')
|
||||
|
||||
/** The classpath to run jarhell check on, defaults to the test runtime classpath */
|
||||
@InputFile
|
||||
public FileCollection classpath = project.sourceSets.test.runtimeClasspath
|
||||
File successMarker = new File(project.buildDir, 'markers/jarHell')
|
||||
|
||||
public JarHellTask() {
|
||||
project.afterEvaluate {
|
||||
FileCollection classpath = project.sourceSets.test.runtimeClasspath
|
||||
inputs.files(classpath)
|
||||
dependsOn(classpath)
|
||||
description = "Runs CheckJarHell on ${classpath}"
|
||||
executable = new File(project.javaHome, 'bin/java')
|
||||
|
@ -22,6 +22,8 @@ import org.apache.rat.anttasks.Report
|
||||
import org.apache.rat.anttasks.SubstringLicenseMatcher
|
||||
import org.apache.rat.license.SimpleLicenseFamily
|
||||
import org.elasticsearch.gradle.AntTask
|
||||
import org.gradle.api.file.FileCollection
|
||||
import org.gradle.api.tasks.OutputFile
|
||||
import org.gradle.api.tasks.SourceSet
|
||||
|
||||
import java.nio.file.Files
|
||||
@ -33,8 +35,22 @@ import java.nio.file.Files
|
||||
*/
|
||||
public class LicenseHeadersTask extends AntTask {
|
||||
|
||||
@OutputFile
|
||||
File reportFile = new File(project.buildDir, 'reports/licenseHeaders/rat.log')
|
||||
|
||||
/**
|
||||
* The list of java files to check. protected so the afterEvaluate closure in the
|
||||
* constructor can write to it.
|
||||
*/
|
||||
protected List<FileCollection> javaFiles
|
||||
|
||||
LicenseHeadersTask() {
|
||||
description = "Checks sources for missing, incorrect, or unacceptable license headers"
|
||||
// Delay resolving the dependencies until after evaluation so we pick up generated sources
|
||||
project.afterEvaluate {
|
||||
javaFiles = project.sourceSets.collect({it.allJava})
|
||||
inputs.files(javaFiles)
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -43,17 +59,13 @@ public class LicenseHeadersTask extends AntTask {
|
||||
ant.project.addDataTypeDefinition('substringMatcher', SubstringLicenseMatcher)
|
||||
ant.project.addDataTypeDefinition('approvedLicense', SimpleLicenseFamily)
|
||||
|
||||
// create a file for the log to go to under reports/
|
||||
File reportDir = new File(project.buildDir, "reports/licenseHeaders")
|
||||
reportDir.mkdirs()
|
||||
File reportFile = new File(reportDir, "rat.log")
|
||||
Files.deleteIfExists(reportFile.toPath())
|
||||
|
||||
|
||||
// run rat, going to the file
|
||||
List<FileCollection> input = javaFiles
|
||||
ant.ratReport(reportFile: reportFile.absolutePath, addDefaultLicenseMatchers: true) {
|
||||
// checks all the java sources (allJava)
|
||||
for (SourceSet set : project.sourceSets) {
|
||||
for (File dir : set.allJava.srcDirs) {
|
||||
for (FileCollection dirSet : input) {
|
||||
for (File dir: dirSet.srcDirs) {
|
||||
// sometimes these dirs don't exist, e.g. site-plugin has no actual java src/main...
|
||||
if (dir.exists()) {
|
||||
ant.fileset(dir: dir)
|
||||
@ -85,12 +97,12 @@ public class LicenseHeadersTask extends AntTask {
|
||||
// parsers generated by antlr
|
||||
pattern(substring: "ANTLR GENERATED CODE")
|
||||
}
|
||||
|
||||
|
||||
// approved categories
|
||||
approvedLicense(familyName: "Apache")
|
||||
approvedLicense(familyName: "Generated")
|
||||
approvedLicense(familyName: "Generated")
|
||||
}
|
||||
|
||||
|
||||
// check the license file for any errors, this should be fast.
|
||||
boolean zeroUnknownLicenses = false
|
||||
boolean foundProblemsWithFiles = false
|
||||
@ -98,12 +110,12 @@ public class LicenseHeadersTask extends AntTask {
|
||||
if (line.startsWith("0 Unknown Licenses")) {
|
||||
zeroUnknownLicenses = true
|
||||
}
|
||||
|
||||
|
||||
if (line.startsWith(" !")) {
|
||||
foundProblemsWithFiles = true
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
if (zeroUnknownLicenses == false || foundProblemsWithFiles) {
|
||||
// print the unapproved license section, usually its all you need to fix problems.
|
||||
int sectionNumber = 0
|
||||
|
@ -68,6 +68,7 @@ public class NamingConventionsTask extends LoggedExec {
|
||||
*/
|
||||
project.afterEvaluate {
|
||||
doFirst {
|
||||
args('-Djna.nosys=true')
|
||||
args('-cp', classpath.asPath, 'org.elasticsearch.test.NamingConventionsCheck')
|
||||
if (skipIntegTestInDisguise) {
|
||||
args('--skip-integ-tests-in-disguise')
|
||||
|
@ -87,16 +87,43 @@ class PrecommitTasks {
|
||||
}
|
||||
|
||||
private static Task configureCheckstyle(Project project) {
|
||||
// Always copy the checkstyle configuration files to 'buildDir/checkstyle' since the resources could be located in a jar
|
||||
// file. If the resources are located in a jar, Gradle will fail when it tries to turn the URL into a file
|
||||
URL checkstyleConfUrl = PrecommitTasks.getResource("/checkstyle.xml")
|
||||
URL checkstyleSuppressionsUrl = PrecommitTasks.getResource("/checkstyle_suppressions.xml")
|
||||
File checkstyleDir = new File(project.buildDir, "checkstyle")
|
||||
File checkstyleSuppressions = new File(checkstyleDir, "checkstyle_suppressions.xml")
|
||||
File checkstyleConf = new File(checkstyleDir, "checkstyle.xml");
|
||||
Task copyCheckstyleConf = project.tasks.create("copyCheckstyleConf")
|
||||
|
||||
// configure inputs and outputs so up to date works properly
|
||||
copyCheckstyleConf.outputs.files(checkstyleSuppressions, checkstyleConf)
|
||||
if ("jar".equals(checkstyleConfUrl.getProtocol())) {
|
||||
JarURLConnection jarURLConnection = (JarURLConnection) checkstyleConfUrl.openConnection()
|
||||
copyCheckstyleConf.inputs.file(jarURLConnection.getJarFileURL())
|
||||
} else if ("file".equals(checkstyleConfUrl.getProtocol())) {
|
||||
copyCheckstyleConf.inputs.files(checkstyleConfUrl.getFile(), checkstyleSuppressionsUrl.getFile())
|
||||
}
|
||||
|
||||
copyCheckstyleConf.doLast {
|
||||
checkstyleDir.mkdirs()
|
||||
// withStream will close the output stream and IOGroovyMethods#getBytes reads the InputStream fully and closes it
|
||||
new FileOutputStream(checkstyleConf).withStream {
|
||||
it.write(checkstyleConfUrl.openStream().getBytes())
|
||||
}
|
||||
new FileOutputStream(checkstyleSuppressions).withStream {
|
||||
it.write(checkstyleSuppressionsUrl.openStream().getBytes())
|
||||
}
|
||||
}
|
||||
|
||||
Task checkstyleTask = project.tasks.create('checkstyle')
|
||||
// Apply the checkstyle plugin to create `checkstyleMain` and `checkstyleTest`. It only
|
||||
// creates them if there is main or test code to check and it makes `check` depend
|
||||
// on them. But we want `precommit` to depend on `checkstyle` which depends on them so
|
||||
// we have to swap them.
|
||||
project.pluginManager.apply('checkstyle')
|
||||
URL checkstyleSuppressions = PrecommitTasks.getResource('/checkstyle_suppressions.xml')
|
||||
project.checkstyle {
|
||||
config = project.resources.text.fromFile(
|
||||
PrecommitTasks.getResource('/checkstyle.xml'), 'UTF-8')
|
||||
config = project.resources.text.fromFile(checkstyleConf, 'UTF-8')
|
||||
configProperties = [
|
||||
suppressions: checkstyleSuppressions
|
||||
]
|
||||
@ -106,6 +133,7 @@ class PrecommitTasks {
|
||||
if (task != null) {
|
||||
project.tasks['check'].dependsOn.remove(task)
|
||||
checkstyleTask.dependsOn(task)
|
||||
task.dependsOn(copyCheckstyleConf)
|
||||
task.inputs.file(checkstyleSuppressions)
|
||||
}
|
||||
}
|
||||
|
@ -27,6 +27,9 @@ import org.apache.tools.ant.Project;
|
||||
import org.elasticsearch.gradle.AntTask;
|
||||
import org.gradle.api.artifacts.Configuration;
|
||||
import org.gradle.api.file.FileCollection;
|
||||
import org.gradle.api.tasks.Input
|
||||
import org.gradle.api.tasks.InputFiles
|
||||
import org.gradle.api.tasks.OutputFile
|
||||
|
||||
import java.nio.file.FileVisitResult;
|
||||
import java.nio.file.Files;
|
||||
@ -40,17 +43,62 @@ import java.util.regex.Pattern;
|
||||
* Basic static checking to keep tabs on third party JARs
|
||||
*/
|
||||
public class ThirdPartyAuditTask extends AntTask {
|
||||
|
||||
|
||||
// patterns for classes to exclude, because we understand their issues
|
||||
private String[] excludes = new String[0];
|
||||
|
||||
private List<String> excludes = [];
|
||||
|
||||
/**
|
||||
* Input for the task. Set javadoc for {#link getJars} for more. Protected
|
||||
* so the afterEvaluate closure in the constructor can write it.
|
||||
*/
|
||||
protected FileCollection jars;
|
||||
|
||||
/**
|
||||
* Classpath against which to run the third patty audit. Protected so the
|
||||
* afterEvaluate closure in the constructor can write it.
|
||||
*/
|
||||
protected FileCollection classpath;
|
||||
|
||||
/**
|
||||
* We use a simple "marker" file that we touch when the task succeeds
|
||||
* as the task output. This is compared against the modified time of the
|
||||
* inputs (ie the jars/class files).
|
||||
*/
|
||||
@OutputFile
|
||||
File successMarker = new File(project.buildDir, 'markers/thirdPartyAudit')
|
||||
|
||||
ThirdPartyAuditTask() {
|
||||
// we depend on this because its the only reliable configuration
|
||||
// this probably makes the build slower: gradle you suck here when it comes to configurations, you pay the price.
|
||||
dependsOn(project.configurations.testCompile);
|
||||
description = "Checks third party JAR bytecode for missing classes, use of internal APIs, and other horrors'";
|
||||
|
||||
project.afterEvaluate {
|
||||
Configuration configuration = project.configurations.findByName('runtime');
|
||||
if (configuration == null) {
|
||||
// some projects apparently do not have 'runtime'? what a nice inconsistency,
|
||||
// basically only serves to waste time in build logic!
|
||||
configuration = project.configurations.findByName('testCompile');
|
||||
}
|
||||
assert configuration != null;
|
||||
classpath = configuration
|
||||
|
||||
// we only want third party dependencies.
|
||||
jars = configuration.fileCollection({ dependency ->
|
||||
dependency.group.startsWith("org.elasticsearch") == false
|
||||
});
|
||||
|
||||
// we don't want provided dependencies, which we have already scanned. e.g. don't
|
||||
// scan ES core's dependencies for every single plugin
|
||||
Configuration provided = project.configurations.findByName('provided')
|
||||
if (provided != null) {
|
||||
jars -= provided
|
||||
}
|
||||
inputs.files(jars)
|
||||
onlyIf { jars.isEmpty() == false }
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* classes that should be excluded from the scan,
|
||||
* e.g. because we know what sheisty stuff those particular classes are up to.
|
||||
@ -61,21 +109,22 @@ public class ThirdPartyAuditTask extends AntTask {
|
||||
throw new IllegalArgumentException("illegal third party audit exclusion: '" + s + "', wildcards are not permitted!");
|
||||
}
|
||||
}
|
||||
excludes = classes;
|
||||
excludes = classes.sort();
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Returns current list of exclusions.
|
||||
*/
|
||||
public String[] getExcludes() {
|
||||
@Input
|
||||
public List<String> getExcludes() {
|
||||
return excludes;
|
||||
}
|
||||
|
||||
// yes, we parse Uwe Schindler's errors to find missing classes, and to keep a continuous audit. Just don't let him know!
|
||||
static final Pattern MISSING_CLASS_PATTERN =
|
||||
Pattern.compile(/WARNING: The referenced class '(.*)' cannot be loaded\. Please fix the classpath\!/);
|
||||
|
||||
static final Pattern VIOLATION_PATTERN =
|
||||
|
||||
static final Pattern VIOLATION_PATTERN =
|
||||
Pattern.compile(/\s\sin ([a-zA-Z0-9\$\.]+) \(.*\)/);
|
||||
|
||||
// we log everything and capture errors and handle them with our whitelist
|
||||
@ -124,32 +173,8 @@ public class ThirdPartyAuditTask extends AntTask {
|
||||
|
||||
@Override
|
||||
protected void runAnt(AntBuilder ant) {
|
||||
Configuration configuration = project.configurations.findByName('runtime');
|
||||
if (configuration == null) {
|
||||
// some projects apparently do not have 'runtime'? what a nice inconsistency,
|
||||
// basically only serves to waste time in build logic!
|
||||
configuration = project.configurations.findByName('testCompile');
|
||||
}
|
||||
assert configuration != null;
|
||||
ant.project.addTaskDefinition('thirdPartyAudit', de.thetaphi.forbiddenapis.ant.AntTask);
|
||||
|
||||
// we only want third party dependencies.
|
||||
FileCollection jars = configuration.fileCollection({ dependency ->
|
||||
dependency.group.startsWith("org.elasticsearch") == false
|
||||
});
|
||||
|
||||
// we don't want provided dependencies, which we have already scanned. e.g. don't
|
||||
// scan ES core's dependencies for every single plugin
|
||||
Configuration provided = project.configurations.findByName('provided');
|
||||
if (provided != null) {
|
||||
jars -= provided;
|
||||
}
|
||||
|
||||
// no dependencies matched, we are done
|
||||
if (jars.isEmpty()) {
|
||||
return;
|
||||
}
|
||||
|
||||
// print which jars we are going to scan, always
|
||||
// this is not the time to try to be succinct! Forbidden will print plenty on its own!
|
||||
Set<String> names = new TreeSet<>();
|
||||
@ -171,26 +196,23 @@ public class ThirdPartyAuditTask extends AntTask {
|
||||
}
|
||||
|
||||
// convert exclusion class names to binary file names
|
||||
String[] excludedFiles = new String[excludes.length];
|
||||
for (int i = 0; i < excludes.length; i++) {
|
||||
excludedFiles[i] = excludes[i].replace('.', '/') + ".class";
|
||||
}
|
||||
Set<String> excludedSet = new TreeSet<>(Arrays.asList(excludedFiles));
|
||||
List<String> excludedFiles = excludes.collect {it.replace('.', '/') + ".class"}
|
||||
Set<String> excludedSet = new TreeSet<>(excludedFiles);
|
||||
|
||||
// jarHellReprise
|
||||
Set<String> sheistySet = getSheistyClasses(tmpDir.toPath());
|
||||
|
||||
try {
|
||||
try {
|
||||
ant.thirdPartyAudit(internalRuntimeForbidden: false,
|
||||
failOnUnsupportedJava: false,
|
||||
failOnMissingClasses: false,
|
||||
signaturesFile: new File(getClass().getResource('/forbidden/third-party-audit.txt').toURI()),
|
||||
classpath: configuration.asPath) {
|
||||
classpath: classpath.asPath) {
|
||||
fileset(dir: tmpDir)
|
||||
}
|
||||
} catch (BuildException ignore) {}
|
||||
|
||||
EvilLogger evilLogger = null;
|
||||
EvilLogger evilLogger = null;
|
||||
for (BuildListener listener : ant.project.getBuildListeners()) {
|
||||
if (listener instanceof EvilLogger) {
|
||||
evilLogger = (EvilLogger) listener;
|
||||
@ -228,6 +250,8 @@ public class ThirdPartyAuditTask extends AntTask {
|
||||
|
||||
// clean up our mess (if we succeed)
|
||||
ant.delete(dir: tmpDir.getAbsolutePath());
|
||||
|
||||
successMarker.setText("", 'UTF-8')
|
||||
}
|
||||
|
||||
/**
|
||||
@ -235,11 +259,11 @@ public class ThirdPartyAuditTask extends AntTask {
|
||||
*/
|
||||
private Set<String> getSheistyClasses(Path root) {
|
||||
// system.parent = extensions loader.
|
||||
// note: for jigsaw, this evilness will need modifications (e.g. use jrt filesystem!).
|
||||
// note: for jigsaw, this evilness will need modifications (e.g. use jrt filesystem!).
|
||||
// but groovy/gradle needs to work at all first!
|
||||
ClassLoader ext = ClassLoader.getSystemClassLoader().getParent();
|
||||
assert ext != null;
|
||||
|
||||
|
||||
Set<String> sheistySet = new TreeSet<>();
|
||||
Files.walkFileTree(root, new SimpleFileVisitor<Path>() {
|
||||
@Override
|
||||
|
@ -258,7 +258,7 @@ class ClusterFormationTasks {
|
||||
'path.repo' : "${node.sharedDir}/repo",
|
||||
'path.shared_data' : "${node.sharedDir}/",
|
||||
// Define a node attribute so we can test that it exists
|
||||
'node.testattr' : 'test',
|
||||
'node.attr.testattr' : 'test',
|
||||
'repositories.url.allowed_urls': 'http://snapshot.test*'
|
||||
]
|
||||
esConfig['http.port'] = node.config.httpPort
|
||||
@ -391,6 +391,22 @@ class ClusterFormationTasks {
|
||||
return configureExecTask(name, project, setup, node, args)
|
||||
}
|
||||
|
||||
/** Wrapper for command line argument: surrounds comma with double quotes **/
|
||||
private static class EscapeCommaWrapper {
|
||||
|
||||
Object arg
|
||||
|
||||
public String toString() {
|
||||
String s = arg.toString()
|
||||
|
||||
/// Surround strings that contains a comma with double quotes
|
||||
if (s.indexOf(',') != -1) {
|
||||
return "\"${s}\""
|
||||
}
|
||||
return s
|
||||
}
|
||||
}
|
||||
|
||||
/** Adds a task to execute a command to help setup the cluster */
|
||||
static Task configureExecTask(String name, Project project, Task setup, NodeInfo node, Object[] execArgs) {
|
||||
return project.tasks.create(name: name, type: LoggedExec, dependsOn: setup) {
|
||||
@ -398,10 +414,12 @@ class ClusterFormationTasks {
|
||||
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
|
||||
executable 'cmd'
|
||||
args '/C', 'call'
|
||||
// On Windows the comma character is considered a parameter separator:
|
||||
// argument are wrapped in an ExecArgWrapper that escapes commas
|
||||
args execArgs.collect { a -> new EscapeCommaWrapper(arg: a) }
|
||||
} else {
|
||||
executable 'sh'
|
||||
commandLine execArgs
|
||||
}
|
||||
args execArgs
|
||||
}
|
||||
}
|
||||
|
||||
@ -432,7 +450,7 @@ class ClusterFormationTasks {
|
||||
// gradle task options are not processed until the end of the configuration phase
|
||||
if (node.config.debug) {
|
||||
println 'Running elasticsearch in debug mode, suspending until connected on port 8000'
|
||||
node.env['JAVA_OPTS'] = '-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=8000'
|
||||
node.env['ES_JAVA_OPTS'] = '-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=8000'
|
||||
}
|
||||
|
||||
node.getCommandString().eachLine { line -> logger.info(line) }
|
||||
|
@ -19,7 +19,6 @@
|
||||
package org.elasticsearch.gradle.test
|
||||
|
||||
import org.apache.tools.ant.taskdefs.condition.Os
|
||||
import org.elasticsearch.gradle.VersionProperties
|
||||
import org.gradle.api.InvalidUserDataException
|
||||
import org.gradle.api.Project
|
||||
import org.gradle.api.Task
|
||||
@ -129,18 +128,18 @@ class NodeInfo {
|
||||
args.add("${esScript}")
|
||||
}
|
||||
|
||||
env = [
|
||||
'JAVA_HOME' : project.javaHome,
|
||||
'ES_GC_OPTS': config.jvmArgs // we pass these with the undocumented gc opts so the argline can set gc, etc
|
||||
]
|
||||
env = [ 'JAVA_HOME' : project.javaHome ]
|
||||
args.addAll("-E", "es.node.portsfile=true")
|
||||
env.put('ES_JAVA_OPTS', config.systemProperties.collect { key, value -> "-D${key}=${value}" }.join(" "))
|
||||
String collectedSystemProperties = config.systemProperties.collect { key, value -> "-D${key}=${value}" }.join(" ")
|
||||
String esJavaOpts = config.jvmArgs.isEmpty() ? collectedSystemProperties : collectedSystemProperties + " " + config.jvmArgs
|
||||
env.put('ES_JAVA_OPTS', esJavaOpts)
|
||||
for (Map.Entry<String, String> property : System.properties.entrySet()) {
|
||||
if (property.getKey().startsWith('es.')) {
|
||||
args.add("-E")
|
||||
args.add("${property.getKey()}=${property.getValue()}")
|
||||
}
|
||||
}
|
||||
env.put('ES_JVM_OPTIONS', new File(confDir, 'jvm.options'))
|
||||
args.addAll("-E", "es.path.conf=${confDir}")
|
||||
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
|
||||
args.add('"') // end the entire command, quoted
|
||||
|
@ -19,6 +19,7 @@
|
||||
package org.elasticsearch.gradle.vagrant
|
||||
|
||||
import org.gradle.api.DefaultTask
|
||||
import org.gradle.api.tasks.Input
|
||||
import org.gradle.api.tasks.TaskAction
|
||||
import org.gradle.logging.ProgressLoggerFactory
|
||||
import org.gradle.process.internal.ExecAction
|
||||
@ -30,41 +31,22 @@ import javax.inject.Inject
|
||||
* Runs bats over vagrant. Pretty much like running it using Exec but with a
|
||||
* nicer output formatter.
|
||||
*/
|
||||
class BatsOverVagrantTask extends DefaultTask {
|
||||
String command
|
||||
String boxName
|
||||
ExecAction execAction
|
||||
public class BatsOverVagrantTask extends VagrantCommandTask {
|
||||
|
||||
BatsOverVagrantTask() {
|
||||
execAction = getExecActionFactory().newExecAction()
|
||||
}
|
||||
@Input
|
||||
String command
|
||||
|
||||
@Inject
|
||||
ProgressLoggerFactory getProgressLoggerFactory() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
BatsOverVagrantTask() {
|
||||
project.afterEvaluate {
|
||||
args 'ssh', boxName, '--command', command
|
||||
}
|
||||
}
|
||||
|
||||
@Inject
|
||||
ExecActionFactory getExecActionFactory() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
void boxName(String boxName) {
|
||||
this.boxName = boxName
|
||||
}
|
||||
|
||||
void command(String command) {
|
||||
this.command = command
|
||||
}
|
||||
|
||||
@TaskAction
|
||||
void exec() {
|
||||
// It'd be nice if --machine-readable were, well, nice
|
||||
execAction.commandLine(['vagrant', 'ssh', boxName, '--command', command])
|
||||
execAction.setStandardOutput(new TapLoggerOutputStream(
|
||||
command: command,
|
||||
factory: getProgressLoggerFactory(),
|
||||
logger: logger))
|
||||
execAction.execute();
|
||||
}
|
||||
@Override
|
||||
protected OutputStream createLoggerOutputStream() {
|
||||
return new TapLoggerOutputStream(
|
||||
command: commandLine.join(' '),
|
||||
factory: getProgressLoggerFactory(),
|
||||
logger: logger)
|
||||
}
|
||||
}
|
||||
|
@ -19,9 +19,11 @@
|
||||
package org.elasticsearch.gradle.vagrant
|
||||
|
||||
import com.carrotsearch.gradle.junit4.LoggingOutputStream
|
||||
import groovy.transform.PackageScope
|
||||
import org.gradle.api.GradleScriptException
|
||||
import org.gradle.api.logging.Logger
|
||||
import org.gradle.logging.ProgressLogger
|
||||
import org.gradle.logging.ProgressLoggerFactory
|
||||
|
||||
import java.util.regex.Matcher
|
||||
|
||||
@ -35,73 +37,77 @@ import java.util.regex.Matcher
|
||||
* There is a Tap4j project but we can't use it because it wants to parse the
|
||||
* entire TAP stream at once and won't parse it stream-wise.
|
||||
*/
|
||||
class TapLoggerOutputStream extends LoggingOutputStream {
|
||||
ProgressLogger progressLogger
|
||||
Logger logger
|
||||
int testsCompleted = 0
|
||||
int testsFailed = 0
|
||||
int testsSkipped = 0
|
||||
Integer testCount
|
||||
String countsFormat
|
||||
public class TapLoggerOutputStream extends LoggingOutputStream {
|
||||
private final ProgressLogger progressLogger
|
||||
private boolean isStarted = false
|
||||
private final Logger logger
|
||||
private int testsCompleted = 0
|
||||
private int testsFailed = 0
|
||||
private int testsSkipped = 0
|
||||
private Integer testCount
|
||||
private String countsFormat
|
||||
|
||||
TapLoggerOutputStream(Map args) {
|
||||
logger = args.logger
|
||||
progressLogger = args.factory.newOperation(VagrantLoggerOutputStream)
|
||||
progressLogger.setDescription("TAP output for $args.command")
|
||||
progressLogger.started()
|
||||
progressLogger.progress("Starting $args.command...")
|
||||
}
|
||||
|
||||
void flush() {
|
||||
if (end == start) return
|
||||
line(new String(buffer, start, end - start))
|
||||
start = end
|
||||
}
|
||||
|
||||
void line(String line) {
|
||||
// System.out.print "===> $line\n"
|
||||
if (testCount == null) {
|
||||
try {
|
||||
testCount = line.split('\\.').last().toInteger()
|
||||
def length = (testCount as String).length()
|
||||
countsFormat = "%0${length}d"
|
||||
countsFormat = "[$countsFormat|$countsFormat|$countsFormat/$countsFormat]"
|
||||
return
|
||||
} catch (Exception e) {
|
||||
throw new GradleScriptException(
|
||||
'Error parsing first line of TAP stream!!', e)
|
||||
}
|
||||
}
|
||||
Matcher m = line =~ /(?<status>ok|not ok) \d+(?<skip> # skip (?<skipReason>\(.+\))?)? \[(?<suite>.+)\] (?<test>.+)/
|
||||
if (!m.matches()) {
|
||||
/* These might be failure report lines or comments or whatever. Its hard
|
||||
to tell and it doesn't matter. */
|
||||
logger.warn(line)
|
||||
return
|
||||
}
|
||||
boolean skipped = m.group('skip') != null
|
||||
boolean success = !skipped && m.group('status') == 'ok'
|
||||
String skipReason = m.group('skipReason')
|
||||
String suiteName = m.group('suite')
|
||||
String testName = m.group('test')
|
||||
|
||||
String status
|
||||
if (skipped) {
|
||||
status = "SKIPPED"
|
||||
testsSkipped++
|
||||
} else if (success) {
|
||||
status = " OK"
|
||||
testsCompleted++
|
||||
} else {
|
||||
status = " FAILED"
|
||||
testsFailed++
|
||||
TapLoggerOutputStream(Map args) {
|
||||
logger = args.logger
|
||||
progressLogger = args.factory.newOperation(VagrantLoggerOutputStream)
|
||||
progressLogger.setDescription("TAP output for `${args.command}`")
|
||||
}
|
||||
|
||||
String counts = sprintf(countsFormat,
|
||||
[testsCompleted, testsFailed, testsSkipped, testCount])
|
||||
progressLogger.progress("Tests $counts, $status [$suiteName] $testName")
|
||||
if (!success) {
|
||||
logger.warn(line)
|
||||
@Override
|
||||
public void flush() {
|
||||
if (isStarted == false) {
|
||||
progressLogger.started()
|
||||
isStarted = true
|
||||
}
|
||||
if (end == start) return
|
||||
line(new String(buffer, start, end - start))
|
||||
start = end
|
||||
}
|
||||
|
||||
void line(String line) {
|
||||
// System.out.print "===> $line\n"
|
||||
if (testCount == null) {
|
||||
try {
|
||||
testCount = line.split('\\.').last().toInteger()
|
||||
def length = (testCount as String).length()
|
||||
countsFormat = "%0${length}d"
|
||||
countsFormat = "[$countsFormat|$countsFormat|$countsFormat/$countsFormat]"
|
||||
return
|
||||
} catch (Exception e) {
|
||||
throw new GradleScriptException(
|
||||
'Error parsing first line of TAP stream!!', e)
|
||||
}
|
||||
}
|
||||
Matcher m = line =~ /(?<status>ok|not ok) \d+(?<skip> # skip (?<skipReason>\(.+\))?)? \[(?<suite>.+)\] (?<test>.+)/
|
||||
if (!m.matches()) {
|
||||
/* These might be failure report lines or comments or whatever. Its hard
|
||||
to tell and it doesn't matter. */
|
||||
logger.warn(line)
|
||||
return
|
||||
}
|
||||
boolean skipped = m.group('skip') != null
|
||||
boolean success = !skipped && m.group('status') == 'ok'
|
||||
String skipReason = m.group('skipReason')
|
||||
String suiteName = m.group('suite')
|
||||
String testName = m.group('test')
|
||||
|
||||
String status
|
||||
if (skipped) {
|
||||
status = "SKIPPED"
|
||||
testsSkipped++
|
||||
} else if (success) {
|
||||
status = " OK"
|
||||
testsCompleted++
|
||||
} else {
|
||||
status = " FAILED"
|
||||
testsFailed++
|
||||
}
|
||||
|
||||
String counts = sprintf(countsFormat,
|
||||
[testsCompleted, testsFailed, testsSkipped, testCount])
|
||||
progressLogger.progress("Tests $counts, $status [$suiteName] $testName")
|
||||
if (!success) {
|
||||
logger.warn(line)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -18,11 +18,10 @@
|
||||
*/
|
||||
package org.elasticsearch.gradle.vagrant
|
||||
|
||||
import org.gradle.api.DefaultTask
|
||||
import org.gradle.api.tasks.TaskAction
|
||||
import org.apache.commons.io.output.TeeOutputStream
|
||||
import org.elasticsearch.gradle.LoggedExec
|
||||
import org.gradle.api.tasks.Input
|
||||
import org.gradle.logging.ProgressLoggerFactory
|
||||
import org.gradle.process.internal.ExecAction
|
||||
import org.gradle.process.internal.ExecActionFactory
|
||||
|
||||
import javax.inject.Inject
|
||||
|
||||
@ -30,43 +29,30 @@ import javax.inject.Inject
|
||||
* Runs a vagrant command. Pretty much like Exec task but with a nicer output
|
||||
* formatter and defaults to `vagrant` as first part of commandLine.
|
||||
*/
|
||||
class VagrantCommandTask extends DefaultTask {
|
||||
List<Object> commandLine
|
||||
String boxName
|
||||
ExecAction execAction
|
||||
public class VagrantCommandTask extends LoggedExec {
|
||||
|
||||
VagrantCommandTask() {
|
||||
execAction = getExecActionFactory().newExecAction()
|
||||
}
|
||||
@Input
|
||||
String boxName
|
||||
|
||||
@Inject
|
||||
ProgressLoggerFactory getProgressLoggerFactory() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
public VagrantCommandTask() {
|
||||
executable = 'vagrant'
|
||||
project.afterEvaluate {
|
||||
// It'd be nice if --machine-readable were, well, nice
|
||||
standardOutput = new TeeOutputStream(standardOutput, createLoggerOutputStream())
|
||||
}
|
||||
}
|
||||
|
||||
@Inject
|
||||
ExecActionFactory getExecActionFactory() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
protected OutputStream createLoggerOutputStream() {
|
||||
return new VagrantLoggerOutputStream(
|
||||
command: commandLine.join(' '),
|
||||
factory: getProgressLoggerFactory(),
|
||||
/* Vagrant tends to output a lot of stuff, but most of the important
|
||||
stuff starts with ==> $box */
|
||||
squashedPrefix: "==> $boxName: ")
|
||||
}
|
||||
|
||||
void boxName(String boxName) {
|
||||
this.boxName = boxName
|
||||
}
|
||||
|
||||
void commandLine(Object... commandLine) {
|
||||
this.commandLine = commandLine
|
||||
}
|
||||
|
||||
@TaskAction
|
||||
void exec() {
|
||||
// It'd be nice if --machine-readable were, well, nice
|
||||
execAction.commandLine(['vagrant'] + commandLine)
|
||||
execAction.setStandardOutput(new VagrantLoggerOutputStream(
|
||||
command: commandLine.join(' '),
|
||||
factory: getProgressLoggerFactory(),
|
||||
/* Vagrant tends to output a lot of stuff, but most of the important
|
||||
stuff starts with ==> $box */
|
||||
squashedPrefix: "==> $boxName: "))
|
||||
execAction.execute();
|
||||
}
|
||||
@Inject
|
||||
ProgressLoggerFactory getProgressLoggerFactory() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
}
|
||||
|
@ -19,7 +19,9 @@
|
||||
package org.elasticsearch.gradle.vagrant
|
||||
|
||||
import com.carrotsearch.gradle.junit4.LoggingOutputStream
|
||||
import org.gradle.api.logging.Logger
|
||||
import org.gradle.logging.ProgressLogger
|
||||
import org.gradle.logging.ProgressLoggerFactory
|
||||
|
||||
/**
|
||||
* Adapts an OutputStream being written to by vagrant into a ProcessLogger. It
|
||||
@ -42,79 +44,60 @@ import org.gradle.logging.ProgressLogger
|
||||
* to catch so it can render the output like
|
||||
* "Heading text > stdout from the provisioner".
|
||||
*/
|
||||
class VagrantLoggerOutputStream extends LoggingOutputStream {
|
||||
static final String HEADING_PREFIX = '==> '
|
||||
public class VagrantLoggerOutputStream extends LoggingOutputStream {
|
||||
private static final String HEADING_PREFIX = '==> '
|
||||
|
||||
ProgressLogger progressLogger
|
||||
String squashedPrefix
|
||||
String lastLine = ''
|
||||
boolean inProgressReport = false
|
||||
String heading = ''
|
||||
private final ProgressLogger progressLogger
|
||||
private boolean isStarted = false
|
||||
private String squashedPrefix
|
||||
private String lastLine = ''
|
||||
private boolean inProgressReport = false
|
||||
private String heading = ''
|
||||
|
||||
VagrantLoggerOutputStream(Map args) {
|
||||
progressLogger = args.factory.newOperation(VagrantLoggerOutputStream)
|
||||
progressLogger.setDescription("Vagrant $args.command")
|
||||
progressLogger.started()
|
||||
progressLogger.progress("Starting vagrant $args.command...")
|
||||
squashedPrefix = args.squashedPrefix
|
||||
}
|
||||
|
||||
void flush() {
|
||||
if (end == start) return
|
||||
line(new String(buffer, start, end - start))
|
||||
start = end
|
||||
}
|
||||
|
||||
void line(String line) {
|
||||
// debugPrintLine(line) // Uncomment me to log every incoming line
|
||||
if (line.startsWith('\r\u001b')) {
|
||||
/* We don't want to try to be a full terminal emulator but we want to
|
||||
keep the escape sequences from leaking and catch _some_ of the
|
||||
meaning. */
|
||||
line = line.substring(2)
|
||||
if ('[K' == line) {
|
||||
inProgressReport = true
|
||||
}
|
||||
return
|
||||
VagrantLoggerOutputStream(Map args) {
|
||||
progressLogger = args.factory.newOperation(VagrantLoggerOutputStream)
|
||||
progressLogger.setDescription("Vagrant output for `$args.command`")
|
||||
squashedPrefix = args.squashedPrefix
|
||||
}
|
||||
if (line.startsWith(squashedPrefix)) {
|
||||
line = line.substring(squashedPrefix.length())
|
||||
inProgressReport = false
|
||||
lastLine = line
|
||||
if (line.startsWith(HEADING_PREFIX)) {
|
||||
line = line.substring(HEADING_PREFIX.length())
|
||||
heading = line + ' > '
|
||||
} else {
|
||||
line = heading + line
|
||||
}
|
||||
} else if (inProgressReport) {
|
||||
inProgressReport = false
|
||||
line = lastLine + line
|
||||
} else {
|
||||
return
|
||||
}
|
||||
// debugLogLine(line) // Uncomment me to log every line we add to the logger
|
||||
progressLogger.progress(line)
|
||||
}
|
||||
|
||||
void debugPrintLine(line) {
|
||||
System.out.print '----------> '
|
||||
for (int i = start; i < end; i++) {
|
||||
switch (buffer[i] as char) {
|
||||
case ' '..'~':
|
||||
System.out.print buffer[i] as char
|
||||
break
|
||||
default:
|
||||
System.out.print '%'
|
||||
System.out.print Integer.toHexString(buffer[i])
|
||||
}
|
||||
@Override
|
||||
public void flush() {
|
||||
if (isStarted == false) {
|
||||
progressLogger.started()
|
||||
isStarted = true
|
||||
}
|
||||
if (end == start) return
|
||||
line(new String(buffer, start, end - start))
|
||||
start = end
|
||||
}
|
||||
System.out.print '\n'
|
||||
}
|
||||
|
||||
void debugLogLine(line) {
|
||||
System.out.print '>>>>>>>>>>> '
|
||||
System.out.print line
|
||||
System.out.print '\n'
|
||||
}
|
||||
void line(String line) {
|
||||
if (line.startsWith('\r\u001b')) {
|
||||
/* We don't want to try to be a full terminal emulator but we want to
|
||||
keep the escape sequences from leaking and catch _some_ of the
|
||||
meaning. */
|
||||
line = line.substring(2)
|
||||
if ('[K' == line) {
|
||||
inProgressReport = true
|
||||
}
|
||||
return
|
||||
}
|
||||
if (line.startsWith(squashedPrefix)) {
|
||||
line = line.substring(squashedPrefix.length())
|
||||
inProgressReport = false
|
||||
lastLine = line
|
||||
if (line.startsWith(HEADING_PREFIX)) {
|
||||
line = line.substring(HEADING_PREFIX.length())
|
||||
heading = line + ' > '
|
||||
} else {
|
||||
line = heading + line
|
||||
}
|
||||
} else if (inProgressReport) {
|
||||
inProgressReport = false
|
||||
line = lastLine + line
|
||||
} else {
|
||||
return
|
||||
}
|
||||
progressLogger.progress(line)
|
||||
}
|
||||
}
|
||||
|
@ -0,0 +1,20 @@
|
||||
#
|
||||
# Licensed to Elasticsearch under one or more contributor
|
||||
# license agreements. See the NOTICE file distributed with
|
||||
# this work for additional information regarding copyright
|
||||
# ownership. Elasticsearch licenses this file to you under
|
||||
# the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
|
||||
implementation-class=org.elasticsearch.gradle.doc.DocsTestPlugin
|
@ -7,23 +7,20 @@
|
||||
<!-- On Windows, Checkstyle matches files using \ path separator -->
|
||||
|
||||
<!-- These files are generated by ANTLR so its silly to hold them to our rules. -->
|
||||
<suppress files="org[/\\]elasticsearch[/\\]painless[/\\]PainlessLexer\.java" checks="." />
|
||||
<suppress files="org[/\\]elasticsearch[/\\]painless[/\\]PainlessParser(|BaseVisitor|Visitor)\.java" checks="." />
|
||||
<suppress files="org[/\\]elasticsearch[/\\]painless[/\\]antlr[/\\]PainlessLexer\.java" checks="." />
|
||||
<suppress files="org[/\\]elasticsearch[/\\]painless[/\\]antlr[/\\]PainlessParser(|BaseVisitor|Visitor)\.java" checks="." />
|
||||
|
||||
<!-- Hopefully temporary suppression of LineLength on files that don't pass it. We should remove these when we the
|
||||
files start to pass. -->
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]apache[/\\]lucene[/\\]queries[/\\]BlendedTermQuery.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]apache[/\\]lucene[/\\]queries[/\\]ExtendedCommonTermsQuery.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]apache[/\\]lucene[/\\]queryparser[/\\]classic[/\\]MapperQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]apache[/\\]lucene[/\\]search[/\\]postingshighlight[/\\]CustomPostingsHighlighter.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]apache[/\\]lucene[/\\]search[/\\]vectorhighlight[/\\]CustomFieldQuery.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]Version.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]Action.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ActionModule.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ActionRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ReplicationResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]health[/\\]ClusterHealthRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]health[/\\]ClusterHealthResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]health[/\\]TransportClusterHealthAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]hotthreads[/\\]NodesHotThreadsRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]hotthreads[/\\]TransportNodesHotThreadsAction.java" checks="LineLength" />
|
||||
@ -31,8 +28,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]info[/\\]TransportNodesInfoAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]stats[/\\]NodesStatsRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]stats[/\\]TransportNodesStatsAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]tasks[/\\]list[/\\]ListTasksResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]tasks[/\\]list[/\\]TransportListTasksAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]repositories[/\\]delete[/\\]DeleteRepositoryRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]repositories[/\\]delete[/\\]TransportDeleteRepositoryAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]repositories[/\\]get[/\\]GetRepositoriesRequestBuilder.java" checks="LineLength" />
|
||||
@ -65,7 +60,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]snapshots[/\\]status[/\\]TransportSnapshotsStatusAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]state[/\\]ClusterStateRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]state[/\\]TransportClusterStateAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]stats[/\\]ClusterStatsIndices.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]stats[/\\]ClusterStatsNodeResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]stats[/\\]ClusterStatsRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]stats[/\\]TransportClusterStatsAction.java" checks="LineLength" />
|
||||
@ -157,11 +151,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]delete[/\\]DeleteRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]delete[/\\]TransportDeleteAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]explain[/\\]TransportExplainAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]fieldstats[/\\]FieldStats.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]fieldstats[/\\]FieldStatsRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]fieldstats[/\\]FieldStatsRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]fieldstats[/\\]FieldStatsResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]fieldstats[/\\]TransportFieldStatsTransportAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]get[/\\]GetRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]get[/\\]MultiGetRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]get[/\\]TransportGetAction.java" checks="LineLength" />
|
||||
@ -187,22 +176,17 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]IngestActionFilter.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]IngestProxyActionFilter.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]PutPipelineTransportAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulateDocumentBaseResult.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulateExecutionService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulatePipelineRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulatePipelineRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulatePipelineTransportAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]percolate[/\\]MultiPercolateRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]percolate[/\\]MultiPercolateRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]percolate[/\\]PercolateRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]percolate[/\\]PercolateResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]percolate[/\\]PercolateShardResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]percolate[/\\]TransportMultiPercolateAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]percolate[/\\]TransportPercolateAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]percolate[/\\]TransportShardMultiPercolateAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]search[/\\]MultiSearchRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]search[/\\]SearchPhaseExecutionException.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]search[/\\]SearchRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]search[/\\]SearchResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]search[/\\]ShardSearchFailure.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]search[/\\]TransportClearScrollAction.java" checks="LineLength" />
|
||||
@ -211,10 +195,8 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]ActionFilter.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]AutoCreateIndex.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]DelegatingActionListener.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]DestructiveOperations.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]HandledTransportAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]IndicesOptions.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]ThreadedActionListener.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]ToXContentToBytes.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]broadcast[/\\]BroadcastOperationRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]broadcast[/\\]BroadcastRequest.java" checks="LineLength" />
|
||||
@ -226,23 +208,18 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]master[/\\]MasterNodeOperationRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]master[/\\]MasterNodeReadOperationRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]master[/\\]TransportMasterNodeAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]master[/\\]TransportMasterNodeReadAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]master[/\\]info[/\\]ClusterInfoRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]master[/\\]info[/\\]ClusterInfoRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]master[/\\]info[/\\]TransportClusterInfoAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]nodes[/\\]NodesOperationRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]nodes[/\\]TransportNodesAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]replication[/\\]ReplicationRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]replication[/\\]ReplicationRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]replication[/\\]TransportBroadcastReplicationAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]replication[/\\]TransportReplicationAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]single[/\\]instance[/\\]InstanceShardOperationRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]single[/\\]instance[/\\]TransportInstanceSingleOperationAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]single[/\\]shard[/\\]SingleShardOperationRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]single[/\\]shard[/\\]SingleShardRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]single[/\\]shard[/\\]TransportSingleShardAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]tasks[/\\]TasksRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]tasks[/\\]TransportTasksAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]termvectors[/\\]MultiTermVectorsRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]termvectors[/\\]MultiTermVectorsRequestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]termvectors[/\\]TermVectorsRequest.java" checks="LineLength" />
|
||||
@ -264,13 +241,11 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]bootstrap[/\\]JarHell.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]bootstrap[/\\]Seccomp.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]bootstrap[/\\]Security.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cache[/\\]recycler[/\\]PageCacheRecycler.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]ElasticsearchClient.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]FilterClient.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]node[/\\]NodeClient.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]support[/\\]AbstractClient.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]transport[/\\]TransportClient.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]transport[/\\]TransportClientNodesService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]transport[/\\]support[/\\]TransportProxyClient.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]ClusterModule.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]ClusterState.java" checks="LineLength" />
|
||||
@ -281,7 +256,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]InternalClusterInfoService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]LocalNodeMasterListener.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]SnapshotsInProgress.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]action[/\\]index[/\\]MappingUpdatedAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]action[/\\]index[/\\]NodeIndexDeletedAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]action[/\\]index[/\\]NodeMappingRefreshAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]action[/\\]shard[/\\]ShardStateAction.java" checks="LineLength" />
|
||||
@ -302,7 +276,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]metadata[/\\]MetaDataMappingService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]metadata[/\\]MetaDataUpdateSettingsService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]metadata[/\\]RepositoriesMetaData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]node[/\\]DiscoveryNode.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]node[/\\]DiscoveryNodes.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]IndexRoutingTable.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]IndexShardRoutingTable.java" checks="LineLength" />
|
||||
@ -326,17 +299,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]command[/\\]CancelAllocationCommand.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]command[/\\]MoveAllocationCommand.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]AllocationDeciders.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]AwarenessAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]ClusterRebalanceAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]ConcurrentRebalanceAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]DiskThresholdDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]EnableAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]FilterAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]NodeVersionAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]SameShardAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]ShardsLimitAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]SnapshotInProgressAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]ThrottlingAllocationDecider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]service[/\\]InternalClusterService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]Base64.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]Booleans.java" checks="LineLength" />
|
||||
@ -349,9 +311,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]cli[/\\]CheckFileCommand.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]collect[/\\]ImmutableOpenIntMap.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]geo[/\\]GeoDistance.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]geo[/\\]builders[/\\]LineStringBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]geo[/\\]builders[/\\]PolygonBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]geo[/\\]builders[/\\]ShapeBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]inject[/\\]DefaultConstructionProxyFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]inject[/\\]InjectorImpl.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]inject[/\\]internal[/\\]ConstructionContext.java" checks="LineLength" />
|
||||
@ -360,7 +319,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]io[/\\]Channels.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]joda[/\\]Joda.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]lucene[/\\]Lucene.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]lucene[/\\]all[/\\]AllTermQuery.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]lucene[/\\]index[/\\]ElasticsearchDirectoryReader.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]lucene[/\\]index[/\\]FilterableTermsEnum.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]lucene[/\\]index[/\\]FreqTermsEnum.java" checks="LineLength" />
|
||||
@ -374,12 +332,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]network[/\\]NetworkService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]recycler[/\\]Recyclers.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]rounding[/\\]Rounding.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]AbstractScopedSettings.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]ClusterSettings.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]IndexScopedSettings.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]Setting.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]Settings.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]loader[/\\]XContentSettingsLoader.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]unit[/\\]ByteSizeValue.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]unit[/\\]TimeValue.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]BigArrays.java" checks="LineLength" />
|
||||
@ -390,7 +342,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]concurrent[/\\]PrioritizedEsThreadPoolExecutor.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]concurrent[/\\]ThreadBarrier.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]concurrent[/\\]ThreadContext.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]ObjectParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]XContentBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]XContentFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]XContentHelper.java" checks="LineLength" />
|
||||
@ -409,14 +360,10 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]fd[/\\]NodesFaultDetection.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]membership[/\\]MembershipAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]ping[/\\]ZenPing.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]ping[/\\]unicast[/\\]UnicastZenPing.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]publish[/\\]PendingClusterStatesQueue.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]publish[/\\]PublishClusterStateAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]env[/\\]ESFileStore.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]env[/\\]Environment.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]env[/\\]NodeEnvironment.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]AsyncShardFetch.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]DanglingIndicesState.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]GatewayAllocator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]GatewayMetaState.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]GatewayService.java" checks="LineLength" />
|
||||
@ -425,32 +372,21 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]PrimaryShardAllocator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]ReplicaShardAllocator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]TransportNodesListGatewayMetaState.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]TransportNodesListGatewayStartedShards.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]http[/\\]HttpTransportSettings.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]http[/\\]netty[/\\]HttpRequestHandler.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]http[/\\]netty[/\\]NettyHttpChannel.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]http[/\\]netty[/\\]NettyHttpServerTransport.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]AlreadyExpiredException.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]CompositeIndexEventListener.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]IndexModule.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]IndexSettings.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]IndexingSlowLog.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]MergePolicyConfig.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]MergeSchedulerConfig.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]NodeServicesProvider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]SearchSlowLog.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]AnalysisRegistry.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]AnalysisService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]CommonGramsTokenFilterFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]CustomAnalyzerProvider.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]EdgeNGramTokenizerFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]NumericDoubleAnalyzer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]ShingleTokenFilterFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]StemmerOverrideTokenFilterFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]StopTokenFilterFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]analysis[/\\]compound[/\\]HyphenationCompoundWordTokenFilterFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]cache[/\\]bitset[/\\]BitsetFilterCache.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]cache[/\\]request[/\\]ShardRequestCache.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]codec[/\\]PerFieldMappingPostingFormatCodec.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]engine[/\\]ElasticsearchConcurrentMergeScheduler.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]engine[/\\]Engine.java" checks="LineLength" />
|
||||
@ -464,16 +400,13 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]fieldcomparator[/\\]FloatValuesComparatorSource.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]fieldcomparator[/\\]LongValuesComparatorSource.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]ordinals[/\\]GlobalOrdinalsBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]ordinals[/\\]GlobalOrdinalsIndexFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]ordinals[/\\]InternalGlobalOrdinalsIndexFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]ordinals[/\\]MultiOrdinals.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]ordinals[/\\]OrdinalsBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]ordinals[/\\]SinglePackedOrdinals.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]plain[/\\]AbstractAtomicParentChildFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]plain[/\\]AbstractIndexGeoPointFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]plain[/\\]AbstractIndexOrdinalsFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]plain[/\\]BinaryDVIndexFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]plain[/\\]GeoPointArrayIndexFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]plain[/\\]PagedBytesIndexFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]plain[/\\]ParentChildIndexFieldData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]plain[/\\]SortedNumericDVIndexFieldData.java" checks="LineLength" />
|
||||
@ -494,18 +427,17 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]ParseContext.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]ParsedDocument.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]CompletionFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]DateFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]DoubleFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]FloatFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]NumberFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]LegacyDateFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]LegacyDoubleFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]LegacyFloatFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]LegacyNumberFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]StringFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]TokenCountFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]LegacyTokenCountFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]TypeParsers.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]geo[/\\]BaseGeoPointFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]geo[/\\]GeoPointFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]geo[/\\]GeoPointFieldMapperLegacy.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]geo[/\\]GeoShapeFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]internal[/\\]AllFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]internal[/\\]FieldNamesFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]internal[/\\]IdFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]internal[/\\]IndexFieldMapper.java" checks="LineLength" />
|
||||
@ -525,68 +457,9 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]percolator[/\\]PercolatorFieldMapper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]percolator[/\\]PercolatorQueriesRegistry.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]AbstractQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]BoolQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]BoostingQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]CommonTermsQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]CommonTermsQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]ConstantScoreQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]ExistsQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]ExistsQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]FieldMaskingSpanQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]FieldMaskingSpanQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoBoundingBoxQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoDistanceQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoDistanceQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoDistanceRangeQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoPolygonQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoPolygonQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoShapeQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeohashCellQuery.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]HasChildQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]HasChildQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]HasParentQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]HasParentQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]IdsQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]MatchAllQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]MatchNoneQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]MatchQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]MoreLikeThisQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]MultiMatchQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]NestedQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]Operator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]PrefixQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]QueryBuilders.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]QueryShardContext.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]QueryStringQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]QueryStringQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]QueryValidationException.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]RangeQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]RangeQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]RegexpQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]ScriptQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SimpleQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SimpleQueryStringParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanContainingQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanContainingQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanMultiTermQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanMultiTermQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanTermQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanWithinQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanWithinQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]TemplateQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]TermQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]TermsQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]TypeQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]WildcardQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]WildcardQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]DecayFunctionBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]FunctionScoreQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]FunctionScoreQueryParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]ScoreFunctionBuilders.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]fieldvaluefactor[/\\]FieldValueFactorFunctionParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]random[/\\]RandomScoreFunctionParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]script[/\\]ScriptScoreFunctionBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]script[/\\]ScriptScoreFunctionParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]support[/\\]InnerHitsQueryParserHelper.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]support[/\\]QueryParsers.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]search[/\\]MatchQuery.java" checks="LineLength" />
|
||||
@ -602,7 +475,6 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]shard[/\\]ShardStateMetaData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]shard[/\\]StoreRecovery.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]shard[/\\]TranslogRecoveryPerformer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]similarity[/\\]SimilarityService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]snapshots[/\\]blobstore[/\\]BlobStoreIndexShardRepository.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]snapshots[/\\]blobstore[/\\]BlobStoreIndexShardSnapshots.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]store[/\\]IndexStore.java" checks="LineLength" />
|
||||
@ -632,28 +504,21 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]recovery[/\\]RecoveryFailedException.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]recovery[/\\]RecoverySettings.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]recovery[/\\]RecoverySource.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]recovery[/\\]RecoverySourceHandler.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]recovery[/\\]RecoveryState.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]recovery[/\\]StartRecoveryRequest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]store[/\\]IndicesStore.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]store[/\\]TransportNodesListShardStoreMetaData.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]ttl[/\\]IndicesTTLService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]IngestMetadata.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]PipelineExecutionService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]PipelineStore.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]core[/\\]CompoundProcessor.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]core[/\\]IngestDocument.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]core[/\\]Pipeline.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]processor[/\\]ConvertProcessor.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]fs[/\\]FsService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]jvm[/\\]DeadlockAnalyzer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]jvm[/\\]GcNames.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]jvm[/\\]HotThreads.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]jvm[/\\]JvmGcMonitorService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]jvm[/\\]JvmService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]jvm[/\\]JvmStats.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]os[/\\]OsService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]monitor[/\\]process[/\\]ProcessService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]node[/\\]Node.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]node[/\\]internal[/\\]InternalSettingsPreparer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]percolator[/\\]PercolatorQuery.java" checks="LineLength" />
|
||||
@ -671,13 +536,11 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]repositories[/\\]fs[/\\]FsRepository.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]repositories[/\\]uri[/\\]URLIndexShardRepository.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]repositories[/\\]uri[/\\]URLRepository.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]BaseRestHandler.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]BytesRestResponse.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]RestController.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]admin[/\\]cluster[/\\]health[/\\]RestClusterHealthAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]info[/\\]RestNodesInfoAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]stats[/\\]RestNodesStatsAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]admin[/\\]cluster[/\\]reroute[/\\]RestClusterRerouteAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]admin[/\\]cluster[/\\]settings[/\\]RestClusterGetSettingsAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]admin[/\\]cluster[/\\]settings[/\\]RestClusterUpdateSettingsAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]admin[/\\]cluster[/\\]state[/\\]RestClusterStateAction.java" checks="LineLength" />
|
||||
@ -698,21 +561,16 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]bulk[/\\]RestBulkAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]cat[/\\]RestCountAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]cat[/\\]RestIndicesAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]cat[/\\]RestNodeAttrsAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]cat[/\\]RestNodesAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]cat[/\\]RestPendingClusterTasksAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]cat[/\\]RestShardsAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]cat[/\\]RestThreadPoolAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]fieldstats[/\\]RestFieldStatsAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]get[/\\]RestMultiGetAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]index[/\\]RestIndexAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]main[/\\]RestMainAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]percolate[/\\]RestPercolateAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]script[/\\]RestDeleteIndexedScriptAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]script[/\\]RestPutIndexedScriptAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]search[/\\]RestClearScrollAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]search[/\\]RestMultiSearchAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]search[/\\]RestSearchAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]search[/\\]RestSearchScrollAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]suggest[/\\]RestSuggestAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]support[/\\]RestActions.java" checks="LineLength" />
|
||||
@ -727,10 +585,8 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]script[/\\]ScriptSettings.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]script[/\\]Template.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]MultiValueMode.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]SearchModule.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]SearchService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]AggregatorFactories.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]AggregatorFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]InternalAggregation.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]InternalMultiBucketAggregation.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]ValuesSourceAggregationBuilder.java" checks="LineLength" />
|
||||
@ -743,10 +599,8 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]filters[/\\]FiltersParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]filters[/\\]InternalFilters.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]geogrid[/\\]GeoHashGridAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]geogrid[/\\]GeoHashGridParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]global[/\\]GlobalAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]global[/\\]InternalGlobal.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]histogram[/\\]DateHistogramParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]histogram[/\\]HistogramAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]missing[/\\]InternalMissing.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]missing[/\\]MissingAggregator.java" checks="LineLength" />
|
||||
@ -755,10 +609,7 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]nested[/\\]ReverseNestedAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]range[/\\]InternalRange.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]range[/\\]RangeAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]range[/\\]date[/\\]DateRangeParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]range[/\\]date[/\\]InternalDateRange.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]range[/\\]geodistance[/\\]GeoDistanceParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]range[/\\]geodistance[/\\]InternalGeoDistance.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]range[/\\]ipv4[/\\]InternalIPv4Range.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]sampler[/\\]DiversifiedBytesHashSamplerAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]sampler[/\\]DiversifiedMapSamplerAggregator.java" checks="LineLength" />
|
||||
@ -769,15 +620,11 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]GlobalOrdinalsSignificantTermsAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]InternalSignificantTerms.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]SignificantLongTerms.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]SignificantLongTermsAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]SignificantStringTerms.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]SignificantStringTermsAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]SignificantTermsAggregatorFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]SignificantTermsParametersParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]SignificantTermsParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]UnmappedSignificantTerms.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]heuristics[/\\]GND.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]heuristics[/\\]JLHScore.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]heuristics[/\\]NXYSignificanceHeuristic.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]heuristics[/\\]PercentageScore.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]significant[/\\]heuristics[/\\]ScriptHeuristic.java" checks="LineLength" />
|
||||
@ -795,39 +642,23 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]terms[/\\]TermsAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]terms[/\\]TermsAggregatorFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]terms[/\\]TermsParametersParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]terms[/\\]TermsParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]terms[/\\]UnmappedTerms.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]terms[/\\]support[/\\]IncludeExclude.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]ValuesSourceMetricsAggregationBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]cardinality[/\\]CardinalityAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]cardinality[/\\]CardinalityAggregatorFactory.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]cardinality[/\\]HyperLogLogPlusPlus.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]geobounds[/\\]GeoBoundsAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]geobounds[/\\]InternalGeoBounds.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]percentiles[/\\]AbstractPercentilesParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]percentiles[/\\]tdigest[/\\]AbstractTDigestPercentilesAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]percentiles[/\\]tdigest[/\\]TDigestPercentileRanksAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]percentiles[/\\]tdigest[/\\]TDigestPercentilesAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]scripted[/\\]InternalScriptedMetric.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]scripted[/\\]ScriptedMetricAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]scripted[/\\]ScriptedMetricParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]stats[/\\]StatsAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]stats[/\\]extended[/\\]ExtendedStatsAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]stats[/\\]extended[/\\]ExtendedStatsParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]stats[/\\]extended[/\\]InternalExtendedStats.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]metrics[/\\]tophits[/\\]TopHitsAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]pipeline[/\\]BucketHelpers.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]pipeline[/\\]bucketmetrics[/\\]BucketMetricsParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]pipeline[/\\]bucketmetrics[/\\]avg[/\\]AvgBucketPipelineAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]pipeline[/\\]bucketscript[/\\]BucketScriptPipelineAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]pipeline[/\\]cumulativesum[/\\]CumulativeSumPipelineAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]pipeline[/\\]derivative[/\\]DerivativePipelineAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]pipeline[/\\]derivative[/\\]InternalDerivative.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]pipeline[/\\]having[/\\]BucketSelectorPipelineAggregator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]support[/\\]AggregationContext.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]support[/\\]AggregationPath.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]support[/\\]GeoPointParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]support[/\\]ValueType.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]support[/\\]ValuesSourceParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]support[/\\]format[/\\]ValueFormat.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]support[/\\]format[/\\]ValueParser.java" checks="LineLength" />
|
||||
@ -842,14 +673,9 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]FetchSubPhaseParseElement.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]explain[/\\]ExplainFetchSubPhase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]fielddata[/\\]FieldDataFieldsParseElement.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]innerhits[/\\]InnerHitsContext.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]innerhits[/\\]InnerHitsFetchSubPhase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]innerhits[/\\]InnerHitsParseElement.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]script[/\\]ScriptFieldsParseElement.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]source[/\\]FetchSourceContext.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]highlight[/\\]AbstractHighlighterBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]highlight[/\\]FastVectorHighlighter.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]highlight[/\\]HighlightBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]highlight[/\\]HighlightPhase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]highlight[/\\]HighlightUtils.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]highlight[/\\]HighlighterParseElement.java" checks="LineLength" />
|
||||
@ -866,55 +692,37 @@
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]lookup[/\\]FieldLookup.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]lookup[/\\]LeafDocLookup.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]lookup[/\\]LeafFieldsLookup.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]profile[/\\]ProfileResult.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]query[/\\]QueryPhase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]rescore[/\\]QueryRescorer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]rescore[/\\]RescoreParseElement.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]searchafter[/\\]SearchAfterBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]sort[/\\]GeoDistanceSortParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]sort[/\\]ScriptSortParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]sort[/\\]SortParseElement.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]SuggestBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]SuggestContextParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]SuggestUtils.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]Suggesters.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]completion[/\\]CompletionSuggestParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]completion[/\\]context[/\\]CategoryContextMapping.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]completion[/\\]context[/\\]ContextMapping.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]completion[/\\]context[/\\]GeoContextMapping.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]completion[/\\]context[/\\]GeoQueryContext.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]CandidateScorer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]DirectCandidateGenerator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]LaplaceScorer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]LinearInterpoatingScorer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]NoisyChannelSpellChecker.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]PhraseSuggestParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]StupidBackoffScorer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]WordScorer.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]term[/\\]TermSuggestParser.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]snapshots[/\\]RestoreService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]snapshots[/\\]SnapshotInfo.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]snapshots[/\\]SnapshotShardFailure.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]snapshots[/\\]SnapshotShardsService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]snapshots[/\\]SnapshotsService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]threadpool[/\\]ThreadPool.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]PlainTransportFuture.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]RequestHandlerRegistry.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]Transport.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]TransportChannelResponseHandler.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]TransportService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]netty[/\\]NettyTransport.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]apache[/\\]lucene[/\\]queries[/\\]BlendedTermQueryTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]apache[/\\]lucene[/\\]search[/\\]postingshighlight[/\\]CustomPostingsHighlighterTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ESExceptionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]NamingConventionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]VersionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ListenerActionIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]RejectionActionIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]HotThreadsIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]health[/\\]ClusterHealthResponsesTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]tasks[/\\]TasksIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]tasks[/\\]TransportTasksActionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]repositories[/\\]RepositoryBlocksIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]settings[/\\]SettingsUpdaterTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]snapshots[/\\]SnapshotBlocksIT.java" checks="LineLength" />
|
||||
@ -928,15 +736,12 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]indices[/\\]shards[/\\]IndicesShardStoreRequestIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]indices[/\\]shards[/\\]IndicesShardStoreResponseTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]indices[/\\]template[/\\]put[/\\]MetaDataIndexTemplateServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]indices[/\\]upgrade[/\\]UpgradeIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]bulk[/\\]BulkProcessorIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]bulk[/\\]BulkRequestTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]bulk[/\\]RetryTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]fieldstats[/\\]FieldStatsRequestTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]get[/\\]MultiGetShardRequestTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]BulkRequestModifierTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]IngestProxyActionFilterTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulateDocumentSimpleResultTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulateExecutionServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulatePipelineRequestParsingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]ingest[/\\]SimulatePipelineResponseTests.java" checks="LineLength" />
|
||||
@ -950,8 +755,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]broadcast[/\\]node[/\\]TransportBroadcastByNodeActionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]master[/\\]TransportMasterNodeActionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]replication[/\\]BroadcastReplicationTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]replication[/\\]ClusterStateCreationUtils.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]replication[/\\]TransportReplicationActionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]support[/\\]single[/\\]instance[/\\]TransportInstanceSingleOperationActionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]termvectors[/\\]AbstractTermVectorsTestCase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]termvectors[/\\]GetTermVectorsCheckDocFreqIT.java" checks="LineLength" />
|
||||
@ -967,15 +770,9 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]bwcompat[/\\]RecoveryWithUnsupportedIndicesIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]bwcompat[/\\]RestoreBackwardsCompatIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]AbstractClientHeadersTestCase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]transport[/\\]FailAndRetryMockTransport.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]transport[/\\]TransportClientHeadersTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]transport[/\\]TransportClientIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]transport[/\\]TransportClientNodesServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]client[/\\]transport[/\\]TransportClientRetryIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]ClusterHealthIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]ClusterInfoServiceIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]ClusterModuleTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]ClusterServiceIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]ClusterStateDiffIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]ClusterStateTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]DiskUsageTests.java" checks="LineLength" />
|
||||
@ -996,7 +793,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]allocation[/\\]SimpleAllocationIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]health[/\\]ClusterIndexHealthTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]health[/\\]ClusterStateHealthTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]health[/\\]RoutingTableGenerator.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]metadata[/\\]AutoExpandReplicasTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]metadata[/\\]DateMathExpressionResolverTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]metadata[/\\]HumanReadableIndexSettingsTests.java" checks="LineLength" />
|
||||
@ -1039,7 +835,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]RebalanceAfterActiveTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]ReplicaAllocatedAfterPrimaryTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]RoutingNodesIntegrityTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]SameShardRoutingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]ShardVersioningTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]ShardsLimitAllocationTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]SingleShardNoReplicasRoutingTests.java" checks="LineLength" />
|
||||
@ -1048,12 +843,10 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]TenShardsOneReplicaRoutingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]ThrottlingAllocationTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]UpdateNumberOfReplicasTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]DiskThresholdDeciderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]DiskThresholdDeciderUnitTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]EnableAllocationDeciderIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]allocation[/\\]decider[/\\]EnableAllocationTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]serialization[/\\]ClusterSerializationTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]serialization[/\\]ClusterStateToStringTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]serialization[/\\]DiffableTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]settings[/\\]ClusterSettingsIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]shards[/\\]ClusterSearchShardsIT.java" checks="LineLength" />
|
||||
@ -1064,19 +857,12 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]blobstore[/\\]FsBlobStoreTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]breaker[/\\]MemoryCircuitBreakerTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]geo[/\\]ShapeBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]geo[/\\]builders[/\\]AbstractShapeBuilderTestCase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]geo[/\\]builders[/\\]EnvelopeBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]geo[/\\]builders[/\\]PolygonBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]hash[/\\]MessageDigestsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]inject[/\\]ModuleTestCase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]io[/\\]stream[/\\]BytesStreamsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]lucene[/\\]index[/\\]FreqTermsEnumTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]lucene[/\\]uid[/\\]VersionsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]network[/\\]CidrsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]rounding[/\\]TimeZoneRoundingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]ScopedSettingsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]SettingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]transport[/\\]BoundTransportAddressTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]unit[/\\]DistanceUnitTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]unit[/\\]FuzzinessTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]BigArraysTests.java" checks="LineLength" />
|
||||
@ -1084,34 +870,24 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]LongObjectHashMapTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]concurrent[/\\]EsExecutorsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]concurrent[/\\]PrioritizedExecutorsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]ObjectParserTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]XContentFactoryTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]builder[/\\]XContentBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]cbor[/\\]JsonVsCborTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]smile[/\\]JsonVsSmileTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]support[/\\]filtering[/\\]AbstractFilteringJsonGeneratorTestCase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]xcontent[/\\]support[/\\]filtering[/\\]FilterPathGeneratorFilteringTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]consistencylevel[/\\]WriteConsistencyLevelIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]deps[/\\]joda[/\\]SimpleJodaTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]deps[/\\]lucene[/\\]VectorHighlighterTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]BlockingClusterStatePublishResponseHandlerTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]DiscoveryWithServiceDisruptionsIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]ZenFaultDetectionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]ZenUnicastDiscoveryIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]NodeJoinControllerTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]ZenDiscoveryIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]ZenDiscoveryUnitTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]ping[/\\]unicast[/\\]UnicastZenPingIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]publish[/\\]PendingClusterStatesQueueTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]zen[/\\]publish[/\\]PublishClusterStateActionTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]document[/\\]DocumentActionsIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]env[/\\]EnvironmentTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]env[/\\]NodeEnvironmentTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]explain[/\\]ExplainActionIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]fieldstats[/\\]FieldStatsIntegrationIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]fieldstats[/\\]FieldStatsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]AsyncShardFetchTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]GatewayMetaStateTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]GatewayModuleTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]GatewayServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]GatewayTests.java" checks="LineLength" />
|
||||
@ -1127,8 +903,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]gateway[/\\]ReusePeerRecoverySharedTest.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]get[/\\]GetActionIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]http[/\\]netty[/\\]NettyHttpServerPipeliningTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]http[/\\]netty[/\\]NettyPipeliningDisabledIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]http[/\\]netty[/\\]NettyPipeliningEnabledIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]IndexModuleTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]IndexServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]IndexWithShadowReplicasIT.java" checks="LineLength" />
|
||||
@ -1156,16 +930,12 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]BinaryDVFieldDataTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]DuelFieldDataTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]FieldDataCacheTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]FilterFieldDataTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]IndexFieldDataServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]PagedBytesStringFieldDataTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]ParentChildFieldDataTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]fielddata[/\\]SortedSetDVStringFieldDataTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]DocumentFieldMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]DynamicMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]FieldTypeTestCase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]MapperServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]all[/\\]SimpleAllMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]binary[/\\]BinaryMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]boost[/\\]CustomBoostMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]boost[/\\]FieldLevelBoostTests.java" checks="LineLength" />
|
||||
@ -1175,8 +945,8 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]BooleanFieldMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]CompletionFieldTypeTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]MultiFieldCopyToMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]TokenCountFieldMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]date[/\\]SimpleDateMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]core[/\\]LegacyTokenCountFieldMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]date[/\\]LegacyDateMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]dynamictemplate[/\\]genericstore[/\\]GenericStoreDynamicTemplateTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]dynamictemplate[/\\]pathmatch[/\\]PathMatchDynamicTemplateTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]dynamictemplate[/\\]simple[/\\]SimpleDynamicTemplatesTests.java" checks="LineLength" />
|
||||
@ -1192,12 +962,12 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]index[/\\]IndexTypeMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]internal[/\\]FieldNamesFieldMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]internal[/\\]TypeFieldMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]ip[/\\]SimpleIpMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]ip[/\\]LegacyIpMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]merge[/\\]TestMergeMapperTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]multifield[/\\]MultiFieldTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]multifield[/\\]merge[/\\]JavaMultiFieldMergeTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]nested[/\\]NestedMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]numeric[/\\]SimpleNumericTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]numeric[/\\]LegacyNumericTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]object[/\\]NullValueObjectMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]object[/\\]SimpleObjectMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]parent[/\\]ParentMappingTests.java" checks="LineLength" />
|
||||
@ -1218,21 +988,15 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]BoostingQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]CommonTermsQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]FieldMaskingSpanQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoBoundingBoxQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]GeoDistanceQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]HasChildQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]HasParentQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]MoreLikeThisQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]MultiMatchQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]QueryStringQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]RandomQueryBuilder.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]RangeQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]ScoreModeTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanMultiTermQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]SpanNotQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]FunctionScoreEquivalenceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]FunctionScoreQueryBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]functionscore[/\\]FunctionScoreTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]plugin[/\\]CustomQueryParserIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]query[/\\]support[/\\]QueryInnerHitsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]search[/\\]MultiMatchQueryTests.java" checks="LineLength" />
|
||||
@ -1261,8 +1025,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]IndicesLifecycleListenerIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]IndicesLifecycleListenerSingleNodeTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]IndicesOptionsIntegrationIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]IndicesServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]analysis[/\\]PreBuiltAnalyzerIntegrationIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]analyze[/\\]AnalyzeActionIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]analyze[/\\]HunspellServiceIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]indices[/\\]exists[/\\]indices[/\\]IndicesExistsIT.java" checks="LineLength" />
|
||||
@ -1295,7 +1057,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]PipelineExecutionServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]PipelineStoreTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]core[/\\]CompoundProcessorTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]core[/\\]IngestDocumentTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]core[/\\]PipelineFactoryTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]core[/\\]ValueSourceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]processor[/\\]AbstractStringProcessorTestCase.java" checks="LineLength" />
|
||||
@ -1324,7 +1085,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]BytesRestResponseTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]CorsRegexDefaultIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]CorsRegexIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]NoOpClient.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]RestControllerTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]util[/\\]RestUtilsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]routing[/\\]AliasResolveRoutingIT.java" checks="LineLength" />
|
||||
@ -1339,7 +1099,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]script[/\\]ScriptServiceTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]script[/\\]ScriptSettingsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]MultiValueModeTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]SearchModuleTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]SearchWithRejectionsIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]MissingValueIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]ChildrenIT.java" checks="LineLength" />
|
||||
@ -1376,14 +1135,9 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]child[/\\]ChildQuerySearchIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]child[/\\]ParentFieldLoadingIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]fetch[/\\]FetchSubPhasePluginIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]functionscore[/\\]DecayFunctionScoreIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]functionscore[/\\]ExplainableScriptIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]functionscore[/\\]FunctionScoreBackwardCompatibilityIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]functionscore[/\\]QueryRescorerIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]geo[/\\]GeoBoundingBoxIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]geo[/\\]GeoFilterIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]geo[/\\]GeoShapeQueryTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]highlight[/\\]HighlightBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]highlight[/\\]HighlighterSearchIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]innerhits[/\\]InnerHitsIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]matchedqueries[/\\]MatchedQueriesIT.java" checks="LineLength" />
|
||||
@ -1397,7 +1151,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]scroll[/\\]DuelScrollIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]scroll[/\\]SearchScrollIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]scroll[/\\]SearchScrollWithFailingNodesIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]searchafter[/\\]SearchAfterBuilderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]searchafter[/\\]SearchAfterIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]simple[/\\]SimpleSearchIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]sort[/\\]SortParserTests.java" checks="LineLength" />
|
||||
@ -1406,7 +1159,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]CustomSuggester.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]completion[/\\]CategoryContextMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]completion[/\\]GeoContextMappingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]DirectCandidateGeneratorTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]suggest[/\\]phrase[/\\]NoisyChannelSpellCheckerTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]similarity[/\\]SimilarityIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]snapshots[/\\]AbstractSnapshotIntegTestCase.java" checks="LineLength" />
|
||||
@ -1424,16 +1176,6 @@
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]threadpool[/\\]ThreadPoolSerializationTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]threadpool[/\\]UpdateThreadPoolSettingsTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]timestamp[/\\]SimpleTimestampIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]AbstractSimpleTransportTestCase.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]ActionNamesIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]ContextAndHeaderTransportIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]NettySizeHeaderFrameDecoderTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]local[/\\]SimpleLocalTransportTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]netty[/\\]NettyScheduledPingTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]netty[/\\]NettyTransportIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]netty[/\\]NettyTransportMultiPortIntegrationIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]netty[/\\]NettyTransportMultiPortTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]netty[/\\]SimpleNettyTransportTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]tribe[/\\]TribeIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ttl[/\\]SimpleTTLIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]update[/\\]UpdateIT.java" checks="LineLength" />
|
||||
@ -1454,14 +1196,11 @@
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]BulkTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]DoubleTermsTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]EquivalenceTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]FunctionScoreTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]GeoDistanceTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]HDRPercentileRanksTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]HDRPercentilesTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]HistogramTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]IPv4RangeTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]IndexLookupTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]IndexedScriptTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]IndicesRequestTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]LongTermsTests.java" checks="LineLength" />
|
||||
<suppress files="modules[/\\]lang-groovy[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]messy[/\\]tests[/\\]MinDocCountTests.java" checks="LineLength" />
|
||||
@ -1501,17 +1240,12 @@
|
||||
<suppress files="plugins[/\\]delete-by-query[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]deletebyquery[/\\]IndexDeleteByQueryResponseTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]delete-by-query[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]deletebyquery[/\\]TransportDeleteByQueryActionTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]delete-by-query[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]plugin[/\\]deletebyquery[/\\]DeleteByQueryTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-azure[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]azure[/\\]management[/\\]AzureComputeService.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-azure[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]azure[/\\]AzureUnicastHostsProvider.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-azure[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]azure[/\\]AbstractAzureTestCase.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-azure[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]azure[/\\]AzureMinimumMasterNodesTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-azure[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]azure[/\\]AzureSimpleTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-azure[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]azure[/\\]AzureTwoStartedNodesTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-ec2[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]aws[/\\]AwsEc2ServiceImpl.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-ec2[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]ec2[/\\]AwsEc2UnicastHostsProvider.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-ec2[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]aws[/\\]AbstractAwsTestCase.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-ec2[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]ec2[/\\]AmazonEC2Mock.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-gce[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]gce[/\\]GceUnicastHostsProvider.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]discovery-gce[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]discovery[/\\]gce[/\\]GceNetworkTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]ingest-geoip[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]geoip[/\\]GeoIpProcessor.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]ingest-geoip[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]ingest[/\\]geoip[/\\]GeoIpProcessorFactoryTests.java" checks="LineLength" />
|
||||
@ -1543,7 +1277,6 @@
|
||||
<suppress files="plugins[/\\]mapper-size[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]mapper[/\\]size[/\\]SizeMappingTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]repository-azure[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]azure[/\\]blobstore[/\\]AzureBlobContainer.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]repository-azure[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]azure[/\\]blobstore[/\\]AzureBlobStore.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]repository-azure[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]azure[/\\]storage[/\\]AzureStorageService.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]repository-azure[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]azure[/\\]storage[/\\]AzureStorageServiceImpl.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]repository-azure[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cloud[/\\]azure[/\\]storage[/\\]AzureStorageSettings.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]repository-azure[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]repositories[/\\]azure[/\\]AzureRepository.java" checks="LineLength" />
|
||||
@ -1576,7 +1309,6 @@
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]cluster[/\\]routing[/\\]TestShardRouting.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]cli[/\\]CliToolTestCase.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]util[/\\]MockBigArrays.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]MockSearchService.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]aggregations[/\\]bucket[/\\]script[/\\]NativeSignificanceScoreScriptWithParams.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]BackgroundIndexer.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]CompositeTestCluster.java" checks="LineLength" />
|
||||
@ -1600,10 +1332,8 @@
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]disruption[/\\]SlowClusterStateProcessing.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]engine[/\\]AssertingSearcher.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]engine[/\\]MockEngineSupport.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]engine[/\\]MockInternalEngine.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]hamcrest[/\\]ElasticsearchAssertions.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]junit[/\\]listeners[/\\]LoggingListener.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]junit[/\\]rule[/\\]RepeatOnExceptionRule.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]rest[/\\]ESRestTestCase.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]rest[/\\]RestTestExecutionContext.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]rest[/\\]client[/\\]RestClient.java" checks="LineLength" />
|
||||
@ -1626,24 +1356,15 @@
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]rest[/\\]support[/\\]FileUtils.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]store[/\\]MockFSDirectoryService.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]store[/\\]MockFSIndexStore.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]transport[/\\]AssertingLocalTransport.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]transport[/\\]CapturingTransport.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]transport[/\\]MockTransportService.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]rest[/\\]test[/\\]FileUtilsTests.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]rest[/\\]test[/\\]JsonPathTests.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]rest[/\\]test[/\\]RestTestParserTests.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]test[/\\]InternalTestClusterTests.java" checks="LineLength" />
|
||||
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]action[/\\]admin[/\\]cluster[/\\]node[/\\]tasks[/\\]list[/\\]TaskInfo.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]cli[/\\]CliTool.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]SettingsModule.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]action[/\\]admin[/\\]indices[/\\]settings[/\\]RestGetSettingsAction.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]tribe[/\\]TribeService.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]transport[/\\]TransportModuleTests.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]search[/\\]sort[/\\]GeoDistanceSortBuilderIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]rest[/\\]CorsNotSetIT.java" checks="LineLength" />
|
||||
<suppress files="core[/\\]src[/\\]test[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]settings[/\\]SettingsModuleTests.java" checks="LineLength" />
|
||||
<suppress files="plugins[/\\]store-smb[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]index[/\\]store[/\\]SmbDirectoryWrapper.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]test[/\\]tasks[/\\]MockTaskManager.java" checks="LineLength" />
|
||||
<suppress files="test[/\\]framework[/\\]src[/\\]main[/\\]java[/\\]org[/\\]elasticsearch[/\\]common[/\\]inject[/\\]ModuleTestCase.java" checks="LineLength" />
|
||||
</suppressions>
|
||||
|
@ -28,3 +28,8 @@ java.security.MessageDigest#clone() @ use org.elasticsearch.common.hash.MessageD
|
||||
|
||||
@defaultMessage this should not have been added to lucene in the first place
|
||||
org.apache.lucene.index.IndexReader#getCombinedCoreAndDeletesKey()
|
||||
|
||||
@defaultMessage Soon to be removed
|
||||
org.apache.lucene.document.FieldType#numericType()
|
||||
|
||||
org.apache.lucene.document.InetAddressPoint#newPrefixQuery(java.lang.String, java.net.InetAddress, int) @LUCENE-7232
|
||||
|
@ -1,5 +1,5 @@
|
||||
elasticsearch = 5.0.0-alpha1
|
||||
lucene = 6.0.0-snapshot-f0aa4fc
|
||||
elasticsearch = 5.0.0
|
||||
lucene = 6.0.0
|
||||
|
||||
# optional dependencies
|
||||
spatial4j = 0.6
|
||||
|
@ -219,7 +219,7 @@ h1. License
|
||||
<pre>
|
||||
This software is licensed under the Apache License, version 2 ("ALv2"), quoted below.
|
||||
|
||||
Copyright 2009-2015 Elasticsearch <https://www.elastic.co>
|
||||
Copyright 2009-2016 Elasticsearch <https://www.elastic.co>
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License"); you may not
|
||||
use this file except in compliance with the License. You may obtain a copy of
|
||||
|
@ -24,6 +24,16 @@ import org.elasticsearch.gradle.BuildPlugin
|
||||
apply plugin: 'elasticsearch.build'
|
||||
apply plugin: 'com.bmuschko.nexus'
|
||||
apply plugin: 'nebula.optional-base'
|
||||
apply plugin: 'nebula.maven-base-publish'
|
||||
apply plugin: 'nebula.maven-scm'
|
||||
|
||||
publishing {
|
||||
publications {
|
||||
nebula {
|
||||
artifactId 'elasticsearch'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
archivesBaseName = 'elasticsearch'
|
||||
|
||||
|
@ -0,0 +1,117 @@
|
||||
/*
|
||||
* Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
* contributor license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright ownership.
|
||||
* The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
* (the "License"); you may not use this file except in compliance with
|
||||
* the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.apache.lucene.document;
|
||||
|
||||
import java.net.InetAddress;
|
||||
import java.net.UnknownHostException;
|
||||
import java.util.Arrays;
|
||||
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.util.NumericUtils;
|
||||
import org.elasticsearch.common.SuppressForbidden;
|
||||
|
||||
/**
|
||||
* Forked utility methods from Lucene's InetAddressPoint until LUCENE-7232 and
|
||||
* LUCENE-7234 are released.
|
||||
*/
|
||||
// TODO: remove me when we upgrade to Lucene 6.1
|
||||
@SuppressForbidden(reason="uses InetAddress.getHostAddress")
|
||||
public final class XInetAddressPoint {
|
||||
|
||||
private XInetAddressPoint() {}
|
||||
|
||||
/** The minimum value that an ip address can hold. */
|
||||
public static final InetAddress MIN_VALUE;
|
||||
/** The maximum value that an ip address can hold. */
|
||||
public static final InetAddress MAX_VALUE;
|
||||
static {
|
||||
MIN_VALUE = InetAddressPoint.decode(new byte[InetAddressPoint.BYTES]);
|
||||
byte[] maxValueBytes = new byte[InetAddressPoint.BYTES];
|
||||
Arrays.fill(maxValueBytes, (byte) 0xFF);
|
||||
MAX_VALUE = InetAddressPoint.decode(maxValueBytes);
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the {@link InetAddress} that compares immediately greater than
|
||||
* {@code address}.
|
||||
* @throws ArithmeticException if the provided address is the
|
||||
* {@link #MAX_VALUE maximum ip address}
|
||||
*/
|
||||
public static InetAddress nextUp(InetAddress address) {
|
||||
if (address.equals(MAX_VALUE)) {
|
||||
throw new ArithmeticException("Overflow: there is no greater InetAddress than "
|
||||
+ address.getHostAddress());
|
||||
}
|
||||
byte[] delta = new byte[InetAddressPoint.BYTES];
|
||||
delta[InetAddressPoint.BYTES-1] = 1;
|
||||
byte[] nextUpBytes = new byte[InetAddressPoint.BYTES];
|
||||
NumericUtils.add(InetAddressPoint.BYTES, 0, InetAddressPoint.encode(address), delta, nextUpBytes);
|
||||
return InetAddressPoint.decode(nextUpBytes);
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the {@link InetAddress} that compares immediately less than
|
||||
* {@code address}.
|
||||
* @throws ArithmeticException if the provided address is the
|
||||
* {@link #MIN_VALUE minimum ip address}
|
||||
*/
|
||||
public static InetAddress nextDown(InetAddress address) {
|
||||
if (address.equals(MIN_VALUE)) {
|
||||
throw new ArithmeticException("Underflow: there is no smaller InetAddress than "
|
||||
+ address.getHostAddress());
|
||||
}
|
||||
byte[] delta = new byte[InetAddressPoint.BYTES];
|
||||
delta[InetAddressPoint.BYTES-1] = 1;
|
||||
byte[] nextDownBytes = new byte[InetAddressPoint.BYTES];
|
||||
NumericUtils.subtract(InetAddressPoint.BYTES, 0, InetAddressPoint.encode(address), delta, nextDownBytes);
|
||||
return InetAddressPoint.decode(nextDownBytes);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a prefix query for matching a CIDR network range.
|
||||
*
|
||||
* @param field field name. must not be {@code null}.
|
||||
* @param value any host address
|
||||
* @param prefixLength the network prefix length for this address. This is also known as the subnet mask in the context of IPv4
|
||||
* addresses.
|
||||
* @throws IllegalArgumentException if {@code field} is null, or prefixLength is invalid.
|
||||
* @return a query matching documents with addresses contained within this network
|
||||
*/
|
||||
// TODO: remove me when we upgrade to Lucene 6.0.1
|
||||
public static Query newPrefixQuery(String field, InetAddress value, int prefixLength) {
|
||||
if (value == null) {
|
||||
throw new IllegalArgumentException("InetAddress must not be null");
|
||||
}
|
||||
if (prefixLength < 0 || prefixLength > 8 * value.getAddress().length) {
|
||||
throw new IllegalArgumentException("illegal prefixLength '" + prefixLength
|
||||
+ "'. Must be 0-32 for IPv4 ranges, 0-128 for IPv6 ranges");
|
||||
}
|
||||
// create the lower value by zeroing out the host portion, upper value by filling it with all ones.
|
||||
byte lower[] = value.getAddress();
|
||||
byte upper[] = value.getAddress();
|
||||
for (int i = prefixLength; i < 8 * lower.length; i++) {
|
||||
int m = 1 << (7 - (i & 7));
|
||||
lower[i >> 3] &= ~m;
|
||||
upper[i >> 3] |= m;
|
||||
}
|
||||
try {
|
||||
return InetAddressPoint.newRangeQuery(field, InetAddress.getByAddress(lower), InetAddress.getByAddress(upper));
|
||||
} catch (UnknownHostException e) {
|
||||
throw new AssertionError(e); // values are coming from InetAddress
|
||||
}
|
||||
}
|
||||
}
|
130
core/src/main/java/org/apache/lucene/index/XPointValues.java
Normal file
130
core/src/main/java/org/apache/lucene/index/XPointValues.java
Normal file
@ -0,0 +1,130 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.apache.lucene.index;
|
||||
import org.apache.lucene.util.StringHelper;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
/**
|
||||
* Forked utility methods from Lucene's PointValues until LUCENE-7257 is released.
|
||||
*/
|
||||
public class XPointValues {
|
||||
/** Return the cumulated number of points across all leaves of the given
|
||||
* {@link IndexReader}. Leaves that do not have points for the given field
|
||||
* are ignored.
|
||||
* @see PointValues#size(String) */
|
||||
public static long size(IndexReader reader, String field) throws IOException {
|
||||
long size = 0;
|
||||
for (LeafReaderContext ctx : reader.leaves()) {
|
||||
FieldInfo info = ctx.reader().getFieldInfos().fieldInfo(field);
|
||||
if (info == null || info.getPointDimensionCount() == 0) {
|
||||
continue;
|
||||
}
|
||||
PointValues values = ctx.reader().getPointValues();
|
||||
size += values.size(field);
|
||||
}
|
||||
return size;
|
||||
}
|
||||
|
||||
/** Return the cumulated number of docs that have points across all leaves
|
||||
* of the given {@link IndexReader}. Leaves that do not have points for the
|
||||
* given field are ignored.
|
||||
* @see PointValues#getDocCount(String) */
|
||||
public static int getDocCount(IndexReader reader, String field) throws IOException {
|
||||
int count = 0;
|
||||
for (LeafReaderContext ctx : reader.leaves()) {
|
||||
FieldInfo info = ctx.reader().getFieldInfos().fieldInfo(field);
|
||||
if (info == null || info.getPointDimensionCount() == 0) {
|
||||
continue;
|
||||
}
|
||||
PointValues values = ctx.reader().getPointValues();
|
||||
count += values.getDocCount(field);
|
||||
}
|
||||
return count;
|
||||
}
|
||||
|
||||
/** Return the minimum packed values across all leaves of the given
|
||||
* {@link IndexReader}. Leaves that do not have points for the given field
|
||||
* are ignored.
|
||||
* @see PointValues#getMinPackedValue(String) */
|
||||
public static byte[] getMinPackedValue(IndexReader reader, String field) throws IOException {
|
||||
byte[] minValue = null;
|
||||
for (LeafReaderContext ctx : reader.leaves()) {
|
||||
FieldInfo info = ctx.reader().getFieldInfos().fieldInfo(field);
|
||||
if (info == null || info.getPointDimensionCount() == 0) {
|
||||
continue;
|
||||
}
|
||||
PointValues values = ctx.reader().getPointValues();
|
||||
byte[] leafMinValue = values.getMinPackedValue(field);
|
||||
if (leafMinValue == null) {
|
||||
continue;
|
||||
}
|
||||
if (minValue == null) {
|
||||
minValue = leafMinValue.clone();
|
||||
} else {
|
||||
final int numDimensions = values.getNumDimensions(field);
|
||||
final int numBytesPerDimension = values.getBytesPerDimension(field);
|
||||
for (int i = 0; i < numDimensions; ++i) {
|
||||
int offset = i * numBytesPerDimension;
|
||||
if (StringHelper.compare(numBytesPerDimension, leafMinValue, offset, minValue, offset) < 0) {
|
||||
System.arraycopy(leafMinValue, offset, minValue, offset, numBytesPerDimension);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return minValue;
|
||||
}
|
||||
|
||||
/** Return the maximum packed values across all leaves of the given
|
||||
* {@link IndexReader}. Leaves that do not have points for the given field
|
||||
* are ignored.
|
||||
* @see PointValues#getMaxPackedValue(String) */
|
||||
public static byte[] getMaxPackedValue(IndexReader reader, String field) throws IOException {
|
||||
byte[] maxValue = null;
|
||||
for (LeafReaderContext ctx : reader.leaves()) {
|
||||
FieldInfo info = ctx.reader().getFieldInfos().fieldInfo(field);
|
||||
if (info == null || info.getPointDimensionCount() == 0) {
|
||||
continue;
|
||||
}
|
||||
PointValues values = ctx.reader().getPointValues();
|
||||
byte[] leafMaxValue = values.getMaxPackedValue(field);
|
||||
if (leafMaxValue == null) {
|
||||
continue;
|
||||
}
|
||||
if (maxValue == null) {
|
||||
maxValue = leafMaxValue.clone();
|
||||
} else {
|
||||
final int numDimensions = values.getNumDimensions(field);
|
||||
final int numBytesPerDimension = values.getBytesPerDimension(field);
|
||||
for (int i = 0; i < numDimensions; ++i) {
|
||||
int offset = i * numBytesPerDimension;
|
||||
if (StringHelper.compare(numBytesPerDimension, leafMaxValue, offset, maxValue, offset) > 0) {
|
||||
System.arraycopy(leafMaxValue, offset, maxValue, offset, numBytesPerDimension);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return maxValue;
|
||||
}
|
||||
|
||||
/** Default constructor */
|
||||
private XPointValues() {
|
||||
}
|
||||
}
|
@ -237,6 +237,10 @@ public abstract class BlendedTermQuery extends Query {
|
||||
return newCtx;
|
||||
}
|
||||
|
||||
public List<Term> getTerms() {
|
||||
return Arrays.asList(terms);
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString(String field) {
|
||||
StringBuilder builder = new StringBuilder("blended(terms:[");
|
||||
|
@ -22,6 +22,7 @@ package org.apache.lucene.queryparser.classic;
|
||||
import org.apache.lucene.analysis.Analyzer;
|
||||
import org.apache.lucene.analysis.TokenStream;
|
||||
import org.apache.lucene.analysis.tokenattributes.CharTermAttribute;
|
||||
import org.apache.lucene.analysis.tokenattributes.PositionIncrementAttribute;
|
||||
import org.apache.lucene.index.Term;
|
||||
import org.apache.lucene.search.BooleanClause;
|
||||
import org.apache.lucene.search.BooleanQuery;
|
||||
@ -32,6 +33,7 @@ import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.search.MultiPhraseQuery;
|
||||
import org.apache.lucene.search.PhraseQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.SynonymQuery;
|
||||
import org.apache.lucene.util.IOUtils;
|
||||
import org.apache.lucene.util.automaton.RegExp;
|
||||
import org.elasticsearch.common.lucene.search.Queries;
|
||||
@ -39,6 +41,7 @@ import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.elasticsearch.index.mapper.MappedFieldType;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
import org.elasticsearch.index.mapper.core.DateFieldMapper;
|
||||
import org.elasticsearch.index.mapper.core.LegacyDateFieldMapper;
|
||||
import org.elasticsearch.index.query.QueryShardContext;
|
||||
import org.elasticsearch.index.query.support.QueryParsers;
|
||||
|
||||
@ -105,7 +108,8 @@ public class MapperQueryParser extends QueryParser {
|
||||
}
|
||||
|
||||
/**
|
||||
* We override this one so we can get the fuzzy part to be treated as string, so people can do: "age:10~5" or "timestamp:2012-10-10~5d"
|
||||
* We override this one so we can get the fuzzy part to be treated as string,
|
||||
* so people can do: "age:10~5" or "timestamp:2012-10-10~5d"
|
||||
*/
|
||||
@Override
|
||||
Query handleBareFuzzy(String qfield, Token fuzzySlop, String termImage) throws ParseException {
|
||||
@ -164,8 +168,7 @@ public class MapperQueryParser extends QueryParser {
|
||||
clauses.add(new BooleanClause(applyBoost(mField, q), BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
}
|
||||
if (clauses.size() == 0) // happens for stopwords
|
||||
return null;
|
||||
if (clauses.isEmpty()) return null; // happens for stopwords
|
||||
return getBooleanQueryCoordDisabled(clauses);
|
||||
}
|
||||
} else {
|
||||
@ -215,7 +218,8 @@ public class MapperQueryParser extends QueryParser {
|
||||
}
|
||||
if (currentFieldType != null) {
|
||||
Query query = null;
|
||||
if (currentFieldType.useTermQueryWithQueryString()) {
|
||||
if (currentFieldType.tokenized() == false) {
|
||||
// this might be a structured field like a numeric
|
||||
try {
|
||||
query = currentFieldType.termQuery(queryText, context);
|
||||
} catch (RuntimeException e) {
|
||||
@ -266,8 +270,7 @@ public class MapperQueryParser extends QueryParser {
|
||||
clauses.add(new BooleanClause(applyBoost(mField, q), BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
}
|
||||
if (clauses.size() == 0) // happens for stopwords
|
||||
return null;
|
||||
if (clauses.isEmpty()) return null; // happens for stopwords
|
||||
return getBooleanQueryCoordDisabled(clauses);
|
||||
}
|
||||
} else {
|
||||
@ -276,7 +279,8 @@ public class MapperQueryParser extends QueryParser {
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Query getRangeQuery(String field, String part1, String part2, boolean startInclusive, boolean endInclusive) throws ParseException {
|
||||
protected Query getRangeQuery(String field, String part1, String part2,
|
||||
boolean startInclusive, boolean endInclusive) throws ParseException {
|
||||
if ("*".equals(part1)) {
|
||||
part1 = null;
|
||||
}
|
||||
@ -317,23 +321,26 @@ public class MapperQueryParser extends QueryParser {
|
||||
clauses.add(new BooleanClause(applyBoost(mField, q), BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
}
|
||||
if (clauses.size() == 0) // happens for stopwords
|
||||
return null;
|
||||
if (clauses.isEmpty()) return null; // happens for stopwords
|
||||
return getBooleanQueryCoordDisabled(clauses);
|
||||
}
|
||||
}
|
||||
|
||||
private Query getRangeQuerySingle(String field, String part1, String part2, boolean startInclusive, boolean endInclusive) {
|
||||
private Query getRangeQuerySingle(String field, String part1, String part2,
|
||||
boolean startInclusive, boolean endInclusive) {
|
||||
currentFieldType = context.fieldMapper(field);
|
||||
if (currentFieldType != null) {
|
||||
if (lowercaseExpandedTerms && !currentFieldType.isNumeric()) {
|
||||
if (lowercaseExpandedTerms && currentFieldType.tokenized()) {
|
||||
part1 = part1 == null ? null : part1.toLowerCase(locale);
|
||||
part2 = part2 == null ? null : part2.toLowerCase(locale);
|
||||
}
|
||||
|
||||
try {
|
||||
Query rangeQuery;
|
||||
if (currentFieldType instanceof DateFieldMapper.DateFieldType && settings.timeZone() != null) {
|
||||
if (currentFieldType instanceof LegacyDateFieldMapper.DateFieldType && settings.timeZone() != null) {
|
||||
LegacyDateFieldMapper.DateFieldType dateFieldType = (LegacyDateFieldMapper.DateFieldType) this.currentFieldType;
|
||||
rangeQuery = dateFieldType.rangeQuery(part1, part2, startInclusive, endInclusive, settings.timeZone(), null);
|
||||
} else if (currentFieldType instanceof DateFieldMapper.DateFieldType && settings.timeZone() != null) {
|
||||
DateFieldMapper.DateFieldType dateFieldType = (DateFieldMapper.DateFieldType) this.currentFieldType;
|
||||
rangeQuery = dateFieldType.rangeQuery(part1, part2, startInclusive, endInclusive, settings.timeZone(), null);
|
||||
} else {
|
||||
@ -392,7 +399,8 @@ public class MapperQueryParser extends QueryParser {
|
||||
currentFieldType = context.fieldMapper(field);
|
||||
if (currentFieldType != null) {
|
||||
try {
|
||||
return currentFieldType.fuzzyQuery(termStr, Fuzziness.build(minSimilarity), fuzzyPrefixLength, settings.fuzzyMaxExpansions(), FuzzyQuery.defaultTranspositions);
|
||||
return currentFieldType.fuzzyQuery(termStr, Fuzziness.build(minSimilarity),
|
||||
fuzzyPrefixLength, settings.fuzzyMaxExpansions(), FuzzyQuery.defaultTranspositions);
|
||||
} catch (RuntimeException e) {
|
||||
if (settings.lenient()) {
|
||||
return null;
|
||||
@ -407,7 +415,8 @@ public class MapperQueryParser extends QueryParser {
|
||||
protected Query newFuzzyQuery(Term term, float minimumSimilarity, int prefixLength) {
|
||||
String text = term.text();
|
||||
int numEdits = FuzzyQuery.floatToEdits(minimumSimilarity, text.codePointCount(0, text.length()));
|
||||
FuzzyQuery query = new FuzzyQuery(term, numEdits, prefixLength, settings.fuzzyMaxExpansions(), FuzzyQuery.defaultTranspositions);
|
||||
FuzzyQuery query = new FuzzyQuery(term, numEdits, prefixLength,
|
||||
settings.fuzzyMaxExpansions(), FuzzyQuery.defaultTranspositions);
|
||||
QueryParsers.setRewriteMethod(query, settings.fuzzyRewriteMethod());
|
||||
return query;
|
||||
}
|
||||
@ -444,8 +453,7 @@ public class MapperQueryParser extends QueryParser {
|
||||
clauses.add(new BooleanClause(applyBoost(mField, q), BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
}
|
||||
if (clauses.size() == 0) // happens for stopwords
|
||||
return null;
|
||||
if (clauses.isEmpty()) return null; // happens for stopwords
|
||||
return getBooleanQueryCoordDisabled(clauses);
|
||||
}
|
||||
} else {
|
||||
@ -463,7 +471,7 @@ public class MapperQueryParser extends QueryParser {
|
||||
setAnalyzer(context.getSearchAnalyzer(currentFieldType));
|
||||
}
|
||||
Query query = null;
|
||||
if (currentFieldType.useTermQueryWithQueryString()) {
|
||||
if (currentFieldType.tokenized() == false) {
|
||||
query = currentFieldType.prefixQuery(termStr, multiTermRewriteMethod, context);
|
||||
}
|
||||
if (query == null) {
|
||||
@ -486,7 +494,7 @@ public class MapperQueryParser extends QueryParser {
|
||||
if (!settings.analyzeWildcard()) {
|
||||
return super.getPrefixQuery(field, termStr);
|
||||
}
|
||||
List<String> tlist;
|
||||
List<List<String> > tlist;
|
||||
// get Analyzer from superclass and tokenize the term
|
||||
TokenStream source = null;
|
||||
try {
|
||||
@ -497,7 +505,9 @@ public class MapperQueryParser extends QueryParser {
|
||||
return super.getPrefixQuery(field, termStr);
|
||||
}
|
||||
tlist = new ArrayList<>();
|
||||
List<String> currentPos = new ArrayList<>();
|
||||
CharTermAttribute termAtt = source.addAttribute(CharTermAttribute.class);
|
||||
PositionIncrementAttribute posAtt = source.addAttribute(PositionIncrementAttribute.class);
|
||||
|
||||
while (true) {
|
||||
try {
|
||||
@ -505,7 +515,14 @@ public class MapperQueryParser extends QueryParser {
|
||||
} catch (IOException e) {
|
||||
break;
|
||||
}
|
||||
tlist.add(termAtt.toString());
|
||||
if (currentPos.isEmpty() == false && posAtt.getPositionIncrement() > 0) {
|
||||
tlist.add(currentPos);
|
||||
currentPos = new ArrayList<>();
|
||||
}
|
||||
currentPos.add(termAtt.toString());
|
||||
}
|
||||
if (currentPos.isEmpty() == false) {
|
||||
tlist.add(currentPos);
|
||||
}
|
||||
} finally {
|
||||
if (source != null) {
|
||||
@ -513,16 +530,45 @@ public class MapperQueryParser extends QueryParser {
|
||||
}
|
||||
}
|
||||
|
||||
if (tlist.size() == 1) {
|
||||
return super.getPrefixQuery(field, tlist.get(0));
|
||||
} else {
|
||||
// build a boolean query with prefix on each one...
|
||||
List<BooleanClause> clauses = new ArrayList<>();
|
||||
for (String token : tlist) {
|
||||
clauses.add(new BooleanClause(super.getPrefixQuery(field, token), BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
return getBooleanQueryCoordDisabled(clauses);
|
||||
if (tlist.size() == 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (tlist.size() == 1 && tlist.get(0).size() == 1) {
|
||||
return super.getPrefixQuery(field, tlist.get(0).get(0));
|
||||
}
|
||||
|
||||
// build a boolean query with prefix on the last position only.
|
||||
List<BooleanClause> clauses = new ArrayList<>();
|
||||
for (int pos = 0; pos < tlist.size(); pos++) {
|
||||
List<String> plist = tlist.get(pos);
|
||||
boolean isLastPos = (pos == tlist.size() - 1);
|
||||
Query posQuery;
|
||||
if (plist.size() == 1) {
|
||||
if (isLastPos) {
|
||||
posQuery = super.getPrefixQuery(field, plist.get(0));
|
||||
} else {
|
||||
posQuery = newTermQuery(new Term(field, plist.get(0)));
|
||||
}
|
||||
} else if (isLastPos == false) {
|
||||
// build a synonym query for terms in the same position.
|
||||
Term[] terms = new Term[plist.size()];
|
||||
for (int i = 0; i < plist.size(); i++) {
|
||||
terms[i] = new Term(field, plist.get(i));
|
||||
}
|
||||
posQuery = new SynonymQuery(terms);
|
||||
} else {
|
||||
List<BooleanClause> innerClauses = new ArrayList<>();
|
||||
for (String token : plist) {
|
||||
innerClauses.add(new BooleanClause(super.getPrefixQuery(field, token),
|
||||
BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
posQuery = getBooleanQueryCoordDisabled(innerClauses);
|
||||
}
|
||||
clauses.add(new BooleanClause(posQuery,
|
||||
getDefaultOperator() == Operator.AND ? BooleanClause.Occur.MUST : BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
return getBooleanQuery(clauses);
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -574,8 +620,7 @@ public class MapperQueryParser extends QueryParser {
|
||||
clauses.add(new BooleanClause(applyBoost(mField, q), BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
}
|
||||
if (clauses.size() == 0) // happens for stopwords
|
||||
return null;
|
||||
if (clauses.isEmpty()) return null; // happens for stopwords
|
||||
return getBooleanQueryCoordDisabled(clauses);
|
||||
}
|
||||
} else {
|
||||
@ -703,8 +748,7 @@ public class MapperQueryParser extends QueryParser {
|
||||
clauses.add(new BooleanClause(applyBoost(mField, q), BooleanClause.Occur.SHOULD));
|
||||
}
|
||||
}
|
||||
if (clauses.size() == 0) // happens for stopwords
|
||||
return null;
|
||||
if (clauses.isEmpty()) return null; // happens for stopwords
|
||||
return getBooleanQueryCoordDisabled(clauses);
|
||||
}
|
||||
} else {
|
||||
@ -722,8 +766,9 @@ public class MapperQueryParser extends QueryParser {
|
||||
setAnalyzer(context.getSearchAnalyzer(currentFieldType));
|
||||
}
|
||||
Query query = null;
|
||||
if (currentFieldType.useTermQueryWithQueryString()) {
|
||||
query = currentFieldType.regexpQuery(termStr, RegExp.ALL, maxDeterminizedStates, multiTermRewriteMethod, context);
|
||||
if (currentFieldType.tokenized() == false) {
|
||||
query = currentFieldType.regexpQuery(termStr, RegExp.ALL,
|
||||
maxDeterminizedStates, multiTermRewriteMethod, context);
|
||||
}
|
||||
if (query == null) {
|
||||
query = super.getRegexpQuery(field, termStr);
|
||||
@ -740,7 +785,7 @@ public class MapperQueryParser extends QueryParser {
|
||||
setAnalyzer(oldAnalyzer);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* @deprecated review all use of this, don't rely on coord
|
||||
*/
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,267 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
package org.apache.lucene.search.suggest.analyzing;
|
||||
|
||||
import org.apache.lucene.analysis.Analyzer;
|
||||
import org.apache.lucene.analysis.TokenStreamToAutomaton;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.apache.lucene.util.IntsRef;
|
||||
import org.apache.lucene.util.UnicodeUtil;
|
||||
import org.apache.lucene.util.automaton.Automata;
|
||||
import org.apache.lucene.util.automaton.Automaton;
|
||||
import org.apache.lucene.util.automaton.FiniteStringsIterator;
|
||||
import org.apache.lucene.util.automaton.LevenshteinAutomata;
|
||||
import org.apache.lucene.util.automaton.Operations;
|
||||
import org.apache.lucene.util.automaton.UTF32ToUTF8;
|
||||
import org.apache.lucene.util.fst.FST;
|
||||
import org.apache.lucene.util.fst.PairOutputs;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
import static org.apache.lucene.util.automaton.Operations.DEFAULT_MAX_DETERMINIZED_STATES;
|
||||
|
||||
/**
|
||||
* Implements a fuzzy {@link AnalyzingSuggester}. The similarity measurement is
|
||||
* based on the Damerau-Levenshtein (optimal string alignment) algorithm, though
|
||||
* you can explicitly choose classic Levenshtein by passing <code>false</code>
|
||||
* for the <code>transpositions</code> parameter.
|
||||
* <p>
|
||||
* At most, this query will match terms up to
|
||||
* {@value org.apache.lucene.util.automaton.LevenshteinAutomata#MAXIMUM_SUPPORTED_DISTANCE}
|
||||
* edits. Higher distances are not supported. Note that the
|
||||
* fuzzy distance is measured in "byte space" on the bytes
|
||||
* returned by the {@link org.apache.lucene.analysis.TokenStream}'s {@link
|
||||
* org.apache.lucene.analysis.tokenattributes.TermToBytesRefAttribute}, usually UTF8. By default
|
||||
* the analyzed bytes must be at least 3 {@link
|
||||
* #DEFAULT_MIN_FUZZY_LENGTH} bytes before any edits are
|
||||
* considered. Furthermore, the first 1 {@link
|
||||
* #DEFAULT_NON_FUZZY_PREFIX} byte is not allowed to be
|
||||
* edited. We allow up to 1 (@link
|
||||
* #DEFAULT_MAX_EDITS} edit.
|
||||
* If {@link #unicodeAware} parameter in the constructor is set to true, maxEdits,
|
||||
* minFuzzyLength, transpositions and nonFuzzyPrefix are measured in Unicode code
|
||||
* points (actual letters) instead of bytes.*
|
||||
*
|
||||
* <p>
|
||||
* NOTE: This suggester does not boost suggestions that
|
||||
* required no edits over suggestions that did require
|
||||
* edits. This is a known limitation.
|
||||
*
|
||||
* <p>
|
||||
* Note: complex query analyzers can have a significant impact on the lookup
|
||||
* performance. It's recommended to not use analyzers that drop or inject terms
|
||||
* like synonyms to keep the complexity of the prefix intersection low for good
|
||||
* lookup performance. At index time, complex analyzers can safely be used.
|
||||
* </p>
|
||||
*/
|
||||
public final class XFuzzySuggester extends XAnalyzingSuggester {
|
||||
private final int maxEdits;
|
||||
private final boolean transpositions;
|
||||
private final int nonFuzzyPrefix;
|
||||
private final int minFuzzyLength;
|
||||
private final boolean unicodeAware;
|
||||
|
||||
/**
|
||||
* Measure maxEdits, minFuzzyLength, transpositions and nonFuzzyPrefix
|
||||
* parameters in Unicode code points (actual letters)
|
||||
* instead of bytes.
|
||||
*/
|
||||
public static final boolean DEFAULT_UNICODE_AWARE = false;
|
||||
|
||||
/**
|
||||
* The default minimum length of the key passed to {@link
|
||||
* #lookup} before any edits are allowed.
|
||||
*/
|
||||
public static final int DEFAULT_MIN_FUZZY_LENGTH = 3;
|
||||
|
||||
/**
|
||||
* The default prefix length where edits are not allowed.
|
||||
*/
|
||||
public static final int DEFAULT_NON_FUZZY_PREFIX = 1;
|
||||
|
||||
/**
|
||||
* The default maximum number of edits for fuzzy
|
||||
* suggestions.
|
||||
*/
|
||||
public static final int DEFAULT_MAX_EDITS = 1;
|
||||
|
||||
/**
|
||||
* The default transposition value passed to {@link org.apache.lucene.util.automaton.LevenshteinAutomata}
|
||||
*/
|
||||
public static final boolean DEFAULT_TRANSPOSITIONS = true;
|
||||
|
||||
/**
|
||||
* Creates a {@link FuzzySuggester} instance initialized with default values.
|
||||
*
|
||||
* @param analyzer the analyzer used for this suggester
|
||||
*/
|
||||
public XFuzzySuggester(Analyzer analyzer) {
|
||||
this(analyzer, analyzer);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a {@link FuzzySuggester} instance with an index & a query analyzer initialized with default values.
|
||||
*
|
||||
* @param indexAnalyzer
|
||||
* Analyzer that will be used for analyzing suggestions while building the index.
|
||||
* @param queryAnalyzer
|
||||
* Analyzer that will be used for analyzing query text during lookup
|
||||
*/
|
||||
public XFuzzySuggester(Analyzer indexAnalyzer, Analyzer queryAnalyzer) {
|
||||
this(indexAnalyzer, null, queryAnalyzer, EXACT_FIRST | PRESERVE_SEP, 256, -1,
|
||||
DEFAULT_MAX_EDITS, DEFAULT_TRANSPOSITIONS,
|
||||
DEFAULT_NON_FUZZY_PREFIX, DEFAULT_MIN_FUZZY_LENGTH, DEFAULT_UNICODE_AWARE,
|
||||
null, false, 0, SEP_LABEL, PAYLOAD_SEP, END_BYTE, HOLE_CHARACTER);
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a {@link FuzzySuggester} instance.
|
||||
*
|
||||
* @param indexAnalyzer Analyzer that will be used for
|
||||
* analyzing suggestions while building the index.
|
||||
* @param queryAnalyzer Analyzer that will be used for
|
||||
* analyzing query text during lookup
|
||||
* @param options see {@link #EXACT_FIRST}, {@link #PRESERVE_SEP}
|
||||
* @param maxSurfaceFormsPerAnalyzedForm Maximum number of
|
||||
* surface forms to keep for a single analyzed form.
|
||||
* When there are too many surface forms we discard the
|
||||
* lowest weighted ones.
|
||||
* @param maxGraphExpansions Maximum number of graph paths
|
||||
* to expand from the analyzed form. Set this to -1 for
|
||||
* no limit.
|
||||
* @param maxEdits must be >= 0 and <= {@link org.apache.lucene.util.automaton.LevenshteinAutomata#MAXIMUM_SUPPORTED_DISTANCE} .
|
||||
* @param transpositions <code>true</code> if transpositions should be treated as a primitive
|
||||
* edit operation. If this is false, comparisons will implement the classic
|
||||
* Levenshtein algorithm.
|
||||
* @param nonFuzzyPrefix length of common (non-fuzzy) prefix (see default {@link #DEFAULT_NON_FUZZY_PREFIX}
|
||||
* @param minFuzzyLength minimum length of lookup key before any edits are allowed (see default {@link #DEFAULT_MIN_FUZZY_LENGTH})
|
||||
* @param sepLabel separation label
|
||||
* @param payloadSep payload separator byte
|
||||
* @param endByte end byte marker byte
|
||||
*/
|
||||
public XFuzzySuggester(Analyzer indexAnalyzer, Automaton queryPrefix, Analyzer queryAnalyzer,
|
||||
int options, int maxSurfaceFormsPerAnalyzedForm, int maxGraphExpansions,
|
||||
int maxEdits, boolean transpositions, int nonFuzzyPrefix, int minFuzzyLength,
|
||||
boolean unicodeAware, FST<PairOutputs.Pair<Long, BytesRef>> fst, boolean hasPayloads,
|
||||
int maxAnalyzedPathsForOneInput, int sepLabel, int payloadSep, int endByte, int holeCharacter) {
|
||||
super(indexAnalyzer, queryPrefix, queryAnalyzer, options, maxSurfaceFormsPerAnalyzedForm, maxGraphExpansions,
|
||||
true, fst, hasPayloads, maxAnalyzedPathsForOneInput, sepLabel, payloadSep, endByte, holeCharacter);
|
||||
if (maxEdits < 0 || maxEdits > LevenshteinAutomata.MAXIMUM_SUPPORTED_DISTANCE) {
|
||||
throw new IllegalArgumentException(
|
||||
"maxEdits must be between 0 and " + LevenshteinAutomata.MAXIMUM_SUPPORTED_DISTANCE);
|
||||
}
|
||||
if (nonFuzzyPrefix < 0) {
|
||||
throw new IllegalArgumentException("nonFuzzyPrefix must not be >= 0 (got " + nonFuzzyPrefix + ")");
|
||||
}
|
||||
if (minFuzzyLength < 0) {
|
||||
throw new IllegalArgumentException("minFuzzyLength must not be >= 0 (got " + minFuzzyLength + ")");
|
||||
}
|
||||
|
||||
this.maxEdits = maxEdits;
|
||||
this.transpositions = transpositions;
|
||||
this.nonFuzzyPrefix = nonFuzzyPrefix;
|
||||
this.minFuzzyLength = minFuzzyLength;
|
||||
this.unicodeAware = unicodeAware;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected List<FSTUtil.Path<PairOutputs.Pair<Long,BytesRef>>> getFullPrefixPaths(
|
||||
List<FSTUtil.Path<PairOutputs.Pair<Long,BytesRef>>> prefixPaths, Automaton lookupAutomaton,
|
||||
FST<PairOutputs.Pair<Long,BytesRef>> fst)
|
||||
throws IOException {
|
||||
|
||||
// TODO: right now there's no penalty for fuzzy/edits,
|
||||
// ie a completion whose prefix matched exactly what the
|
||||
// user typed gets no boost over completions that
|
||||
// required an edit, which get no boost over completions
|
||||
// requiring two edits. I suspect a multiplicative
|
||||
// factor is appropriate (eg, say a fuzzy match must be at
|
||||
// least 2X better weight than the non-fuzzy match to
|
||||
// "compete") ... in which case I think the wFST needs
|
||||
// to be log weights or something ...
|
||||
|
||||
Automaton levA = convertAutomaton(toLevenshteinAutomata(lookupAutomaton));
|
||||
/*
|
||||
Writer w = new OutputStreamWriter(new FileOutputStream("out.dot"), "UTF-8");
|
||||
w.write(levA.toDot());
|
||||
w.close();
|
||||
System.out.println("Wrote LevA to out.dot");
|
||||
*/
|
||||
return FSTUtil.intersectPrefixPaths(levA, fst);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Automaton convertAutomaton(Automaton a) {
|
||||
if (unicodeAware) {
|
||||
// FLORIAN EDIT: get converted Automaton from superclass
|
||||
Automaton utf8automaton = new UTF32ToUTF8().convert(super.convertAutomaton(a));
|
||||
// This automaton should not blow up during determinize:
|
||||
utf8automaton = Operations.determinize(utf8automaton, Integer.MAX_VALUE);
|
||||
return utf8automaton;
|
||||
} else {
|
||||
return super.convertAutomaton(a);
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public TokenStreamToAutomaton getTokenStreamToAutomaton() {
|
||||
final TokenStreamToAutomaton tsta = super.getTokenStreamToAutomaton();
|
||||
tsta.setUnicodeArcs(unicodeAware);
|
||||
return tsta;
|
||||
}
|
||||
|
||||
Automaton toLevenshteinAutomata(Automaton automaton) {
|
||||
List<Automaton> subs = new ArrayList<>();
|
||||
FiniteStringsIterator finiteStrings = new FiniteStringsIterator(automaton);
|
||||
for (IntsRef string; (string = finiteStrings.next()) != null;) {
|
||||
if (string.length <= nonFuzzyPrefix || string.length < minFuzzyLength) {
|
||||
subs.add(Automata.makeString(string.ints, string.offset, string.length));
|
||||
} else {
|
||||
int ints[] = new int[string.length-nonFuzzyPrefix];
|
||||
System.arraycopy(string.ints, string.offset+nonFuzzyPrefix, ints, 0, ints.length);
|
||||
// TODO: maybe add alphaMin to LevenshteinAutomata,
|
||||
// and pass 1 instead of 0? We probably don't want
|
||||
// to allow the trailing dedup bytes to be
|
||||
// edited... but then 0 byte is "in general" allowed
|
||||
// on input (but not in UTF8).
|
||||
LevenshteinAutomata lev = new LevenshteinAutomata(
|
||||
ints, unicodeAware ? Character.MAX_CODE_POINT : 255, transpositions);
|
||||
subs.add(lev.toAutomaton(maxEdits, UnicodeUtil.newString(string.ints, string.offset, nonFuzzyPrefix)));
|
||||
}
|
||||
}
|
||||
|
||||
if (subs.isEmpty()) {
|
||||
// automaton is empty, there is no accepted paths through it
|
||||
return Automata.makeEmpty(); // matches nothing
|
||||
} else if (subs.size() == 1) {
|
||||
// no synonyms or anything: just a single path through the tokenstream
|
||||
return subs.get(0);
|
||||
} else {
|
||||
// multiple paths: this is really scary! is it slow?
|
||||
// maybe we should not do this and throw UOE?
|
||||
Automaton a = Operations.union(subs);
|
||||
// TODO: we could call toLevenshteinAutomata() before det?
|
||||
// this only happens if you have multiple paths anyway (e.g. synonyms)
|
||||
return Operations.determinize(a, DEFAULT_MAX_DETERMINIZED_STATES);
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,124 @@
|
||||
/*
|
||||
* Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
* contributor license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright ownership.
|
||||
* The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
* (the "License"); you may not use this file except in compliance with
|
||||
* the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.apache.lucene.spatial.geopoint.search;
|
||||
|
||||
import org.apache.lucene.index.IndexReader;
|
||||
import org.apache.lucene.search.BooleanClause;
|
||||
import org.apache.lucene.search.BooleanQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.spatial.geopoint.document.GeoPointField.TermEncoding;
|
||||
|
||||
/** Implements a point distance range query on a GeoPoint field. This is based on
|
||||
* {@code org.apache.lucene.spatial.geopoint.search.GeoPointDistanceQuery} and is implemented using a
|
||||
* {@code org.apache.lucene.search.BooleanClause.MUST_NOT} clause to exclude any points that fall within
|
||||
* minRadiusMeters from the provided point.
|
||||
* <p>
|
||||
* NOTE: this query does not correctly support multi-value docs (see: https://issues.apache.org/jira/browse/LUCENE-7126)
|
||||
* <br>
|
||||
* TODO: remove this per ISSUE #17658
|
||||
**/
|
||||
public final class XGeoPointDistanceRangeQuery extends GeoPointDistanceQuery {
|
||||
/** minimum distance range (in meters) from lat, lon center location, maximum is inherited */
|
||||
protected final double minRadiusMeters;
|
||||
|
||||
/**
|
||||
* Constructs a query for all {@link org.apache.lucene.spatial.geopoint.document.GeoPointField} types within a minimum / maximum
|
||||
* distance (in meters) range from a given point
|
||||
*/
|
||||
public XGeoPointDistanceRangeQuery(final String field, final double centerLat, final double centerLon,
|
||||
final double minRadiusMeters, final double maxRadiusMeters) {
|
||||
this(field, TermEncoding.PREFIX, centerLat, centerLon, minRadiusMeters, maxRadiusMeters);
|
||||
}
|
||||
|
||||
/**
|
||||
* Constructs a query for all {@link org.apache.lucene.spatial.geopoint.document.GeoPointField} types within a minimum / maximum
|
||||
* distance (in meters) range from a given point. Accepts an optional
|
||||
* {@link org.apache.lucene.spatial.geopoint.document.GeoPointField.TermEncoding}
|
||||
*/
|
||||
public XGeoPointDistanceRangeQuery(final String field, final TermEncoding termEncoding, final double centerLat, final double centerLon,
|
||||
final double minRadiusMeters, final double maxRadius) {
|
||||
super(field, termEncoding, centerLat, centerLon, maxRadius);
|
||||
this.minRadiusMeters = minRadiusMeters;
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query rewrite(IndexReader reader) {
|
||||
Query q = super.rewrite(reader);
|
||||
if (minRadiusMeters == 0.0) {
|
||||
return q;
|
||||
}
|
||||
|
||||
// add an exclusion query
|
||||
BooleanQuery.Builder bqb = new BooleanQuery.Builder();
|
||||
|
||||
// create a new exclusion query
|
||||
GeoPointDistanceQuery exclude = new GeoPointDistanceQuery(field, termEncoding, centerLat, centerLon, minRadiusMeters);
|
||||
// full map search
|
||||
// if (radiusMeters >= GeoProjectionUtils.SEMIMINOR_AXIS) {
|
||||
// bqb.add(new BooleanClause(new GeoPointInBBoxQuery(this.field, -180.0, -90.0, 180.0, 90.0), BooleanClause.Occur.MUST));
|
||||
// } else {
|
||||
bqb.add(new BooleanClause(q, BooleanClause.Occur.MUST));
|
||||
// }
|
||||
bqb.add(new BooleanClause(exclude, BooleanClause.Occur.MUST_NOT));
|
||||
|
||||
return bqb.build();
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString(String field) {
|
||||
final StringBuilder sb = new StringBuilder();
|
||||
sb.append(getClass().getSimpleName());
|
||||
sb.append(':');
|
||||
if (!this.field.equals(field)) {
|
||||
sb.append(" field=");
|
||||
sb.append(this.field);
|
||||
sb.append(':');
|
||||
}
|
||||
return sb.append( " Center: [")
|
||||
.append(centerLat)
|
||||
.append(',')
|
||||
.append(centerLon)
|
||||
.append(']')
|
||||
.append(" From Distance: ")
|
||||
.append(minRadiusMeters)
|
||||
.append(" m")
|
||||
.append(" To Distance: ")
|
||||
.append(radiusMeters)
|
||||
.append(" m")
|
||||
.append(" Lower Left: [")
|
||||
.append(minLat)
|
||||
.append(',')
|
||||
.append(minLon)
|
||||
.append(']')
|
||||
.append(" Upper Right: [")
|
||||
.append(maxLat)
|
||||
.append(',')
|
||||
.append(maxLon)
|
||||
.append("]")
|
||||
.toString();
|
||||
}
|
||||
|
||||
/** getter method for minimum distance */
|
||||
public double getMinRadiusMeters() {
|
||||
return this.minRadiusMeters;
|
||||
}
|
||||
|
||||
/** getter method for maximum distance */
|
||||
public double getMaxRadiusMeters() {
|
||||
return this.radiusMeters;
|
||||
}
|
||||
}
|
@ -128,4 +128,33 @@ public class Build {
|
||||
public String toString() {
|
||||
return "[" + shortHash + "][" + date + "]";
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) {
|
||||
return true;
|
||||
}
|
||||
if (o == null || getClass() != o.getClass()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
Build build = (Build) o;
|
||||
|
||||
if (isSnapshot != build.isSnapshot) {
|
||||
return false;
|
||||
}
|
||||
if (!shortHash.equals(build.shortHash)) {
|
||||
return false;
|
||||
}
|
||||
return date.equals(build.date);
|
||||
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
int result = (isSnapshot ? 1 : 0);
|
||||
result = 31 * result + shortHash.hashCode();
|
||||
result = 31 * result + date.hashCode();
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
@ -19,10 +19,12 @@
|
||||
|
||||
package org.elasticsearch;
|
||||
|
||||
import org.elasticsearch.action.support.replication.ReplicationOperation;
|
||||
import org.elasticsearch.cluster.action.shard.ShardStateAction;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Writeable;
|
||||
import org.elasticsearch.common.logging.LoggerMessageFormat;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
@ -46,7 +48,7 @@ import static org.elasticsearch.cluster.metadata.IndexMetaData.INDEX_UUID_NA_VAL
|
||||
/**
|
||||
* A base class for all elasticsearch exceptions.
|
||||
*/
|
||||
public class ElasticsearchException extends RuntimeException implements ToXContent {
|
||||
public class ElasticsearchException extends RuntimeException implements ToXContent, Writeable {
|
||||
|
||||
public static final String REST_EXCEPTION_SKIP_CAUSE = "rest.exception.cause.skip";
|
||||
public static final String REST_EXCEPTION_SKIP_STACK_TRACE = "rest.exception.stacktrace.skip";
|
||||
@ -234,6 +236,7 @@ public class ElasticsearchException extends RuntimeException implements ToXConte
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeOptionalString(this.getMessage());
|
||||
out.writeThrowable(this.getCause());
|
||||
@ -412,7 +415,8 @@ public class ElasticsearchException extends RuntimeException implements ToXConte
|
||||
if (simpleName.startsWith("Elasticsearch")) {
|
||||
simpleName = simpleName.substring("Elasticsearch".length());
|
||||
}
|
||||
return Strings.toUnderscoreCase(simpleName);
|
||||
// TODO: do we really need to make the exception name in underscore casing?
|
||||
return toUnderscoreCase(simpleName);
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -595,8 +599,7 @@ public class ElasticsearchException extends RuntimeException implements ToXConte
|
||||
org.elasticsearch.common.util.concurrent.EsRejectedExecutionException::new, 59),
|
||||
EARLY_TERMINATION_EXCEPTION(org.elasticsearch.common.lucene.Lucene.EarlyTerminationException.class,
|
||||
org.elasticsearch.common.lucene.Lucene.EarlyTerminationException::new, 60),
|
||||
ROUTING_VALIDATION_EXCEPTION(org.elasticsearch.cluster.routing.RoutingValidationException.class,
|
||||
org.elasticsearch.cluster.routing.RoutingValidationException::new, 61),
|
||||
// 61 used to be for RoutingValidationException
|
||||
NOT_SERIALIZABLE_EXCEPTION_WRAPPER(org.elasticsearch.common.io.stream.NotSerializableExceptionWrapper.class,
|
||||
org.elasticsearch.common.io.stream.NotSerializableExceptionWrapper::new, 62),
|
||||
ALIAS_FILTER_PARSING_EXCEPTION(org.elasticsearch.indices.AliasFilterParsingException.class,
|
||||
@ -696,8 +699,8 @@ public class ElasticsearchException extends RuntimeException implements ToXConte
|
||||
org.elasticsearch.index.translog.TranslogException::new, 115),
|
||||
PROCESS_CLUSTER_EVENT_TIMEOUT_EXCEPTION(org.elasticsearch.cluster.metadata.ProcessClusterEventTimeoutException.class,
|
||||
org.elasticsearch.cluster.metadata.ProcessClusterEventTimeoutException::new, 116),
|
||||
RETRY_ON_PRIMARY_EXCEPTION(org.elasticsearch.action.support.replication.TransportReplicationAction.RetryOnPrimaryException.class,
|
||||
org.elasticsearch.action.support.replication.TransportReplicationAction.RetryOnPrimaryException::new, 117),
|
||||
RETRY_ON_PRIMARY_EXCEPTION(ReplicationOperation.RetryOnPrimaryException.class,
|
||||
ReplicationOperation.RetryOnPrimaryException::new, 117),
|
||||
ELASTICSEARCH_TIMEOUT_EXCEPTION(org.elasticsearch.ElasticsearchTimeoutException.class,
|
||||
org.elasticsearch.ElasticsearchTimeoutException::new, 118),
|
||||
QUERY_PHASE_EXECUTION_EXCEPTION(org.elasticsearch.search.query.QueryPhaseExecutionException.class,
|
||||
@ -845,4 +848,39 @@ public class ElasticsearchException extends RuntimeException implements ToXConte
|
||||
interface FunctionThatThrowsIOException<T, R> {
|
||||
R apply(T t) throws IOException;
|
||||
}
|
||||
|
||||
// lower cases and adds underscores to transitions in a name
|
||||
private static String toUnderscoreCase(String value) {
|
||||
StringBuilder sb = new StringBuilder();
|
||||
boolean changed = false;
|
||||
for (int i = 0; i < value.length(); i++) {
|
||||
char c = value.charAt(i);
|
||||
if (Character.isUpperCase(c)) {
|
||||
if (!changed) {
|
||||
// copy it over here
|
||||
for (int j = 0; j < i; j++) {
|
||||
sb.append(value.charAt(j));
|
||||
}
|
||||
changed = true;
|
||||
if (i == 0) {
|
||||
sb.append(Character.toLowerCase(c));
|
||||
} else {
|
||||
sb.append('_');
|
||||
sb.append(Character.toLowerCase(c));
|
||||
}
|
||||
} else {
|
||||
sb.append('_');
|
||||
sb.append(Character.toLowerCase(c));
|
||||
}
|
||||
} else {
|
||||
if (changed) {
|
||||
sb.append(c);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (!changed) {
|
||||
return value;
|
||||
}
|
||||
return sb.toString();
|
||||
}
|
||||
}
|
||||
|
@ -34,12 +34,11 @@ import java.io.IOException;
|
||||
*/
|
||||
@SuppressWarnings("deprecation")
|
||||
public class Version {
|
||||
|
||||
// The logic for ID is: XXYYZZAA, where XX is major version, YY is minor version, ZZ is revision, and AA is alpha/beta/rc indicator
|
||||
// AA values below 25 are for alpha builder (since 5.0), and above 25 and below 50 are beta builds, and below 99 are RC builds, with 99 indicating a release
|
||||
// the (internal) format of the id is there so we can easily do after/before checks on the id
|
||||
|
||||
|
||||
/*
|
||||
* The logic for ID is: XXYYZZAA, where XX is major version, YY is minor version, ZZ is revision, and AA is alpha/beta/rc indicator AA
|
||||
* values below 25 are for alpha builder (since 5.0), and above 25 and below 50 are beta builds, and below 99 are RC builds, with 99
|
||||
* indicating a release the (internal) format of the id is there so we can easily do after/before checks on the id
|
||||
*/
|
||||
public static final int V_2_0_0_beta1_ID = 2000001;
|
||||
public static final Version V_2_0_0_beta1 = new Version(V_2_0_0_beta1_ID, org.apache.lucene.util.Version.LUCENE_5_2_1);
|
||||
public static final int V_2_0_0_beta2_ID = 2000002;
|
||||
@ -62,11 +61,21 @@ public class Version {
|
||||
public static final Version V_2_2_0 = new Version(V_2_2_0_ID, org.apache.lucene.util.Version.LUCENE_5_4_1);
|
||||
public static final int V_2_2_1_ID = 2020199;
|
||||
public static final Version V_2_2_1 = new Version(V_2_2_1_ID, org.apache.lucene.util.Version.LUCENE_5_4_1);
|
||||
public static final int V_2_2_2_ID = 2020299;
|
||||
public static final Version V_2_2_2 = new Version(V_2_2_2_ID, org.apache.lucene.util.Version.LUCENE_5_4_1);
|
||||
public static final int V_2_3_0_ID = 2030099;
|
||||
public static final Version V_2_3_0 = new Version(V_2_3_0_ID, org.apache.lucene.util.Version.LUCENE_5_5_0);
|
||||
public static final int V_2_3_1_ID = 2030199;
|
||||
public static final Version V_2_3_1 = new Version(V_2_3_1_ID, org.apache.lucene.util.Version.LUCENE_5_5_0);
|
||||
public static final int V_2_3_2_ID = 2030299;
|
||||
public static final Version V_2_3_2 = new Version(V_2_3_2_ID, org.apache.lucene.util.Version.LUCENE_5_5_0);
|
||||
public static final int V_5_0_0_alpha1_ID = 5000001;
|
||||
public static final Version V_5_0_0_alpha1 = new Version(V_5_0_0_alpha1_ID, org.apache.lucene.util.Version.LUCENE_6_0_0);
|
||||
public static final Version CURRENT = V_5_0_0_alpha1;
|
||||
public static final int V_5_0_0_alpha2_ID = 5000002;
|
||||
public static final Version V_5_0_0_alpha2 = new Version(V_5_0_0_alpha2_ID, org.apache.lucene.util.Version.LUCENE_6_0_0);
|
||||
public static final int V_5_0_0_ID = 5000099;
|
||||
public static final Version V_5_0_0 = new Version(V_5_0_0_ID, org.apache.lucene.util.Version.LUCENE_6_0_0);
|
||||
public static final Version CURRENT = V_5_0_0;
|
||||
|
||||
static {
|
||||
assert CURRENT.luceneVersion.equals(org.apache.lucene.util.Version.LATEST) : "Version must be upgraded to ["
|
||||
@ -79,10 +88,20 @@ public class Version {
|
||||
|
||||
public static Version fromId(int id) {
|
||||
switch (id) {
|
||||
case V_5_0_0_ID:
|
||||
return V_5_0_0;
|
||||
case V_5_0_0_alpha2_ID:
|
||||
return V_5_0_0_alpha2;
|
||||
case V_5_0_0_alpha1_ID:
|
||||
return V_5_0_0_alpha1;
|
||||
case V_2_3_2_ID:
|
||||
return V_2_3_2;
|
||||
case V_2_3_1_ID:
|
||||
return V_2_3_1;
|
||||
case V_2_3_0_ID:
|
||||
return V_2_3_0;
|
||||
case V_2_2_2_ID:
|
||||
return V_2_2_2;
|
||||
case V_2_2_1_ID:
|
||||
return V_2_2_1;
|
||||
case V_2_2_0_ID:
|
||||
@ -113,12 +132,15 @@ public class Version {
|
||||
/**
|
||||
* Return the {@link Version} of Elasticsearch that has been used to create an index given its settings.
|
||||
*
|
||||
* @throws IllegalStateException if the given index settings doesn't contain a value for the key {@value IndexMetaData#SETTING_VERSION_CREATED}
|
||||
* @throws IllegalStateException if the given index settings doesn't contain a value for the key
|
||||
* {@value IndexMetaData#SETTING_VERSION_CREATED}
|
||||
*/
|
||||
public static Version indexCreated(Settings indexSettings) {
|
||||
final Version indexVersion = indexSettings.getAsVersion(IndexMetaData.SETTING_VERSION_CREATED, null);
|
||||
if (indexVersion == null) {
|
||||
throw new IllegalStateException("[" + IndexMetaData.SETTING_VERSION_CREATED + "] is not present in the index settings for index with uuid: [" + indexSettings.get(IndexMetaData.SETTING_INDEX_UUID) + "]");
|
||||
throw new IllegalStateException(
|
||||
"[" + IndexMetaData.SETTING_VERSION_CREATED + "] is not present in the index settings for index with uuid: ["
|
||||
+ indexSettings.get(IndexMetaData.SETTING_INDEX_UUID) + "]");
|
||||
}
|
||||
return indexVersion;
|
||||
}
|
||||
@ -147,7 +169,8 @@ public class Version {
|
||||
}
|
||||
String[] parts = version.split("\\.|\\-");
|
||||
if (parts.length < 3 || parts.length > 4) {
|
||||
throw new IllegalArgumentException("the version needs to contain major, minor, and revision, and optionally the build: " + version);
|
||||
throw new IllegalArgumentException(
|
||||
"the version needs to contain major, minor, and revision, and optionally the build: " + version);
|
||||
}
|
||||
|
||||
try {
|
||||
@ -231,7 +254,8 @@ public class Version {
|
||||
|
||||
@SuppressForbidden(reason = "System.out.*")
|
||||
public static void main(String[] args) {
|
||||
System.out.println("Version: " + Version.CURRENT + ", Build: " + Build.CURRENT.shortHash() + "/" + Build.CURRENT.date() + ", JVM: " + JvmInfo.jvmInfo().version());
|
||||
System.out.println("Version: " + Version.CURRENT + ", Build: " + Build.CURRENT.shortHash() + "/" + Build.CURRENT.date() + ", JVM: "
|
||||
+ JvmInfo.jvmInfo().version());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -24,16 +24,21 @@ import org.elasticsearch.transport.BaseTransportResponseHandler;
|
||||
import org.elasticsearch.transport.TransportException;
|
||||
import org.elasticsearch.transport.TransportResponse;
|
||||
|
||||
import java.util.Objects;
|
||||
import java.util.function.Supplier;
|
||||
|
||||
/**
|
||||
* A simple base class for action response listeners, defaulting to using the SAME executor (as its
|
||||
* very common on response handlers).
|
||||
*/
|
||||
public abstract class ActionListenerResponseHandler<Response extends TransportResponse> extends BaseTransportResponseHandler<Response> {
|
||||
public class ActionListenerResponseHandler<Response extends TransportResponse> extends BaseTransportResponseHandler<Response> {
|
||||
|
||||
private final ActionListener<Response> listener;
|
||||
private final Supplier<Response> responseSupplier;
|
||||
|
||||
public ActionListenerResponseHandler(ActionListener<Response> listener) {
|
||||
this.listener = listener;
|
||||
public ActionListenerResponseHandler(ActionListener<Response> listener, Supplier<Response> responseSupplier) {
|
||||
this.listener = Objects.requireNonNull(listener);
|
||||
this.responseSupplier = Objects.requireNonNull(responseSupplier);
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -46,6 +51,11 @@ public abstract class ActionListenerResponseHandler<Response extends TransportRe
|
||||
listener.onFailure(e);
|
||||
}
|
||||
|
||||
@Override
|
||||
public Response newInstance() {
|
||||
return responseSupplier.get();
|
||||
}
|
||||
|
||||
@Override
|
||||
public String executor() {
|
||||
return ThreadPool.Names.SAME;
|
||||
|
@ -19,6 +19,8 @@
|
||||
|
||||
package org.elasticsearch.action;
|
||||
|
||||
import org.elasticsearch.action.admin.cluster.allocation.ClusterAllocationExplainAction;
|
||||
import org.elasticsearch.action.admin.cluster.allocation.TransportClusterAllocationExplainAction;
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthAction;
|
||||
import org.elasticsearch.action.admin.cluster.health.TransportClusterHealthAction;
|
||||
import org.elasticsearch.action.admin.cluster.node.hotthreads.NodesHotThreadsAction;
|
||||
@ -145,12 +147,12 @@ import org.elasticsearch.action.get.TransportMultiGetAction;
|
||||
import org.elasticsearch.action.get.TransportShardMultiGetAction;
|
||||
import org.elasticsearch.action.index.IndexAction;
|
||||
import org.elasticsearch.action.index.TransportIndexAction;
|
||||
import org.elasticsearch.action.indexedscripts.delete.DeleteIndexedScriptAction;
|
||||
import org.elasticsearch.action.indexedscripts.delete.TransportDeleteIndexedScriptAction;
|
||||
import org.elasticsearch.action.indexedscripts.get.GetIndexedScriptAction;
|
||||
import org.elasticsearch.action.indexedscripts.get.TransportGetIndexedScriptAction;
|
||||
import org.elasticsearch.action.indexedscripts.put.PutIndexedScriptAction;
|
||||
import org.elasticsearch.action.indexedscripts.put.TransportPutIndexedScriptAction;
|
||||
import org.elasticsearch.action.admin.cluster.storedscripts.DeleteStoredScriptAction;
|
||||
import org.elasticsearch.action.admin.cluster.storedscripts.TransportDeleteStoredScriptAction;
|
||||
import org.elasticsearch.action.admin.cluster.storedscripts.GetStoredScriptAction;
|
||||
import org.elasticsearch.action.admin.cluster.storedscripts.TransportGetStoredScriptAction;
|
||||
import org.elasticsearch.action.admin.cluster.storedscripts.PutStoredScriptAction;
|
||||
import org.elasticsearch.action.admin.cluster.storedscripts.TransportPutStoredScriptAction;
|
||||
import org.elasticsearch.action.ingest.IngestActionFilter;
|
||||
import org.elasticsearch.action.ingest.IngestProxyActionFilter;
|
||||
import org.elasticsearch.action.ingest.DeletePipelineAction;
|
||||
@ -161,6 +163,8 @@ import org.elasticsearch.action.ingest.PutPipelineAction;
|
||||
import org.elasticsearch.action.ingest.PutPipelineTransportAction;
|
||||
import org.elasticsearch.action.ingest.SimulatePipelineAction;
|
||||
import org.elasticsearch.action.ingest.SimulatePipelineTransportAction;
|
||||
import org.elasticsearch.action.main.MainAction;
|
||||
import org.elasticsearch.action.main.TransportMainAction;
|
||||
import org.elasticsearch.action.percolate.MultiPercolateAction;
|
||||
import org.elasticsearch.action.percolate.PercolateAction;
|
||||
import org.elasticsearch.action.percolate.TransportMultiPercolateAction;
|
||||
@ -257,12 +261,14 @@ public class ActionModule extends AbstractModule {
|
||||
bind(ActionFilters.class).asEagerSingleton();
|
||||
bind(AutoCreateIndex.class).asEagerSingleton();
|
||||
bind(DestructiveOperations.class).asEagerSingleton();
|
||||
registerAction(MainAction.INSTANCE, TransportMainAction.class);
|
||||
registerAction(NodesInfoAction.INSTANCE, TransportNodesInfoAction.class);
|
||||
registerAction(NodesStatsAction.INSTANCE, TransportNodesStatsAction.class);
|
||||
registerAction(NodesHotThreadsAction.INSTANCE, TransportNodesHotThreadsAction.class);
|
||||
registerAction(ListTasksAction.INSTANCE, TransportListTasksAction.class);
|
||||
registerAction(CancelTasksAction.INSTANCE, TransportCancelTasksAction.class);
|
||||
|
||||
registerAction(ClusterAllocationExplainAction.INSTANCE, TransportClusterAllocationExplainAction.class);
|
||||
registerAction(ClusterStatsAction.INSTANCE, TransportClusterStatsAction.class);
|
||||
registerAction(ClusterStateAction.INSTANCE, TransportClusterStateAction.class);
|
||||
registerAction(ClusterHealthAction.INSTANCE, TransportClusterHealthAction.class);
|
||||
@ -334,9 +340,9 @@ public class ActionModule extends AbstractModule {
|
||||
registerAction(RenderSearchTemplateAction.INSTANCE, TransportRenderSearchTemplateAction.class);
|
||||
|
||||
//Indexed scripts
|
||||
registerAction(PutIndexedScriptAction.INSTANCE, TransportPutIndexedScriptAction.class);
|
||||
registerAction(GetIndexedScriptAction.INSTANCE, TransportGetIndexedScriptAction.class);
|
||||
registerAction(DeleteIndexedScriptAction.INSTANCE, TransportDeleteIndexedScriptAction.class);
|
||||
registerAction(PutStoredScriptAction.INSTANCE, TransportPutStoredScriptAction.class);
|
||||
registerAction(GetStoredScriptAction.INSTANCE, TransportGetStoredScriptAction.class);
|
||||
registerAction(DeleteStoredScriptAction.INSTANCE, TransportDeleteStoredScriptAction.class);
|
||||
|
||||
registerAction(FieldStatsAction.INSTANCE, TransportFieldStatsTransportAction.class);
|
||||
|
||||
|
@ -22,7 +22,6 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.StatusToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.index.seqno.SequenceNumbersService;
|
||||
import org.elasticsearch.index.shard.ShardId;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
@ -59,7 +58,6 @@ public abstract class DocWriteResponse extends ReplicationResponse implements St
|
||||
return this.shardId.getIndexName();
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* The exact shard the document was changed in.
|
||||
*/
|
||||
@ -122,12 +120,12 @@ public abstract class DocWriteResponse extends ReplicationResponse implements St
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString _INDEX = new XContentBuilderString("_index");
|
||||
static final XContentBuilderString _TYPE = new XContentBuilderString("_type");
|
||||
static final XContentBuilderString _ID = new XContentBuilderString("_id");
|
||||
static final XContentBuilderString _VERSION = new XContentBuilderString("_version");
|
||||
static final XContentBuilderString _SHARD_ID = new XContentBuilderString("_shard_id");
|
||||
static final XContentBuilderString _SEQ_NO = new XContentBuilderString("_seq_no");
|
||||
static final String _INDEX = "_index";
|
||||
static final String _TYPE = "_type";
|
||||
static final String _ID = "_id";
|
||||
static final String _VERSION = "_version";
|
||||
static final String _SHARD_ID = "_shard_id";
|
||||
static final String _SEQ_NO = "_seq_no";
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -26,10 +26,8 @@ package org.elasticsearch.action;
|
||||
public interface RealtimeRequest {
|
||||
|
||||
/**
|
||||
* @param realtime Controls whether this request should be realtime by reading from the translog. If <code>null</code>
|
||||
* is specified then whether the operation will be realtime depends on the api of the concrete request
|
||||
* subclass.
|
||||
* @param realtime Controls whether this request should be realtime by reading from the translog.
|
||||
*/
|
||||
<R extends RealtimeRequest> R realtime(Boolean realtime);
|
||||
<R extends RealtimeRequest> R realtime(boolean realtime);
|
||||
|
||||
}
|
||||
|
@ -28,7 +28,6 @@ import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Streamable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.index.shard.ShardId;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
|
||||
@ -280,24 +279,24 @@ public class ReplicationResponse extends ActionResponse {
|
||||
|
||||
private static class Fields {
|
||||
|
||||
private static final XContentBuilderString _INDEX = new XContentBuilderString("_index");
|
||||
private static final XContentBuilderString _SHARD = new XContentBuilderString("_shard");
|
||||
private static final XContentBuilderString _NODE = new XContentBuilderString("_node");
|
||||
private static final XContentBuilderString REASON = new XContentBuilderString("reason");
|
||||
private static final XContentBuilderString STATUS = new XContentBuilderString("status");
|
||||
private static final XContentBuilderString PRIMARY = new XContentBuilderString("primary");
|
||||
private static final String _INDEX = "_index";
|
||||
private static final String _SHARD = "_shard";
|
||||
private static final String _NODE = "_node";
|
||||
private static final String REASON = "reason";
|
||||
private static final String STATUS = "status";
|
||||
private static final String PRIMARY = "primary";
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
private static class Fields {
|
||||
|
||||
private static final XContentBuilderString _SHARDS = new XContentBuilderString("_shards");
|
||||
private static final XContentBuilderString TOTAL = new XContentBuilderString("total");
|
||||
private static final XContentBuilderString SUCCESSFUL = new XContentBuilderString("successful");
|
||||
private static final XContentBuilderString PENDING = new XContentBuilderString("pending");
|
||||
private static final XContentBuilderString FAILED = new XContentBuilderString("failed");
|
||||
private static final XContentBuilderString FAILURES = new XContentBuilderString("failures");
|
||||
private static final String _SHARDS = "_shards";
|
||||
private static final String TOTAL = "total";
|
||||
private static final String SUCCESSFUL = "successful";
|
||||
private static final String PENDING = "pending";
|
||||
private static final String FAILED = "failed";
|
||||
private static final String FAILURES = "failures";
|
||||
|
||||
}
|
||||
}
|
||||
|
@ -37,7 +37,7 @@ import static org.elasticsearch.ExceptionsHelper.detailedMessage;
|
||||
*
|
||||
* The class is final due to serialization limitations
|
||||
*/
|
||||
public final class TaskOperationFailure implements Writeable<TaskOperationFailure>, ToXContent {
|
||||
public final class TaskOperationFailure implements Writeable, ToXContent {
|
||||
|
||||
private final String nodeId;
|
||||
|
||||
@ -47,6 +47,16 @@ public final class TaskOperationFailure implements Writeable<TaskOperationFailur
|
||||
|
||||
private final RestStatus status;
|
||||
|
||||
public TaskOperationFailure(String nodeId, long taskId, Throwable t) {
|
||||
this.nodeId = nodeId;
|
||||
this.taskId = taskId;
|
||||
this.reason = t;
|
||||
status = ExceptionsHelper.status(t);
|
||||
}
|
||||
|
||||
/**
|
||||
* Read from a stream.
|
||||
*/
|
||||
public TaskOperationFailure(StreamInput in) throws IOException {
|
||||
nodeId = in.readString();
|
||||
taskId = in.readLong();
|
||||
@ -54,11 +64,12 @@ public final class TaskOperationFailure implements Writeable<TaskOperationFailur
|
||||
status = RestStatus.readFrom(in);
|
||||
}
|
||||
|
||||
public TaskOperationFailure(String nodeId, long taskId, Throwable t) {
|
||||
this.nodeId = nodeId;
|
||||
this.taskId = taskId;
|
||||
this.reason = t;
|
||||
status = ExceptionsHelper.status(t);
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeString(nodeId);
|
||||
out.writeLong(taskId);
|
||||
out.writeThrowable(reason);
|
||||
RestStatus.writeTo(out, status);
|
||||
}
|
||||
|
||||
public String getNodeId() {
|
||||
@ -81,19 +92,6 @@ public final class TaskOperationFailure implements Writeable<TaskOperationFailur
|
||||
return reason;
|
||||
}
|
||||
|
||||
@Override
|
||||
public TaskOperationFailure readFrom(StreamInput in) throws IOException {
|
||||
return new TaskOperationFailure(in);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeString(nodeId);
|
||||
out.writeLong(taskId);
|
||||
out.writeThrowable(reason);
|
||||
RestStatus.writeTo(out, status);
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return "[" + nodeId + "][" + taskId + "] failed, reason [" + getReason() + "]";
|
||||
|
@ -49,11 +49,7 @@ public class TransportActionNodeProxy<Request extends ActionRequest, Response ex
|
||||
listener.onFailure(validationException);
|
||||
return;
|
||||
}
|
||||
transportService.sendRequest(node, action.name(), request, transportOptions, new ActionListenerResponseHandler<Response>(listener) {
|
||||
@Override
|
||||
public Response newInstance() {
|
||||
return action.newResponse();
|
||||
}
|
||||
});
|
||||
transportService.sendRequest(node, action.name(), request, transportOptions,
|
||||
new ActionListenerResponseHandler<>(listener, action::newResponse));
|
||||
}
|
||||
}
|
||||
|
@ -0,0 +1,48 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.allocation;
|
||||
|
||||
import org.elasticsearch.action.Action;
|
||||
import org.elasticsearch.client.ElasticsearchClient;
|
||||
|
||||
/**
|
||||
* Action for explaining shard allocation for a shard in the cluster
|
||||
*/
|
||||
public class ClusterAllocationExplainAction extends Action<ClusterAllocationExplainRequest,
|
||||
ClusterAllocationExplainResponse,
|
||||
ClusterAllocationExplainRequestBuilder> {
|
||||
|
||||
public static final ClusterAllocationExplainAction INSTANCE = new ClusterAllocationExplainAction();
|
||||
public static final String NAME = "cluster:monitor/allocation/explain";
|
||||
|
||||
private ClusterAllocationExplainAction() {
|
||||
super(NAME);
|
||||
}
|
||||
|
||||
@Override
|
||||
public ClusterAllocationExplainResponse newResponse() {
|
||||
return new ClusterAllocationExplainResponse();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ClusterAllocationExplainRequestBuilder newRequestBuilder(ElasticsearchClient client) {
|
||||
return new ClusterAllocationExplainRequestBuilder(client, this);
|
||||
}
|
||||
}
|
@ -0,0 +1,177 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.allocation;
|
||||
|
||||
import org.elasticsearch.ElasticsearchParseException;
|
||||
import org.elasticsearch.action.ActionRequestValidationException;
|
||||
import org.elasticsearch.action.support.master.MasterNodeRequest;
|
||||
import org.elasticsearch.common.Nullable;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.ParseFieldMatcher;
|
||||
import org.elasticsearch.common.ParseFieldMatcherSupplier;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ObjectParser;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.action.ValidateActions.addValidationError;
|
||||
|
||||
/**
|
||||
* A request to explain the allocation of a shard in the cluster
|
||||
*/
|
||||
public class ClusterAllocationExplainRequest extends MasterNodeRequest<ClusterAllocationExplainRequest> {
|
||||
|
||||
private static ObjectParser<ClusterAllocationExplainRequest, ParseFieldMatcherSupplier> PARSER = new ObjectParser(
|
||||
"cluster/allocation/explain");
|
||||
static {
|
||||
PARSER.declareString(ClusterAllocationExplainRequest::setIndex, new ParseField("index"));
|
||||
PARSER.declareInt(ClusterAllocationExplainRequest::setShard, new ParseField("shard"));
|
||||
PARSER.declareBoolean(ClusterAllocationExplainRequest::setPrimary, new ParseField("primary"));
|
||||
}
|
||||
|
||||
private String index;
|
||||
private Integer shard;
|
||||
private Boolean primary;
|
||||
private boolean includeYesDecisions = false;
|
||||
|
||||
/** Explain the first unassigned shard */
|
||||
public ClusterAllocationExplainRequest() {
|
||||
this.index = null;
|
||||
this.shard = null;
|
||||
this.primary = null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new allocation explain request. If {@code primary} is false, the first unassigned replica
|
||||
* will be picked for explanation. If no replicas are unassigned, the first assigned replica will
|
||||
* be explained.
|
||||
*/
|
||||
public ClusterAllocationExplainRequest(String index, int shard, boolean primary) {
|
||||
this.index = index;
|
||||
this.shard = shard;
|
||||
this.primary = primary;
|
||||
}
|
||||
|
||||
@Override
|
||||
public ActionRequestValidationException validate() {
|
||||
ActionRequestValidationException validationException = null;
|
||||
if (this.useAnyUnassignedShard() == false) {
|
||||
if (this.index == null) {
|
||||
validationException = addValidationError("index must be specified", validationException);
|
||||
}
|
||||
if (this.shard == null) {
|
||||
validationException = addValidationError("shard must be specified", validationException);
|
||||
}
|
||||
if (this.primary == null) {
|
||||
validationException = addValidationError("primary must be specified", validationException);
|
||||
}
|
||||
}
|
||||
return validationException;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns {@code true} iff the first unassigned shard is to be used
|
||||
*/
|
||||
public boolean useAnyUnassignedShard() {
|
||||
return this.index == null && this.shard == null && this.primary == null;
|
||||
}
|
||||
|
||||
public ClusterAllocationExplainRequest setIndex(String index) {
|
||||
this.index = index;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Nullable
|
||||
public String getIndex() {
|
||||
return this.index;
|
||||
}
|
||||
|
||||
public ClusterAllocationExplainRequest setShard(Integer shard) {
|
||||
this.shard = shard;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Nullable
|
||||
public Integer getShard() {
|
||||
return this.shard;
|
||||
}
|
||||
|
||||
public ClusterAllocationExplainRequest setPrimary(Boolean primary) {
|
||||
this.primary = primary;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Nullable
|
||||
public Boolean isPrimary() {
|
||||
return this.primary;
|
||||
}
|
||||
|
||||
public void includeYesDecisions(boolean includeYesDecisions) {
|
||||
this.includeYesDecisions = includeYesDecisions;
|
||||
}
|
||||
|
||||
/** Returns true if all decisions should be included. Otherwise only "NO" and "THROTTLE" decisions are returned */
|
||||
public boolean includeYesDecisions() {
|
||||
return this.includeYesDecisions;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
StringBuilder sb = new StringBuilder("ClusterAllocationExplainRequest[");
|
||||
if (this.useAnyUnassignedShard()) {
|
||||
sb.append("useAnyUnassignedShard=true");
|
||||
} else {
|
||||
sb.append("index=").append(index);
|
||||
sb.append(",shard=").append(shard);
|
||||
sb.append(",primary?=").append(primary);
|
||||
}
|
||||
sb.append(",includeYesDecisions?=").append(includeYesDecisions);
|
||||
return sb.toString();
|
||||
}
|
||||
|
||||
public static ClusterAllocationExplainRequest parse(XContentParser parser) throws IOException {
|
||||
ClusterAllocationExplainRequest req = PARSER.parse(parser, new ClusterAllocationExplainRequest(), () -> ParseFieldMatcher.STRICT);
|
||||
Exception e = req.validate();
|
||||
if (e != null) {
|
||||
throw new ElasticsearchParseException("'index', 'shard', and 'primary' must be specified in allocation explain request", e);
|
||||
}
|
||||
return req;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
this.index = in.readOptionalString();
|
||||
this.shard = in.readOptionalVInt();
|
||||
this.primary = in.readOptionalBoolean();
|
||||
this.includeYesDecisions = in.readBoolean();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeOptionalString(index);
|
||||
out.writeOptionalVInt(shard);
|
||||
out.writeOptionalBoolean(primary);
|
||||
out.writeBoolean(includeYesDecisions);
|
||||
}
|
||||
}
|
@ -0,0 +1,66 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.allocation;
|
||||
|
||||
import org.elasticsearch.action.support.IndicesOptions;
|
||||
import org.elasticsearch.action.support.master.MasterNodeOperationRequestBuilder;
|
||||
import org.elasticsearch.client.ElasticsearchClient;
|
||||
|
||||
/**
|
||||
* Builder for requests to explain the allocation of a shard in the cluster
|
||||
*/
|
||||
public class ClusterAllocationExplainRequestBuilder
|
||||
extends MasterNodeOperationRequestBuilder<ClusterAllocationExplainRequest,
|
||||
ClusterAllocationExplainResponse,
|
||||
ClusterAllocationExplainRequestBuilder> {
|
||||
|
||||
public ClusterAllocationExplainRequestBuilder(ElasticsearchClient client, ClusterAllocationExplainAction action) {
|
||||
super(client, action, new ClusterAllocationExplainRequest());
|
||||
}
|
||||
|
||||
/** The index name to use when finding the shard to explain */
|
||||
public ClusterAllocationExplainRequestBuilder setIndex(String index) {
|
||||
request.setIndex(index);
|
||||
return this;
|
||||
}
|
||||
|
||||
/** The shard number to use when finding the shard to explain */
|
||||
public ClusterAllocationExplainRequestBuilder setShard(int shard) {
|
||||
request.setShard(shard);
|
||||
return this;
|
||||
}
|
||||
|
||||
/** Whether the primary or replica should be explained */
|
||||
public ClusterAllocationExplainRequestBuilder setPrimary(boolean primary) {
|
||||
request.setPrimary(primary);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Signal that the first unassigned shard should be used
|
||||
*/
|
||||
public ClusterAllocationExplainRequestBuilder useAnyUnassignedShard() {
|
||||
request.setIndex(null);
|
||||
request.setShard(null);
|
||||
request.setPrimary(null);
|
||||
return this;
|
||||
}
|
||||
|
||||
}
|
@ -17,48 +17,45 @@
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.action.support;
|
||||
package org.elasticsearch.action.admin.cluster.allocation;
|
||||
|
||||
import org.elasticsearch.action.ActionResponse;
|
||||
import org.elasticsearch.cluster.ClusterName;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.tasks.Task;
|
||||
import org.elasticsearch.tasks.TaskId;
|
||||
import org.elasticsearch.transport.TransportRequest;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
/**
|
||||
* Base class for requests that can have associated child tasks
|
||||
* Explanation response for a shard in the cluster
|
||||
*/
|
||||
public class ChildTaskRequest extends TransportRequest {
|
||||
public class ClusterAllocationExplainResponse extends ActionResponse {
|
||||
|
||||
private TaskId parentTaskId = TaskId.EMPTY_TASK_ID;
|
||||
private ClusterAllocationExplanation cae;
|
||||
|
||||
protected ChildTaskRequest() {
|
||||
public ClusterAllocationExplainResponse() {
|
||||
}
|
||||
|
||||
public void setParentTask(String parentTaskNode, long parentTaskId) {
|
||||
this.parentTaskId = new TaskId(parentTaskNode, parentTaskId);
|
||||
public ClusterAllocationExplainResponse(ClusterAllocationExplanation cae) {
|
||||
this.cae = cae;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the explanation for shard allocation in the cluster
|
||||
*/
|
||||
public ClusterAllocationExplanation getExplanation() {
|
||||
return this.cae;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
parentTaskId = new TaskId(in);
|
||||
this.cae = new ClusterAllocationExplanation(in);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
parentTaskId.writeTo(out);
|
||||
}
|
||||
|
||||
@Override
|
||||
public final Task createTask(long id, String type, String action) {
|
||||
return createTask(id, type, action, parentTaskId);
|
||||
}
|
||||
|
||||
public Task createTask(long id, String type, String action, TaskId parentTaskId) {
|
||||
return new Task(id, type, action, getDescription(), parentTaskId);
|
||||
cae.writeTo(out);
|
||||
}
|
||||
}
|
@ -0,0 +1,259 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.allocation;
|
||||
|
||||
import org.elasticsearch.cluster.node.DiscoveryNode;
|
||||
import org.elasticsearch.cluster.routing.UnassignedInfo;
|
||||
import org.elasticsearch.common.Nullable;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Writeable;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.shard.ShardId;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* A {@code ClusterAllocationExplanation} is an explanation of why a shard may or may not be allocated to nodes. It also includes weights
|
||||
* for where the shard is likely to be assigned. It is an immutable class
|
||||
*/
|
||||
public final class ClusterAllocationExplanation implements ToXContent, Writeable {
|
||||
|
||||
private final ShardId shard;
|
||||
private final boolean primary;
|
||||
private final String assignedNodeId;
|
||||
private final UnassignedInfo unassignedInfo;
|
||||
private final long remainingDelayMillis;
|
||||
private final Map<DiscoveryNode, NodeExplanation> nodeExplanations;
|
||||
|
||||
public ClusterAllocationExplanation(ShardId shard, boolean primary, @Nullable String assignedNodeId, long remainingDelayMillis,
|
||||
@Nullable UnassignedInfo unassignedInfo, Map<DiscoveryNode, NodeExplanation> nodeExplanations) {
|
||||
this.shard = shard;
|
||||
this.primary = primary;
|
||||
this.assignedNodeId = assignedNodeId;
|
||||
this.unassignedInfo = unassignedInfo;
|
||||
this.remainingDelayMillis = remainingDelayMillis;
|
||||
this.nodeExplanations = nodeExplanations;
|
||||
}
|
||||
|
||||
public ClusterAllocationExplanation(StreamInput in) throws IOException {
|
||||
this.shard = ShardId.readShardId(in);
|
||||
this.primary = in.readBoolean();
|
||||
this.assignedNodeId = in.readOptionalString();
|
||||
this.unassignedInfo = in.readOptionalWriteable(UnassignedInfo::new);
|
||||
this.remainingDelayMillis = in.readVLong();
|
||||
|
||||
int mapSize = in.readVInt();
|
||||
Map<DiscoveryNode, NodeExplanation> nodeToExplanation = new HashMap<>(mapSize);
|
||||
for (int i = 0; i < mapSize; i++) {
|
||||
NodeExplanation nodeExplanation = new NodeExplanation(in);
|
||||
nodeToExplanation.put(nodeExplanation.getNode(), nodeExplanation);
|
||||
}
|
||||
this.nodeExplanations = nodeToExplanation;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
this.getShard().writeTo(out);
|
||||
out.writeBoolean(this.isPrimary());
|
||||
out.writeOptionalString(this.getAssignedNodeId());
|
||||
out.writeOptionalWriteable(this.getUnassignedInfo());
|
||||
out.writeVLong(remainingDelayMillis);
|
||||
|
||||
out.writeVInt(this.nodeExplanations.size());
|
||||
for (NodeExplanation explanation : this.nodeExplanations.values()) {
|
||||
explanation.writeTo(out);
|
||||
}
|
||||
}
|
||||
|
||||
/** Return the shard that the explanation is about */
|
||||
public ShardId getShard() {
|
||||
return this.shard;
|
||||
}
|
||||
|
||||
/** Return true if the explained shard is primary, false otherwise */
|
||||
public boolean isPrimary() {
|
||||
return this.primary;
|
||||
}
|
||||
|
||||
/** Return turn if the shard is assigned to a node */
|
||||
public boolean isAssigned() {
|
||||
return this.assignedNodeId != null;
|
||||
}
|
||||
|
||||
/** Return the assigned node id or null if not assigned */
|
||||
@Nullable
|
||||
public String getAssignedNodeId() {
|
||||
return this.assignedNodeId;
|
||||
}
|
||||
|
||||
/** Return the unassigned info for the shard or null if the shard is assigned */
|
||||
@Nullable
|
||||
public UnassignedInfo getUnassignedInfo() {
|
||||
return this.unassignedInfo;
|
||||
}
|
||||
|
||||
/** Return the remaining allocation delay for this shard in millisocends */
|
||||
public long getRemainingDelayMillis() {
|
||||
return this.remainingDelayMillis;
|
||||
}
|
||||
|
||||
/** Return a map of node to the explanation for that node */
|
||||
public Map<DiscoveryNode, NodeExplanation> getNodeExplanations() {
|
||||
return this.nodeExplanations;
|
||||
}
|
||||
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(); {
|
||||
builder.startObject("shard"); {
|
||||
builder.field("index", shard.getIndexName());
|
||||
builder.field("index_uuid", shard.getIndex().getUUID());
|
||||
builder.field("id", shard.getId());
|
||||
builder.field("primary", primary);
|
||||
}
|
||||
builder.endObject(); // end shard
|
||||
builder.field("assigned", this.assignedNodeId != null);
|
||||
// If assigned, show the node id of the node it's assigned to
|
||||
if (assignedNodeId != null) {
|
||||
builder.field("assigned_node_id", this.assignedNodeId);
|
||||
}
|
||||
// If we have unassigned info, show that
|
||||
if (unassignedInfo != null) {
|
||||
unassignedInfo.toXContent(builder, params);
|
||||
long delay = unassignedInfo.getLastComputedLeftDelayNanos();
|
||||
builder.timeValueField("allocation_delay_in_millis", "allocation_delay", TimeValue.timeValueNanos(delay));
|
||||
builder.timeValueField("remaining_delay_in_millis", "remaining_delay", TimeValue.timeValueMillis(remainingDelayMillis));
|
||||
}
|
||||
builder.startObject("nodes");
|
||||
for (NodeExplanation explanation : nodeExplanations.values()) {
|
||||
explanation.toXContent(builder, params);
|
||||
}
|
||||
builder.endObject(); // end nodes
|
||||
}
|
||||
builder.endObject(); // end wrapping object
|
||||
return builder;
|
||||
}
|
||||
|
||||
/** An Enum representing the final decision for a shard allocation on a node */
|
||||
public enum FinalDecision {
|
||||
// Yes, the shard can be assigned
|
||||
YES((byte) 0),
|
||||
// No, the shard cannot be assigned
|
||||
NO((byte) 1),
|
||||
// The shard is already assigned to this node
|
||||
ALREADY_ASSIGNED((byte) 2);
|
||||
|
||||
private final byte id;
|
||||
|
||||
FinalDecision (byte id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
private static FinalDecision fromId(byte id) {
|
||||
switch (id) {
|
||||
case 0: return YES;
|
||||
case 1: return NO;
|
||||
case 2: return ALREADY_ASSIGNED;
|
||||
default:
|
||||
throw new IllegalArgumentException("unknown id for final decision: [" + id + "]");
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
switch (id) {
|
||||
case 0: return "YES";
|
||||
case 1: return "NO";
|
||||
case 2: return "ALREADY_ASSIGNED";
|
||||
default:
|
||||
throw new IllegalArgumentException("unknown id for final decision: [" + id + "]");
|
||||
}
|
||||
}
|
||||
|
||||
static FinalDecision readFrom(StreamInput in) throws IOException {
|
||||
return fromId(in.readByte());
|
||||
}
|
||||
|
||||
void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeByte(id);
|
||||
}
|
||||
}
|
||||
|
||||
/** An Enum representing the state of the shard store's copy of the data on a node */
|
||||
public enum StoreCopy {
|
||||
// No data for this shard is on the node
|
||||
NONE((byte) 0),
|
||||
// A copy of the data is available on this node
|
||||
AVAILABLE((byte) 1),
|
||||
// The copy of the data on the node is corrupt
|
||||
CORRUPT((byte) 2),
|
||||
// There was an error reading this node's copy of the data
|
||||
IO_ERROR((byte) 3),
|
||||
// The copy of the data on the node is stale
|
||||
STALE((byte) 4),
|
||||
// It's unknown what the copy of the data is
|
||||
UNKNOWN((byte) 5);
|
||||
|
||||
private final byte id;
|
||||
|
||||
StoreCopy (byte id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
private static StoreCopy fromId(byte id) {
|
||||
switch (id) {
|
||||
case 0: return NONE;
|
||||
case 1: return AVAILABLE;
|
||||
case 2: return CORRUPT;
|
||||
case 3: return IO_ERROR;
|
||||
case 4: return STALE;
|
||||
case 5: return UNKNOWN;
|
||||
default:
|
||||
throw new IllegalArgumentException("unknown id for store copy: [" + id + "]");
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
switch (id) {
|
||||
case 0: return "NONE";
|
||||
case 1: return "AVAILABLE";
|
||||
case 2: return "CORRUPT";
|
||||
case 3: return "IO_ERROR";
|
||||
case 4: return "STALE";
|
||||
case 5: return "UNKNOWN";
|
||||
default:
|
||||
throw new IllegalArgumentException("unknown id for store copy: [" + id + "]");
|
||||
}
|
||||
}
|
||||
|
||||
static StoreCopy readFrom(StreamInput in) throws IOException {
|
||||
return fromId(in.readByte());
|
||||
}
|
||||
|
||||
void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeByte(id);
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,145 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.allocation;
|
||||
|
||||
import org.elasticsearch.ExceptionsHelper;
|
||||
import org.elasticsearch.action.admin.indices.shards.IndicesShardStoresResponse;
|
||||
import org.elasticsearch.cluster.node.DiscoveryNode;
|
||||
import org.elasticsearch.cluster.routing.allocation.decider.Decision;
|
||||
import org.elasticsearch.common.Nullable;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Writeable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Map;
|
||||
/** The cluster allocation explanation for a single node */
|
||||
public class NodeExplanation implements Writeable, ToXContent {
|
||||
private final DiscoveryNode node;
|
||||
private final Decision nodeDecision;
|
||||
private final Float nodeWeight;
|
||||
private final IndicesShardStoresResponse.StoreStatus storeStatus;
|
||||
private final ClusterAllocationExplanation.FinalDecision finalDecision;
|
||||
private final ClusterAllocationExplanation.StoreCopy storeCopy;
|
||||
private final String finalExplanation;
|
||||
|
||||
public NodeExplanation(final DiscoveryNode node, final Decision nodeDecision, final Float nodeWeight,
|
||||
final @Nullable IndicesShardStoresResponse.StoreStatus storeStatus,
|
||||
final ClusterAllocationExplanation.FinalDecision finalDecision,
|
||||
final String finalExplanation,
|
||||
final ClusterAllocationExplanation.StoreCopy storeCopy) {
|
||||
this.node = node;
|
||||
this.nodeDecision = nodeDecision;
|
||||
this.nodeWeight = nodeWeight;
|
||||
this.storeStatus = storeStatus;
|
||||
this.finalDecision = finalDecision;
|
||||
this.finalExplanation = finalExplanation;
|
||||
this.storeCopy = storeCopy;
|
||||
}
|
||||
|
||||
public NodeExplanation(StreamInput in) throws IOException {
|
||||
this.node = new DiscoveryNode(in);
|
||||
this.nodeDecision = Decision.readFrom(in);
|
||||
this.nodeWeight = in.readFloat();
|
||||
if (in.readBoolean()) {
|
||||
this.storeStatus = IndicesShardStoresResponse.StoreStatus.readStoreStatus(in);
|
||||
} else {
|
||||
this.storeStatus = null;
|
||||
}
|
||||
this.finalDecision = ClusterAllocationExplanation.FinalDecision.readFrom(in);
|
||||
this.finalExplanation = in.readString();
|
||||
this.storeCopy = ClusterAllocationExplanation.StoreCopy.readFrom(in);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
node.writeTo(out);
|
||||
Decision.writeTo(nodeDecision, out);
|
||||
out.writeFloat(nodeWeight);
|
||||
if (storeStatus == null) {
|
||||
out.writeBoolean(false);
|
||||
} else {
|
||||
out.writeBoolean(true);
|
||||
storeStatus.writeTo(out);
|
||||
}
|
||||
finalDecision.writeTo(out);
|
||||
out.writeString(finalExplanation);
|
||||
storeCopy.writeTo(out);
|
||||
}
|
||||
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(node.getId()); {
|
||||
builder.field("node_name", node.getName());
|
||||
builder.startObject("node_attributes"); {
|
||||
for (Map.Entry<String, String> attrEntry : node.getAttributes().entrySet()) {
|
||||
builder.field(attrEntry.getKey(), attrEntry.getValue());
|
||||
}
|
||||
}
|
||||
builder.endObject(); // end attributes
|
||||
builder.startObject("store"); {
|
||||
builder.field("shard_copy", storeCopy.toString());
|
||||
if (storeStatus != null) {
|
||||
final Throwable storeErr = storeStatus.getStoreException();
|
||||
if (storeErr != null) {
|
||||
builder.field("store_exception", ExceptionsHelper.detailedMessage(storeErr));
|
||||
}
|
||||
}
|
||||
}
|
||||
builder.endObject(); // end store
|
||||
builder.field("final_decision", finalDecision.toString());
|
||||
builder.field("final_explanation", finalExplanation.toString());
|
||||
builder.field("weight", nodeWeight);
|
||||
nodeDecision.toXContent(builder, params);
|
||||
}
|
||||
builder.endObject(); // end node <uuid>
|
||||
return builder;
|
||||
}
|
||||
|
||||
public DiscoveryNode getNode() {
|
||||
return this.node;
|
||||
}
|
||||
|
||||
public Decision getDecision() {
|
||||
return this.nodeDecision;
|
||||
}
|
||||
|
||||
public Float getWeight() {
|
||||
return this.nodeWeight;
|
||||
}
|
||||
|
||||
@Nullable
|
||||
public IndicesShardStoresResponse.StoreStatus getStoreStatus() {
|
||||
return this.storeStatus;
|
||||
}
|
||||
|
||||
public ClusterAllocationExplanation.FinalDecision getFinalDecision() {
|
||||
return this.finalDecision;
|
||||
}
|
||||
|
||||
public String getFinalExplanation() {
|
||||
return this.finalExplanation;
|
||||
}
|
||||
|
||||
public ClusterAllocationExplanation.StoreCopy getStoreCopy() {
|
||||
return this.storeCopy;
|
||||
}
|
||||
}
|
@ -0,0 +1,326 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.allocation;
|
||||
|
||||
import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
|
||||
import org.apache.lucene.index.CorruptIndexException;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.ExceptionsHelper;
|
||||
import org.elasticsearch.action.ActionListener;
|
||||
import org.elasticsearch.action.admin.indices.shards.IndicesShardStoresRequest;
|
||||
import org.elasticsearch.action.admin.indices.shards.IndicesShardStoresResponse;
|
||||
import org.elasticsearch.action.admin.indices.shards.TransportIndicesShardStoresAction;
|
||||
import org.elasticsearch.action.support.ActionFilters;
|
||||
import org.elasticsearch.action.support.master.TransportMasterNodeAction;
|
||||
import org.elasticsearch.cluster.ClusterInfoService;
|
||||
import org.elasticsearch.cluster.ClusterName;
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.block.ClusterBlockException;
|
||||
import org.elasticsearch.cluster.block.ClusterBlockLevel;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver;
|
||||
import org.elasticsearch.cluster.metadata.MetaData;
|
||||
import org.elasticsearch.cluster.metadata.MetaData.Custom;
|
||||
import org.elasticsearch.cluster.node.DiscoveryNode;
|
||||
import org.elasticsearch.cluster.routing.RoutingNode;
|
||||
import org.elasticsearch.cluster.routing.RoutingNodes;
|
||||
import org.elasticsearch.cluster.routing.RoutingNodes.RoutingNodesIterator;
|
||||
import org.elasticsearch.cluster.routing.RoutingTable;
|
||||
import org.elasticsearch.cluster.routing.ShardRouting;
|
||||
import org.elasticsearch.cluster.routing.UnassignedInfo;
|
||||
import org.elasticsearch.cluster.routing.allocation.AllocationService;
|
||||
import org.elasticsearch.cluster.routing.allocation.RoutingAllocation;
|
||||
import org.elasticsearch.cluster.routing.allocation.RoutingExplanations;
|
||||
import org.elasticsearch.cluster.routing.allocation.allocator.ShardsAllocator;
|
||||
import org.elasticsearch.cluster.routing.allocation.decider.AllocationDeciders;
|
||||
import org.elasticsearch.cluster.routing.allocation.decider.Decision;
|
||||
import org.elasticsearch.cluster.service.ClusterService;
|
||||
import org.elasticsearch.common.collect.ImmutableOpenIntMap;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.transport.TransportService;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
|
||||
/**
|
||||
* The {@code TransportClusterAllocationExplainAction} is responsible for actually executing the explanation of a shard's allocation on the
|
||||
* master node in the cluster.
|
||||
*/
|
||||
public class TransportClusterAllocationExplainAction
|
||||
extends TransportMasterNodeAction<ClusterAllocationExplainRequest, ClusterAllocationExplainResponse> {
|
||||
|
||||
private final AllocationService allocationService;
|
||||
private final ClusterInfoService clusterInfoService;
|
||||
private final AllocationDeciders allocationDeciders;
|
||||
private final ShardsAllocator shardAllocator;
|
||||
private final TransportIndicesShardStoresAction shardStoresAction;
|
||||
|
||||
@Inject
|
||||
public TransportClusterAllocationExplainAction(Settings settings, TransportService transportService, ClusterService clusterService,
|
||||
ThreadPool threadPool, ActionFilters actionFilters,
|
||||
IndexNameExpressionResolver indexNameExpressionResolver,
|
||||
AllocationService allocationService, ClusterInfoService clusterInfoService,
|
||||
AllocationDeciders allocationDeciders, ShardsAllocator shardAllocator,
|
||||
TransportIndicesShardStoresAction shardStoresAction) {
|
||||
super(settings, ClusterAllocationExplainAction.NAME, transportService, clusterService, threadPool, actionFilters,
|
||||
indexNameExpressionResolver, ClusterAllocationExplainRequest::new);
|
||||
this.allocationService = allocationService;
|
||||
this.clusterInfoService = clusterInfoService;
|
||||
this.allocationDeciders = allocationDeciders;
|
||||
this.shardAllocator = shardAllocator;
|
||||
this.shardStoresAction = shardStoresAction;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected String executor() {
|
||||
return ThreadPool.Names.MANAGEMENT;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected ClusterBlockException checkBlock(ClusterAllocationExplainRequest request, ClusterState state) {
|
||||
return state.blocks().globalBlockedException(ClusterBlockLevel.METADATA_READ);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected ClusterAllocationExplainResponse newResponse() {
|
||||
return new ClusterAllocationExplainResponse();
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the decisions for the given {@code ShardRouting} on the given {@code RoutingNode}. If {@code includeYesDecisions} is not true,
|
||||
* only non-YES (NO and THROTTLE) decisions are returned.
|
||||
*/
|
||||
public static Decision tryShardOnNode(ShardRouting shard, RoutingNode node, RoutingAllocation allocation, boolean includeYesDecisions) {
|
||||
Decision d = allocation.deciders().canAllocate(shard, node, allocation);
|
||||
if (includeYesDecisions) {
|
||||
return d;
|
||||
} else {
|
||||
Decision.Multi nonYesDecisions = new Decision.Multi();
|
||||
List<Decision> decisions = d.getDecisions();
|
||||
for (Decision decision : decisions) {
|
||||
if (decision.type() != Decision.Type.YES) {
|
||||
nonYesDecisions.add(decision);
|
||||
}
|
||||
}
|
||||
return nonYesDecisions;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Construct a {@code NodeExplanation} object for the given shard given all the metadata. This also attempts to construct the human
|
||||
* readable FinalDecision and final explanation as part of the explanation.
|
||||
*/
|
||||
public static NodeExplanation calculateNodeExplanation(ShardRouting shard,
|
||||
IndexMetaData indexMetaData,
|
||||
DiscoveryNode node,
|
||||
Decision nodeDecision,
|
||||
Float nodeWeight,
|
||||
IndicesShardStoresResponse.StoreStatus storeStatus,
|
||||
String assignedNodeId,
|
||||
Set<String> activeAllocationIds) {
|
||||
final ClusterAllocationExplanation.FinalDecision finalDecision;
|
||||
final ClusterAllocationExplanation.StoreCopy storeCopy;
|
||||
final String finalExplanation;
|
||||
|
||||
if (storeStatus == null) {
|
||||
// No copies of the data
|
||||
storeCopy = ClusterAllocationExplanation.StoreCopy.NONE;
|
||||
} else {
|
||||
final Throwable storeErr = storeStatus.getStoreException();
|
||||
if (storeErr != null) {
|
||||
if (ExceptionsHelper.unwrapCause(storeErr) instanceof CorruptIndexException) {
|
||||
storeCopy = ClusterAllocationExplanation.StoreCopy.CORRUPT;
|
||||
} else {
|
||||
storeCopy = ClusterAllocationExplanation.StoreCopy.IO_ERROR;
|
||||
}
|
||||
} else if (activeAllocationIds.isEmpty()) {
|
||||
// The ids are only empty if dealing with a legacy index
|
||||
// TODO: fetch the shard state versions and display here?
|
||||
storeCopy = ClusterAllocationExplanation.StoreCopy.UNKNOWN;
|
||||
} else if (activeAllocationIds.contains(storeStatus.getAllocationId())) {
|
||||
storeCopy = ClusterAllocationExplanation.StoreCopy.AVAILABLE;
|
||||
} else {
|
||||
// Otherwise, this is a stale copy of the data (allocation ids don't match)
|
||||
storeCopy = ClusterAllocationExplanation.StoreCopy.STALE;
|
||||
}
|
||||
}
|
||||
|
||||
if (node.getId().equals(assignedNodeId)) {
|
||||
finalDecision = ClusterAllocationExplanation.FinalDecision.ALREADY_ASSIGNED;
|
||||
finalExplanation = "the shard is already assigned to this node";
|
||||
} else if (shard.primary() && shard.unassigned() && shard.allocatedPostIndexCreate(indexMetaData) &&
|
||||
storeCopy == ClusterAllocationExplanation.StoreCopy.STALE) {
|
||||
finalExplanation = "the copy of the shard is stale, allocation ids do not match";
|
||||
finalDecision = ClusterAllocationExplanation.FinalDecision.NO;
|
||||
} else if (shard.primary() && shard.unassigned() && shard.allocatedPostIndexCreate(indexMetaData) &&
|
||||
storeCopy == ClusterAllocationExplanation.StoreCopy.NONE) {
|
||||
finalExplanation = "there is no copy of the shard available";
|
||||
finalDecision = ClusterAllocationExplanation.FinalDecision.NO;
|
||||
} else if (shard.primary() && shard.unassigned() && storeCopy == ClusterAllocationExplanation.StoreCopy.CORRUPT) {
|
||||
finalExplanation = "the copy of the shard is corrupt";
|
||||
finalDecision = ClusterAllocationExplanation.FinalDecision.NO;
|
||||
} else if (shard.primary() && shard.unassigned() && storeCopy == ClusterAllocationExplanation.StoreCopy.IO_ERROR) {
|
||||
finalExplanation = "the copy of the shard cannot be read";
|
||||
finalDecision = ClusterAllocationExplanation.FinalDecision.NO;
|
||||
} else {
|
||||
if (nodeDecision.type() == Decision.Type.NO) {
|
||||
finalDecision = ClusterAllocationExplanation.FinalDecision.NO;
|
||||
finalExplanation = "the shard cannot be assigned because one or more allocation decider returns a 'NO' decision";
|
||||
} else {
|
||||
finalDecision = ClusterAllocationExplanation.FinalDecision.YES;
|
||||
if (storeCopy == ClusterAllocationExplanation.StoreCopy.AVAILABLE) {
|
||||
finalExplanation = "the shard can be assigned and the node contains a valid copy of the shard data";
|
||||
} else {
|
||||
finalExplanation = "the shard can be assigned";
|
||||
}
|
||||
}
|
||||
}
|
||||
return new NodeExplanation(node, nodeDecision, nodeWeight, storeStatus, finalDecision, finalExplanation, storeCopy);
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* For the given {@code ShardRouting}, return the explanation of the allocation for that shard on all nodes. If {@code
|
||||
* includeYesDecisions} is true, returns all decisions, otherwise returns only 'NO' and 'THROTTLE' decisions.
|
||||
*/
|
||||
public static ClusterAllocationExplanation explainShard(ShardRouting shard, RoutingAllocation allocation, RoutingNodes routingNodes,
|
||||
boolean includeYesDecisions, ShardsAllocator shardAllocator,
|
||||
List<IndicesShardStoresResponse.StoreStatus> shardStores) {
|
||||
// don't short circuit deciders, we want a full explanation
|
||||
allocation.debugDecision(true);
|
||||
// get the existing unassigned info if available
|
||||
UnassignedInfo ui = shard.unassignedInfo();
|
||||
|
||||
RoutingNodesIterator iter = routingNodes.nodes();
|
||||
Map<DiscoveryNode, Decision> nodeToDecision = new HashMap<>();
|
||||
while (iter.hasNext()) {
|
||||
RoutingNode node = iter.next();
|
||||
DiscoveryNode discoNode = node.node();
|
||||
if (discoNode.isDataNode()) {
|
||||
Decision d = tryShardOnNode(shard, node, allocation, includeYesDecisions);
|
||||
nodeToDecision.put(discoNode, d);
|
||||
}
|
||||
}
|
||||
long remainingDelayMillis = 0;
|
||||
final MetaData metadata = allocation.metaData();
|
||||
final IndexMetaData indexMetaData = metadata.index(shard.index());
|
||||
if (ui != null) {
|
||||
final Settings indexSettings = indexMetaData.getSettings();
|
||||
long remainingDelayNanos = ui.getRemainingDelay(System.nanoTime(), metadata.settings(), indexSettings);
|
||||
remainingDelayMillis = TimeValue.timeValueNanos(remainingDelayNanos).millis();
|
||||
}
|
||||
|
||||
// Calculate weights for each of the nodes
|
||||
Map<DiscoveryNode, Float> weights = shardAllocator.weighShard(allocation, shard);
|
||||
|
||||
Map<DiscoveryNode, IndicesShardStoresResponse.StoreStatus> nodeToStatus = new HashMap<>(shardStores.size());
|
||||
for (IndicesShardStoresResponse.StoreStatus status : shardStores) {
|
||||
nodeToStatus.put(status.getNode(), status);
|
||||
}
|
||||
|
||||
Map<DiscoveryNode, NodeExplanation> explanations = new HashMap<>(shardStores.size());
|
||||
for (Map.Entry<DiscoveryNode, Decision> entry : nodeToDecision.entrySet()) {
|
||||
DiscoveryNode node = entry.getKey();
|
||||
Decision decision = entry.getValue();
|
||||
Float weight = weights.get(node);
|
||||
IndicesShardStoresResponse.StoreStatus storeStatus = nodeToStatus.get(node);
|
||||
NodeExplanation nodeExplanation = calculateNodeExplanation(shard, indexMetaData, node, decision, weight,
|
||||
storeStatus, shard.currentNodeId(), indexMetaData.activeAllocationIds(shard.getId()));
|
||||
explanations.put(node, nodeExplanation);
|
||||
}
|
||||
return new ClusterAllocationExplanation(shard.shardId(), shard.primary(),
|
||||
shard.currentNodeId(), remainingDelayMillis, ui, explanations);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void masterOperation(final ClusterAllocationExplainRequest request, final ClusterState state,
|
||||
final ActionListener<ClusterAllocationExplainResponse> listener) {
|
||||
final RoutingNodes routingNodes = state.getRoutingNodes();
|
||||
final RoutingAllocation allocation = new RoutingAllocation(allocationDeciders, routingNodes, state.nodes(),
|
||||
clusterInfoService.getClusterInfo(), System.nanoTime());
|
||||
|
||||
ShardRouting foundShard = null;
|
||||
if (request.useAnyUnassignedShard()) {
|
||||
// If we can use any shard, just pick the first unassigned one (if there are any)
|
||||
RoutingNodes.UnassignedShards.UnassignedIterator ui = routingNodes.unassigned().iterator();
|
||||
if (ui.hasNext()) {
|
||||
foundShard = ui.next();
|
||||
}
|
||||
} else {
|
||||
String index = request.getIndex();
|
||||
int shard = request.getShard();
|
||||
if (request.isPrimary()) {
|
||||
// If we're looking for the primary shard, there's only one copy, so pick it directly
|
||||
foundShard = allocation.routingTable().shardRoutingTable(index, shard).primaryShard();
|
||||
} else {
|
||||
// If looking for a replica, go through all the replica shards
|
||||
List<ShardRouting> replicaShardRoutings = allocation.routingTable().shardRoutingTable(index, shard).replicaShards();
|
||||
if (replicaShardRoutings.size() > 0) {
|
||||
// Pick the first replica at the very least
|
||||
foundShard = replicaShardRoutings.get(0);
|
||||
// In case there are multiple replicas where some are assigned and some aren't,
|
||||
// try to find one that is unassigned at least
|
||||
for (ShardRouting replica : replicaShardRoutings) {
|
||||
if (replica.unassigned()) {
|
||||
foundShard = replica;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (foundShard == null) {
|
||||
listener.onFailure(new ElasticsearchException("unable to find any shards to explain [{}] in the routing table", request));
|
||||
return;
|
||||
}
|
||||
final ShardRouting shardRouting = foundShard;
|
||||
logger.debug("explaining the allocation for [{}], found shard [{}]", request, shardRouting);
|
||||
|
||||
getShardStores(shardRouting, new ActionListener<IndicesShardStoresResponse>() {
|
||||
@Override
|
||||
public void onResponse(IndicesShardStoresResponse shardStoreResponse) {
|
||||
ImmutableOpenIntMap<List<IndicesShardStoresResponse.StoreStatus>> shardStatuses =
|
||||
shardStoreResponse.getStoreStatuses().get(shardRouting.getIndexName());
|
||||
List<IndicesShardStoresResponse.StoreStatus> shardStoreStatus = shardStatuses.get(shardRouting.id());
|
||||
ClusterAllocationExplanation cae = explainShard(shardRouting, allocation, routingNodes,
|
||||
request.includeYesDecisions(), shardAllocator, shardStoreStatus);
|
||||
listener.onResponse(new ClusterAllocationExplainResponse(cae));
|
||||
}
|
||||
|
||||
@Override
|
||||
public void onFailure(Throwable e) {
|
||||
listener.onFailure(e);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
private void getShardStores(ShardRouting shard, final ActionListener<IndicesShardStoresResponse> listener) {
|
||||
IndicesShardStoresRequest request = new IndicesShardStoresRequest(shard.getIndexName());
|
||||
request.shardStatuses("all");
|
||||
shardStoresAction.execute(request, listener);
|
||||
}
|
||||
}
|
@ -19,7 +19,6 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.health;
|
||||
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.action.ActionResponse;
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.health.ClusterHealthStatus;
|
||||
@ -30,12 +29,10 @@ import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.common.xcontent.StatusToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
import java.util.Map;
|
||||
|
||||
@ -83,14 +80,6 @@ public class ClusterHealthResponse extends ActionResponse implements StatusToXCo
|
||||
return clusterStateHealth;
|
||||
}
|
||||
|
||||
/**
|
||||
* The validation failures on the cluster level (without index validation failures).
|
||||
*/
|
||||
public List<String> getValidationFailures() {
|
||||
return clusterStateHealth.getValidationFailures();
|
||||
}
|
||||
|
||||
|
||||
public int getActiveShards() {
|
||||
return clusterStateHealth.getActiveShards();
|
||||
}
|
||||
@ -234,25 +223,24 @@ public class ClusterHealthResponse extends ActionResponse implements StatusToXCo
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString CLUSTER_NAME = new XContentBuilderString("cluster_name");
|
||||
static final XContentBuilderString STATUS = new XContentBuilderString("status");
|
||||
static final XContentBuilderString TIMED_OUT = new XContentBuilderString("timed_out");
|
||||
static final XContentBuilderString NUMBER_OF_NODES = new XContentBuilderString("number_of_nodes");
|
||||
static final XContentBuilderString NUMBER_OF_DATA_NODES = new XContentBuilderString("number_of_data_nodes");
|
||||
static final XContentBuilderString NUMBER_OF_PENDING_TASKS = new XContentBuilderString("number_of_pending_tasks");
|
||||
static final XContentBuilderString NUMBER_OF_IN_FLIGHT_FETCH = new XContentBuilderString("number_of_in_flight_fetch");
|
||||
static final XContentBuilderString DELAYED_UNASSIGNED_SHARDS = new XContentBuilderString("delayed_unassigned_shards");
|
||||
static final XContentBuilderString TASK_MAX_WAIT_TIME_IN_QUEUE = new XContentBuilderString("task_max_waiting_in_queue");
|
||||
static final XContentBuilderString TASK_MAX_WAIT_TIME_IN_QUEUE_IN_MILLIS = new XContentBuilderString("task_max_waiting_in_queue_millis");
|
||||
static final XContentBuilderString ACTIVE_SHARDS_PERCENT_AS_NUMBER = new XContentBuilderString("active_shards_percent_as_number");
|
||||
static final XContentBuilderString ACTIVE_SHARDS_PERCENT = new XContentBuilderString("active_shards_percent");
|
||||
static final XContentBuilderString ACTIVE_PRIMARY_SHARDS = new XContentBuilderString("active_primary_shards");
|
||||
static final XContentBuilderString ACTIVE_SHARDS = new XContentBuilderString("active_shards");
|
||||
static final XContentBuilderString RELOCATING_SHARDS = new XContentBuilderString("relocating_shards");
|
||||
static final XContentBuilderString INITIALIZING_SHARDS = new XContentBuilderString("initializing_shards");
|
||||
static final XContentBuilderString UNASSIGNED_SHARDS = new XContentBuilderString("unassigned_shards");
|
||||
static final XContentBuilderString VALIDATION_FAILURES = new XContentBuilderString("validation_failures");
|
||||
static final XContentBuilderString INDICES = new XContentBuilderString("indices");
|
||||
static final String CLUSTER_NAME = "cluster_name";
|
||||
static final String STATUS = "status";
|
||||
static final String TIMED_OUT = "timed_out";
|
||||
static final String NUMBER_OF_NODES = "number_of_nodes";
|
||||
static final String NUMBER_OF_DATA_NODES = "number_of_data_nodes";
|
||||
static final String NUMBER_OF_PENDING_TASKS = "number_of_pending_tasks";
|
||||
static final String NUMBER_OF_IN_FLIGHT_FETCH = "number_of_in_flight_fetch";
|
||||
static final String DELAYED_UNASSIGNED_SHARDS = "delayed_unassigned_shards";
|
||||
static final String TASK_MAX_WAIT_TIME_IN_QUEUE = "task_max_waiting_in_queue";
|
||||
static final String TASK_MAX_WAIT_TIME_IN_QUEUE_IN_MILLIS = "task_max_waiting_in_queue_millis";
|
||||
static final String ACTIVE_SHARDS_PERCENT_AS_NUMBER = "active_shards_percent_as_number";
|
||||
static final String ACTIVE_SHARDS_PERCENT = "active_shards_percent";
|
||||
static final String ACTIVE_PRIMARY_SHARDS = "active_primary_shards";
|
||||
static final String ACTIVE_SHARDS = "active_shards";
|
||||
static final String RELOCATING_SHARDS = "relocating_shards";
|
||||
static final String INITIALIZING_SHARDS = "initializing_shards";
|
||||
static final String UNASSIGNED_SHARDS = "unassigned_shards";
|
||||
static final String INDICES = "indices";
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -276,36 +264,10 @@ public class ClusterHealthResponse extends ActionResponse implements StatusToXCo
|
||||
String level = params.param("level", "cluster");
|
||||
boolean outputIndices = "indices".equals(level) || "shards".equals(level);
|
||||
|
||||
|
||||
if (!getValidationFailures().isEmpty()) {
|
||||
builder.startArray(Fields.VALIDATION_FAILURES);
|
||||
for (String validationFailure : getValidationFailures()) {
|
||||
builder.value(validationFailure);
|
||||
}
|
||||
// if we don't print index level information, still print the index validation failures
|
||||
// so we know why the status is red
|
||||
if (!outputIndices) {
|
||||
for (ClusterIndexHealth indexHealth : clusterStateHealth.getIndices().values()) {
|
||||
builder.startObject(indexHealth.getIndex());
|
||||
|
||||
if (!indexHealth.getValidationFailures().isEmpty()) {
|
||||
builder.startArray(Fields.VALIDATION_FAILURES);
|
||||
for (String validationFailure : indexHealth.getValidationFailures()) {
|
||||
builder.value(validationFailure);
|
||||
}
|
||||
builder.endArray();
|
||||
}
|
||||
|
||||
builder.endObject();
|
||||
}
|
||||
}
|
||||
builder.endArray();
|
||||
}
|
||||
|
||||
if (outputIndices) {
|
||||
builder.startObject(Fields.INDICES);
|
||||
for (ClusterIndexHealth indexHealth : clusterStateHealth.getIndices().values()) {
|
||||
builder.startObject(indexHealth.getIndex(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.startObject(indexHealth.getIndex());
|
||||
indexHealth.toXContent(builder, params);
|
||||
builder.endObject();
|
||||
}
|
||||
|
@ -54,7 +54,8 @@ public class TransportClusterHealthAction extends TransportMasterNodeReadAction<
|
||||
public TransportClusterHealthAction(Settings settings, TransportService transportService, ClusterService clusterService,
|
||||
ThreadPool threadPool, ClusterName clusterName, ActionFilters actionFilters,
|
||||
IndexNameExpressionResolver indexNameExpressionResolver, GatewayAllocator gatewayAllocator) {
|
||||
super(settings, ClusterHealthAction.NAME, transportService, clusterService, threadPool, actionFilters, indexNameExpressionResolver, ClusterHealthRequest::new);
|
||||
super(settings, ClusterHealthAction.NAME, false, transportService, clusterService, threadPool, actionFilters,
|
||||
indexNameExpressionResolver, ClusterHealthRequest::new);
|
||||
this.clusterName = clusterName;
|
||||
this.gatewayAllocator = gatewayAllocator;
|
||||
}
|
||||
@ -143,7 +144,7 @@ public class TransportClusterHealthAction extends TransportMasterNodeReadAction<
|
||||
assert waitFor >= 0;
|
||||
final ClusterStateObserver observer = new ClusterStateObserver(clusterService, logger, threadPool.getThreadContext());
|
||||
final ClusterState state = observer.observedState();
|
||||
if (waitFor == 0 || request.timeout().millis() == 0) {
|
||||
if (request.timeout().millis() == 0) {
|
||||
listener.onResponse(getResponse(request, state, waitFor, request.timeout().millis() == 0));
|
||||
return;
|
||||
}
|
||||
|
@ -19,12 +19,14 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.node.hotthreads;
|
||||
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.support.nodes.BaseNodesResponse;
|
||||
import org.elasticsearch.cluster.ClusterName;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
*/
|
||||
@ -33,26 +35,18 @@ public class NodesHotThreadsResponse extends BaseNodesResponse<NodeHotThreads> {
|
||||
NodesHotThreadsResponse() {
|
||||
}
|
||||
|
||||
public NodesHotThreadsResponse(ClusterName clusterName, NodeHotThreads[] nodes) {
|
||||
super(clusterName, nodes);
|
||||
public NodesHotThreadsResponse(ClusterName clusterName, List<NodeHotThreads> nodes, List<FailedNodeException> failures) {
|
||||
super(clusterName, nodes, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
nodes = new NodeHotThreads[in.readVInt()];
|
||||
for (int i = 0; i < nodes.length; i++) {
|
||||
nodes[i] = NodeHotThreads.readNodeHotThreads(in);
|
||||
}
|
||||
protected List<NodeHotThreads> readNodesFrom(StreamInput in) throws IOException {
|
||||
return in.readList(NodeHotThreads::readNodeHotThreads);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeVInt(nodes.length);
|
||||
for (NodeHotThreads node : nodes) {
|
||||
node.writeTo(out);
|
||||
}
|
||||
protected void writeNodesTo(StreamOutput out, List<NodeHotThreads> nodes) throws IOException {
|
||||
out.writeStreamableList(nodes);
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -20,6 +20,7 @@
|
||||
package org.elasticsearch.action.admin.cluster.node.hotthreads;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.support.ActionFilters;
|
||||
import org.elasticsearch.action.support.nodes.BaseNodeRequest;
|
||||
import org.elasticsearch.action.support.nodes.TransportNodesAction;
|
||||
@ -35,33 +36,28 @@ import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.transport.TransportService;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.concurrent.atomic.AtomicReferenceArray;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
public class TransportNodesHotThreadsAction extends TransportNodesAction<NodesHotThreadsRequest, NodesHotThreadsResponse, TransportNodesHotThreadsAction.NodeRequest, NodeHotThreads> {
|
||||
public class TransportNodesHotThreadsAction extends TransportNodesAction<NodesHotThreadsRequest,
|
||||
NodesHotThreadsResponse,
|
||||
TransportNodesHotThreadsAction.NodeRequest,
|
||||
NodeHotThreads> {
|
||||
|
||||
@Inject
|
||||
public TransportNodesHotThreadsAction(Settings settings, ClusterName clusterName, ThreadPool threadPool,
|
||||
ClusterService clusterService, TransportService transportService,
|
||||
ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, NodesHotThreadsAction.NAME, clusterName, threadPool, clusterService, transportService, actionFilters,
|
||||
indexNameExpressionResolver, NodesHotThreadsRequest::new, NodeRequest::new, ThreadPool.Names.GENERIC);
|
||||
indexNameExpressionResolver, NodesHotThreadsRequest::new, NodeRequest::new, ThreadPool.Names.GENERIC, NodeHotThreads.class);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected NodesHotThreadsResponse newResponse(NodesHotThreadsRequest request, AtomicReferenceArray responses) {
|
||||
final List<NodeHotThreads> nodes = new ArrayList<>();
|
||||
for (int i = 0; i < responses.length(); i++) {
|
||||
Object resp = responses.get(i);
|
||||
if (resp instanceof NodeHotThreads) {
|
||||
nodes.add((NodeHotThreads) resp);
|
||||
}
|
||||
}
|
||||
return new NodesHotThreadsResponse(clusterName, nodes.toArray(new NodeHotThreads[nodes.size()]));
|
||||
protected NodesHotThreadsResponse newResponse(NodesHotThreadsRequest request,
|
||||
List<NodeHotThreads> responses, List<FailedNodeException> failures) {
|
||||
return new NodesHotThreadsResponse(clusterName, responses, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -19,9 +19,10 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.node.info;
|
||||
|
||||
import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.support.nodes.BaseNodesResponse;
|
||||
import org.elasticsearch.cluster.ClusterName;
|
||||
import org.elasticsearch.cluster.node.DiscoveryNode;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
@ -30,6 +31,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
@ -40,59 +42,54 @@ public class NodesInfoResponse extends BaseNodesResponse<NodeInfo> implements To
|
||||
public NodesInfoResponse() {
|
||||
}
|
||||
|
||||
public NodesInfoResponse(ClusterName clusterName, NodeInfo[] nodes) {
|
||||
super(clusterName, nodes);
|
||||
public NodesInfoResponse(ClusterName clusterName, List<NodeInfo> nodes, List<FailedNodeException> failures) {
|
||||
super(clusterName, nodes, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
nodes = new NodeInfo[in.readVInt()];
|
||||
for (int i = 0; i < nodes.length; i++) {
|
||||
nodes[i] = NodeInfo.readNodeInfo(in);
|
||||
}
|
||||
protected List<NodeInfo> readNodesFrom(StreamInput in) throws IOException {
|
||||
return in.readList(NodeInfo::readNodeInfo);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeVInt(nodes.length);
|
||||
for (NodeInfo node : nodes) {
|
||||
node.writeTo(out);
|
||||
}
|
||||
protected void writeNodesTo(StreamOutput out, List<NodeInfo> nodes) throws IOException {
|
||||
out.writeStreamableList(nodes);
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.field("cluster_name", getClusterName().value(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
|
||||
builder.startObject("nodes");
|
||||
for (NodeInfo nodeInfo : this) {
|
||||
builder.startObject(nodeInfo.getNode().id(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
for (NodeInfo nodeInfo : getNodes()) {
|
||||
builder.startObject(nodeInfo.getNode().getId());
|
||||
|
||||
builder.field("name", nodeInfo.getNode().name(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field("transport_address", nodeInfo.getNode().address().toString());
|
||||
builder.field("host", nodeInfo.getNode().getHostName(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field("ip", nodeInfo.getNode().getHostAddress(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field("name", nodeInfo.getNode().getName());
|
||||
builder.field("transport_address", nodeInfo.getNode().getAddress().toString());
|
||||
builder.field("host", nodeInfo.getNode().getHostName());
|
||||
builder.field("ip", nodeInfo.getNode().getHostAddress());
|
||||
|
||||
builder.field("version", nodeInfo.getVersion());
|
||||
builder.field("build_hash", nodeInfo.getBuild().shortHash());
|
||||
|
||||
if (nodeInfo.getServiceAttributes() != null) {
|
||||
for (Map.Entry<String, String> nodeAttribute : nodeInfo.getServiceAttributes().entrySet()) {
|
||||
builder.field(nodeAttribute.getKey(), nodeAttribute.getValue(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field(nodeAttribute.getKey(), nodeAttribute.getValue());
|
||||
}
|
||||
}
|
||||
|
||||
if (!nodeInfo.getNode().attributes().isEmpty()) {
|
||||
builder.startArray("roles");
|
||||
for (DiscoveryNode.Role role : nodeInfo.getNode().getRoles()) {
|
||||
builder.value(role.getRoleName());
|
||||
}
|
||||
builder.endArray();
|
||||
|
||||
if (!nodeInfo.getNode().getAttributes().isEmpty()) {
|
||||
builder.startObject("attributes");
|
||||
for (ObjectObjectCursor<String, String> attr : nodeInfo.getNode().attributes()) {
|
||||
builder.field(attr.key, attr.value, XContentBuilder.FieldCaseConversion.NONE);
|
||||
for (Map.Entry<String, String> entry : nodeInfo.getNode().getAttributes().entrySet()) {
|
||||
builder.field(entry.getKey(), entry.getValue());
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
|
||||
|
||||
if (nodeInfo.getSettings() != null) {
|
||||
builder.startObject("settings");
|
||||
Settings settings = nodeInfo.getSettings();
|
||||
|
@ -19,6 +19,7 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.node.info;
|
||||
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.support.ActionFilters;
|
||||
import org.elasticsearch.action.support.nodes.BaseNodeRequest;
|
||||
import org.elasticsearch.action.support.nodes.TransportNodesAction;
|
||||
@ -34,36 +35,32 @@ import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.transport.TransportService;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.concurrent.atomic.AtomicReferenceArray;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
public class TransportNodesInfoAction extends TransportNodesAction<NodesInfoRequest, NodesInfoResponse, TransportNodesInfoAction.NodeInfoRequest, NodeInfo> {
|
||||
public class TransportNodesInfoAction extends TransportNodesAction<NodesInfoRequest,
|
||||
NodesInfoResponse,
|
||||
TransportNodesInfoAction.NodeInfoRequest,
|
||||
NodeInfo> {
|
||||
|
||||
private final NodeService nodeService;
|
||||
|
||||
@Inject
|
||||
public TransportNodesInfoAction(Settings settings, ClusterName clusterName, ThreadPool threadPool,
|
||||
ClusterService clusterService, TransportService transportService,
|
||||
NodeService nodeService, ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
NodeService nodeService, ActionFilters actionFilters,
|
||||
IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, NodesInfoAction.NAME, clusterName, threadPool, clusterService, transportService, actionFilters,
|
||||
indexNameExpressionResolver, NodesInfoRequest::new, NodeInfoRequest::new, ThreadPool.Names.MANAGEMENT);
|
||||
indexNameExpressionResolver, NodesInfoRequest::new, NodeInfoRequest::new, ThreadPool.Names.MANAGEMENT, NodeInfo.class);
|
||||
this.nodeService = nodeService;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected NodesInfoResponse newResponse(NodesInfoRequest nodesInfoRequest, AtomicReferenceArray responses) {
|
||||
final List<NodeInfo> nodesInfos = new ArrayList<>();
|
||||
for (int i = 0; i < responses.length(); i++) {
|
||||
Object resp = responses.get(i);
|
||||
if (resp instanceof NodeInfo) {
|
||||
nodesInfos.add((NodeInfo) resp);
|
||||
}
|
||||
}
|
||||
return new NodesInfoResponse(clusterName, nodesInfos.toArray(new NodeInfo[nodesInfos.size()]));
|
||||
protected NodesInfoResponse newResponse(NodesInfoRequest nodesInfoRequest,
|
||||
List<NodeInfo> responses, List<FailedNodeException> failures) {
|
||||
return new NodesInfoResponse(clusterName, responses, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -48,18 +48,14 @@ public final class LivenessResponse extends ActionResponse {
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
clusterName = ClusterName.readClusterName(in);
|
||||
if (in.readBoolean()) {
|
||||
node = DiscoveryNode.readNode(in);
|
||||
} else {
|
||||
node = null;
|
||||
}
|
||||
node = in.readOptionalWriteable(DiscoveryNode::new);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
clusterName.writeTo(out);
|
||||
out.writeOptionalStreamable(node);
|
||||
out.writeOptionalWriteable(node);
|
||||
}
|
||||
|
||||
public ClusterName getClusterName() {
|
||||
|
@ -19,7 +19,6 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.node.stats;
|
||||
|
||||
import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
|
||||
import org.elasticsearch.action.support.nodes.BaseNodeResponse;
|
||||
import org.elasticsearch.cluster.node.DiscoveryNode;
|
||||
import org.elasticsearch.common.Nullable;
|
||||
@ -41,6 +40,7 @@ import org.elasticsearch.threadpool.ThreadPoolStats;
|
||||
import org.elasticsearch.transport.TransportStats;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* Node statistics (dynamic, changes depending on when created).
|
||||
@ -224,7 +224,7 @@ public class NodeStats extends BaseNodeResponse implements ToXContent {
|
||||
threadPool = ThreadPoolStats.readThreadPoolStats(in);
|
||||
}
|
||||
if (in.readBoolean()) {
|
||||
fs = FsInfo.readFsInfo(in);
|
||||
fs = new FsInfo(in);
|
||||
}
|
||||
if (in.readBoolean()) {
|
||||
transport = TransportStats.readTransportStats(in);
|
||||
@ -299,15 +299,21 @@ public class NodeStats extends BaseNodeResponse implements ToXContent {
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
if (!params.param("node_info_format", "default").equals("none")) {
|
||||
builder.field("name", getNode().name(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field("transport_address", getNode().address().toString(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field("host", getNode().getHostName(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field("ip", getNode().getAddress(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field("name", getNode().getName());
|
||||
builder.field("transport_address", getNode().getAddress().toString());
|
||||
builder.field("host", getNode().getHostName());
|
||||
builder.field("ip", getNode().getAddress());
|
||||
|
||||
if (!getNode().attributes().isEmpty()) {
|
||||
builder.startArray("roles");
|
||||
for (DiscoveryNode.Role role : getNode().getRoles()) {
|
||||
builder.value(role.getRoleName());
|
||||
}
|
||||
builder.endArray();
|
||||
|
||||
if (!getNode().getAttributes().isEmpty()) {
|
||||
builder.startObject("attributes");
|
||||
for (ObjectObjectCursor<String, String> attr : getNode().attributes()) {
|
||||
builder.field(attr.key, attr.value, XContentBuilder.FieldCaseConversion.NONE);
|
||||
for (Map.Entry<String, String> attrEntry : getNode().getAttributes().entrySet()) {
|
||||
builder.field(attrEntry.getKey(), attrEntry.getValue());
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
|
@ -19,6 +19,7 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.node.stats;
|
||||
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.support.nodes.BaseNodesResponse;
|
||||
import org.elasticsearch.cluster.ClusterName;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
@ -28,6 +29,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
*
|
||||
@ -37,35 +39,25 @@ public class NodesStatsResponse extends BaseNodesResponse<NodeStats> implements
|
||||
NodesStatsResponse() {
|
||||
}
|
||||
|
||||
public NodesStatsResponse(ClusterName clusterName, NodeStats[] nodes) {
|
||||
super(clusterName, nodes);
|
||||
public NodesStatsResponse(ClusterName clusterName, List<NodeStats> nodes, List<FailedNodeException> failures) {
|
||||
super(clusterName, nodes, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
nodes = new NodeStats[in.readVInt()];
|
||||
for (int i = 0; i < nodes.length; i++) {
|
||||
nodes[i] = NodeStats.readNodeStats(in);
|
||||
}
|
||||
protected List<NodeStats> readNodesFrom(StreamInput in) throws IOException {
|
||||
return in.readList(NodeStats::readNodeStats);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeVInt(nodes.length);
|
||||
for (NodeStats node : nodes) {
|
||||
node.writeTo(out);
|
||||
}
|
||||
protected void writeNodesTo(StreamOutput out, List<NodeStats> nodes) throws IOException {
|
||||
out.writeStreamableList(nodes);
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.field("cluster_name", getClusterName().value());
|
||||
|
||||
builder.startObject("nodes");
|
||||
for (NodeStats nodeStats : this) {
|
||||
builder.startObject(nodeStats.getNode().id(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
for (NodeStats nodeStats : getNodes()) {
|
||||
builder.startObject(nodeStats.getNode().getId());
|
||||
builder.field("timestamp", nodeStats.getTimestamp());
|
||||
nodeStats.toXContent(builder, params);
|
||||
|
||||
@ -88,4 +80,4 @@ public class NodesStatsResponse extends BaseNodesResponse<NodeStats> implements
|
||||
return "{ \"error\" : \"" + e.getMessage() + "\"}";
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -19,6 +19,7 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.node.stats;
|
||||
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.support.ActionFilters;
|
||||
import org.elasticsearch.action.support.nodes.BaseNodeRequest;
|
||||
import org.elasticsearch.action.support.nodes.TransportNodesAction;
|
||||
@ -34,36 +35,31 @@ import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.transport.TransportService;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.concurrent.atomic.AtomicReferenceArray;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
public class TransportNodesStatsAction extends TransportNodesAction<NodesStatsRequest, NodesStatsResponse, TransportNodesStatsAction.NodeStatsRequest, NodeStats> {
|
||||
public class TransportNodesStatsAction extends TransportNodesAction<NodesStatsRequest,
|
||||
NodesStatsResponse,
|
||||
TransportNodesStatsAction.NodeStatsRequest,
|
||||
NodeStats> {
|
||||
|
||||
private final NodeService nodeService;
|
||||
|
||||
@Inject
|
||||
public TransportNodesStatsAction(Settings settings, ClusterName clusterName, ThreadPool threadPool,
|
||||
ClusterService clusterService, TransportService transportService,
|
||||
NodeService nodeService, ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, NodesStatsAction.NAME, clusterName, threadPool, clusterService, transportService, actionFilters, indexNameExpressionResolver,
|
||||
NodesStatsRequest::new, NodeStatsRequest::new, ThreadPool.Names.MANAGEMENT);
|
||||
NodeService nodeService, ActionFilters actionFilters,
|
||||
IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, NodesStatsAction.NAME, clusterName, threadPool, clusterService, transportService, actionFilters,
|
||||
indexNameExpressionResolver, NodesStatsRequest::new, NodeStatsRequest::new, ThreadPool.Names.MANAGEMENT, NodeStats.class);
|
||||
this.nodeService = nodeService;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected NodesStatsResponse newResponse(NodesStatsRequest nodesInfoRequest, AtomicReferenceArray responses) {
|
||||
final List<NodeStats> nodeStats = new ArrayList<>();
|
||||
for (int i = 0; i < responses.length(); i++) {
|
||||
Object resp = responses.get(i);
|
||||
if (resp instanceof NodeStats) {
|
||||
nodeStats.add((NodeStats) resp);
|
||||
}
|
||||
}
|
||||
return new NodesStatsResponse(clusterName, nodeStats.toArray(new NodeStats[nodeStats.size()]));
|
||||
protected NodesStatsResponse newResponse(NodesStatsRequest request, List<NodeStats> responses, List<FailedNodeException> failures) {
|
||||
return new NodesStatsResponse(clusterName, responses, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -56,7 +56,7 @@ import java.util.function.Consumer;
|
||||
* Transport action that can be used to cancel currently running cancellable tasks.
|
||||
* <p>
|
||||
* For a task to be cancellable it has to return an instance of
|
||||
* {@link CancellableTask} from {@link TransportRequest#createTask(long, String, String)}
|
||||
* {@link CancellableTask} from {@link TransportRequest#createTask(long, String, String, TaskId)}
|
||||
*/
|
||||
public class TransportCancelTasksAction extends TransportTasksAction<CancellableTask, CancelTasksRequest, CancelTasksResponse, TaskInfo> {
|
||||
|
||||
@ -251,7 +251,7 @@ public class TransportCancelTasksAction extends TransportTasksAction<Cancellable
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
parentTaskId = new TaskId(in);
|
||||
parentTaskId = TaskId.readFromStream(in);
|
||||
ban = in.readBoolean();
|
||||
if (ban) {
|
||||
reason = in.readString();
|
||||
|
@ -19,17 +19,16 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.node.tasks.list;
|
||||
|
||||
import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
|
||||
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.TaskOperationFailure;
|
||||
import org.elasticsearch.action.support.tasks.BaseTasksResponse;
|
||||
import org.elasticsearch.cluster.node.DiscoveryNode;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.tasks.TaskId;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
@ -39,6 +38,7 @@ import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* Returns the list of tasks currently running on the nodes
|
||||
@ -49,10 +49,13 @@ public class ListTasksResponse extends BaseTasksResponse implements ToXContent {
|
||||
|
||||
private Map<DiscoveryNode, List<TaskInfo>> nodes;
|
||||
|
||||
private List<TaskGroup> groups;
|
||||
|
||||
public ListTasksResponse() {
|
||||
}
|
||||
|
||||
public ListTasksResponse(List<TaskInfo> tasks, List<TaskOperationFailure> taskFailures, List<? extends FailedNodeException> nodeFailures) {
|
||||
public ListTasksResponse(List<TaskInfo> tasks, List<TaskOperationFailure> taskFailures,
|
||||
List<? extends FailedNodeException> nodeFailures) {
|
||||
super(taskFailures, nodeFailures);
|
||||
this.tasks = tasks == null ? Collections.emptyList() : Collections.unmodifiableList(new ArrayList<>(tasks));
|
||||
}
|
||||
@ -96,6 +99,41 @@ public class ListTasksResponse extends BaseTasksResponse implements ToXContent {
|
||||
return nodeTasks;
|
||||
}
|
||||
|
||||
public List<TaskGroup> getTaskGroups() {
|
||||
if (groups == null) {
|
||||
buildTaskGroups();
|
||||
}
|
||||
return groups;
|
||||
}
|
||||
|
||||
private void buildTaskGroups() {
|
||||
Map<TaskId, TaskGroup.Builder> taskGroups = new HashMap<>();
|
||||
List<TaskGroup.Builder> topLevelTasks = new ArrayList<>();
|
||||
// First populate all tasks
|
||||
for (TaskInfo taskInfo : this.tasks) {
|
||||
taskGroups.put(taskInfo.getTaskId(), TaskGroup.builder(taskInfo));
|
||||
}
|
||||
|
||||
// Now go through all task group builders and add children to their parents
|
||||
for (TaskGroup.Builder taskGroup : taskGroups.values()) {
|
||||
TaskId parentTaskId = taskGroup.getTaskInfo().getParentTaskId();
|
||||
if (parentTaskId.isSet()) {
|
||||
TaskGroup.Builder parentTask = taskGroups.get(parentTaskId);
|
||||
if (parentTask != null) {
|
||||
// we found parent in the list of tasks - add it to the parent list
|
||||
parentTask.addGroup(taskGroup);
|
||||
} else {
|
||||
// we got zombie or the parent was filtered out - add it to the the top task list
|
||||
topLevelTasks.add(taskGroup);
|
||||
}
|
||||
} else {
|
||||
// top level task - add it to the top task list
|
||||
topLevelTasks.add(taskGroup);
|
||||
}
|
||||
}
|
||||
this.groups = Collections.unmodifiableList(topLevelTasks.stream().map(TaskGroup.Builder::build).collect(Collectors.toList()));
|
||||
}
|
||||
|
||||
public List<TaskInfo> getTasks() {
|
||||
return tasks;
|
||||
}
|
||||
@ -121,46 +159,53 @@ public class ListTasksResponse extends BaseTasksResponse implements ToXContent {
|
||||
}
|
||||
builder.endArray();
|
||||
}
|
||||
String groupBy = params.param("group_by", "nodes");
|
||||
if ("nodes".equals(groupBy)) {
|
||||
builder.startObject("nodes");
|
||||
for (Map.Entry<DiscoveryNode, List<TaskInfo>> entry : getPerNodeTasks().entrySet()) {
|
||||
DiscoveryNode node = entry.getKey();
|
||||
builder.startObject(node.getId());
|
||||
builder.field("name", node.getName());
|
||||
builder.field("transport_address", node.getAddress().toString());
|
||||
builder.field("host", node.getHostName());
|
||||
builder.field("ip", node.getAddress());
|
||||
|
||||
builder.startObject("nodes");
|
||||
for (Map.Entry<DiscoveryNode, List<TaskInfo>> entry : getPerNodeTasks().entrySet()) {
|
||||
DiscoveryNode node = entry.getKey();
|
||||
builder.startObject(node.getId(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field("name", node.name());
|
||||
builder.field("transport_address", node.address().toString());
|
||||
builder.field("host", node.getHostName());
|
||||
builder.field("ip", node.getAddress());
|
||||
builder.startArray("roles");
|
||||
for (DiscoveryNode.Role role : node.getRoles()) {
|
||||
builder.value(role.getRoleName());
|
||||
}
|
||||
builder.endArray();
|
||||
|
||||
if (!node.attributes().isEmpty()) {
|
||||
builder.startObject("attributes");
|
||||
for (ObjectObjectCursor<String, String> attr : node.attributes()) {
|
||||
builder.field(attr.key, attr.value, XContentBuilder.FieldCaseConversion.NONE);
|
||||
if (!node.getAttributes().isEmpty()) {
|
||||
builder.startObject("attributes");
|
||||
for (Map.Entry<String, String> attrEntry : node.getAttributes().entrySet()) {
|
||||
builder.field(attrEntry.getKey(), attrEntry.getValue());
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
builder.startObject("tasks");
|
||||
for(TaskInfo task : entry.getValue()) {
|
||||
builder.startObject(task.getTaskId().toString());
|
||||
task.toXContent(builder, params);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
builder.endObject();
|
||||
}
|
||||
} else if ("parents".equals(groupBy)) {
|
||||
builder.startObject("tasks");
|
||||
for(TaskInfo task : entry.getValue()) {
|
||||
builder.startObject(task.getTaskId().toString(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
task.toXContent(builder, params);
|
||||
for (TaskGroup group : getTaskGroups()) {
|
||||
builder.startObject(group.getTaskInfo().getTaskId().toString());
|
||||
group.toXContent(builder, params);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
return builder;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
try {
|
||||
XContentBuilder builder = XContentFactory.jsonBuilder().prettyPrint();
|
||||
builder.startObject();
|
||||
toXContent(builder, EMPTY_PARAMS);
|
||||
builder.endObject();
|
||||
return builder.string();
|
||||
} catch (IOException e) {
|
||||
return "{ \"error\" : \"" + e.getMessage() + "\"}";
|
||||
}
|
||||
return Strings.toString(this);
|
||||
}
|
||||
}
|
||||
|
@ -0,0 +1,94 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.node.tasks.list;
|
||||
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* Information about a currently running task and all its subtasks.
|
||||
*/
|
||||
public class TaskGroup implements ToXContent {
|
||||
|
||||
private final TaskInfo task;
|
||||
|
||||
private final List<TaskGroup> childTasks;
|
||||
|
||||
|
||||
public TaskGroup(TaskInfo task, List<TaskGroup> childTasks) {
|
||||
this.task = task;
|
||||
this.childTasks = Collections.unmodifiableList(new ArrayList<>(childTasks));
|
||||
}
|
||||
|
||||
public static Builder builder(TaskInfo taskInfo) {
|
||||
return new Builder(taskInfo);
|
||||
}
|
||||
|
||||
public static class Builder {
|
||||
private TaskInfo taskInfo;
|
||||
private List<Builder> childTasks;
|
||||
|
||||
private Builder(TaskInfo taskInfo) {
|
||||
this.taskInfo = taskInfo;
|
||||
childTasks = new ArrayList<>();
|
||||
}
|
||||
|
||||
public void addGroup(Builder builder) {
|
||||
childTasks.add(builder);
|
||||
}
|
||||
|
||||
public TaskInfo getTaskInfo() {
|
||||
return taskInfo;
|
||||
}
|
||||
|
||||
public TaskGroup build() {
|
||||
return new TaskGroup(taskInfo, childTasks.stream().map(Builder::build).collect(Collectors.toList()));
|
||||
}
|
||||
}
|
||||
|
||||
public TaskInfo getTaskInfo() {
|
||||
return task;
|
||||
}
|
||||
|
||||
public List<TaskGroup> getChildTasks() {
|
||||
return childTasks;
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
task.toXContent(builder, params);
|
||||
if (childTasks.isEmpty() == false) {
|
||||
builder.startArray("children");
|
||||
for (TaskGroup taskGroup : childTasks) {
|
||||
builder.startObject();
|
||||
taskGroup.toXContent(builder, params);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endArray();
|
||||
}
|
||||
return builder;
|
||||
}
|
||||
}
|
@ -39,7 +39,7 @@ import java.util.concurrent.TimeUnit;
|
||||
* and use in APIs. Instead, immutable and streamable TaskInfo objects are used to represent
|
||||
* snapshot information about currently running tasks.
|
||||
*/
|
||||
public class TaskInfo implements Writeable<TaskInfo>, ToXContent {
|
||||
public class TaskInfo implements Writeable, ToXContent {
|
||||
|
||||
private final DiscoveryNode node;
|
||||
|
||||
@ -57,10 +57,12 @@ public class TaskInfo implements Writeable<TaskInfo>, ToXContent {
|
||||
|
||||
private final Task.Status status;
|
||||
|
||||
private final boolean cancellable;
|
||||
|
||||
private final TaskId parentTaskId;
|
||||
|
||||
public TaskInfo(DiscoveryNode node, long id, String type, String action, String description, Task.Status status, long startTime,
|
||||
long runningTimeNanos, TaskId parentTaskId) {
|
||||
long runningTimeNanos, boolean cancellable, TaskId parentTaskId) {
|
||||
this.node = node;
|
||||
this.taskId = new TaskId(node.getId(), id);
|
||||
this.type = type;
|
||||
@ -69,23 +71,38 @@ public class TaskInfo implements Writeable<TaskInfo>, ToXContent {
|
||||
this.status = status;
|
||||
this.startTime = startTime;
|
||||
this.runningTimeNanos = runningTimeNanos;
|
||||
this.cancellable = cancellable;
|
||||
this.parentTaskId = parentTaskId;
|
||||
}
|
||||
|
||||
/**
|
||||
* Read from a stream.
|
||||
*/
|
||||
public TaskInfo(StreamInput in) throws IOException {
|
||||
node = DiscoveryNode.readNode(in);
|
||||
node = new DiscoveryNode(in);
|
||||
taskId = new TaskId(node.getId(), in.readLong());
|
||||
type = in.readString();
|
||||
action = in.readString();
|
||||
description = in.readOptionalString();
|
||||
if (in.readBoolean()) {
|
||||
status = in.readTaskStatus();
|
||||
} else {
|
||||
status = null;
|
||||
}
|
||||
status = in.readOptionalNamedWriteable(Task.Status.class);
|
||||
startTime = in.readLong();
|
||||
runningTimeNanos = in.readLong();
|
||||
parentTaskId = new TaskId(in);
|
||||
cancellable = in.readBoolean();
|
||||
parentTaskId = TaskId.readFromStream(in);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
node.writeTo(out);
|
||||
out.writeLong(taskId.getId());
|
||||
out.writeString(type);
|
||||
out.writeString(action);
|
||||
out.writeOptionalString(description);
|
||||
out.writeOptionalNamedWriteable(status);
|
||||
out.writeLong(startTime);
|
||||
out.writeLong(runningTimeNanos);
|
||||
out.writeBoolean(cancellable);
|
||||
parentTaskId.writeTo(out);
|
||||
}
|
||||
|
||||
public TaskId getTaskId() {
|
||||
@ -134,6 +151,13 @@ public class TaskInfo implements Writeable<TaskInfo>, ToXContent {
|
||||
return runningTimeNanos;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns true if the task supports cancellation
|
||||
*/
|
||||
public boolean isCancellable() {
|
||||
return cancellable;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the parent task id
|
||||
*/
|
||||
@ -141,29 +165,6 @@ public class TaskInfo implements Writeable<TaskInfo>, ToXContent {
|
||||
return parentTaskId;
|
||||
}
|
||||
|
||||
@Override
|
||||
public TaskInfo readFrom(StreamInput in) throws IOException {
|
||||
return new TaskInfo(in);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
node.writeTo(out);
|
||||
out.writeLong(taskId.getId());
|
||||
out.writeString(type);
|
||||
out.writeString(action);
|
||||
out.writeOptionalString(description);
|
||||
if (status != null) {
|
||||
out.writeBoolean(true);
|
||||
out.writeTaskStatus(status);
|
||||
} else {
|
||||
out.writeBoolean(false);
|
||||
}
|
||||
out.writeLong(startTime);
|
||||
out.writeLong(runningTimeNanos);
|
||||
parentTaskId.writeTo(out);
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.field("node", node.getId());
|
||||
@ -178,6 +179,7 @@ public class TaskInfo implements Writeable<TaskInfo>, ToXContent {
|
||||
}
|
||||
builder.dateValueField("start_time_in_millis", "start_time", startTime);
|
||||
builder.timeValueField("running_time_in_nanos", "running_time", runningTimeNanos, TimeUnit.NANOSECONDS);
|
||||
builder.field("cancellable", cancellable);
|
||||
if (parentTaskId.isSet()) {
|
||||
builder.field("parent_task_id", parentTaskId.toString());
|
||||
}
|
||||
|
@ -51,12 +51,15 @@ public class TransportListTasksAction extends TransportTasksAction<Task, ListTas
|
||||
private static final TimeValue DEFAULT_WAIT_FOR_COMPLETION_TIMEOUT = timeValueSeconds(30);
|
||||
|
||||
@Inject
|
||||
public TransportListTasksAction(Settings settings, ClusterName clusterName, ThreadPool threadPool, ClusterService clusterService, TransportService transportService, ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, ListTasksAction.NAME, clusterName, threadPool, clusterService, transportService, actionFilters, indexNameExpressionResolver, ListTasksRequest::new, ListTasksResponse::new, ThreadPool.Names.MANAGEMENT);
|
||||
public TransportListTasksAction(Settings settings, ClusterName clusterName, ThreadPool threadPool, ClusterService clusterService,
|
||||
TransportService transportService, ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, ListTasksAction.NAME, clusterName, threadPool, clusterService, transportService, actionFilters,
|
||||
indexNameExpressionResolver, ListTasksRequest::new, ListTasksResponse::new, ThreadPool.Names.MANAGEMENT);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected ListTasksResponse newResponse(ListTasksRequest request, List<TaskInfo> tasks, List<TaskOperationFailure> taskOperationFailures, List<FailedNodeException> failedNodeExceptions) {
|
||||
protected ListTasksResponse newResponse(ListTasksRequest request, List<TaskInfo> tasks,
|
||||
List<TaskOperationFailure> taskOperationFailures, List<FailedNodeException> failedNodeExceptions) {
|
||||
return new ListTasksResponse(tasks, taskOperationFailures, failedNodeExceptions);
|
||||
}
|
||||
|
||||
|
@ -148,7 +148,7 @@ public class PutRepositoryRequest extends AcknowledgedRequest<PutRepositoryReque
|
||||
* @return this request
|
||||
*/
|
||||
public PutRepositoryRequest settings(String source) {
|
||||
this.settings = Settings.settingsBuilder().loadFromSource(source).build();
|
||||
this.settings = Settings.builder().loadFromSource(source).build();
|
||||
return this;
|
||||
}
|
||||
|
||||
|
@ -26,7 +26,6 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.common.xcontent.XContentHelper;
|
||||
|
||||
import java.io.IOException;
|
||||
@ -55,7 +54,7 @@ public class VerifyRepositoryResponse extends ActionResponse implements ToXConte
|
||||
clusterName = ClusterName.readClusterName(in);
|
||||
nodes = new DiscoveryNode[in.readVInt()];
|
||||
for (int i=0; i<nodes.length; i++){
|
||||
nodes[i] = DiscoveryNode.readNode(in);
|
||||
nodes[i] = new DiscoveryNode(in);
|
||||
}
|
||||
}
|
||||
|
||||
@ -78,16 +77,16 @@ public class VerifyRepositoryResponse extends ActionResponse implements ToXConte
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString NODES = new XContentBuilderString("nodes");
|
||||
static final XContentBuilderString NAME = new XContentBuilderString("name");
|
||||
static final String NODES = "nodes";
|
||||
static final String NAME = "name";
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(Fields.NODES);
|
||||
for (DiscoveryNode node : nodes) {
|
||||
builder.startObject(node.id(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.field(Fields.NAME, node.name(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.startObject(node.getId());
|
||||
builder.field(Fields.NAME, node.getName());
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
|
@ -23,7 +23,9 @@ import org.elasticsearch.ElasticsearchParseException;
|
||||
import org.elasticsearch.action.ActionRequestValidationException;
|
||||
import org.elasticsearch.action.support.master.AcknowledgedRequest;
|
||||
import org.elasticsearch.cluster.routing.allocation.command.AllocationCommand;
|
||||
import org.elasticsearch.cluster.routing.allocation.command.AllocationCommandRegistry;
|
||||
import org.elasticsearch.cluster.routing.allocation.command.AllocationCommands;
|
||||
import org.elasticsearch.common.ParseFieldMatcher;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
@ -36,7 +38,6 @@ import java.io.IOException;
|
||||
* Request to submit cluster reroute allocation commands
|
||||
*/
|
||||
public class ClusterRerouteRequest extends AcknowledgedRequest<ClusterRerouteRequest> {
|
||||
|
||||
AllocationCommands commands = new AllocationCommands();
|
||||
boolean dryRun;
|
||||
boolean explain;
|
||||
@ -87,10 +88,19 @@ public class ClusterRerouteRequest extends AcknowledgedRequest<ClusterRerouteReq
|
||||
return this.explain;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the allocation commands to execute.
|
||||
*/
|
||||
public ClusterRerouteRequest commands(AllocationCommand... commands) {
|
||||
this.commands = new AllocationCommands(commands);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the source for the request.
|
||||
*/
|
||||
public ClusterRerouteRequest source(BytesReference source) throws Exception {
|
||||
public ClusterRerouteRequest source(BytesReference source, AllocationCommandRegistry registry, ParseFieldMatcher parseFieldMatcher)
|
||||
throws Exception {
|
||||
try (XContentParser parser = XContentHelper.createParser(source)) {
|
||||
XContentParser.Token token;
|
||||
String currentFieldName = null;
|
||||
@ -99,7 +109,7 @@ public class ClusterRerouteRequest extends AcknowledgedRequest<ClusterRerouteReq
|
||||
currentFieldName = parser.currentName();
|
||||
} else if (token == XContentParser.Token.START_ARRAY) {
|
||||
if ("commands".equals(currentFieldName)) {
|
||||
this.commands = AllocationCommands.fromXContent(parser);
|
||||
this.commands = AllocationCommands.fromXContent(parser, parseFieldMatcher, registry);
|
||||
} else {
|
||||
throw new ElasticsearchParseException("failed to parse reroute request, got start array with wrong field name [{}]", currentFieldName);
|
||||
}
|
||||
|
@ -61,10 +61,10 @@ public class ClusterRerouteRequestBuilder extends AcknowledgedRequestBuilder<Clu
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the source for the request
|
||||
* Sets the commands for the request to execute.
|
||||
*/
|
||||
public ClusterRerouteRequestBuilder setSource(BytesReference source) throws Exception {
|
||||
request.source(source);
|
||||
public ClusterRerouteRequestBuilder setCommands(AllocationCommand... commands) throws Exception {
|
||||
request.commands(commands);
|
||||
return this;
|
||||
}
|
||||
}
|
||||
|
@ -85,7 +85,7 @@ public class ClusterUpdateSettingsRequest extends AcknowledgedRequest<ClusterUpd
|
||||
* Sets the source containing the transient settings to be updated. They will not survive a full cluster restart
|
||||
*/
|
||||
public ClusterUpdateSettingsRequest transientSettings(String source) {
|
||||
this.transientSettings = Settings.settingsBuilder().loadFromSource(source).build();
|
||||
this.transientSettings = Settings.builder().loadFromSource(source).build();
|
||||
return this;
|
||||
}
|
||||
|
||||
@ -124,7 +124,7 @@ public class ClusterUpdateSettingsRequest extends AcknowledgedRequest<ClusterUpd
|
||||
* Sets the source containing the persistent settings to be updated. They will get applied cross restarts
|
||||
*/
|
||||
public ClusterUpdateSettingsRequest persistentSettings(String source) {
|
||||
this.persistentSettings = Settings.settingsBuilder().loadFromSource(source).build();
|
||||
this.persistentSettings = Settings.builder().loadFromSource(source).build();
|
||||
return this;
|
||||
}
|
||||
|
||||
|
@ -32,8 +32,8 @@ import static org.elasticsearch.cluster.ClusterState.builder;
|
||||
* due to the update.
|
||||
*/
|
||||
final class SettingsUpdater {
|
||||
final Settings.Builder transientUpdates = Settings.settingsBuilder();
|
||||
final Settings.Builder persistentUpdates = Settings.settingsBuilder();
|
||||
final Settings.Builder transientUpdates = Settings.builder();
|
||||
final Settings.Builder persistentUpdates = Settings.builder();
|
||||
private final ClusterSettings clusterSettings;
|
||||
|
||||
SettingsUpdater(ClusterSettings clusterSettings) {
|
||||
@ -50,11 +50,11 @@ final class SettingsUpdater {
|
||||
|
||||
synchronized ClusterState updateSettings(final ClusterState currentState, Settings transientToApply, Settings persistentToApply) {
|
||||
boolean changed = false;
|
||||
Settings.Builder transientSettings = Settings.settingsBuilder();
|
||||
Settings.Builder transientSettings = Settings.builder();
|
||||
transientSettings.put(currentState.metaData().transientSettings());
|
||||
changed |= clusterSettings.updateDynamicSettings(transientToApply, transientSettings, transientUpdates, "transient");
|
||||
|
||||
Settings.Builder persistentSettings = Settings.settingsBuilder();
|
||||
Settings.Builder persistentSettings = Settings.builder();
|
||||
persistentSettings.put(currentState.metaData().persistentSettings());
|
||||
changed |= clusterSettings.updateDynamicSettings(persistentToApply, persistentSettings, persistentUpdates, "persistent");
|
||||
|
||||
|
@ -114,7 +114,7 @@ public class TransportClusterUpdateSettingsAction extends TransportMasterNodeAct
|
||||
// We're about to send a second update task, so we need to check if we're still the elected master
|
||||
// For example the minimum_master_node could have been breached and we're no longer elected master,
|
||||
// so we should *not* execute the reroute.
|
||||
if (!clusterService.state().nodes().localNodeMaster()) {
|
||||
if (!clusterService.state().nodes().isLocalNodeElectedMaster()) {
|
||||
logger.debug("Skipping reroute after cluster update settings, because node is no longer master");
|
||||
listener.onResponse(new ClusterUpdateSettingsResponse(updateSettingsAcked, updater.getTransientUpdates(), updater.getPersistentUpdate()));
|
||||
return;
|
||||
|
@ -26,6 +26,7 @@ import org.elasticsearch.common.io.stream.Streamable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.Index;
|
||||
import org.elasticsearch.index.shard.ShardId;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
@ -33,16 +34,14 @@ import java.io.IOException;
|
||||
*/
|
||||
public class ClusterSearchShardsGroup implements Streamable, ToXContent {
|
||||
|
||||
private Index index;
|
||||
private int shardId;
|
||||
private ShardId shardId;
|
||||
ShardRouting[] shards;
|
||||
|
||||
ClusterSearchShardsGroup() {
|
||||
|
||||
}
|
||||
|
||||
public ClusterSearchShardsGroup(Index index, int shardId, ShardRouting[] shards) {
|
||||
this.index = index;
|
||||
public ClusterSearchShardsGroup(ShardId shardId, ShardRouting[] shards) {
|
||||
this.shardId = shardId;
|
||||
this.shards = shards;
|
||||
}
|
||||
@ -54,11 +53,11 @@ public class ClusterSearchShardsGroup implements Streamable, ToXContent {
|
||||
}
|
||||
|
||||
public String getIndex() {
|
||||
return index.getName();
|
||||
return shardId.getIndexName();
|
||||
}
|
||||
|
||||
public int getShardId() {
|
||||
return shardId;
|
||||
return shardId.id();
|
||||
}
|
||||
|
||||
public ShardRouting[] getShards() {
|
||||
@ -67,18 +66,16 @@ public class ClusterSearchShardsGroup implements Streamable, ToXContent {
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
index = new Index(in);
|
||||
shardId = in.readVInt();
|
||||
shardId = ShardId.readShardId(in);
|
||||
shards = new ShardRouting[in.readVInt()];
|
||||
for (int i = 0; i < shards.length; i++) {
|
||||
shards[i] = ShardRouting.readShardRoutingEntry(in, index, shardId);
|
||||
shards[i] = new ShardRouting(shardId, in);
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
index.writeTo(out);
|
||||
out.writeVInt(shardId);
|
||||
shardId.writeTo(out);
|
||||
out.writeVInt(shards.length);
|
||||
for (ShardRouting shardRouting : shards) {
|
||||
shardRouting.writeToThin(out);
|
||||
|
@ -61,7 +61,7 @@ public class ClusterSearchShardsResponse extends ActionResponse implements ToXCo
|
||||
}
|
||||
nodes = new DiscoveryNode[in.readVInt()];
|
||||
for (int i = 0; i < nodes.length; i++) {
|
||||
nodes[i] = DiscoveryNode.readNode(in);
|
||||
nodes[i] = new DiscoveryNode(in);
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -34,6 +34,7 @@ import org.elasticsearch.cluster.service.ClusterService;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.Index;
|
||||
import org.elasticsearch.index.shard.ShardId;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.transport.TransportService;
|
||||
|
||||
@ -78,8 +79,7 @@ public class TransportClusterSearchShardsAction extends TransportMasterNodeReadA
|
||||
ClusterSearchShardsGroup[] groupResponses = new ClusterSearchShardsGroup[groupShardsIterator.size()];
|
||||
int currentGroup = 0;
|
||||
for (ShardIterator shardIt : groupShardsIterator) {
|
||||
Index index = shardIt.shardId().getIndex();
|
||||
int shardId = shardIt.shardId().getId();
|
||||
ShardId shardId = shardIt.shardId();
|
||||
ShardRouting[] shardRoutings = new ShardRouting[shardIt.size()];
|
||||
int currentShard = 0;
|
||||
shardIt.reset();
|
||||
@ -87,7 +87,7 @@ public class TransportClusterSearchShardsAction extends TransportMasterNodeReadA
|
||||
shardRoutings[currentShard++] = shard;
|
||||
nodeIds.add(shard.currentNodeId());
|
||||
}
|
||||
groupResponses[currentGroup++] = new ClusterSearchShardsGroup(index, shardId, shardRoutings);
|
||||
groupResponses[currentGroup++] = new ClusterSearchShardsGroup(shardId, shardRoutings);
|
||||
}
|
||||
DiscoveryNode[] nodes = new DiscoveryNode[nodeIds.size()];
|
||||
int currentNode = 0;
|
||||
|
@ -299,7 +299,7 @@ public class CreateSnapshotRequest extends MasterNodeRequest<CreateSnapshotReque
|
||||
* @return this request
|
||||
*/
|
||||
public CreateSnapshotRequest settings(String source) {
|
||||
this.settings = Settings.settingsBuilder().loadFromSource(source).build();
|
||||
this.settings = Settings.builder().loadFromSource(source).build();
|
||||
return this;
|
||||
}
|
||||
|
||||
|
@ -25,7 +25,6 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
import org.elasticsearch.snapshots.SnapshotInfo;
|
||||
|
||||
@ -58,13 +57,13 @@ public class CreateSnapshotResponse extends ActionResponse implements ToXContent
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
snapshotInfo = SnapshotInfo.readOptionalSnapshotInfo(in);
|
||||
snapshotInfo = in.readOptionalWriteable(SnapshotInfo::new);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeOptionalStreamable(snapshotInfo);
|
||||
out.writeOptionalWriteable(snapshotInfo);
|
||||
}
|
||||
|
||||
/**
|
||||
@ -83,15 +82,15 @@ public class CreateSnapshotResponse extends ActionResponse implements ToXContent
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString SNAPSHOT = new XContentBuilderString("snapshot");
|
||||
static final XContentBuilderString ACCEPTED = new XContentBuilderString("accepted");
|
||||
static final String SNAPSHOT = "snapshot";
|
||||
static final String ACCEPTED = "accepted";
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
if (snapshotInfo != null) {
|
||||
builder.field(Fields.SNAPSHOT);
|
||||
snapshotInfo.toXContent(builder, params);
|
||||
snapshotInfo.toExternalXContent(builder, params);
|
||||
} else {
|
||||
builder.field(Fields.ACCEPTED, true);
|
||||
}
|
||||
|
@ -24,7 +24,6 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.snapshots.SnapshotInfo;
|
||||
|
||||
import java.io.IOException;
|
||||
@ -61,7 +60,7 @@ public class GetSnapshotsResponse extends ActionResponse implements ToXContent {
|
||||
int size = in.readVInt();
|
||||
List<SnapshotInfo> builder = new ArrayList<>();
|
||||
for (int i = 0; i < size; i++) {
|
||||
builder.add(SnapshotInfo.readSnapshotInfo(in));
|
||||
builder.add(new SnapshotInfo(in));
|
||||
}
|
||||
snapshots = Collections.unmodifiableList(builder);
|
||||
}
|
||||
@ -76,14 +75,14 @@ public class GetSnapshotsResponse extends ActionResponse implements ToXContent {
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString SNAPSHOTS = new XContentBuilderString("snapshots");
|
||||
static final String SNAPSHOTS = "snapshots";
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, ToXContent.Params params) throws IOException {
|
||||
builder.startArray(Fields.SNAPSHOTS);
|
||||
for (SnapshotInfo snapshotInfo : snapshots) {
|
||||
snapshotInfo.toXContent(builder, params);
|
||||
snapshotInfo.toExternalXContent(builder, params);
|
||||
}
|
||||
builder.endArray();
|
||||
return builder;
|
||||
|
@ -31,7 +31,6 @@ import org.elasticsearch.cluster.service.ClusterService;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.regex.Regex;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.snapshots.Snapshot;
|
||||
import org.elasticsearch.snapshots.SnapshotInfo;
|
||||
import org.elasticsearch.snapshots.SnapshotsService;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
@ -77,18 +76,12 @@ public class TransportGetSnapshotsAction extends TransportMasterNodeAction<GetSn
|
||||
try {
|
||||
List<SnapshotInfo> snapshotInfoBuilder = new ArrayList<>();
|
||||
if (isAllSnapshots(request.snapshots())) {
|
||||
List<Snapshot> snapshots = snapshotsService.snapshots(request.repository(), request.ignoreUnavailable());
|
||||
for (Snapshot snapshot : snapshots) {
|
||||
snapshotInfoBuilder.add(new SnapshotInfo(snapshot));
|
||||
}
|
||||
snapshotInfoBuilder.addAll(snapshotsService.snapshots(request.repository(), request.ignoreUnavailable()));
|
||||
} else if (isCurrentSnapshots(request.snapshots())) {
|
||||
List<Snapshot> snapshots = snapshotsService.currentSnapshots(request.repository());
|
||||
for (Snapshot snapshot : snapshots) {
|
||||
snapshotInfoBuilder.add(new SnapshotInfo(snapshot));
|
||||
}
|
||||
snapshotInfoBuilder.addAll(snapshotsService.currentSnapshots(request.repository()));
|
||||
} else {
|
||||
Set<String> snapshotsToGet = new LinkedHashSet<>(); // to keep insertion order
|
||||
List<Snapshot> snapshots = null;
|
||||
List<SnapshotInfo> snapshots = null;
|
||||
for (String snapshotOrPattern : request.snapshots()) {
|
||||
if (Regex.isSimpleMatchPattern(snapshotOrPattern) == false) {
|
||||
snapshotsToGet.add(snapshotOrPattern);
|
||||
@ -96,7 +89,7 @@ public class TransportGetSnapshotsAction extends TransportMasterNodeAction<GetSn
|
||||
if (snapshots == null) { // lazily load snapshots
|
||||
snapshots = snapshotsService.snapshots(request.repository(), request.ignoreUnavailable());
|
||||
}
|
||||
for (Snapshot snapshot : snapshots) {
|
||||
for (SnapshotInfo snapshot : snapshots) {
|
||||
if (Regex.simpleMatch(snapshotOrPattern, snapshot.name())) {
|
||||
snapshotsToGet.add(snapshot.name());
|
||||
}
|
||||
@ -105,7 +98,7 @@ public class TransportGetSnapshotsAction extends TransportMasterNodeAction<GetSn
|
||||
}
|
||||
for (String snapshot : snapshotsToGet) {
|
||||
SnapshotId snapshotId = new SnapshotId(request.repository(), snapshot);
|
||||
snapshotInfoBuilder.add(new SnapshotInfo(snapshotsService.snapshot(snapshotId)));
|
||||
snapshotInfoBuilder.add(snapshotsService.snapshot(snapshotId));
|
||||
}
|
||||
}
|
||||
listener.onResponse(new GetSnapshotsResponse(Collections.unmodifiableList(snapshotInfoBuilder)));
|
||||
|
@ -324,7 +324,7 @@ public class RestoreSnapshotRequest extends MasterNodeRequest<RestoreSnapshotReq
|
||||
* @return this request
|
||||
*/
|
||||
public RestoreSnapshotRequest settings(String source) {
|
||||
this.settings = Settings.settingsBuilder().loadFromSource(source).build();
|
||||
this.settings = Settings.builder().loadFromSource(source).build();
|
||||
return this;
|
||||
}
|
||||
|
||||
@ -441,7 +441,7 @@ public class RestoreSnapshotRequest extends MasterNodeRequest<RestoreSnapshotReq
|
||||
* Sets settings that should be added/changed in all restored indices
|
||||
*/
|
||||
public RestoreSnapshotRequest indexSettings(String source) {
|
||||
this.indexSettings = Settings.settingsBuilder().loadFromSource(source).build();
|
||||
this.indexSettings = Settings.builder().loadFromSource(source).build();
|
||||
return this;
|
||||
}
|
||||
|
||||
|
@ -25,7 +25,6 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
import org.elasticsearch.snapshots.RestoreInfo;
|
||||
|
||||
@ -75,8 +74,8 @@ public class RestoreSnapshotResponse extends ActionResponse implements ToXConten
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString SNAPSHOT = new XContentBuilderString("snapshot");
|
||||
static final XContentBuilderString ACCEPTED = new XContentBuilderString("accepted");
|
||||
static final String SNAPSHOT = "snapshot";
|
||||
static final String ACCEPTED = "accepted";
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -24,7 +24,6 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.index.shard.ShardId;
|
||||
import org.elasticsearch.index.snapshots.IndexShardSnapshotStatus;
|
||||
|
||||
@ -135,9 +134,9 @@ public class SnapshotIndexShardStatus extends BroadcastShardResponse implements
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString STAGE = new XContentBuilderString("stage");
|
||||
static final XContentBuilderString REASON = new XContentBuilderString("reason");
|
||||
static final XContentBuilderString NODE = new XContentBuilderString("node");
|
||||
static final String STAGE = "stage";
|
||||
static final String REASON = "reason";
|
||||
static final String NODE = "node";
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -21,7 +21,6 @@ package org.elasticsearch.action.admin.cluster.snapshots.status;
|
||||
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Collection;
|
||||
@ -91,12 +90,12 @@ public class SnapshotIndexStatus implements Iterable<SnapshotIndexShardStatus>,
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString SHARDS = new XContentBuilderString("shards");
|
||||
static final String SHARDS = "shards";
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(getIndex(), XContentBuilder.FieldCaseConversion.NONE);
|
||||
builder.startObject(getIndex());
|
||||
shardsStats.toXContent(builder, params);
|
||||
stats.toXContent(builder, params);
|
||||
builder.startObject(Fields.SHARDS);
|
||||
|
@ -21,7 +21,6 @@ package org.elasticsearch.action.admin.cluster.snapshots.status;
|
||||
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Collection;
|
||||
@ -106,13 +105,13 @@ public class SnapshotShardsStats implements ToXContent {
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString SHARDS_STATS = new XContentBuilderString("shards_stats");
|
||||
static final XContentBuilderString INITIALIZING = new XContentBuilderString("initializing");
|
||||
static final XContentBuilderString STARTED = new XContentBuilderString("started");
|
||||
static final XContentBuilderString FINALIZING = new XContentBuilderString("finalizing");
|
||||
static final XContentBuilderString DONE = new XContentBuilderString("done");
|
||||
static final XContentBuilderString FAILED = new XContentBuilderString("failed");
|
||||
static final XContentBuilderString TOTAL = new XContentBuilderString("total");
|
||||
static final String SHARDS_STATS = "shards_stats";
|
||||
static final String INITIALIZING = "initializing";
|
||||
static final String STARTED = "started";
|
||||
static final String FINALIZING = "finalizing";
|
||||
static final String DONE = "done";
|
||||
static final String FAILED = "failed";
|
||||
static final String TOTAL = "total";
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -24,7 +24,6 @@ import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Streamable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.index.snapshots.IndexShardSnapshotStatus;
|
||||
|
||||
import java.io.IOException;
|
||||
@ -130,16 +129,16 @@ public class SnapshotStats implements Streamable, ToXContent {
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString STATS = new XContentBuilderString("stats");
|
||||
static final XContentBuilderString NUMBER_OF_FILES = new XContentBuilderString("number_of_files");
|
||||
static final XContentBuilderString PROCESSED_FILES = new XContentBuilderString("processed_files");
|
||||
static final XContentBuilderString TOTAL_SIZE_IN_BYTES = new XContentBuilderString("total_size_in_bytes");
|
||||
static final XContentBuilderString TOTAL_SIZE = new XContentBuilderString("total_size");
|
||||
static final XContentBuilderString PROCESSED_SIZE_IN_BYTES = new XContentBuilderString("processed_size_in_bytes");
|
||||
static final XContentBuilderString PROCESSED_SIZE = new XContentBuilderString("processed_size");
|
||||
static final XContentBuilderString START_TIME_IN_MILLIS = new XContentBuilderString("start_time_in_millis");
|
||||
static final XContentBuilderString TIME_IN_MILLIS = new XContentBuilderString("time_in_millis");
|
||||
static final XContentBuilderString TIME = new XContentBuilderString("time");
|
||||
static final String STATS = "stats";
|
||||
static final String NUMBER_OF_FILES = "number_of_files";
|
||||
static final String PROCESSED_FILES = "processed_files";
|
||||
static final String TOTAL_SIZE_IN_BYTES = "total_size_in_bytes";
|
||||
static final String TOTAL_SIZE = "total_size";
|
||||
static final String PROCESSED_SIZE_IN_BYTES = "processed_size_in_bytes";
|
||||
static final String PROCESSED_SIZE = "processed_size";
|
||||
static final String START_TIME_IN_MILLIS = "start_time_in_millis";
|
||||
static final String TIME_IN_MILLIS = "time_in_millis";
|
||||
static final String TIME = "time";
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -26,7 +26,6 @@ import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Streamable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
|
||||
import java.io.IOException;
|
||||
@ -180,10 +179,10 @@ public class SnapshotStatus implements ToXContent, Streamable {
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString SNAPSHOT = new XContentBuilderString("snapshot");
|
||||
static final XContentBuilderString REPOSITORY = new XContentBuilderString("repository");
|
||||
static final XContentBuilderString STATE = new XContentBuilderString("state");
|
||||
static final XContentBuilderString INDICES = new XContentBuilderString("indices");
|
||||
static final String SNAPSHOT = "snapshot";
|
||||
static final String REPOSITORY = "repository";
|
||||
static final String STATE = "state";
|
||||
static final String INDICES = "indices";
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -24,7 +24,6 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
@ -75,7 +74,7 @@ public class SnapshotsStatusResponse extends ActionResponse implements ToXConten
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString SNAPSHOTS = new XContentBuilderString("snapshots");
|
||||
static final String SNAPSHOTS = "snapshots";
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -43,18 +43,20 @@ import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.transport.TransportService;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.atomic.AtomicReferenceArray;
|
||||
|
||||
import static java.util.Collections.unmodifiableMap;
|
||||
|
||||
/**
|
||||
* Transport client that collects snapshot shard statuses from data nodes
|
||||
*/
|
||||
public class TransportNodesSnapshotsStatus extends TransportNodesAction<TransportNodesSnapshotsStatus.Request, TransportNodesSnapshotsStatus.NodesSnapshotStatus, TransportNodesSnapshotsStatus.NodeRequest, TransportNodesSnapshotsStatus.NodeSnapshotStatus> {
|
||||
public class TransportNodesSnapshotsStatus extends TransportNodesAction<TransportNodesSnapshotsStatus.Request,
|
||||
TransportNodesSnapshotsStatus.NodesSnapshotStatus,
|
||||
TransportNodesSnapshotsStatus.NodeRequest,
|
||||
TransportNodesSnapshotsStatus.NodeSnapshotStatus> {
|
||||
|
||||
public static final String ACTION_NAME = SnapshotsStatusAction.NAME + "[nodes]";
|
||||
|
||||
@ -66,7 +68,7 @@ public class TransportNodesSnapshotsStatus extends TransportNodesAction<Transpor
|
||||
SnapshotShardsService snapshotShardsService, ActionFilters actionFilters,
|
||||
IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, ACTION_NAME, clusterName, threadPool, clusterService, transportService, actionFilters, indexNameExpressionResolver,
|
||||
Request::new, NodeRequest::new, ThreadPool.Names.GENERIC);
|
||||
Request::new, NodeRequest::new, ThreadPool.Names.GENERIC, NodeSnapshotStatus.class);
|
||||
this.snapshotShardsService = snapshotShardsService;
|
||||
}
|
||||
|
||||
@ -86,28 +88,15 @@ public class TransportNodesSnapshotsStatus extends TransportNodesAction<Transpor
|
||||
}
|
||||
|
||||
@Override
|
||||
protected NodesSnapshotStatus newResponse(Request request, AtomicReferenceArray responses) {
|
||||
final List<NodeSnapshotStatus> nodesList = new ArrayList<>();
|
||||
final List<FailedNodeException> failures = new ArrayList<>();
|
||||
for (int i = 0; i < responses.length(); i++) {
|
||||
Object resp = responses.get(i);
|
||||
if (resp instanceof NodeSnapshotStatus) { // will also filter out null response for unallocated ones
|
||||
nodesList.add((NodeSnapshotStatus) resp);
|
||||
} else if (resp instanceof FailedNodeException) {
|
||||
failures.add((FailedNodeException) resp);
|
||||
} else {
|
||||
logger.warn("unknown response type [{}], expected NodeSnapshotStatus or FailedNodeException", resp);
|
||||
}
|
||||
}
|
||||
return new NodesSnapshotStatus(clusterName, nodesList.toArray(new NodeSnapshotStatus[nodesList.size()]),
|
||||
failures.toArray(new FailedNodeException[failures.size()]));
|
||||
protected NodesSnapshotStatus newResponse(Request request, List<NodeSnapshotStatus> responses, List<FailedNodeException> failures) {
|
||||
return new NodesSnapshotStatus(clusterName, responses, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected NodeSnapshotStatus nodeOperation(NodeRequest request) {
|
||||
Map<SnapshotId, Map<ShardId, SnapshotIndexShardStatus>> snapshotMapBuilder = new HashMap<>();
|
||||
try {
|
||||
String nodeId = clusterService.localNode().id();
|
||||
String nodeId = clusterService.localNode().getId();
|
||||
for (SnapshotId snapshotId : request.snapshotIds) {
|
||||
Map<ShardId, IndexShardSnapshotStatus> shardsStatus = snapshotShardsService.currentSnapshotShards(snapshotId);
|
||||
if (shardsStatus == null) {
|
||||
@ -169,75 +158,47 @@ public class TransportNodesSnapshotsStatus extends TransportNodesAction<Transpor
|
||||
|
||||
public static class NodesSnapshotStatus extends BaseNodesResponse<NodeSnapshotStatus> {
|
||||
|
||||
private FailedNodeException[] failures;
|
||||
|
||||
NodesSnapshotStatus() {
|
||||
}
|
||||
|
||||
public NodesSnapshotStatus(ClusterName clusterName, NodeSnapshotStatus[] nodes, FailedNodeException[] failures) {
|
||||
super(clusterName, nodes);
|
||||
this.failures = failures;
|
||||
public NodesSnapshotStatus(ClusterName clusterName, List<NodeSnapshotStatus> nodes, List<FailedNodeException> failures) {
|
||||
super(clusterName, nodes, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
public FailedNodeException[] failures() {
|
||||
return failures;
|
||||
protected List<NodeSnapshotStatus> readNodesFrom(StreamInput in) throws IOException {
|
||||
return in.readStreamableList(NodeSnapshotStatus::new);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
nodes = new NodeSnapshotStatus[in.readVInt()];
|
||||
for (int i = 0; i < nodes.length; i++) {
|
||||
nodes[i] = new NodeSnapshotStatus();
|
||||
nodes[i].readFrom(in);
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeVInt(nodes.length);
|
||||
for (NodeSnapshotStatus response : nodes) {
|
||||
response.writeTo(out);
|
||||
}
|
||||
protected void writeNodesTo(StreamOutput out, List<NodeSnapshotStatus> nodes) throws IOException {
|
||||
out.writeStreamableList(nodes);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
public static class NodeRequest extends BaseNodeRequest {
|
||||
|
||||
private SnapshotId[] snapshotIds;
|
||||
private List<SnapshotId> snapshotIds;
|
||||
|
||||
public NodeRequest() {
|
||||
}
|
||||
|
||||
NodeRequest(String nodeId, TransportNodesSnapshotsStatus.Request request) {
|
||||
super(nodeId);
|
||||
snapshotIds = request.snapshotIds;
|
||||
snapshotIds = Arrays.asList(request.snapshotIds);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
int n = in.readVInt();
|
||||
snapshotIds = new SnapshotId[n];
|
||||
for (int i = 0; i < n; i++) {
|
||||
snapshotIds[i] = SnapshotId.readSnapshotId(in);
|
||||
}
|
||||
snapshotIds = in.readList(SnapshotId::readSnapshotId);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
if (snapshotIds != null) {
|
||||
out.writeVInt(snapshotIds.length);
|
||||
for (int i = 0; i < snapshotIds.length; i++) {
|
||||
snapshotIds[i].writeTo(out);
|
||||
}
|
||||
} else {
|
||||
out.writeVInt(0);
|
||||
}
|
||||
out.writeStreamableList(snapshotIds);
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -36,7 +36,7 @@ import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.shard.ShardId;
|
||||
import org.elasticsearch.index.snapshots.IndexShardSnapshotStatus;
|
||||
import org.elasticsearch.snapshots.Snapshot;
|
||||
import org.elasticsearch.snapshots.SnapshotInfo;
|
||||
import org.elasticsearch.snapshots.SnapshotsService;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.transport.TransportService;
|
||||
@ -202,7 +202,7 @@ public class TransportSnapshotsStatusAction extends TransportMasterNodeAction<Sn
|
||||
// This is a snapshot the is currently running - skipping
|
||||
continue;
|
||||
}
|
||||
Snapshot snapshot = snapshotsService.snapshot(snapshotId);
|
||||
SnapshotInfo snapshot = snapshotsService.snapshot(snapshotId);
|
||||
List<SnapshotIndexShardStatus> shardStatusBuilder = new ArrayList<>();
|
||||
if (snapshot.state().completed()) {
|
||||
Map<ShardId, IndexShardSnapshotStatus> shardStatues = snapshotsService.snapshotShards(snapshotId);
|
||||
|
@ -47,7 +47,7 @@ public class TransportClusterStateAction extends TransportMasterNodeReadAction<C
|
||||
@Inject
|
||||
public TransportClusterStateAction(Settings settings, TransportService transportService, ClusterService clusterService, ThreadPool threadPool,
|
||||
ClusterName clusterName, ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, ClusterStateAction.NAME, transportService, clusterService, threadPool, actionFilters, indexNameExpressionResolver, ClusterStateRequest::new);
|
||||
super(settings, ClusterStateAction.NAME, false, transportService, clusterService, threadPool, actionFilters, indexNameExpressionResolver, ClusterStateRequest::new);
|
||||
this.clusterName = clusterName;
|
||||
}
|
||||
|
||||
|
@ -22,12 +22,8 @@ package org.elasticsearch.action.admin.cluster.stats;
|
||||
import com.carrotsearch.hppc.ObjectObjectHashMap;
|
||||
import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
|
||||
import org.elasticsearch.action.admin.indices.stats.CommonStats;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Streamable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.index.cache.query.QueryCacheStats;
|
||||
import org.elasticsearch.index.engine.SegmentsStats;
|
||||
import org.elasticsearch.index.fielddata.FieldDataStats;
|
||||
@ -37,8 +33,9 @@ import org.elasticsearch.index.store.StoreStats;
|
||||
import org.elasticsearch.search.suggest.completion.CompletionStats;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
||||
public class ClusterStatsIndices implements ToXContent, Streamable {
|
||||
public class ClusterStatsIndices implements ToXContent {
|
||||
|
||||
private int indexCount;
|
||||
private ShardStats shards;
|
||||
@ -50,10 +47,7 @@ public class ClusterStatsIndices implements ToXContent, Streamable {
|
||||
private SegmentsStats segments;
|
||||
private PercolatorQueryCacheStats percolatorCache;
|
||||
|
||||
private ClusterStatsIndices() {
|
||||
}
|
||||
|
||||
public ClusterStatsIndices(ClusterStatsNodeResponse[] nodeResponses) {
|
||||
public ClusterStatsIndices(List<ClusterStatsNodeResponse> nodeResponses) {
|
||||
ObjectObjectHashMap<String, ShardStats> countsPerIndex = new ObjectObjectHashMap<>();
|
||||
|
||||
this.docs = new DocsStats();
|
||||
@ -132,40 +126,8 @@ public class ClusterStatsIndices implements ToXContent, Streamable {
|
||||
return percolatorCache;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
indexCount = in.readVInt();
|
||||
shards = ShardStats.readShardStats(in);
|
||||
docs = DocsStats.readDocStats(in);
|
||||
store = StoreStats.readStoreStats(in);
|
||||
fieldData = FieldDataStats.readFieldDataStats(in);
|
||||
queryCache = QueryCacheStats.readQueryCacheStats(in);
|
||||
completion = CompletionStats.readCompletionStats(in);
|
||||
segments = SegmentsStats.readSegmentsStats(in);
|
||||
percolatorCache = PercolatorQueryCacheStats.readPercolateStats(in);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeVInt(indexCount);
|
||||
shards.writeTo(out);
|
||||
docs.writeTo(out);
|
||||
store.writeTo(out);
|
||||
fieldData.writeTo(out);
|
||||
queryCache.writeTo(out);
|
||||
completion.writeTo(out);
|
||||
segments.writeTo(out);
|
||||
percolatorCache.writeTo(out);
|
||||
}
|
||||
|
||||
public static ClusterStatsIndices readIndicesStats(StreamInput in) throws IOException {
|
||||
ClusterStatsIndices indicesStats = new ClusterStatsIndices();
|
||||
indicesStats.readFrom(in);
|
||||
return indicesStats;
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString COUNT = new XContentBuilderString("count");
|
||||
static final String COUNT = "count";
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -182,7 +144,7 @@ public class ClusterStatsIndices implements ToXContent, Streamable {
|
||||
return builder;
|
||||
}
|
||||
|
||||
public static class ShardStats implements ToXContent, Streamable {
|
||||
public static class ShardStats implements ToXContent {
|
||||
|
||||
int indices;
|
||||
int total;
|
||||
@ -327,52 +289,18 @@ public class ClusterStatsIndices implements ToXContent, Streamable {
|
||||
}
|
||||
}
|
||||
|
||||
public static ShardStats readShardStats(StreamInput in) throws IOException {
|
||||
ShardStats c = new ShardStats();
|
||||
c.readFrom(in);
|
||||
return c;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
indices = in.readVInt();
|
||||
total = in.readVInt();
|
||||
primaries = in.readVInt();
|
||||
minIndexShards = in.readVInt();
|
||||
maxIndexShards = in.readVInt();
|
||||
minIndexPrimaryShards = in.readVInt();
|
||||
maxIndexPrimaryShards = in.readVInt();
|
||||
minIndexReplication = in.readDouble();
|
||||
totalIndexReplication = in.readDouble();
|
||||
maxIndexReplication = in.readDouble();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeVInt(indices);
|
||||
out.writeVInt(total);
|
||||
out.writeVInt(primaries);
|
||||
out.writeVInt(minIndexShards);
|
||||
out.writeVInt(maxIndexShards);
|
||||
out.writeVInt(minIndexPrimaryShards);
|
||||
out.writeVInt(maxIndexPrimaryShards);
|
||||
out.writeDouble(minIndexReplication);
|
||||
out.writeDouble(totalIndexReplication);
|
||||
out.writeDouble(maxIndexReplication);
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString SHARDS = new XContentBuilderString("shards");
|
||||
static final XContentBuilderString TOTAL = new XContentBuilderString("total");
|
||||
static final XContentBuilderString PRIMARIES = new XContentBuilderString("primaries");
|
||||
static final XContentBuilderString REPLICATION = new XContentBuilderString("replication");
|
||||
static final XContentBuilderString MIN = new XContentBuilderString("min");
|
||||
static final XContentBuilderString MAX = new XContentBuilderString("max");
|
||||
static final XContentBuilderString AVG = new XContentBuilderString("avg");
|
||||
static final XContentBuilderString INDEX = new XContentBuilderString("index");
|
||||
static final String SHARDS = "shards";
|
||||
static final String TOTAL = "total";
|
||||
static final String PRIMARIES = "primaries";
|
||||
static final String REPLICATION = "replication";
|
||||
static final String MIN = "min";
|
||||
static final String MAX = "max";
|
||||
static final String AVG = "avg";
|
||||
static final String INDEX = "index";
|
||||
}
|
||||
|
||||
private void addIntMinMax(XContentBuilderString field, int min, int max, double avg, XContentBuilder builder) throws IOException {
|
||||
private void addIntMinMax(String field, int min, int max, double avg, XContentBuilder builder) throws IOException {
|
||||
builder.startObject(field);
|
||||
builder.field(Fields.MIN, min);
|
||||
builder.field(Fields.MAX, max);
|
||||
@ -380,7 +308,7 @@ public class ClusterStatsIndices implements ToXContent, Streamable {
|
||||
builder.endObject();
|
||||
}
|
||||
|
||||
private void addDoubleMinMax(XContentBuilderString field, double min, double max, double avg, XContentBuilder builder) throws IOException {
|
||||
private void addDoubleMinMax(String field, double min, double max, double avg, XContentBuilder builder) throws IOException {
|
||||
builder.startObject(field);
|
||||
builder.field(Fields.MIN, min);
|
||||
builder.field(Fields.MAX, max);
|
||||
|
@ -21,60 +21,53 @@ package org.elasticsearch.action.admin.cluster.stats;
|
||||
|
||||
import com.carrotsearch.hppc.ObjectIntHashMap;
|
||||
import com.carrotsearch.hppc.cursors.ObjectIntCursor;
|
||||
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.action.admin.cluster.node.info.NodeInfo;
|
||||
import org.elasticsearch.action.admin.cluster.node.stats.NodeStats;
|
||||
import org.elasticsearch.cluster.node.DiscoveryNode;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Streamable;
|
||||
import org.elasticsearch.common.transport.InetSocketTransportAddress;
|
||||
import org.elasticsearch.common.transport.TransportAddress;
|
||||
import org.elasticsearch.common.unit.ByteSizeValue;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.monitor.fs.FsInfo;
|
||||
import org.elasticsearch.monitor.jvm.JvmInfo;
|
||||
import org.elasticsearch.plugins.PluginInfo;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.net.InetAddress;
|
||||
import java.util.ArrayList;
|
||||
import java.util.HashMap;
|
||||
import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
|
||||
public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
public class ClusterStatsNodes implements ToXContent {
|
||||
|
||||
private Counts counts;
|
||||
private Set<Version> versions;
|
||||
private OsStats os;
|
||||
private ProcessStats process;
|
||||
private JvmStats jvm;
|
||||
private FsInfo.Path fs;
|
||||
private Set<PluginInfo> plugins;
|
||||
private final Counts counts;
|
||||
private final Set<Version> versions;
|
||||
private final OsStats os;
|
||||
private final ProcessStats process;
|
||||
private final JvmStats jvm;
|
||||
private final FsInfo.Path fs;
|
||||
private final Set<PluginInfo> plugins;
|
||||
|
||||
private ClusterStatsNodes() {
|
||||
}
|
||||
|
||||
public ClusterStatsNodes(ClusterStatsNodeResponse[] nodeResponses) {
|
||||
this.counts = new Counts();
|
||||
ClusterStatsNodes(List<ClusterStatsNodeResponse> nodeResponses) {
|
||||
this.versions = new HashSet<>();
|
||||
this.os = new OsStats();
|
||||
this.jvm = new JvmStats();
|
||||
this.fs = new FsInfo.Path();
|
||||
this.plugins = new HashSet<>();
|
||||
this.process = new ProcessStats();
|
||||
|
||||
Set<InetAddress> seenAddresses = new HashSet<>(nodeResponses.length);
|
||||
|
||||
Set<InetAddress> seenAddresses = new HashSet<>(nodeResponses.size());
|
||||
List<NodeInfo> nodeInfos = new ArrayList<>();
|
||||
List<NodeStats> nodeStats = new ArrayList<>();
|
||||
for (ClusterStatsNodeResponse nodeResponse : nodeResponses) {
|
||||
|
||||
counts.addNodeInfo(nodeResponse.nodeInfo());
|
||||
versions.add(nodeResponse.nodeInfo().getVersion());
|
||||
process.addNodeStats(nodeResponse.nodeStats());
|
||||
jvm.addNodeInfoStats(nodeResponse.nodeInfo(), nodeResponse.nodeStats());
|
||||
plugins.addAll(nodeResponse.nodeInfo().getPlugins().getPluginInfos());
|
||||
nodeInfos.add(nodeResponse.nodeInfo());
|
||||
nodeStats.add(nodeResponse.nodeStats());
|
||||
this.versions.add(nodeResponse.nodeInfo().getVersion());
|
||||
this.plugins.addAll(nodeResponse.nodeInfo().getPlugins().getPluginInfos());
|
||||
|
||||
// now do the stats that should be deduped by hardware (implemented by ip deduping)
|
||||
TransportAddress publishAddress = nodeResponse.nodeInfo().getTransport().address().publishAddress();
|
||||
@ -82,19 +75,19 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
if (publishAddress.uniqueAddressTypeId() == 1) {
|
||||
inetAddress = ((InetSocketTransportAddress) publishAddress).address().getAddress();
|
||||
}
|
||||
|
||||
if (!seenAddresses.add(inetAddress)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
os.addNodeInfo(nodeResponse.nodeInfo());
|
||||
if (nodeResponse.nodeStats().getFs() != null) {
|
||||
fs.add(nodeResponse.nodeStats().getFs().total());
|
||||
this.fs.add(nodeResponse.nodeStats().getFs().total());
|
||||
}
|
||||
}
|
||||
this.counts = new Counts(nodeInfos);
|
||||
this.os = new OsStats(nodeInfos);
|
||||
this.process = new ProcessStats(nodeStats);
|
||||
this.jvm = new JvmStats(nodeInfos, nodeStats);
|
||||
}
|
||||
|
||||
|
||||
public Counts getCounts() {
|
||||
return this.counts;
|
||||
}
|
||||
@ -123,58 +116,14 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
return plugins;
|
||||
}
|
||||
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
counts = Counts.readCounts(in);
|
||||
|
||||
int size = in.readVInt();
|
||||
versions = new HashSet<>(size);
|
||||
for (; size > 0; size--) {
|
||||
versions.add(Version.readVersion(in));
|
||||
}
|
||||
|
||||
os = OsStats.readOsStats(in);
|
||||
process = ProcessStats.readStats(in);
|
||||
jvm = JvmStats.readJvmStats(in);
|
||||
fs = FsInfo.Path.readInfoFrom(in);
|
||||
|
||||
size = in.readVInt();
|
||||
plugins = new HashSet<>(size);
|
||||
for (; size > 0; size--) {
|
||||
plugins.add(PluginInfo.readFromStream(in));
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
counts.writeTo(out);
|
||||
out.writeVInt(versions.size());
|
||||
for (Version v : versions) Version.writeVersion(v, out);
|
||||
os.writeTo(out);
|
||||
process.writeTo(out);
|
||||
jvm.writeTo(out);
|
||||
fs.writeTo(out);
|
||||
out.writeVInt(plugins.size());
|
||||
for (PluginInfo p : plugins) {
|
||||
p.writeTo(out);
|
||||
}
|
||||
}
|
||||
|
||||
public static ClusterStatsNodes readNodeStats(StreamInput in) throws IOException {
|
||||
ClusterStatsNodes nodeStats = new ClusterStatsNodes();
|
||||
nodeStats.readFrom(in);
|
||||
return nodeStats;
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString COUNT = new XContentBuilderString("count");
|
||||
static final XContentBuilderString VERSIONS = new XContentBuilderString("versions");
|
||||
static final XContentBuilderString OS = new XContentBuilderString("os");
|
||||
static final XContentBuilderString PROCESS = new XContentBuilderString("process");
|
||||
static final XContentBuilderString JVM = new XContentBuilderString("jvm");
|
||||
static final XContentBuilderString FS = new XContentBuilderString("fs");
|
||||
static final XContentBuilderString PLUGINS = new XContentBuilderString("plugins");
|
||||
static final String COUNT = "count";
|
||||
static final String VERSIONS = "versions";
|
||||
static final String OS = "os";
|
||||
static final String PROCESS = "process";
|
||||
static final String JVM = "jvm";
|
||||
static final String FS = "fs";
|
||||
static final String PLUGINS = "plugins";
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -212,109 +161,79 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
return builder;
|
||||
}
|
||||
|
||||
public static class Counts implements Streamable, ToXContent {
|
||||
int total;
|
||||
int masterOnly;
|
||||
int dataOnly;
|
||||
int masterData;
|
||||
int client;
|
||||
public static class Counts implements ToXContent {
|
||||
static final String COORDINATING_ONLY = "coordinating_only";
|
||||
|
||||
public void addNodeInfo(NodeInfo nodeInfo) {
|
||||
total++;
|
||||
DiscoveryNode node = nodeInfo.getNode();
|
||||
if (node.masterNode()) {
|
||||
if (node.dataNode()) {
|
||||
masterData++;
|
||||
} else {
|
||||
masterOnly++;
|
||||
}
|
||||
} else if (node.dataNode()) {
|
||||
dataOnly++;
|
||||
} else if (node.clientNode()) {
|
||||
client++;
|
||||
private final int total;
|
||||
private final Map<String, Integer> roles;
|
||||
|
||||
private Counts(List<NodeInfo> nodeInfos) {
|
||||
this.roles = new HashMap<>();
|
||||
for (DiscoveryNode.Role role : DiscoveryNode.Role.values()) {
|
||||
this.roles.put(role.getRoleName(), 0);
|
||||
}
|
||||
this.roles.put(COORDINATING_ONLY, 0);
|
||||
|
||||
int total = 0;
|
||||
for (NodeInfo nodeInfo : nodeInfos) {
|
||||
total++;
|
||||
if (nodeInfo.getNode().getRoles().isEmpty()) {
|
||||
Integer count = roles.get(COORDINATING_ONLY);
|
||||
roles.put(COORDINATING_ONLY, ++count);
|
||||
} else {
|
||||
for (DiscoveryNode.Role role : nodeInfo.getNode().getRoles()) {
|
||||
Integer count = roles.get(role.getRoleName());
|
||||
roles.put(role.getRoleName(), ++count);
|
||||
}
|
||||
}
|
||||
}
|
||||
this.total = total;
|
||||
}
|
||||
|
||||
public int getTotal() {
|
||||
return total;
|
||||
}
|
||||
|
||||
public int getMasterOnly() {
|
||||
return masterOnly;
|
||||
}
|
||||
|
||||
public int getDataOnly() {
|
||||
return dataOnly;
|
||||
}
|
||||
|
||||
public int getMasterData() {
|
||||
return masterData;
|
||||
}
|
||||
|
||||
public int getClient() {
|
||||
return client;
|
||||
}
|
||||
|
||||
public static Counts readCounts(StreamInput in) throws IOException {
|
||||
Counts c = new Counts();
|
||||
c.readFrom(in);
|
||||
return c;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
total = in.readVInt();
|
||||
masterOnly = in.readVInt();
|
||||
dataOnly = in.readVInt();
|
||||
masterData = in.readVInt();
|
||||
client = in.readVInt();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeVInt(total);
|
||||
out.writeVInt(masterOnly);
|
||||
out.writeVInt(dataOnly);
|
||||
out.writeVInt(masterData);
|
||||
out.writeVInt(client);
|
||||
public Map<String, Integer> getRoles() {
|
||||
return roles;
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString TOTAL = new XContentBuilderString("total");
|
||||
static final XContentBuilderString MASTER_ONLY = new XContentBuilderString("master_only");
|
||||
static final XContentBuilderString DATA_ONLY = new XContentBuilderString("data_only");
|
||||
static final XContentBuilderString MASTER_DATA = new XContentBuilderString("master_data");
|
||||
static final XContentBuilderString CLIENT = new XContentBuilderString("client");
|
||||
static final String TOTAL = "total";
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.field(Fields.TOTAL, total);
|
||||
builder.field(Fields.MASTER_ONLY, masterOnly);
|
||||
builder.field(Fields.DATA_ONLY, dataOnly);
|
||||
builder.field(Fields.MASTER_DATA, masterData);
|
||||
builder.field(Fields.CLIENT, client);
|
||||
for (Map.Entry<String, Integer> entry : roles.entrySet()) {
|
||||
builder.field(entry.getKey(), entry.getValue());
|
||||
}
|
||||
return builder;
|
||||
}
|
||||
}
|
||||
|
||||
public static class OsStats implements ToXContent, Streamable {
|
||||
|
||||
int availableProcessors;
|
||||
int allocatedProcessors;
|
||||
public static class OsStats implements ToXContent {
|
||||
final int availableProcessors;
|
||||
final int allocatedProcessors;
|
||||
final ObjectIntHashMap<String> names;
|
||||
|
||||
public OsStats() {
|
||||
names = new ObjectIntHashMap<>();
|
||||
}
|
||||
/**
|
||||
* Build the stats from information about each node.
|
||||
*/
|
||||
private OsStats(List<NodeInfo> nodeInfos) {
|
||||
this.names = new ObjectIntHashMap<>();
|
||||
int availableProcessors = 0;
|
||||
int allocatedProcessors = 0;
|
||||
for (NodeInfo nodeInfo : nodeInfos) {
|
||||
availableProcessors += nodeInfo.getOs().getAvailableProcessors();
|
||||
allocatedProcessors += nodeInfo.getOs().getAllocatedProcessors();
|
||||
|
||||
public void addNodeInfo(NodeInfo nodeInfo) {
|
||||
availableProcessors += nodeInfo.getOs().getAvailableProcessors();
|
||||
allocatedProcessors += nodeInfo.getOs().getAllocatedProcessors();
|
||||
|
||||
if (nodeInfo.getOs().getName() != null) {
|
||||
names.addTo(nodeInfo.getOs().getName(), 1);
|
||||
if (nodeInfo.getOs().getName() != null) {
|
||||
names.addTo(nodeInfo.getOs().getName(), 1);
|
||||
}
|
||||
}
|
||||
this.availableProcessors = availableProcessors;
|
||||
this.allocatedProcessors = allocatedProcessors;
|
||||
}
|
||||
|
||||
public int getAvailableProcessors() {
|
||||
@ -325,40 +244,12 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
return allocatedProcessors;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
availableProcessors = in.readVInt();
|
||||
allocatedProcessors = in.readVInt();
|
||||
int size = in.readVInt();
|
||||
names.clear();
|
||||
for (int i = 0; i < size; i++) {
|
||||
names.addTo(in.readString(), in.readVInt());
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeVInt(availableProcessors);
|
||||
out.writeVInt(allocatedProcessors);
|
||||
out.writeVInt(names.size());
|
||||
for (ObjectIntCursor<String> name : names) {
|
||||
out.writeString(name.key);
|
||||
out.writeVInt(name.value);
|
||||
}
|
||||
}
|
||||
|
||||
public static OsStats readOsStats(StreamInput in) throws IOException {
|
||||
OsStats os = new OsStats();
|
||||
os.readFrom(in);
|
||||
return os;
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString AVAILABLE_PROCESSORS = new XContentBuilderString("available_processors");
|
||||
static final XContentBuilderString ALLOCATED_PROCESSORS = new XContentBuilderString("allocated_processors");
|
||||
static final XContentBuilderString NAME = new XContentBuilderString("name");
|
||||
static final XContentBuilderString NAMES = new XContentBuilderString("names");
|
||||
static final XContentBuilderString COUNT = new XContentBuilderString("count");
|
||||
static final String AVAILABLE_PROCESSORS = "available_processors";
|
||||
static final String ALLOCATED_PROCESSORS = "allocated_processors";
|
||||
static final String NAME = "name";
|
||||
static final String NAMES = "names";
|
||||
static final String COUNT = "count";
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -373,35 +264,49 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endArray();
|
||||
|
||||
return builder;
|
||||
}
|
||||
}
|
||||
|
||||
public static class ProcessStats implements ToXContent, Streamable {
|
||||
public static class ProcessStats implements ToXContent {
|
||||
|
||||
int count;
|
||||
int cpuPercent;
|
||||
long totalOpenFileDescriptors;
|
||||
long minOpenFileDescriptors = Long.MAX_VALUE;
|
||||
long maxOpenFileDescriptors = Long.MIN_VALUE;
|
||||
final int count;
|
||||
final int cpuPercent;
|
||||
final long totalOpenFileDescriptors;
|
||||
final long minOpenFileDescriptors;
|
||||
final long maxOpenFileDescriptors;
|
||||
|
||||
public void addNodeStats(NodeStats nodeStats) {
|
||||
if (nodeStats.getProcess() == null) {
|
||||
return;
|
||||
/**
|
||||
* Build from looking at a list of node statistics.
|
||||
*/
|
||||
private ProcessStats(List<NodeStats> nodeStatsList) {
|
||||
int count = 0;
|
||||
int cpuPercent = 0;
|
||||
long totalOpenFileDescriptors = 0;
|
||||
long minOpenFileDescriptors = Long.MAX_VALUE;
|
||||
long maxOpenFileDescriptors = Long.MIN_VALUE;
|
||||
for (NodeStats nodeStats : nodeStatsList) {
|
||||
if (nodeStats.getProcess() == null) {
|
||||
continue;
|
||||
}
|
||||
count++;
|
||||
if (nodeStats.getProcess().getCpu() != null) {
|
||||
cpuPercent += nodeStats.getProcess().getCpu().getPercent();
|
||||
}
|
||||
long fd = nodeStats.getProcess().getOpenFileDescriptors();
|
||||
if (fd > 0) {
|
||||
// fd can be -1 if not supported on platform
|
||||
totalOpenFileDescriptors += fd;
|
||||
}
|
||||
// we still do min max calc on -1, so we'll have an indication of it not being supported on one of the nodes.
|
||||
minOpenFileDescriptors = Math.min(minOpenFileDescriptors, fd);
|
||||
maxOpenFileDescriptors = Math.max(maxOpenFileDescriptors, fd);
|
||||
}
|
||||
count++;
|
||||
if (nodeStats.getProcess().getCpu() != null) {
|
||||
cpuPercent += nodeStats.getProcess().getCpu().getPercent();
|
||||
}
|
||||
long fd = nodeStats.getProcess().getOpenFileDescriptors();
|
||||
if (fd > 0) {
|
||||
// fd can be -1 if not supported on platform
|
||||
totalOpenFileDescriptors += fd;
|
||||
}
|
||||
// we still do min max calc on -1, so we'll have an indication of it not being supported on one of the nodes.
|
||||
minOpenFileDescriptors = Math.min(minOpenFileDescriptors, fd);
|
||||
maxOpenFileDescriptors = Math.max(maxOpenFileDescriptors, fd);
|
||||
this.count = count;
|
||||
this.cpuPercent = cpuPercent;
|
||||
this.totalOpenFileDescriptors = totalOpenFileDescriptors;
|
||||
this.minOpenFileDescriptors = minOpenFileDescriptors;
|
||||
this.maxOpenFileDescriptors = maxOpenFileDescriptors;
|
||||
}
|
||||
|
||||
/**
|
||||
@ -432,37 +337,13 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
return minOpenFileDescriptors;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
count = in.readVInt();
|
||||
cpuPercent = in.readVInt();
|
||||
totalOpenFileDescriptors = in.readVLong();
|
||||
minOpenFileDescriptors = in.readLong();
|
||||
maxOpenFileDescriptors = in.readLong();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeVInt(count);
|
||||
out.writeVInt(cpuPercent);
|
||||
out.writeVLong(totalOpenFileDescriptors);
|
||||
out.writeLong(minOpenFileDescriptors);
|
||||
out.writeLong(maxOpenFileDescriptors);
|
||||
}
|
||||
|
||||
public static ProcessStats readStats(StreamInput in) throws IOException {
|
||||
ProcessStats cpu = new ProcessStats();
|
||||
cpu.readFrom(in);
|
||||
return cpu;
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString CPU = new XContentBuilderString("cpu");
|
||||
static final XContentBuilderString PERCENT = new XContentBuilderString("percent");
|
||||
static final XContentBuilderString OPEN_FILE_DESCRIPTORS = new XContentBuilderString("open_file_descriptors");
|
||||
static final XContentBuilderString MIN = new XContentBuilderString("min");
|
||||
static final XContentBuilderString MAX = new XContentBuilderString("max");
|
||||
static final XContentBuilderString AVG = new XContentBuilderString("avg");
|
||||
static final String CPU = "cpu";
|
||||
static final String PERCENT = "percent";
|
||||
static final String OPEN_FILE_DESCRIPTORS = "open_file_descriptors";
|
||||
static final String MIN = "min";
|
||||
static final String MAX = "max";
|
||||
static final String AVG = "avg";
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -479,20 +360,45 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
}
|
||||
}
|
||||
|
||||
public static class JvmStats implements Streamable, ToXContent {
|
||||
public static class JvmStats implements ToXContent {
|
||||
|
||||
ObjectIntHashMap<JvmVersion> versions;
|
||||
long threads;
|
||||
long maxUptime;
|
||||
long heapUsed;
|
||||
long heapMax;
|
||||
private final ObjectIntHashMap<JvmVersion> versions;
|
||||
private final long threads;
|
||||
private final long maxUptime;
|
||||
private final long heapUsed;
|
||||
private final long heapMax;
|
||||
|
||||
JvmStats() {
|
||||
versions = new ObjectIntHashMap<>();
|
||||
threads = 0;
|
||||
maxUptime = 0;
|
||||
heapMax = 0;
|
||||
heapUsed = 0;
|
||||
/**
|
||||
* Build from lists of information about each node.
|
||||
*/
|
||||
private JvmStats(List<NodeInfo> nodeInfos, List<NodeStats> nodeStatsList) {
|
||||
this.versions = new ObjectIntHashMap<>();
|
||||
long threads = 0;
|
||||
long maxUptime = 0;
|
||||
long heapMax = 0;
|
||||
long heapUsed = 0;
|
||||
for (NodeInfo nodeInfo : nodeInfos) {
|
||||
versions.addTo(new JvmVersion(nodeInfo.getJvm()), 1);
|
||||
}
|
||||
|
||||
for (NodeStats nodeStats : nodeStatsList) {
|
||||
org.elasticsearch.monitor.jvm.JvmStats js = nodeStats.getJvm();
|
||||
if (js == null) {
|
||||
continue;
|
||||
}
|
||||
if (js.getThreads() != null) {
|
||||
threads += js.getThreads().getCount();
|
||||
}
|
||||
maxUptime = Math.max(maxUptime, js.getUptime().millis());
|
||||
if (js.getMem() != null) {
|
||||
heapUsed += js.getMem().getHeapUsed().bytes();
|
||||
heapMax += js.getMem().getHeapMax().bytes();
|
||||
}
|
||||
}
|
||||
this.threads = threads;
|
||||
this.maxUptime = maxUptime;
|
||||
this.heapUsed = heapUsed;
|
||||
this.heapMax = heapMax;
|
||||
}
|
||||
|
||||
public ObjectIntHashMap<JvmVersion> getVersions() {
|
||||
@ -527,70 +433,21 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
return new ByteSizeValue(heapMax);
|
||||
}
|
||||
|
||||
public void addNodeInfoStats(NodeInfo nodeInfo, NodeStats nodeStats) {
|
||||
versions.addTo(new JvmVersion(nodeInfo.getJvm()), 1);
|
||||
org.elasticsearch.monitor.jvm.JvmStats js = nodeStats.getJvm();
|
||||
if (js == null) {
|
||||
return;
|
||||
}
|
||||
if (js.getThreads() != null) {
|
||||
threads += js.getThreads().getCount();
|
||||
}
|
||||
maxUptime = Math.max(maxUptime, js.getUptime().millis());
|
||||
if (js.getMem() != null) {
|
||||
heapUsed += js.getMem().getHeapUsed().bytes();
|
||||
heapMax += js.getMem().getHeapMax().bytes();
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
int size = in.readVInt();
|
||||
versions = new ObjectIntHashMap<>(size);
|
||||
for (; size > 0; size--) {
|
||||
versions.addTo(JvmVersion.readJvmVersion(in), in.readVInt());
|
||||
}
|
||||
threads = in.readVLong();
|
||||
maxUptime = in.readVLong();
|
||||
heapUsed = in.readVLong();
|
||||
heapMax = in.readVLong();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeVInt(versions.size());
|
||||
for (ObjectIntCursor<JvmVersion> v : versions) {
|
||||
v.key.writeTo(out);
|
||||
out.writeVInt(v.value);
|
||||
}
|
||||
|
||||
out.writeVLong(threads);
|
||||
out.writeVLong(maxUptime);
|
||||
out.writeVLong(heapUsed);
|
||||
out.writeVLong(heapMax);
|
||||
}
|
||||
|
||||
public static JvmStats readJvmStats(StreamInput in) throws IOException {
|
||||
JvmStats jvmStats = new JvmStats();
|
||||
jvmStats.readFrom(in);
|
||||
return jvmStats;
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString VERSIONS = new XContentBuilderString("versions");
|
||||
static final XContentBuilderString VERSION = new XContentBuilderString("version");
|
||||
static final XContentBuilderString VM_NAME = new XContentBuilderString("vm_name");
|
||||
static final XContentBuilderString VM_VERSION = new XContentBuilderString("vm_version");
|
||||
static final XContentBuilderString VM_VENDOR = new XContentBuilderString("vm_vendor");
|
||||
static final XContentBuilderString COUNT = new XContentBuilderString("count");
|
||||
static final XContentBuilderString THREADS = new XContentBuilderString("threads");
|
||||
static final XContentBuilderString MAX_UPTIME = new XContentBuilderString("max_uptime");
|
||||
static final XContentBuilderString MAX_UPTIME_IN_MILLIS = new XContentBuilderString("max_uptime_in_millis");
|
||||
static final XContentBuilderString MEM = new XContentBuilderString("mem");
|
||||
static final XContentBuilderString HEAP_USED = new XContentBuilderString("heap_used");
|
||||
static final XContentBuilderString HEAP_USED_IN_BYTES = new XContentBuilderString("heap_used_in_bytes");
|
||||
static final XContentBuilderString HEAP_MAX = new XContentBuilderString("heap_max");
|
||||
static final XContentBuilderString HEAP_MAX_IN_BYTES = new XContentBuilderString("heap_max_in_bytes");
|
||||
static final String VERSIONS = "versions";
|
||||
static final String VERSION = "version";
|
||||
static final String VM_NAME = "vm_name";
|
||||
static final String VM_VERSION = "vm_version";
|
||||
static final String VM_VENDOR = "vm_vendor";
|
||||
static final String COUNT = "count";
|
||||
static final String THREADS = "threads";
|
||||
static final String MAX_UPTIME = "max_uptime";
|
||||
static final String MAX_UPTIME_IN_MILLIS = "max_uptime_in_millis";
|
||||
static final String MEM = "mem";
|
||||
static final String HEAP_USED = "heap_used";
|
||||
static final String HEAP_USED_IN_BYTES = "heap_used_in_bytes";
|
||||
static final String HEAP_MAX = "heap_max";
|
||||
static final String HEAP_MAX_IN_BYTES = "heap_max_in_bytes";
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -617,7 +474,7 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
}
|
||||
}
|
||||
|
||||
public static class JvmVersion implements Streamable {
|
||||
public static class JvmVersion {
|
||||
String version;
|
||||
String vmName;
|
||||
String vmVersion;
|
||||
@ -630,9 +487,6 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
vmVendor = jvmInfo.getVmVendor();
|
||||
}
|
||||
|
||||
JvmVersion() {
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) {
|
||||
@ -651,29 +505,5 @@ public class ClusterStatsNodes implements ToXContent, Streamable {
|
||||
public int hashCode() {
|
||||
return vmVersion.hashCode();
|
||||
}
|
||||
|
||||
public static JvmVersion readJvmVersion(StreamInput in) throws IOException {
|
||||
JvmVersion jvm = new JvmVersion();
|
||||
jvm.readFrom(in);
|
||||
return jvm;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
version = in.readString();
|
||||
vmName = in.readString();
|
||||
vmVersion = in.readString();
|
||||
vmVendor = in.readString();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeString(version);
|
||||
out.writeString(vmName);
|
||||
out.writeString(vmVersion);
|
||||
out.writeString(vmVendor);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
|
@ -19,6 +19,7 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.stats;
|
||||
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.support.nodes.BaseNodesResponse;
|
||||
import org.elasticsearch.cluster.ClusterName;
|
||||
import org.elasticsearch.cluster.health.ClusterHealthStatus;
|
||||
@ -26,13 +27,11 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Iterator;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
*
|
||||
@ -49,8 +48,9 @@ public class ClusterStatsResponse extends BaseNodesResponse<ClusterStatsNodeResp
|
||||
ClusterStatsResponse() {
|
||||
}
|
||||
|
||||
public ClusterStatsResponse(long timestamp, ClusterName clusterName, String clusterUUID, ClusterStatsNodeResponse[] nodes) {
|
||||
super(clusterName, null);
|
||||
public ClusterStatsResponse(long timestamp, ClusterName clusterName, String clusterUUID,
|
||||
List<ClusterStatsNodeResponse> nodes, List<FailedNodeException> failures) {
|
||||
super(clusterName, nodes, failures);
|
||||
this.timestamp = timestamp;
|
||||
this.clusterUUID = clusterUUID;
|
||||
nodesStats = new ClusterStatsNodes(nodes);
|
||||
@ -80,77 +80,53 @@ public class ClusterStatsResponse extends BaseNodesResponse<ClusterStatsNodeResp
|
||||
return indicesStats;
|
||||
}
|
||||
|
||||
@Override
|
||||
public ClusterStatsNodeResponse[] getNodes() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
@Override
|
||||
public Map<String, ClusterStatsNodeResponse> getNodesMap() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ClusterStatsNodeResponse getAt(int position) {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
@Override
|
||||
public Iterator<ClusterStatsNodeResponse> iterator() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
timestamp = in.readVLong();
|
||||
status = null;
|
||||
if (in.readBoolean()) {
|
||||
// it may be that the master switched on us while doing the operation. In this case the status may be null.
|
||||
status = ClusterHealthStatus.fromValue(in.readByte());
|
||||
}
|
||||
clusterUUID = in.readString();
|
||||
nodesStats = ClusterStatsNodes.readNodeStats(in);
|
||||
indicesStats = ClusterStatsIndices.readIndicesStats(in);
|
||||
// it may be that the master switched on us while doing the operation. In this case the status may be null.
|
||||
status = in.readOptionalWriteable(ClusterHealthStatus::readFrom);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeVLong(timestamp);
|
||||
if (status == null) {
|
||||
out.writeBoolean(false);
|
||||
} else {
|
||||
out.writeBoolean(true);
|
||||
out.writeByte(status.value());
|
||||
}
|
||||
out.writeString(clusterUUID);
|
||||
nodesStats.writeTo(out);
|
||||
indicesStats.writeTo(out);
|
||||
out.writeOptionalWriteable(status);
|
||||
}
|
||||
|
||||
static final class Fields {
|
||||
static final XContentBuilderString NODES = new XContentBuilderString("nodes");
|
||||
static final XContentBuilderString INDICES = new XContentBuilderString("indices");
|
||||
static final XContentBuilderString UUID = new XContentBuilderString("uuid");
|
||||
static final XContentBuilderString CLUSTER_NAME = new XContentBuilderString("cluster_name");
|
||||
static final XContentBuilderString STATUS = new XContentBuilderString("status");
|
||||
@Override
|
||||
protected List<ClusterStatsNodeResponse> readNodesFrom(StreamInput in) throws IOException {
|
||||
List<ClusterStatsNodeResponse> nodes = in.readList(ClusterStatsNodeResponse::readNodeResponse);
|
||||
|
||||
// built from nodes rather than from the stream directly
|
||||
nodesStats = new ClusterStatsNodes(nodes);
|
||||
indicesStats = new ClusterStatsIndices(nodes);
|
||||
|
||||
return nodes;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void writeNodesTo(StreamOutput out, List<ClusterStatsNodeResponse> nodes) throws IOException {
|
||||
// nodeStats and indicesStats are rebuilt from nodes
|
||||
out.writeStreamableList(nodes);
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.field("timestamp", getTimestamp());
|
||||
builder.field(Fields.CLUSTER_NAME, getClusterName().value());
|
||||
if (params.paramAsBoolean("output_uuid", false)) {
|
||||
builder.field(Fields.UUID, clusterUUID);
|
||||
builder.field("uuid", clusterUUID);
|
||||
}
|
||||
if (status != null) {
|
||||
builder.field(Fields.STATUS, status.name().toLowerCase(Locale.ROOT));
|
||||
builder.field("status", status.name().toLowerCase(Locale.ROOT));
|
||||
}
|
||||
builder.startObject(Fields.INDICES);
|
||||
builder.startObject("indices");
|
||||
indicesStats.toXContent(builder, params);
|
||||
builder.endObject();
|
||||
builder.startObject(Fields.NODES);
|
||||
builder.startObject("nodes");
|
||||
nodesStats.toXContent(builder, params);
|
||||
builder.endObject();
|
||||
return builder;
|
||||
|
@ -19,6 +19,7 @@
|
||||
|
||||
package org.elasticsearch.action.admin.cluster.stats;
|
||||
|
||||
import org.elasticsearch.action.FailedNodeException;
|
||||
import org.elasticsearch.action.admin.cluster.node.info.NodeInfo;
|
||||
import org.elasticsearch.action.admin.cluster.node.stats.NodeStats;
|
||||
import org.elasticsearch.action.admin.indices.stats.CommonStats;
|
||||
@ -46,7 +47,6 @@ import org.elasticsearch.transport.TransportService;
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.concurrent.atomic.AtomicReferenceArray;
|
||||
|
||||
/**
|
||||
*
|
||||
@ -68,22 +68,17 @@ public class TransportClusterStatsAction extends TransportNodesAction<ClusterSta
|
||||
NodeService nodeService, IndicesService indicesService,
|
||||
ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) {
|
||||
super(settings, ClusterStatsAction.NAME, clusterName, threadPool, clusterService, transportService, actionFilters,
|
||||
indexNameExpressionResolver, ClusterStatsRequest::new, ClusterStatsNodeRequest::new, ThreadPool.Names.MANAGEMENT);
|
||||
indexNameExpressionResolver, ClusterStatsRequest::new, ClusterStatsNodeRequest::new, ThreadPool.Names.MANAGEMENT,
|
||||
ClusterStatsNodeResponse.class);
|
||||
this.nodeService = nodeService;
|
||||
this.indicesService = indicesService;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected ClusterStatsResponse newResponse(ClusterStatsRequest clusterStatsRequest, AtomicReferenceArray responses) {
|
||||
final List<ClusterStatsNodeResponse> nodeStats = new ArrayList<>(responses.length());
|
||||
for (int i = 0; i < responses.length(); i++) {
|
||||
Object resp = responses.get(i);
|
||||
if (resp instanceof ClusterStatsNodeResponse) {
|
||||
nodeStats.add((ClusterStatsNodeResponse) resp);
|
||||
}
|
||||
}
|
||||
return new ClusterStatsResponse(System.currentTimeMillis(), clusterName,
|
||||
clusterService.state().metaData().clusterUUID(), nodeStats.toArray(new ClusterStatsNodeResponse[nodeStats.size()]));
|
||||
protected ClusterStatsResponse newResponse(ClusterStatsRequest request,
|
||||
List<ClusterStatsNodeResponse> responses, List<FailedNodeException> failures) {
|
||||
return new ClusterStatsResponse(System.currentTimeMillis(), clusterName, clusterService.state().metaData().clusterUUID(),
|
||||
responses, failures);
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -114,7 +109,7 @@ public class TransportClusterStatsAction extends TransportNodesAction<ClusterSta
|
||||
}
|
||||
|
||||
ClusterHealthStatus clusterStatus = null;
|
||||
if (clusterService.state().nodes().localNodeMaster()) {
|
||||
if (clusterService.state().nodes().isLocalNodeElectedMaster()) {
|
||||
clusterStatus = new ClusterStateHealth(clusterService.state()).getStatus();
|
||||
}
|
||||
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user