Added a new line linter (#2875)

* Added linter to add new line

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Fixed new lines

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Ignore empty files

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Updated DEVELOPER GUIDE

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Renamed workflow file

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Fixed failing tests

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>
This commit is contained in:
Owais Kazi 2022-04-13 11:14:18 -07:00 committed by GitHub
parent 08e4a35839
commit 3c5d997a76
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
154 changed files with 115 additions and 167 deletions

14
.github/workflows/code-hygiene.yml vendored Normal file
View File

@ -0,0 +1,14 @@
name: Code Hygiene
on: [push, pull_request]
jobs:
linelint:
runs-on: ubuntu-latest
name: Check if all files end in newline
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Linelint
uses: fernandrone/linelint@0.0.4

49
.linelint.yml Normal file
View File

@ -0,0 +1,49 @@
# 'true' will fix files
autofix: true
ignore:
- .git/
- .gradle/
- .idea/
- '*.sha1'
- '*.txt'
- '.github/CODEOWNERS'
- 'buildSrc/src/testKit/opensearch.build/LICENSE'
- 'buildSrc/src/testKit/opensearch.build/NOTICE'
- 'server/licenses/apache-log4j-extras-DEPENDENCIES'
# Empty files
- 'doc-tools/missing-doclet/bin/main/org/opensearch/missingdoclet/MissingDoclet.class'
- 'buildSrc/src/integTest/resources/org/opensearch/gradle/internal/fake_git/remote/build.gradle'
- 'buildSrc/src/integTest/resources/org/opensearch/gradle/internal/fake_git/remote/distribution/archives/oss-darwin-tar/build.gradle'
- 'buildSrc/src/integTest/resources/org/opensearch/gradle/internal/fake_git/remote/distribution/bwc/bugfix/build.gradle'
- 'buildSrc/src/integTest/resources/org/opensearch/gradle/internal/fake_git/remote/distribution/bwc/minor/build.gradle'
- 'buildSrc/src/main/resources/buildSrc.marker'
- 'buildSrc/src/testKit/opensearch-build-resources/settings.gradle'
- 'buildSrc/src/testKit/opensearch.build/settings.gradle'
- 'buildSrc/src/testKit/reaper/settings.gradle'
- 'buildSrc/src/testKit/symbolic-link-preserving-tar/settings.gradle'
- 'buildSrc/src/testKit/testingConventions/empty_test_task/.gitignore'
- 'client/rest-high-level/src/main/resources/META-INF/services/org.opensearch.plugins.spi.NamedXContentProvider'
- 'distribution/bwc/bugfix/build.gradle'
- 'distribution/bwc/maintenance/build.gradle'
- 'distribution/bwc/minor/build.gradle'
- 'distribution/bwc/staged/build.gradle'
- 'libs/ssl-config/src/test/resources/certs/pem-utils/empty.pem'
- 'qa/evil-tests/src/test/resources/org/opensearch/common/logging/does_not_exist/nothing_to_see_here'
- 'qa/os/centos-6/build.gradle'
- 'qa/os/debian-8/build.gradle'
- 'qa/os/oel-6/build.gradle'
- 'qa/os/oel-7/build.gradle'
- 'qa/os/sles-12/build.gradle'
# Test requires no new line for these files
- 'server/src/test/resources/org/opensearch/action/bulk/simple-bulk11.json'
- 'server/src/test/resources/org/opensearch/action/search/simple-msearch5.json'
rules:
# checks if file ends in a newline character
end-of-file:
# set to true to enable this rule
enable: true
# if true also checks if file ends in a single newline character
single-new-line: true

View File

@ -48,6 +48,7 @@
- [Distributed Framework](#distributed-framework) - [Distributed Framework](#distributed-framework)
- [Submitting Changes](#submitting-changes) - [Submitting Changes](#submitting-changes)
- [Backports](#backports) - [Backports](#backports)
- [LineLint](#linelint)
# Developer Guide # Developer Guide
@ -472,3 +473,18 @@ See [CONTRIBUTING](CONTRIBUTING.md).
## Backports ## Backports
The Github workflow in [`backport.yml`](.github/workflows/backport.yml) creates backport PRs automatically when the original PR with an appropriate label `backport <backport-branch-name>` is merged to main with the backport workflow run successfully on the PR. For example, if a PR on main needs to be backported to `1.x` branch, add a label `backport 1.x` to the PR and make sure the backport workflow runs on the PR along with other checks. Once this PR is merged to main, the workflow will create a backport PR to the `1.x` branch. The Github workflow in [`backport.yml`](.github/workflows/backport.yml) creates backport PRs automatically when the original PR with an appropriate label `backport <backport-branch-name>` is merged to main with the backport workflow run successfully on the PR. For example, if a PR on main needs to be backported to `1.x` branch, add a label `backport 1.x` to the PR and make sure the backport workflow runs on the PR along with other checks. Once this PR is merged to main, the workflow will create a backport PR to the `1.x` branch.
## LineLint
A linter in [`code-hygiene.yml`](.github/workflows/code-hygiene.yml) that validates simple newline and whitespace rules in all sorts of files. It can:
- Recursively check a directory tree for files that do not end in a newline
- Automatically fix these files by adding a newline or trimming extra newlines.
Rules are defined in `.linelint.yml`.
Executing the binary will automatically search the local directory tree for linting errors.
linelint .
Pass a list of files or directories to limit your search.
linelint README.md LICENSE

View File

@ -88,6 +88,3 @@ project(':valid_setup_with_base') {
} }
} }
} }

View File

@ -136,4 +136,3 @@ if __name__ == "__main__":
print('WARNING: no documentation references updates for release %s' % (release_version)) print('WARNING: no documentation references updates for release %s' % (release_version))
print('*** Done.') print('*** Done.')

View File

@ -28,4 +28,3 @@ done
# Return non-zero error code if any commits were missing signoff # Return non-zero error code if any commits were missing signoff
exit $missingSignoff exit $missingSignoff

View File

@ -7,4 +7,3 @@
- is_true: version - is_true: version
- is_true: version.number - is_true: version.number
- match: { version.build_type: "docker" } - match: { version.build_type: "docker" }

View File

@ -123,4 +123,3 @@
- match: - match:
$body: | $body: |
/^(\S{5,}\n)+$/ /^(\S{5,}\n)+$/

View File

@ -47,4 +47,3 @@ for VAR_NAME_FILE in OPENSEARCH_PASSWORD_FILE KEYSTORE_PASSWORD_FILE ; do
unset "$VAR_NAME_FILE" unset "$VAR_NAME_FILE"
fi fi
done done

View File

@ -74,4 +74,3 @@ if defined JAVA_OPTS (
rem check the Java version rem check the Java version
%JAVA% -cp "%OPENSEARCH_CLASSPATH%" "org.opensearch.tools.java_version_checker.JavaVersionChecker" || exit /b 1 %JAVA% -cp "%OPENSEARCH_CLASSPATH%" "org.opensearch.tools.java_version_checker.JavaVersionChecker" || exit /b 1

View File

@ -8,4 +8,3 @@ version '1.0.0-SNAPSHOT'
repositories { repositories {
mavenCentral() mavenCentral()
} }

View File

@ -40,4 +40,3 @@ dependencies {
tasks.named('forbiddenApisMain').configure { tasks.named('forbiddenApisMain').configure {
replaceSignatureFiles 'jdk-signatures' replaceSignatureFiles 'jdk-signatures'
} }

View File

@ -42,4 +42,3 @@ tasks.named('forbiddenApisMain').configure {
// TODO: Need to decide how we want to handle for forbidden signatures with the changes to core // TODO: Need to decide how we want to handle for forbidden signatures with the changes to core
replaceSignatureFiles 'jdk-signatures' replaceSignatureFiles 'jdk-signatures'
} }

View File

@ -10,4 +10,3 @@ EXIM_PROTOCOL (P=%{NOTSPACE:protocol})
EXIM_MSG_SIZE (S=%{NUMBER:exim_msg_size}) EXIM_MSG_SIZE (S=%{NUMBER:exim_msg_size})
EXIM_HEADER_ID (id=%{NOTSPACE:exim_header_id}) EXIM_HEADER_ID (id=%{NOTSPACE:exim_header_id})
EXIM_SUBJECT (T=%{QS:exim_subject}) EXIM_SUBJECT (T=%{QS:exim_subject})

View File

@ -6,4 +6,3 @@ RT_FLOW1 %{RT_FLOW_EVENT:event}: %{GREEDYDATA:close-reason}: %{IP:src-ip}/%{INT:
RT_FLOW2 %{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .* RT_FLOW2 %{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .*
RT_FLOW3 %{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\(\d\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .* RT_FLOW3 %{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\(\d\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .*

View File

@ -1,3 +1,2 @@
# Default postgresql pg_log format pattern # Default postgresql pg_log format pattern
POSTGRESQL %{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid} POSTGRESQL %{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid}

View File

@ -61,4 +61,3 @@ tasks.test {
jvmArgs += ["--add-opens", "java.base/java.security.cert=ALL-UNNAMED"] jvmArgs += ["--add-opens", "java.base/java.security.cert=ALL-UNNAMED"]
} }
} }

View File

@ -68,4 +68,3 @@
- match: { tokens.1.token: "f" } - match: { tokens.1.token: "f" }
- match: { tokens.2.token: "g" } - match: { tokens.2.token: "g" }
- match: { tokens.3.token: "h" } - match: { tokens.3.token: "h" }

View File

@ -119,4 +119,3 @@
- match: { indices.analysis.built_in_analyzers.2.name: spanish } - match: { indices.analysis.built_in_analyzers.2.name: spanish }
- match: { indices.analysis.built_in_analyzers.2.count: 2 } - match: { indices.analysis.built_in_analyzers.2.count: 2 }
- match: { indices.analysis.built_in_analyzers.2.index_count: 2 } - match: { indices.analysis.built_in_analyzers.2.index_count: 2 }

View File

@ -76,4 +76,3 @@
- match: { tokens.5.token: dude } - match: { tokens.5.token: dude }
- match: { tokens.5.position: 4 } - match: { tokens.5.position: 4 }
- match: { tokens.5.positionLength: null } - match: { tokens.5.positionLength: null }

View File

@ -229,4 +229,3 @@ setup:
query: bar baz query: bar baz
analyzer: lower_graph_syns analyzer: lower_graph_syns
- match: { hits.total: 1 } - match: { hits.total: 1 }

View File

@ -56,4 +56,3 @@ setup:
use_field: text_en use_field: text_en
max_gaps: 1 max_gaps: 1
- match: { hits.total.value: 1 } - match: { hits.total.value: 1 }

View File

@ -91,4 +91,3 @@ teardown:
get: get:
index: test index: test
id: 3 id: 3

View File

@ -43,4 +43,3 @@ restResources {
testClusters.all { testClusters.all {
extraConfigFile 'ingest-user-agent/test-regexes.yml', file('src/test/test-regexes.yml') extraConfigFile 'ingest-user-agent/test-regexes.yml', file('src/test/test-regexes.yml')
} }

View File

@ -52,4 +52,3 @@ tasks.named("dependencyLicenses").configure {
mapping from: /lucene-.*/, to: 'lucene' mapping from: /lucene-.*/, to: 'lucene'
mapping from: /asm-.*/, to: 'asm' mapping from: /asm-.*/, to: 'asm'
} }

View File

@ -139,4 +139,3 @@ setup:
- is_false: aggregations.placeholder.buckets.0.str_terms.buckets.1.key_as_string - is_false: aggregations.placeholder.buckets.0.str_terms.buckets.1.key_as_string
- match: { aggregations.placeholder.buckets.0.str_terms.buckets.1.doc_count: 1 } - match: { aggregations.placeholder.buckets.0.str_terms.buckets.1.doc_count: 1 }
- match: { aggregations.placeholder.buckets.0.the_bucket_script.value: 2.0 } - match: { aggregations.placeholder.buckets.0.the_bucket_script.value: 2.0 }

View File

@ -41,4 +41,3 @@ dependencies {
testClusters.all { testClusters.all {
module ':modules:reindex' module ':modules:reindex'
} }

View File

@ -75,4 +75,3 @@ testClusters.all {
"http://snapshot.test*,http://${urlFixture.addressAndPort}" "http://snapshot.test*,http://${urlFixture.addressAndPort}"
}, PropertyNormalization.IGNORE_VALUE }, PropertyNormalization.IGNORE_VALUE
} }

View File

@ -32,4 +32,3 @@ opensearchplugin {
description 'Integrates OpenSearch with systemd' description 'Integrates OpenSearch with systemd'
classname 'org.opensearch.systemd.SystemdPlugin' classname 'org.opensearch.systemd.SystemdPlugin'
} }

View File

@ -46,4 +46,3 @@ restResources {
tasks.named("dependencyLicenses").configure { tasks.named("dependencyLicenses").configure {
mapping from: /lucene-.*/, to: 'lucene' mapping from: /lucene-.*/, to: 'lucene'
} }

View File

@ -31,4 +31,3 @@
- match: { tokens.1.token: joe } - match: { tokens.1.token: joe }
- match: { tokens.2.token: BLKS } - match: { tokens.2.token: BLKS }
- match: { tokens.3.token: bloggs } - match: { tokens.3.token: bloggs }

View File

@ -28,4 +28,3 @@
- length: { tokens: 1 } - length: { tokens: 1 }
- match: { tokens.0.token: SPRKLF } - match: { tokens.0.token: SPRKLF }

View File

@ -30,4 +30,3 @@
- length: { tokens: 1 } - length: { tokens: 1 }
- match: { tokens.0.token: Svarts } - match: { tokens.0.token: Svarts }

View File

@ -27,4 +27,3 @@
- length: { tokens: 1 } - length: { tokens: 1 }
- match: { tokens.0.token: "645740" } - match: { tokens.0.token: "645740" }

View File

@ -47,4 +47,3 @@ restResources {
tasks.named("dependencyLicenses").configure { tasks.named("dependencyLicenses").configure {
mapping from: /lucene-.*/, to: 'lucene' mapping from: /lucene-.*/, to: 'lucene'
} }

View File

@ -36,4 +36,3 @@ configure(project('painless-whitelist')) {
} }
} }
} }

View File

@ -42,4 +42,3 @@ testClusters.all {
// Adds a setting in the OpenSearch keystore before running the integration tests // Adds a setting in the OpenSearch keystore before running the integration tests
keystore 'custom.secured', 'password' keystore 'custom.secured', 'password'
} }

View File

@ -56,4 +56,3 @@ javaRestTest {
dependsOn exampleFixture dependsOn exampleFixture
nonInputProperties.systemProperty 'external.address', "${-> exampleFixture.addressAndPort}" nonInputProperties.systemProperty 'external.address', "${-> exampleFixture.addressAndPort}"
} }

View File

@ -39,4 +39,3 @@ opensearchplugin {
} }
test.enabled = false test.enabled = false

View File

@ -2,4 +2,3 @@
= AsciiDoc test = AsciiDoc test
Here is a test of the asciidoc format. Here is a test of the asciidoc format.

View File

@ -12,4 +12,3 @@
- contains: { 'nodes.$master.plugins': { name: ingest-attachment } } - contains: { 'nodes.$master.plugins': { name: ingest-attachment } }
- contains: { 'nodes.$master.ingest.processors': { type: attachment } } - contains: { 'nodes.$master.ingest.processors': { type: attachment } }

View File

@ -142,4 +142,3 @@
request_cache: false request_cache: false
body: { "query" : {"match_phrase" : { "my_field" : {"query": "~MARK0", "analyzer": "whitespace"} } }, "highlight" : { "type" : "annotated", "fields" : { "my_field" : {} } } } body: { "query" : {"match_phrase" : { "my_field" : {"query": "~MARK0", "analyzer": "whitespace"} } }, "highlight" : { "type" : "annotated", "fields" : { "my_field" : {} } } }
- match: {_shards.failed: 0} - match: {_shards.failed: 0}

View File

@ -45,4 +45,3 @@
- do: - do:
snapshot.delete_repository: snapshot.delete_repository:
repository: test_snapshot_repository repository: test_snapshot_repository

View File

@ -47,4 +47,3 @@
- do: - do:
snapshot.delete_repository: snapshot.delete_repository:
repository: test_snapshot_repository repository: test_snapshot_repository

View File

@ -183,4 +183,3 @@ thirdPartyAudit {
'io.netty.handler.ssl.util.OpenJdkSelfSignedCertGenerator' 'io.netty.handler.ssl.util.OpenJdkSelfSignedCertGenerator'
) )
} }

View File

@ -35,4 +35,3 @@ testClusters.javaRestTest {
} }
test.enabled = false test.enabled = false

View File

@ -220,4 +220,3 @@
# When all shards are skipped current logic returns 1 to produce a valid search result # When all shards are skipped current logic returns 1 to produce a valid search result
- match: { _shards.skipped : 1} - match: { _shards.skipped : 1}
- match: { _shards.failed: 0 } - match: { _shards.failed: 0 }

View File

@ -204,4 +204,3 @@
tasks.get: tasks.get:
wait_for_completion: true wait_for_completion: true
task_id: $task task_id: $task

View File

@ -111,5 +111,3 @@
gte: "2019-02-01T00:00+01:00" gte: "2019-02-01T00:00+01:00"
lte: "2019-02-01T00:00+01:00" lte: "2019-02-01T00:00+01:00"
- match: { hits.total: 1 } - match: { hits.total: 1 }

View File

@ -133,5 +133,3 @@
wait_for_completion: true wait_for_completion: true
task_id: $task_id task_id: $task_id
- match: { task.headers.X-Opaque-Id: "Reindexing Again" } - match: { task.headers.X-Opaque-Id: "Reindexing Again" }

View File

@ -38,4 +38,3 @@
time_frame: time_frame:
gte: "2019-02-01T00:00+01:00" gte: "2019-02-01T00:00+01:00"
lte: "2019-02-01T00:00+01:00" lte: "2019-02-01T00:00+01:00"

View File

@ -112,4 +112,3 @@
_id: test_id2 _id: test_id2
pipeline: my_pipeline_1 pipeline: my_pipeline_1
- f1: v2 - f1: v2

View File

@ -102,4 +102,3 @@
- match: { error.processor_type: "script" } - match: { error.processor_type: "script" }
- match: { error.type: "script_exception" } - match: { error.type: "script_exception" }
- match: { error.reason: "compile error" } - match: { error.reason: "compile error" }

View File

@ -34,4 +34,3 @@
id: 1 id: 1
pipeline: "my_timely_pipeline" pipeline: "my_timely_pipeline"
body: {} body: {}

View File

@ -49,4 +49,3 @@ public class SmokeTestPluginsClientYamlTestSuiteIT extends OpenSearchClientYamlS
return OpenSearchClientYamlSuiteTestCase.createParameters(); return OpenSearchClientYamlSuiteTestCase.createParameters();
} }
} }

View File

@ -411,5 +411,3 @@
Signed-off-by: Abbas Hussain &lt;abbas_10690@yahoo.com&gt; Signed-off-by: Abbas Hussain &lt;abbas_10690@yahoo.com&gt;

View File

@ -386,5 +386,3 @@
Signed-off-by: Sooraj Sinha &lt;soosinha@amazon.com&gt; Signed-off-by: Sooraj Sinha &lt;soosinha@amazon.com&gt;

View File

@ -458,4 +458,3 @@
Signed-off-by: Nicholas Walter Knize &lt;nknize@apache.org&gt; Signed-off-by: Nicholas Walter Knize &lt;nknize@apache.org&gt;

View File

@ -72,5 +72,3 @@
Signed-off-by: dblock &lt;dblock@amazon.com&gt; Signed-off-by: dblock &lt;dblock@amazon.com&gt;

View File

@ -1295,5 +1295,3 @@
[Nick Knize](mailto:nknize@apache.org) - Thu, 4 Nov 2021 14:46:57 -0500 [Nick Knize](mailto:nknize@apache.org) - Thu, 4 Nov 2021 14:46:57 -0500
Signed-off-by: Nicholas Walter Knize &lt;nknize@apache.org&gt; Signed-off-by: Nicholas Walter Knize &lt;nknize@apache.org&gt;

View File

@ -223,4 +223,3 @@
- match: { items.0.index.status: 400 } - match: { items.0.index.status: 400 }
- match: { items.0.index.error.type: illegal_argument_exception } - match: { items.0.index.error.type: illegal_argument_exception }
- match: { items.0.index.error.reason: "no write index is defined for alias [test_index]. The write index may be explicitly disabled using is_write_index=false or the alias points to multiple indices without one being designated as a write index" } - match: { items.0.index.error.reason: "no write index is defined for alias [test_index]. The write index may be explicitly disabled using is_write_index=false or the alias points to multiple indices without one being designated as a write index" }

View File

@ -14,4 +14,3 @@
index: test_index index: test_index
- match: {count: 2} - match: {count: 2}

View File

@ -14,4 +14,3 @@
index: test_index index: test_index
- match: {count: 2} - match: {count: 2}

View File

@ -68,4 +68,3 @@
- match: { items.0.update.get._source.foo: garply } - match: { items.0.update.get._source.foo: garply }
- is_false: items.0.update.get._source.bar - is_false: items.0.update.get._source.bar

View File

@ -131,4 +131,3 @@
- match: - match:
$body: | $body: |
/^(\S{5,}\n)+$/ /^(\S{5,}\n)+$/

View File

@ -11,4 +11,3 @@
local: true local: true
- is_true: tasks - is_true: tasks

View File

@ -3,4 +3,3 @@
- do: - do:
cluster.remote_info: {} cluster.remote_info: {}
- is_true: '' - is_true: ''

View File

@ -104,4 +104,3 @@ teardown:
cluster.post_voting_config_exclusions: cluster.post_voting_config_exclusions:
node_ids: nodeId node_ids: nodeId
node_names: nodeName node_names: nodeName

View File

@ -39,4 +39,3 @@
get: get:
index: test_1 index: test_1
id: 1 id: 1

View File

@ -30,4 +30,3 @@
index: test_1 index: test_1
id: 1 id: 1
routing: 5 routing: 5

View File

@ -14,4 +14,3 @@
- match: { _index: test_1 } - match: { _index: test_1 }
- match: { _id: '1' } - match: { _id: '1' }
- match: { _source: { foo: "bar" } } - match: { _source: { foo: "bar" } }

View File

@ -48,5 +48,3 @@
- match: { fields.foo: [bar] } - match: { fields.foo: [bar] }
- match: { fields.count: [1] } - match: { fields.count: [1] }
- match: { _source.foo: bar } - match: { _source.foo: bar }

View File

@ -41,4 +41,3 @@
get: get:
index: test_1 index: test_1
id: 1 id: 1

View File

@ -80,4 +80,3 @@
id: 1 id: 1
version: 1 version: 1
version_type: external_gte version_type: external_gte

View File

@ -40,4 +40,3 @@
get: get:
index: test_1 index: test_1
id: 1 id: 1

View File

@ -221,4 +221,3 @@ setup:
catch: param catch: param
indices.delete_alias: indices.delete_alias:
index: "test_index1" index: "test_index1"

View File

@ -41,4 +41,3 @@ setup:
local: true local: true
- is_false: '' - is_false: ''

View File

@ -27,5 +27,3 @@
index: test index: test
max_num_segments: 10 max_num_segments: 10
only_expunge_deletes: true only_expunge_deletes: true

View File

@ -6,5 +6,3 @@
indices.get_field_mapping: indices.get_field_mapping:
index: test_index index: test_index
fields: field fields: field

View File

@ -127,4 +127,3 @@ setup:
- match: {test_index_2.mappings.t1.full_name: t1 } - match: {test_index_2.mappings.t1.full_name: t1 }
- match: {test_index_2.mappings.t2.full_name: t2 } - match: {test_index_2.mappings.t2.full_name: t2 }
- length: {test_index_2.mappings: 2} - length: {test_index_2.mappings: 2}

View File

@ -17,4 +17,3 @@ setup:
catch: missing catch: missing
indices.get_index_template: indices.get_index_template:
name: test name: test

View File

@ -29,4 +29,3 @@
index: test_index index: test_index
ignore_unavailable: true ignore_unavailable: true
allow_no_indices: false allow_no_indices: false

View File

@ -23,4 +23,3 @@
- match: { test-index.settings.index.number_of_replicas: "3" } - match: { test-index.settings.index.number_of_replicas: "3" }
- match: { test-index.settings.index.number_of_shards: "2" } - match: { test-index.settings.index.number_of_shards: "2" }
- match: { test-index.settings.index.refresh_interval: "-1" } - match: { test-index.settings.index.refresh_interval: "-1" }

View File

@ -10,4 +10,3 @@ setup:
catch: missing catch: missing
indices.get_template: indices.get_template:
name: test name: test

View File

@ -101,4 +101,3 @@ setup:
search: search:
rest_total_hits_as_int: true rest_total_hits_as_int: true
index: test_index3 index: test_index3

View File

@ -113,4 +113,3 @@ setup:
- do: - do:
catch: param catch: param
indices.put_alias: {} indices.put_alias: {}

View File

@ -143,4 +143,3 @@
- is_false: test_index.mappings.properties.foo.meta.bar - is_false: test_index.mappings.properties.foo.meta.bar
- match: { test_index.mappings.properties.foo.meta.baz: "quux" } - match: { test_index.mappings.properties.foo.meta.baz: "quux" }

View File

@ -159,4 +159,3 @@ setup:
indices.get_mapping: {} indices.get_mapping: {}
- match: {test_index1.mappings.properties.text.type: text} - match: {test_index1.mappings.properties.text.type: text}

View File

@ -110,4 +110,3 @@ setup:
- match: {test_index1.settings.index.refresh_interval: 1s} - match: {test_index1.settings.index.refresh_interval: 1s}
- match: {test_index2.settings.index.refresh_interval: 1s} - match: {test_index2.settings.index.refresh_interval: 1s}
- match: {foo.settings.index.refresh_interval: 1s} - match: {foo.settings.index.refresh_interval: 1s}

View File

@ -55,4 +55,3 @@ setup:
- match: { _shards.total: 0 } - match: { _shards.total: 0 }
- match: { _shards.successful: 0 } - match: { _shards.successful: 0 }
- match: { _shards.failed: 0 } - match: { _shards.failed: 0 }

View File

@ -148,4 +148,3 @@
body: body:
conditions: conditions:
max_docs: 1 max_docs: 1

View File

@ -52,4 +52,3 @@
- match: { conditions: { "[max_docs: 2]": true } } - match: { conditions: { "[max_docs: 2]": true } }
- match: { rolled_over: true } - match: { rolled_over: true }

View File

@ -132,4 +132,3 @@ setup:
index: test_index index: test_index
- match: {test_index.aliases.test_alias: {'index_routing': '5', 'search_routing': '5'}} - match: {test_index.aliases.test_alias: {'index_routing': '5', 'search_routing': '5'}}

View File

@ -69,4 +69,3 @@
indices.upgrade: indices.upgrade:
index: ["test_index", "does_not_exist"] index: ["test_index", "does_not_exist"]
ignore_unavailable: false ignore_unavailable: false

View File

@ -2,6 +2,3 @@
"Lucene Version": "Lucene Version":
- do: {info: {}} - do: {info: {}}
- is_true: version.lucene_version - is_true: version.lucene_version

View File

@ -224,4 +224,3 @@
- is_false: nodes.$node_id.indices.translog - is_false: nodes.$node_id.indices.translog
- is_false: nodes.$node_id.indices.recovery - is_false: nodes.$node_id.indices.recovery
- is_true: nodes.$node_id.indices.segments.file_sizes - is_true: nodes.$node_id.indices.segments.file_sizes

View File

@ -2,4 +2,3 @@
"Ping": "Ping":
- do: { ping: {}} - do: { ping: {}}
- is_true: '' - is_true: ''

View File

@ -118,4 +118,3 @@ setup:
2: 2:
terms: terms:
field: date field: date

View File

@ -74,4 +74,3 @@
buckets_path: "the_avg" buckets_path: "the_avg"
window: -1 window: -1
script: "MovingFunctions.windowMax(values)" script: "MovingFunctions.windowMax(values)"

Some files were not shown because too many files have changed in this diff Show More