Added a new line linter (#2875)

* Added linter to add new line

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Fixed new lines

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Ignore empty files

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Updated DEVELOPER GUIDE

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Renamed workflow file

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>

* Fixed failing tests

Signed-off-by: Owais Kazi <owaiskazi19@gmail.com>
This commit is contained in:
Owais Kazi 2022-04-13 11:14:18 -07:00 committed by GitHub
parent 08e4a35839
commit 3c5d997a76
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
154 changed files with 115 additions and 167 deletions

View File

@ -12,4 +12,4 @@ ol-7.7
sles-12.3 # older version used in Vagrant image sles-12.3 # older version used in Vagrant image
sles-12.5 sles-12.5
sles-15.1 sles-15.1
sles-15.2 sles-15.2

14
.github/workflows/code-hygiene.yml vendored Normal file
View File

@ -0,0 +1,14 @@
name: Code Hygiene
on: [push, pull_request]
jobs:
linelint:
runs-on: ubuntu-latest
name: Check if all files end in newline
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Linelint
uses: fernandrone/linelint@0.0.4

View File

@ -12,4 +12,4 @@ jobs:
- name: Delete merged branch - name: Delete merged branch
uses: SvanBoxel/delete-merged-branch@main uses: SvanBoxel/delete-merged-branch@main
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@ -16,4 +16,4 @@ jobs:
args: --accept=200,403,429 --exclude-mail **/*.html **/*.md **/*.txt **/*.json --exclude-file .lychee.excludes args: --accept=200,403,429 --exclude-mail **/*.html **/*.md **/*.txt **/*.json --exclude-file .lychee.excludes
fail: true fail: true
env: env:
GITHUB_TOKEN: ${{secrets.GITHUB_TOKEN}} GITHUB_TOKEN: ${{secrets.GITHUB_TOKEN}}

View File

@ -7,4 +7,4 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- uses: gradle/wrapper-validation-action@v1 - uses: gradle/wrapper-validation-action@v1

49
.linelint.yml Normal file
View File

@ -0,0 +1,49 @@
# 'true' will fix files
autofix: true
ignore:
- .git/
- .gradle/
- .idea/
- '*.sha1'
- '*.txt'
- '.github/CODEOWNERS'
- 'buildSrc/src/testKit/opensearch.build/LICENSE'
- 'buildSrc/src/testKit/opensearch.build/NOTICE'
- 'server/licenses/apache-log4j-extras-DEPENDENCIES'
# Empty files
- 'doc-tools/missing-doclet/bin/main/org/opensearch/missingdoclet/MissingDoclet.class'
- 'buildSrc/src/integTest/resources/org/opensearch/gradle/internal/fake_git/remote/build.gradle'
- 'buildSrc/src/integTest/resources/org/opensearch/gradle/internal/fake_git/remote/distribution/archives/oss-darwin-tar/build.gradle'
- 'buildSrc/src/integTest/resources/org/opensearch/gradle/internal/fake_git/remote/distribution/bwc/bugfix/build.gradle'
- 'buildSrc/src/integTest/resources/org/opensearch/gradle/internal/fake_git/remote/distribution/bwc/minor/build.gradle'
- 'buildSrc/src/main/resources/buildSrc.marker'
- 'buildSrc/src/testKit/opensearch-build-resources/settings.gradle'
- 'buildSrc/src/testKit/opensearch.build/settings.gradle'
- 'buildSrc/src/testKit/reaper/settings.gradle'
- 'buildSrc/src/testKit/symbolic-link-preserving-tar/settings.gradle'
- 'buildSrc/src/testKit/testingConventions/empty_test_task/.gitignore'
- 'client/rest-high-level/src/main/resources/META-INF/services/org.opensearch.plugins.spi.NamedXContentProvider'
- 'distribution/bwc/bugfix/build.gradle'
- 'distribution/bwc/maintenance/build.gradle'
- 'distribution/bwc/minor/build.gradle'
- 'distribution/bwc/staged/build.gradle'
- 'libs/ssl-config/src/test/resources/certs/pem-utils/empty.pem'
- 'qa/evil-tests/src/test/resources/org/opensearch/common/logging/does_not_exist/nothing_to_see_here'
- 'qa/os/centos-6/build.gradle'
- 'qa/os/debian-8/build.gradle'
- 'qa/os/oel-6/build.gradle'
- 'qa/os/oel-7/build.gradle'
- 'qa/os/sles-12/build.gradle'
# Test requires no new line for these files
- 'server/src/test/resources/org/opensearch/action/bulk/simple-bulk11.json'
- 'server/src/test/resources/org/opensearch/action/search/simple-msearch5.json'
rules:
# checks if file ends in a newline character
end-of-file:
# set to true to enable this rule
enable: true
# if true also checks if file ends in a single newline character
single-new-line: true

View File

@ -48,6 +48,7 @@
- [Distributed Framework](#distributed-framework) - [Distributed Framework](#distributed-framework)
- [Submitting Changes](#submitting-changes) - [Submitting Changes](#submitting-changes)
- [Backports](#backports) - [Backports](#backports)
- [LineLint](#linelint)
# Developer Guide # Developer Guide
@ -472,3 +473,18 @@ See [CONTRIBUTING](CONTRIBUTING.md).
## Backports ## Backports
The Github workflow in [`backport.yml`](.github/workflows/backport.yml) creates backport PRs automatically when the original PR with an appropriate label `backport <backport-branch-name>` is merged to main with the backport workflow run successfully on the PR. For example, if a PR on main needs to be backported to `1.x` branch, add a label `backport 1.x` to the PR and make sure the backport workflow runs on the PR along with other checks. Once this PR is merged to main, the workflow will create a backport PR to the `1.x` branch. The Github workflow in [`backport.yml`](.github/workflows/backport.yml) creates backport PRs automatically when the original PR with an appropriate label `backport <backport-branch-name>` is merged to main with the backport workflow run successfully on the PR. For example, if a PR on main needs to be backported to `1.x` branch, add a label `backport 1.x` to the PR and make sure the backport workflow runs on the PR along with other checks. Once this PR is merged to main, the workflow will create a backport PR to the `1.x` branch.
## LineLint
A linter in [`code-hygiene.yml`](.github/workflows/code-hygiene.yml) that validates simple newline and whitespace rules in all sorts of files. It can:
- Recursively check a directory tree for files that do not end in a newline
- Automatically fix these files by adding a newline or trimming extra newlines.
Rules are defined in `.linelint.yml`.
Executing the binary will automatically search the local directory tree for linting errors.
linelint .
Pass a list of files or directories to limit your search.
linelint README.md LICENSE

View File

@ -45,4 +45,4 @@ Copyright OpenSearch Contributors. See [NOTICE](NOTICE.txt) for details.
OpenSearch is a registered trademark of Amazon Web Services. OpenSearch is a registered trademark of Amazon Web Services.
OpenSearch includes certain Apache-licensed Elasticsearch code from Elasticsearch B.V. and other source code. Elasticsearch B.V. is not the source of that other source code. ELASTICSEARCH is a registered trademark of Elasticsearch B.V. OpenSearch includes certain Apache-licensed Elasticsearch code from Elasticsearch B.V. and other source code. Elasticsearch B.V. is not the source of that other source code. ELASTICSEARCH is a registered trademark of Elasticsearch B.V.

View File

@ -1,3 +1,3 @@
## Releasing ## Releasing
This project follows [OpenSearch project branching, labelling, and releasing](https://github.com/opensearch-project/.github/blob/main/RELEASING.md). This project follows [OpenSearch project branching, labelling, and releasing](https://github.com/opensearch-project/.github/blob/main/RELEASING.md).

View File

@ -1,3 +1,3 @@
## Reporting a Vulnerability ## Reporting a Vulnerability
If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/) or directly via email to aws-security@amazon.com. Please do **not** create a public GitHub issue. If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/) or directly via email to aws-security@amazon.com. Please do **not** create a public GitHub issue.

View File

@ -3,4 +3,4 @@ encoding//src/main/java=UTF-8
encoding//src/main/resources=UTF-8 encoding//src/main/resources=UTF-8
encoding//src/test/java=UTF-8 encoding//src/test/java=UTF-8
encoding//src/test/resources=UTF-8 encoding//src/test/resources=UTF-8
encoding/<project>=UTF-8 encoding/<project>=UTF-8

View File

@ -1 +1 @@
7.4.1 7.4.1

View File

@ -88,6 +88,3 @@ project(':valid_setup_with_base') {
} }
} }
} }

View File

@ -16,4 +16,4 @@ include 'all_classes_in_tasks'
include 'not_implementing_base' include 'not_implementing_base'
include 'valid_setup_no_base' include 'valid_setup_no_base'
include 'valid_setup_with_base' include 'valid_setup_with_base'
include 'tests_in_main' include 'tests_in_main'

View File

@ -9,4 +9,4 @@
* GitHub history for details. * GitHub history for details.
*/ */
include 'sample_jars' include 'sample_jars'

View File

@ -136,4 +136,3 @@ if __name__ == "__main__":
print('WARNING: no documentation references updates for release %s' % (release_version)) print('WARNING: no documentation references updates for release %s' % (release_version))
print('*** Done.') print('*** Done.')

View File

@ -28,4 +28,3 @@ done
# Return non-zero error code if any commits were missing signoff # Return non-zero error code if any commits were missing signoff
exit $missingSignoff exit $missingSignoff

View File

@ -7,4 +7,3 @@
- is_true: version - is_true: version
- is_true: version.number - is_true: version.number
- match: { version.build_type: "docker" } - match: { version.build_type: "docker" }

View File

@ -123,4 +123,3 @@
- match: - match:
$body: | $body: |
/^(\S{5,}\n)+$/ /^(\S{5,}\n)+$/

View File

@ -47,4 +47,3 @@ for VAR_NAME_FILE in OPENSEARCH_PASSWORD_FILE KEYSTORE_PASSWORD_FILE ; do
unset "$VAR_NAME_FILE" unset "$VAR_NAME_FILE"
fi fi
done done

View File

@ -74,4 +74,3 @@ if defined JAVA_OPTS (
rem check the Java version rem check the Java version
%JAVA% -cp "%OPENSEARCH_CLASSPATH%" "org.opensearch.tools.java_version_checker.JavaVersionChecker" || exit /b 1 %JAVA% -cp "%OPENSEARCH_CLASSPATH%" "org.opensearch.tools.java_version_checker.JavaVersionChecker" || exit /b 1

View File

@ -8,4 +8,3 @@ version '1.0.0-SNAPSHOT'
repositories { repositories {
mavenCentral() mavenCentral()
} }

View File

@ -40,4 +40,3 @@ dependencies {
tasks.named('forbiddenApisMain').configure { tasks.named('forbiddenApisMain').configure {
replaceSignatureFiles 'jdk-signatures' replaceSignatureFiles 'jdk-signatures'
} }

View File

@ -360,4 +360,4 @@
"append": "" "append": ""
} }
] ]

View File

@ -42,4 +42,3 @@ tasks.named('forbiddenApisMain').configure {
// TODO: Need to decide how we want to handle for forbidden signatures with the changes to core // TODO: Need to decide how we want to handle for forbidden signatures with the changes to core
replaceSignatureFiles 'jdk-signatures' replaceSignatureFiles 'jdk-signatures'
} }

View File

@ -10,4 +10,3 @@ EXIM_PROTOCOL (P=%{NOTSPACE:protocol})
EXIM_MSG_SIZE (S=%{NUMBER:exim_msg_size}) EXIM_MSG_SIZE (S=%{NUMBER:exim_msg_size})
EXIM_HEADER_ID (id=%{NOTSPACE:exim_header_id}) EXIM_HEADER_ID (id=%{NOTSPACE:exim_header_id})
EXIM_SUBJECT (T=%{QS:exim_subject}) EXIM_SUBJECT (T=%{QS:exim_subject})

View File

@ -6,4 +6,3 @@ RT_FLOW1 %{RT_FLOW_EVENT:event}: %{GREEDYDATA:close-reason}: %{IP:src-ip}/%{INT:
RT_FLOW2 %{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .* RT_FLOW2 %{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .*
RT_FLOW3 %{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\(\d\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .* RT_FLOW3 %{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\(\d\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .*

View File

@ -1,3 +1,2 @@
# Default postgresql pg_log format pattern # Default postgresql pg_log format pattern
POSTGRESQL %{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid} POSTGRESQL %{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid}

View File

@ -61,4 +61,3 @@ tasks.test {
jvmArgs += ["--add-opens", "java.base/java.security.cert=ALL-UNNAMED"] jvmArgs += ["--add-opens", "java.base/java.security.cert=ALL-UNNAMED"]
} }
} }

View File

@ -68,4 +68,3 @@
- match: { tokens.1.token: "f" } - match: { tokens.1.token: "f" }
- match: { tokens.2.token: "g" } - match: { tokens.2.token: "g" }
- match: { tokens.3.token: "h" } - match: { tokens.3.token: "h" }

View File

@ -119,4 +119,3 @@
- match: { indices.analysis.built_in_analyzers.2.name: spanish } - match: { indices.analysis.built_in_analyzers.2.name: spanish }
- match: { indices.analysis.built_in_analyzers.2.count: 2 } - match: { indices.analysis.built_in_analyzers.2.count: 2 }
- match: { indices.analysis.built_in_analyzers.2.index_count: 2 } - match: { indices.analysis.built_in_analyzers.2.index_count: 2 }

View File

@ -76,4 +76,3 @@
- match: { tokens.5.token: dude } - match: { tokens.5.token: dude }
- match: { tokens.5.position: 4 } - match: { tokens.5.position: 4 }
- match: { tokens.5.positionLength: null } - match: { tokens.5.positionLength: null }

View File

@ -229,4 +229,3 @@ setup:
query: bar baz query: bar baz
analyzer: lower_graph_syns analyzer: lower_graph_syns
- match: { hits.total: 1 } - match: { hits.total: 1 }

View File

@ -56,4 +56,3 @@ setup:
use_field: text_en use_field: text_en
max_gaps: 1 max_gaps: 1
- match: { hits.total.value: 1 } - match: { hits.total.value: 1 }

View File

@ -91,4 +91,3 @@ teardown:
get: get:
index: test index: test
id: 3 id: 3

View File

@ -43,4 +43,3 @@ restResources {
testClusters.all { testClusters.all {
extraConfigFile 'ingest-user-agent/test-regexes.yml', file('src/test/test-regexes.yml') extraConfigFile 'ingest-user-agent/test-regexes.yml', file('src/test/test-regexes.yml')
} }

View File

@ -1,3 +1,3 @@
user_agent_parsers: user_agent_parsers:
- regex: '.*' - regex: '.*'
family_replacement: 'Test' family_replacement: 'Test'

View File

@ -52,4 +52,3 @@ tasks.named("dependencyLicenses").configure {
mapping from: /lucene-.*/, to: 'lucene' mapping from: /lucene-.*/, to: 'lucene'
mapping from: /asm-.*/, to: 'asm' mapping from: /asm-.*/, to: 'asm'
} }

View File

@ -124,4 +124,4 @@ ID: [_a-zA-Z] [_a-zA-Z0-9]*;
mode AFTER_DOT; mode AFTER_DOT;
DOTINTEGER: ( '0' | [1-9] [0-9]* ) -> mode(DEFAULT_MODE); DOTINTEGER: ( '0' | [1-9] [0-9]* ) -> mode(DEFAULT_MODE);
DOTID: [_a-zA-Z] [_a-zA-Z0-9]* -> mode(DEFAULT_MODE); DOTID: [_a-zA-Z] [_a-zA-Z0-9]* -> mode(DEFAULT_MODE);

View File

@ -139,4 +139,3 @@ setup:
- is_false: aggregations.placeholder.buckets.0.str_terms.buckets.1.key_as_string - is_false: aggregations.placeholder.buckets.0.str_terms.buckets.1.key_as_string
- match: { aggregations.placeholder.buckets.0.str_terms.buckets.1.doc_count: 1 } - match: { aggregations.placeholder.buckets.0.str_terms.buckets.1.doc_count: 1 }
- match: { aggregations.placeholder.buckets.0.the_bucket_script.value: 2.0 } - match: { aggregations.placeholder.buckets.0.the_bucket_script.value: 2.0 }

View File

@ -41,4 +41,3 @@ dependencies {
testClusters.all { testClusters.all {
module ':modules:reindex' module ':modules:reindex'
} }

View File

@ -75,4 +75,3 @@ testClusters.all {
"http://snapshot.test*,http://${urlFixture.addressAndPort}" "http://snapshot.test*,http://${urlFixture.addressAndPort}"
}, PropertyNormalization.IGNORE_VALUE }, PropertyNormalization.IGNORE_VALUE
} }

View File

@ -32,4 +32,3 @@ opensearchplugin {
description 'Integrates OpenSearch with systemd' description 'Integrates OpenSearch with systemd'
classname 'org.opensearch.systemd.SystemdPlugin' classname 'org.opensearch.systemd.SystemdPlugin'
} }

View File

@ -18,4 +18,4 @@
# Apply rule status {200}=RBBI.WORD_LETTER, which is mapped # Apply rule status {200}=RBBI.WORD_LETTER, which is mapped
# to <ALPHANUM> token type by DefaultICUTokenizerConfig. # to <ALPHANUM> token type by DefaultICUTokenizerConfig.
.+ {200}; .+ {200};

View File

@ -46,4 +46,3 @@ restResources {
tasks.named("dependencyLicenses").configure { tasks.named("dependencyLicenses").configure {
mapping from: /lucene-.*/, to: 'lucene' mapping from: /lucene-.*/, to: 'lucene'
} }

View File

@ -31,4 +31,3 @@
- match: { tokens.1.token: joe } - match: { tokens.1.token: joe }
- match: { tokens.2.token: BLKS } - match: { tokens.2.token: BLKS }
- match: { tokens.3.token: bloggs } - match: { tokens.3.token: bloggs }

View File

@ -28,4 +28,3 @@
- length: { tokens: 1 } - length: { tokens: 1 }
- match: { tokens.0.token: SPRKLF } - match: { tokens.0.token: SPRKLF }

View File

@ -30,4 +30,3 @@
- length: { tokens: 1 } - length: { tokens: 1 }
- match: { tokens.0.token: Svarts } - match: { tokens.0.token: Svarts }

View File

@ -27,4 +27,3 @@
- length: { tokens: 1 } - length: { tokens: 1 }
- match: { tokens.0.token: "645740" } - match: { tokens.0.token: "645740" }

View File

@ -47,4 +47,3 @@ restResources {
tasks.named("dependencyLicenses").configure { tasks.named("dependencyLicenses").configure {
mapping from: /lucene-.*/, to: 'lucene' mapping from: /lucene-.*/, to: 'lucene'
} }

View File

@ -36,4 +36,3 @@ configure(project('painless-whitelist')) {
} }
} }
} }

View File

@ -42,4 +42,3 @@ testClusters.all {
// Adds a setting in the OpenSearch keystore before running the integration tests // Adds a setting in the OpenSearch keystore before running the integration tests
keystore 'custom.secured', 'password' keystore 'custom.secured', 'password'
} }

View File

@ -2,4 +2,4 @@
custom: custom:
simple: foo simple: foo
list: [0, 1, 1, 2, 3, 5, 8, 13, 21] list: [0, 1, 1, 2, 3, 5, 8, 13, 21]
filtered: secret filtered: secret

View File

@ -56,4 +56,3 @@ javaRestTest {
dependsOn exampleFixture dependsOn exampleFixture
nonInputProperties.systemProperty 'external.address', "${-> exampleFixture.addressAndPort}" nonInputProperties.systemProperty 'external.address', "${-> exampleFixture.addressAndPort}"
} }

View File

@ -39,4 +39,3 @@ opensearchplugin {
} }
test.enabled = false test.enabled = false

View File

@ -2,4 +2,3 @@
= AsciiDoc test = AsciiDoc test
Here is a test of the asciidoc format. Here is a test of the asciidoc format.

View File

@ -12,4 +12,3 @@
- contains: { 'nodes.$master.plugins': { name: ingest-attachment } } - contains: { 'nodes.$master.plugins': { name: ingest-attachment } }
- contains: { 'nodes.$master.ingest.processors': { type: attachment } } - contains: { 'nodes.$master.ingest.processors': { type: attachment } }

View File

@ -142,4 +142,3 @@
request_cache: false request_cache: false
body: { "query" : {"match_phrase" : { "my_field" : {"query": "~MARK0", "analyzer": "whitespace"} } }, "highlight" : { "type" : "annotated", "fields" : { "my_field" : {} } } } body: { "query" : {"match_phrase" : { "my_field" : {"query": "~MARK0", "analyzer": "whitespace"} } }, "highlight" : { "type" : "annotated", "fields" : { "my_field" : {} } } }
- match: {_shards.failed: 0} - match: {_shards.failed: 0}

View File

@ -45,4 +45,3 @@
- do: - do:
snapshot.delete_repository: snapshot.delete_repository:
repository: test_snapshot_repository repository: test_snapshot_repository

View File

@ -47,4 +47,3 @@
- do: - do:
snapshot.delete_repository: snapshot.delete_repository:
repository: test_snapshot_repository repository: test_snapshot_repository

View File

@ -183,4 +183,3 @@ thirdPartyAudit {
'io.netty.handler.ssl.util.OpenJdkSelfSignedCertGenerator' 'io.netty.handler.ssl.util.OpenJdkSelfSignedCertGenerator'
) )
} }

View File

@ -35,4 +35,3 @@ testClusters.javaRestTest {
} }
test.enabled = false test.enabled = false

View File

@ -1 +1 @@
tool help tool help

View File

@ -220,4 +220,3 @@
# When all shards are skipped current logic returns 1 to produce a valid search result # When all shards are skipped current logic returns 1 to produce a valid search result
- match: { _shards.skipped : 1} - match: { _shards.skipped : 1}
- match: { _shards.failed: 0 } - match: { _shards.failed: 0 }

View File

@ -204,4 +204,3 @@
tasks.get: tasks.get:
wait_for_completion: true wait_for_completion: true
task_id: $task task_id: $task

View File

@ -111,5 +111,3 @@
gte: "2019-02-01T00:00+01:00" gte: "2019-02-01T00:00+01:00"
lte: "2019-02-01T00:00+01:00" lte: "2019-02-01T00:00+01:00"
- match: { hits.total: 1 } - match: { hits.total: 1 }

View File

@ -133,5 +133,3 @@
wait_for_completion: true wait_for_completion: true
task_id: $task_id task_id: $task_id
- match: { task.headers.X-Opaque-Id: "Reindexing Again" } - match: { task.headers.X-Opaque-Id: "Reindexing Again" }

View File

@ -38,4 +38,3 @@
time_frame: time_frame:
gte: "2019-02-01T00:00+01:00" gte: "2019-02-01T00:00+01:00"
lte: "2019-02-01T00:00+01:00" lte: "2019-02-01T00:00+01:00"

View File

@ -112,4 +112,3 @@
_id: test_id2 _id: test_id2
pipeline: my_pipeline_1 pipeline: my_pipeline_1
- f1: v2 - f1: v2

View File

@ -102,4 +102,3 @@
- match: { error.processor_type: "script" } - match: { error.processor_type: "script" }
- match: { error.type: "script_exception" } - match: { error.type: "script_exception" }
- match: { error.reason: "compile error" } - match: { error.reason: "compile error" }

View File

@ -34,4 +34,3 @@
id: 1 id: 1
pipeline: "my_timely_pipeline" pipeline: "my_timely_pipeline"
body: {} body: {}

View File

@ -1 +1 @@
ctx.bytes_total = ctx.bytes_in + ctx.bytes_out ctx.bytes_total = ctx.bytes_in + ctx.bytes_out

View File

@ -49,4 +49,3 @@ public class SmokeTestPluginsClientYamlTestSuiteIT extends OpenSearchClientYamlS
return OpenSearchClientYamlSuiteTestCase.createParameters(); return OpenSearchClientYamlSuiteTestCase.createParameters();
} }
} }

View File

@ -411,5 +411,3 @@
Signed-off-by: Abbas Hussain &lt;abbas_10690@yahoo.com&gt; Signed-off-by: Abbas Hussain &lt;abbas_10690@yahoo.com&gt;

View File

@ -386,5 +386,3 @@
Signed-off-by: Sooraj Sinha &lt;soosinha@amazon.com&gt; Signed-off-by: Sooraj Sinha &lt;soosinha@amazon.com&gt;

View File

@ -458,4 +458,3 @@
Signed-off-by: Nicholas Walter Knize &lt;nknize@apache.org&gt; Signed-off-by: Nicholas Walter Knize &lt;nknize@apache.org&gt;

View File

@ -72,5 +72,3 @@
Signed-off-by: dblock &lt;dblock@amazon.com&gt; Signed-off-by: dblock &lt;dblock@amazon.com&gt;

View File

@ -1295,5 +1295,3 @@
[Nick Knize](mailto:nknize@apache.org) - Thu, 4 Nov 2021 14:46:57 -0500 [Nick Knize](mailto:nknize@apache.org) - Thu, 4 Nov 2021 14:46:57 -0500
Signed-off-by: Nicholas Walter Knize &lt;nknize@apache.org&gt; Signed-off-by: Nicholas Walter Knize &lt;nknize@apache.org&gt;

View File

@ -223,4 +223,3 @@
- match: { items.0.index.status: 400 } - match: { items.0.index.status: 400 }
- match: { items.0.index.error.type: illegal_argument_exception } - match: { items.0.index.error.type: illegal_argument_exception }
- match: { items.0.index.error.reason: "no write index is defined for alias [test_index]. The write index may be explicitly disabled using is_write_index=false or the alias points to multiple indices without one being designated as a write index" } - match: { items.0.index.error.reason: "no write index is defined for alias [test_index]. The write index may be explicitly disabled using is_write_index=false or the alias points to multiple indices without one being designated as a write index" }

View File

@ -14,4 +14,3 @@
index: test_index index: test_index
- match: {count: 2} - match: {count: 2}

View File

@ -14,4 +14,3 @@
index: test_index index: test_index
- match: {count: 2} - match: {count: 2}

View File

@ -68,4 +68,3 @@
- match: { items.0.update.get._source.foo: garply } - match: { items.0.update.get._source.foo: garply }
- is_false: items.0.update.get._source.bar - is_false: items.0.update.get._source.bar

View File

@ -131,4 +131,3 @@
- match: - match:
$body: | $body: |
/^(\S{5,}\n)+$/ /^(\S{5,}\n)+$/

View File

@ -11,4 +11,3 @@
local: true local: true
- is_true: tasks - is_true: tasks

View File

@ -3,4 +3,3 @@
- do: - do:
cluster.remote_info: {} cluster.remote_info: {}
- is_true: '' - is_true: ''

View File

@ -104,4 +104,3 @@ teardown:
cluster.post_voting_config_exclusions: cluster.post_voting_config_exclusions:
node_ids: nodeId node_ids: nodeId
node_names: nodeName node_names: nodeName

View File

@ -39,4 +39,3 @@
get: get:
index: test_1 index: test_1
id: 1 id: 1

View File

@ -30,4 +30,3 @@
index: test_1 index: test_1
id: 1 id: 1
routing: 5 routing: 5

View File

@ -14,4 +14,3 @@
- match: { _index: test_1 } - match: { _index: test_1 }
- match: { _id: '1' } - match: { _id: '1' }
- match: { _source: { foo: "bar" } } - match: { _source: { foo: "bar" } }

View File

@ -48,5 +48,3 @@
- match: { fields.foo: [bar] } - match: { fields.foo: [bar] }
- match: { fields.count: [1] } - match: { fields.count: [1] }
- match: { _source.foo: bar } - match: { _source.foo: bar }

View File

@ -41,4 +41,3 @@
get: get:
index: test_1 index: test_1
id: 1 id: 1

View File

@ -80,4 +80,3 @@
id: 1 id: 1
version: 1 version: 1
version_type: external_gte version_type: external_gte

View File

@ -40,4 +40,3 @@
get: get:
index: test_1 index: test_1
id: 1 id: 1

View File

@ -221,4 +221,3 @@ setup:
catch: param catch: param
indices.delete_alias: indices.delete_alias:
index: "test_index1" index: "test_index1"

Some files were not shown because too many files have changed in this diff Show More