Test docs for plugins

We weren't doing it before because we weren't starting the plugins.
Now we are.

The hardest part of this was handling the files the tests expect
to be on the filesystem. extraConfigFiles was broken.
This commit is contained in:
Nik Everett 2016-05-13 16:15:51 -04:00
parent 399d023715
commit 9c85569883
15 changed files with 130 additions and 83 deletions

View File

@ -22,6 +22,29 @@ apply plugin: 'elasticsearch.docs-test'
integTest {
cluster {
setting 'script.inline', 'true'
setting 'script.stored', 'true'
Closure configFile = {
extraConfigFile it, "src/test/cluster/config/$it"
}
configFile 'scripts/my_script.js'
configFile 'scripts/my_script.py'
configFile 'userdict_ja.txt'
configFile 'KeywordTokenizer.rbbi'
}
}
// Build the cluser with all plugins
project.rootProject.subprojects.findAll { it.parent.path == ':plugins' }.each { subproj ->
/* Skip repositories. We just aren't going to be able to test them so it
* doesn't make sense to waste time installing them. */
if (subproj.path.startsWith(':plugins:repository-')) {
return
}
integTest {
cluster {
// We need a non-decorated project object, so we lookup the project by path
plugin subproj.name, project(subproj.path)
}
}
}
@ -30,8 +53,6 @@ buildRestTests.docs = fileTree(projectDir) {
exclude 'build.gradle'
// That is where the snippets go, not where they come from!
exclude 'build'
// Remove plugins because they aren't installed during this test. Yet?
exclude 'plugins'
// This file simply doesn't pass yet. We should figure out how to fix it.
exclude 'reference/modules/snapshots.asciidoc'
}

View File

@ -161,6 +161,8 @@ PUT icu_sample
}
}
GET _cluster/health?wait_for_status=yellow
POST icu_sample/_analyze?analyzer=my_analyzer&text=Elasticsearch. Wow!
--------------------------------------------------
// CONSOLE
@ -169,7 +171,6 @@ The above `analyze` request returns the following:
[source,js]
--------------------------------------------------
# Result
{
"tokens": [
{
@ -182,7 +183,7 @@ The above `analyze` request returns the following:
]
}
--------------------------------------------------
// TESTRESPONSE
[[analysis-icu-normalization]]
==== ICU Normalization Token Filter
@ -253,7 +254,7 @@ PUT icu_sample
"analysis": {
"analyzer": {
"folded": {
"tokenizer": "icu",
"tokenizer": "icu_tokenizer",
"filter": [
"icu_folding"
]
@ -359,6 +360,8 @@ PUT /my_index
}
}
GET _cluster/health?wait_for_status=yellow
GET _search <3>
{
"query": {
@ -478,6 +481,8 @@ PUT icu_sample
}
}
GET _cluster/health?wait_for_status=yellow
GET icu_sample/_analyze?analyzer=latin
{
"text": "你好" <2>

View File

@ -172,6 +172,8 @@ PUT kuromoji_sample
}
}
GET _cluster/health?wait_for_status=yellow
POST kuromoji_sample/_analyze?analyzer=my_analyzer&text=東京スカイツリー
--------------------------------------------------
// CONSOLE
@ -224,6 +226,8 @@ PUT kuromoji_sample
}
}
GET _cluster/health?wait_for_status=yellow
POST kuromoji_sample/_analyze?analyzer=my_analyzer&text=飲み
--------------------------------------------------
// CONSOLE
@ -282,6 +286,8 @@ PUT kuromoji_sample
}
}
GET _cluster/health?wait_for_status=yellow
POST kuromoji_sample/_analyze?analyzer=my_analyzer&text=寿司がおいしいね
--------------------------------------------------
@ -354,6 +360,8 @@ PUT kuromoji_sample
}
}
GET _cluster/health?wait_for_status=yellow
POST kuromoji_sample/_analyze?analyzer=katakana_analyzer&text=寿司 <1>
POST kuromoji_sample/_analyze?analyzer=romaji_analyzer&text=寿司 <2>
@ -405,6 +413,8 @@ PUT kuromoji_sample
}
}
GET _cluster/health?wait_for_status=yellow
POST kuromoji_sample/_analyze?analyzer=my_analyzer&text=コピー <1>
POST kuromoji_sample/_analyze?analyzer=my_analyzer&text=サーバー <2>
@ -454,7 +464,9 @@ PUT kuromoji_sample
}
}
POST kuromoji_sample/_analyze?analyzer=my_analyzer&text=ストップは消える
GET _cluster/health?wait_for_status=yellow
POST kuromoji_sample/_analyze?analyzer=analyzer_with_ja_stop&text=ストップは消える
--------------------------------------------------
// CONSOLE
@ -500,6 +512,8 @@ PUT kuromoji_sample
}
}
GET _cluster/health?wait_for_status=yellow
POST kuromoji_sample/_analyze?analyzer=my_analyzer&text=一〇〇〇
--------------------------------------------------
@ -518,4 +532,3 @@ POST kuromoji_sample/_analyze?analyzer=my_analyzer&text=一〇〇〇
} ]
}
--------------------------------------------------

View File

@ -79,6 +79,8 @@ PUT phonetic_sample
}
}
GET _cluster/health?wait_for_status=yellow
POST phonetic_sample/_analyze?analyzer=my_analyzer&text=Joe Bloggs <1>
--------------------------------------------------
// CONSOLE
@ -116,5 +118,3 @@ supported:
be guessed. Accepts: `any`, `comomon`, `cyrillic`, `english`, `french`,
`german`, `hebrew`, `hungarian`, `polish`, `romanian`, `russian`,
`spanish`.

View File

@ -53,8 +53,6 @@ you can use JavaScript as follows:
[source,js]
----
DELETE test
PUT test/doc/1
{
"num": 1.0
@ -96,8 +94,6 @@ you can use JavaScript as follows:
[source,js]
----
DELETE test
PUT test/doc/1
{
"num": 1.0
@ -129,7 +125,6 @@ GET test/_search
}
}
}
----
// CONSOLE
@ -145,7 +140,7 @@ You can save your scripts to a file in the `config/scripts/` directory on
every node. The `.javascript` file suffix identifies the script as containing
JavaScript:
First, save this file as `config/scripts/my_script.javascript` on every node
First, save this file as `config/scripts/my_script.js` on every node
in the cluster:
[source,js]
@ -157,8 +152,6 @@ then use the script as follows:
[source,js]
----
DELETE test
PUT test/doc/1
{
"num": 1.0
@ -185,9 +178,7 @@ GET test/_search
}
}
}
----
// CONSOLE
<1> The function score query retrieves the script with filename `my_script.javascript`.

View File

@ -52,8 +52,6 @@ you can use Python as follows:
[source,js]
----
DELETE test
PUT test/doc/1
{
"num": 1.0
@ -95,8 +93,6 @@ you can use Python as follows:
[source,js]
----
DELETE test
PUT test/doc/1
{
"num": 1.0
@ -156,8 +152,6 @@ then use the script as follows:
[source,js]
----
DELETE test
PUT test/doc/1
{
"num": 1.0
@ -184,9 +178,7 @@ GET test/_search
}
}
}
----
// CONSOLE
<1> The function score query retrieves the script with filename `my_script.py`.

View File

@ -57,12 +57,13 @@ Index a new document populated with a `base64`-encoded attachment:
[source,js]
--------------------------
POST /trying-out-mapper-attachments/person/1
POST /trying-out-mapper-attachments/person/1?refresh
{
"cv": "e1xydGYxXGFuc2kNCkxvcmVtIGlwc3VtIGRvbG9yIHNpdCBhbWV0DQpccGFyIH0="
}
--------------------------
// CONSOLE
// TEST[continued]
Search for the document using words in the attachment:
@ -76,8 +77,35 @@ POST /trying-out-mapper-attachments/person/_search
}}}
--------------------------
// CONSOLE
// TEST[continued]
If you get a hit for your indexed document, the plugin should be installed and working.
If you get a hit for your indexed document, the plugin should be installed and working. It'll look like:
[source,js]
--------------------------
{
"timed_out": false,
"took": 53,
"hits": {
"total": 1,
"max_score": 0.3125,
"hits": [
{
"_score": 0.3125,
"_index": "trying-out-mapper-attachments",
"_type": "person",
"_id": "1",
"_source": {
"cv": "e1xydGYxXGFuc2kNCkxvcmVtIGlwc3VtIGRvbG9yIHNpdCBhbWV0DQpccGFyIH0="
}
}
]
},
"_shards": ...
}
--------------------------
// TESTRESPONSE[s/"took": 53/"took": "$body.took"/]
// TESTRESPONSE[s/"_shards": \.\.\./"_shards": "$body._shards"/]
[[mapper-attachments-usage]]
==== Usage
@ -87,13 +115,14 @@ Using the attachment type is simple, in your mapping JSON, simply set a certain
[source,js]
--------------------------
PUT /test
PUT /test/person/_mapping
{
"mappings": {
"person" : {
"properties" : {
"my_attachment" : { "type" : "attachment" }
}
"properties" : {
"my_attachment" : { "type" : "attachment" }
}
}
}
}
--------------------------
// CONSOLE
@ -146,25 +175,40 @@ in the mappings. For example:
[source,js]
--------------------------
PUT /test/person/_mapping
PUT /test
{
"person" : {
"properties" : {
"file" : {
"type" : "attachment",
"fields" : {
"content" : {"index" : "no"},
"title" : {"store" : "yes"},
"date" : {"store" : "yes"},
"author" : {"analyzer" : "myAnalyzer"},
"keywords" : {"store" : "yes"},
"content_type" : {"store" : "yes"},
"content_length" : {"store" : "yes"},
"language" : {"store" : "yes"}
}
}
"settings": {
"index": {
"analysis": {
"analyzer": {
"my_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["standard"]
}
}
}
}
},
"mappings": {
"person" : {
"properties" : {
"file" : {
"type" : "attachment",
"fields" : {
"content" : {"index" : "no"},
"title" : {"store" : "yes"},
"date" : {"store" : "yes"},
"author" : {"analyzer" : "my_analyzer"},
"keywords" : {"store" : "yes"},
"content_type" : {"store" : "yes"},
"content_length" : {"store" : "yes"},
"language" : {"store" : "yes"}
}
}
}
}
}
}
--------------------------
// CONSOLE
@ -179,7 +223,6 @@ If you need to query on metadata fields, use the attachment field name dot the m
[source,js]
--------------------------
DELETE /test
PUT /test
PUT /test/person/_mapping
{
@ -300,7 +343,6 @@ If you want to highlight your attachment content, you will need to set `"store":
[source,js]
--------------------------
DELETE /test
PUT /test
PUT /test/person/_mapping
{

View File

@ -52,8 +52,7 @@ PUT my_index
--------------------------
// CONSOLE
The value of the `_size` field is accessible in queries, aggregations, scripts,
and when sorting:
The value of the `_size` field is accessible in queries:
[source,js]
--------------------------
@ -76,33 +75,10 @@ GET my_index/_search
"gt": 10
}
}
},
"aggs": {
"Sizes": {
"terms": {
"field": "_size", <2>
"size": 10
}
}
},
"sort": [
{
"_size": { <3>
"order": "desc"
}
}
],
"script_fields": {
"Size": {
"script": "doc['_size']" <4>
}
}
}
--------------------------
// CONSOLE
// TEST[continued]
<1> Querying on the `_size` field
<2> Aggregating on the `_size` field
<3> Sorting on the `_size` field
<4> Accessing the `_size` field in scripts (inline scripts must be modules-security-scripting.html#enable-dynamic-scripting[enabled] for this example to work)

View File

@ -168,6 +168,7 @@ PUT _snapshot/my_backup4
}
----
// CONSOLE
// TEST[skip:we don't have azure setup while testing this]
Example using Java:
@ -207,4 +208,3 @@ be a valid DNS name, conforming to the following naming rules:
permitted in container names.
* All letters in a container name must be lowercase.
* Container names must be from 3 through 63 characters long.

View File

@ -82,7 +82,7 @@ https://console.cloud.google.com/compute/[Compute Engine console].
To indicate that a repository should use the built-in authentication,
the repository `service_account` setting must be set to `_default_`:
[source,json]
[source,js]
----
PUT _snapshot/my_gcs_repository_on_compute_engine
{
@ -94,6 +94,7 @@ PUT _snapshot/my_gcs_repository_on_compute_engine
}
----
// CONSOLE
// TEST[skip:we don't have gcs setup while testing this]
NOTE: The Compute Engine VM must be allowed to use the Storage service. This can be done only at VM
creation time, when "Storage" access can be configured to "Read/Write" permission. Check your
@ -115,7 +116,7 @@ To create a service account file:
A service account file looks like this:
[source,json]
[source,js]
----
{
"type": "service_account",
@ -136,7 +137,7 @@ every node of the cluster.
To indicate that a repository should use a service account file:
[source,json]
[source,js]
----
PUT _snapshot/my_gcs_repository
{
@ -148,7 +149,7 @@ PUT _snapshot/my_gcs_repository
}
----
// CONSOLE
// TEST[skip:we don't have gcs setup while testing this]
[[repository-gcs-bucket-permission]]
===== Set Bucket Permission
@ -168,7 +169,7 @@ The service account used to access the bucket must have the "Writer" access to t
Once everything is installed and every node is started, you can create a new repository that
uses Google Cloud Storage to store snapshots:
[source,json]
[source,js]
----
PUT _snapshot/my_gcs_repository
{
@ -180,6 +181,7 @@ PUT _snapshot/my_gcs_repository
}
----
// CONSOLE
// TEST[skip:we don't have gcs setup while testing this]
The following settings are supported:

View File

@ -149,6 +149,7 @@ PUT _snapshot/my_s3_repository
}
----
// CONSOLE
// TEST[skip:we don't have s3 set up while testing this]
The following settings are supported:

View File

@ -0,0 +1 @@
.+ {200};

View File

@ -0,0 +1 @@
doc["num"].value * factor

View File

@ -0,0 +1 @@
doc["num"].value * factor

View File

@ -0,0 +1 @@
東京スカイツリー,東京 スカイツリー,トウキョウ スカイツリー,カスタム名詞