Rest: Add json in request body to scroll, clear scroll, and analyze API
Change analyze.asciidoc and scroll.asciidoc Add json support to Analyze and Scroll, and clear scrollAPI Add rest-api-spec/test Closes #5866
This commit is contained in:
parent
18ede79ed5
commit
0955c127c0
|
@ -9,26 +9,47 @@ analyzers:
|
|||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET 'localhost:9200/_analyze?analyzer=standard' -d 'this is a test'
|
||||
curl -XGET 'localhost:9200/_analyze' -d '
|
||||
{
|
||||
"analyzer" : "standard",
|
||||
"text" : "this is a test"
|
||||
}'
|
||||
--------------------------------------------------
|
||||
|
||||
coming[2.0.0, body based parameters were added in 2.0.0]
|
||||
|
||||
Or by building a custom transient analyzer out of tokenizers,
|
||||
token filters and char filters. Token filters can use the shorter 'filters'
|
||||
parameter name:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET 'localhost:9200/_analyze?tokenizer=keyword&filters=lowercase' -d 'this is a test'
|
||||
|
||||
curl -XGET 'localhost:9200/_analyze?tokenizer=keyword&token_filters=lowercase&char_filters=html_strip' -d 'this is a <b>test</b>'
|
||||
curl -XGET 'localhost:9200/_analyze' -d '
|
||||
{
|
||||
"tokenizer" : "keyword",
|
||||
"filters" : ["lowercase"],
|
||||
"text" : "this is a test"
|
||||
}'
|
||||
|
||||
curl -XGET 'localhost:9200/_analyze' -d '
|
||||
{
|
||||
"tokenizer" : "keyword",
|
||||
"token_filters" : ["lowercase"],
|
||||
"char_filters" : ["html_strip"],
|
||||
"text" : "this is a <b>test</b>"
|
||||
}'
|
||||
--------------------------------------------------
|
||||
|
||||
coming[2.0.0, body based parameters were added in 2.0.0]
|
||||
|
||||
It can also run against a specific index:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET 'localhost:9200/test/_analyze?text=this+is+a+test'
|
||||
curl -XGET 'localhost:9200/test/_analyze' -d '
|
||||
{
|
||||
"text" : "this is a test"
|
||||
}'
|
||||
--------------------------------------------------
|
||||
|
||||
The above will run an analysis on the "this is a test" text, using the
|
||||
|
@ -37,18 +58,42 @@ can also be provided to use a different analyzer:
|
|||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET 'localhost:9200/test/_analyze?analyzer=whitespace' -d 'this is a test'
|
||||
curl -XGET 'localhost:9200/test/_analyze' -d '
|
||||
{
|
||||
"analyzer" : "whitespace",
|
||||
"text : "this is a test"
|
||||
}'
|
||||
--------------------------------------------------
|
||||
|
||||
coming[2.0.0, body based parameters were added in 2.0.0]
|
||||
|
||||
Also, the analyzer can be derived based on a field mapping, for example:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET 'localhost:9200/test/_analyze?field=obj1.field1' -d 'this is a test'
|
||||
curl -XGET 'localhost:9200/test/_analyze' -d '
|
||||
{
|
||||
"field" : "obj1.field1",
|
||||
"text" : "this is a test"
|
||||
}'
|
||||
--------------------------------------------------
|
||||
|
||||
coming[2.0.0, body based parameters were added in 2.0.0]
|
||||
|
||||
Will cause the analysis to happen based on the analyzer configured in the
|
||||
mapping for `obj1.field1` (and if not, the default index analyzer).
|
||||
|
||||
Also, the text can be provided as part of the request body, and not as a
|
||||
parameter.
|
||||
All parameters can also supplied as request parameters. For example:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET 'localhost:9200/_analyze?tokenizer=keyword&filters=lowercase&text=this+is+a+test'
|
||||
--------------------------------------------------
|
||||
|
||||
For backwards compatibility, we also accept the text parameter as the body of the request,
|
||||
provided it doesn't start with `{` :
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET 'localhost:9200/_analyze?tokenizer=keyword&token_filters=lowercase&char_filters=html_strip' -d 'this is a <b>test</b>'
|
||||
--------------------------------------------------
|
||||
|
|
|
@ -55,20 +55,35 @@ results.
|
|||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET <1> 'localhost:9200/_search/scroll?scroll=1m' <2> <3> \
|
||||
-d 'c2Nhbjs2OzM0NDg1ODpzRlBLc0FXNlNyNm5JWUc1' <4>
|
||||
curl -XGET <1> 'localhost:9200/_search/scroll' <2> -d'
|
||||
{
|
||||
"scroll" : "1m", <3>
|
||||
"scroll_id" : "c2Nhbjs2OzM0NDg1ODpzRlBLc0FXNlNyNm5JWUc1" <4>
|
||||
}
|
||||
'
|
||||
--------------------------------------------------
|
||||
|
||||
coming[2.0.0, body based parameters were added in 2.0.0]
|
||||
|
||||
<1> `GET` or `POST` can be used.
|
||||
<2> The URL should not include the `index` or `type` name -- these
|
||||
are specified in the original `search` request instead.
|
||||
<3> The `scroll` parameter tells Elasticsearch to keep the search context open
|
||||
for another `1m`.
|
||||
<4> The `scroll_id` can be passed in the request body or in the
|
||||
query string as `?scroll_id=....`
|
||||
<4> The `scroll_id` parameter
|
||||
|
||||
Each call to the `scroll` API returns the next batch of results until there
|
||||
are no more results left to return, ie the `hits` array is empty.
|
||||
|
||||
For backwards compatibility, `scroll_id` and `scroll` can be passed in the query string.
|
||||
And the `scroll_id` can be passed in the request body
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
curl -XGET <1> 'localhost:9200/_search/scroll?scroll=1m' <2> <3> \
|
||||
-d 'c2Nhbjs2OzM0NDg1ODpzRlBLc0FXNlNyNm5JWUc1' <4>
|
||||
--------------------------------------------------
|
||||
|
||||
IMPORTANT: The initial search request and each subsequent scroll request
|
||||
returns a new `scroll_id` -- only the most recent `scroll_id` should be
|
||||
used.
|
||||
|
@ -168,19 +183,26 @@ clear a search context manually with the `clear-scroll` API:
|
|||
|
||||
[source,js]
|
||||
---------------------------------------
|
||||
curl -XDELETE localhost:9200/_search/scroll \
|
||||
-d 'c2Nhbjs2OzM0NDg1ODpzRlBLc0FXNlNyNm5JWUc1' <1>
|
||||
curl -XDELETE localhost:9200/_search/scroll -d '
|
||||
{
|
||||
"scroll_id" : ["c2Nhbjs2OzM0NDg1ODpzRlBLc0FXNlNyNm5JWUc1"]
|
||||
}'
|
||||
---------------------------------------
|
||||
<1> The `scroll_id` can be passed in the request body or in the query string.
|
||||
|
||||
Multiple scroll IDs can be passed as comma separated values:
|
||||
coming[2.0.0, body based parameters were added in 2.0.0]
|
||||
|
||||
Multiple scroll IDs can be passed as array:
|
||||
|
||||
[source,js]
|
||||
---------------------------------------
|
||||
curl -XDELETE localhost:9200/_search/scroll \
|
||||
-d 'c2Nhbjs2OzM0NDg1ODpzRlBLc0FXNlNyNm5JWUc1,aGVuRmV0Y2g7NTsxOnkxaDZ' <1>
|
||||
curl -XDELETE localhost:9200/_search/scroll -d '
|
||||
{
|
||||
"scroll_id" : ["c2Nhbjs2OzM0NDg1ODpzRlBLc0FXNlNyNm5JWUc1", "aGVuRmV0Y2g7NTsxOnkxaDZ"]
|
||||
}'
|
||||
---------------------------------------
|
||||
|
||||
coming[2.0.0, body based parameters were added in 2.0.0]
|
||||
|
||||
All search contexts can be cleared with the `_all` parameter:
|
||||
|
||||
[source,js]
|
||||
|
@ -188,3 +210,12 @@ All search contexts can be cleared with the `_all` parameter:
|
|||
curl -XDELETE localhost:9200/_search/scroll/_all
|
||||
---------------------------------------
|
||||
|
||||
The `scroll_id` can also be passed as a query string parameter or in the request body.
|
||||
Multiple scroll IDs can be passed as comma separated values:
|
||||
|
||||
[source,js]
|
||||
---------------------------------------
|
||||
curl -XDELETE localhost:9200/_search/scroll \
|
||||
-d 'c2Nhbjs2OzM0NDg1ODpzRlBLc0FXNlNyNm5JWUc1,aGVuRmV0Y2g7NTsxOnkxaDZ'
|
||||
---------------------------------------
|
||||
|
||||
|
|
|
@ -48,3 +48,18 @@ setup:
|
|||
- length: { tokens: 2 }
|
||||
- match: { tokens.0.token: Foo }
|
||||
- match: { tokens.1.token: Bar! }
|
||||
---
|
||||
"JSON in Body":
|
||||
- do:
|
||||
indices.analyze:
|
||||
body: { "text": "Foo Bar", "filters": ["lowercase"], "tokenizer": keyword }
|
||||
- length: {tokens: 1 }
|
||||
- match: { tokens.0.token: foo bar }
|
||||
---
|
||||
"Body params override query string":
|
||||
- do:
|
||||
indices.analyze:
|
||||
text: Foo Bar
|
||||
body: { "text": "Bar Foo", "filters": ["lowercase"], "tokenizer": keyword }
|
||||
- length: {tokens: 1 }
|
||||
- match: { tokens.0.token: bar foo }
|
||||
|
|
|
@ -112,8 +112,7 @@
|
|||
|
||||
- do:
|
||||
scroll:
|
||||
scroll_id: $scroll_id
|
||||
scroll: 1m
|
||||
body: { "scroll_id": "$scroll_id", "scroll": "1m"}
|
||||
|
||||
- match: {hits.total: 2 }
|
||||
- length: {hits.hits: 1 }
|
||||
|
@ -131,3 +130,63 @@
|
|||
clear_scroll:
|
||||
scroll_id: $scroll_id
|
||||
|
||||
---
|
||||
"Body params override query string":
|
||||
- do:
|
||||
indices.create:
|
||||
index: test_scroll
|
||||
- do:
|
||||
index:
|
||||
index: test_scroll
|
||||
type: test
|
||||
id: 42
|
||||
body: { foo: 1 }
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test_scroll
|
||||
type: test
|
||||
id: 43
|
||||
body: { foo: 2 }
|
||||
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test_scroll
|
||||
size: 1
|
||||
scroll: 1m
|
||||
sort: foo
|
||||
body:
|
||||
query:
|
||||
match_all: {}
|
||||
|
||||
- set: {_scroll_id: scroll_id}
|
||||
- match: {hits.total: 2 }
|
||||
- length: {hits.hits: 1 }
|
||||
- match: {hits.hits.0._id: "42" }
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test_scroll
|
||||
type: test
|
||||
id: 44
|
||||
body: { foo: 3 }
|
||||
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
- do:
|
||||
scroll:
|
||||
scroll_id: invalid_scroll_id
|
||||
body: { "scroll_id": "$scroll_id", "scroll": "1m"}
|
||||
|
||||
- match: {hits.total: 2 }
|
||||
- length: {hits.hits: 1 }
|
||||
- match: {hits.hits.0._id: "43" }
|
||||
|
||||
- do:
|
||||
clear_scroll:
|
||||
scroll_id: $scroll_id
|
||||
|
||||
|
|
|
@ -37,3 +37,44 @@
|
|||
catch: missing
|
||||
clear_scroll:
|
||||
scroll_id: $scroll_id1
|
||||
|
||||
---
|
||||
"Body params override query string":
|
||||
- do:
|
||||
indices.create:
|
||||
index: test_scroll
|
||||
- do:
|
||||
index:
|
||||
index: test_scroll
|
||||
type: test
|
||||
id: 42
|
||||
body: { foo: bar }
|
||||
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test_scroll
|
||||
search_type: scan
|
||||
scroll: 1m
|
||||
body:
|
||||
query:
|
||||
match_all: {}
|
||||
|
||||
- set: {_scroll_id: scroll_id1}
|
||||
|
||||
- do:
|
||||
clear_scroll:
|
||||
scroll_id: "invalid_scroll_id"
|
||||
body: { "scroll_id": [ "$scroll_id1" ]}
|
||||
|
||||
- do:
|
||||
catch: missing
|
||||
scroll:
|
||||
scroll_id: $scroll_id1
|
||||
|
||||
- do:
|
||||
catch: missing
|
||||
clear_scroll:
|
||||
scroll_id: $scroll_id1
|
||||
|
|
|
@ -53,29 +53,23 @@ public class AnalyzeRequest extends SingleCustomOperationRequest<AnalyzeRequest>
|
|||
}
|
||||
|
||||
/**
|
||||
* Constructs a new analyzer request for the provided text.
|
||||
* Constructs a new analyzer request for the provided index.
|
||||
*
|
||||
* @param text The text to analyze
|
||||
* @param index The text to analyze
|
||||
*/
|
||||
public AnalyzeRequest(String text) {
|
||||
this.text = text;
|
||||
}
|
||||
|
||||
/**
|
||||
* Constructs a new analyzer request for the provided index and text.
|
||||
*
|
||||
* @param index The index name
|
||||
* @param text The text to analyze
|
||||
*/
|
||||
public AnalyzeRequest(@Nullable String index, String text) {
|
||||
public AnalyzeRequest(String index) {
|
||||
this.index(index);
|
||||
this.text = text;
|
||||
}
|
||||
|
||||
public String text() {
|
||||
return this.text;
|
||||
}
|
||||
|
||||
public AnalyzeRequest text(String text) {
|
||||
this.text = text;
|
||||
return this;
|
||||
}
|
||||
|
||||
public AnalyzeRequest analyzer(String analyzer) {
|
||||
this.analyzer = analyzer;
|
||||
return this;
|
||||
|
|
|
@ -32,7 +32,7 @@ public class AnalyzeRequestBuilder extends SingleCustomOperationRequestBuilder<A
|
|||
}
|
||||
|
||||
public AnalyzeRequestBuilder(IndicesAdminClient indicesClient, String index, String text) {
|
||||
super(indicesClient, new AnalyzeRequest(index, text));
|
||||
super(indicesClient, new AnalyzeRequest(index).text(text));
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -18,15 +18,24 @@
|
|||
*/
|
||||
package org.elasticsearch.rest.action.admin.indices.analyze;
|
||||
|
||||
import com.google.common.collect.Lists;
|
||||
import org.elasticsearch.ElasticsearchIllegalArgumentException;
|
||||
import org.elasticsearch.action.admin.indices.analyze.AnalyzeRequest;
|
||||
import org.elasticsearch.action.admin.indices.analyze.AnalyzeResponse;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentHelper;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.rest.*;
|
||||
import org.elasticsearch.rest.action.support.RestToXContentListener;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
||||
import static org.elasticsearch.rest.RestRequest.Method.GET;
|
||||
import static org.elasticsearch.rest.RestRequest.Method.POST;
|
||||
|
||||
|
@ -47,14 +56,8 @@ public class RestAnalyzeAction extends BaseRestHandler {
|
|||
@Override
|
||||
public void handleRequest(final RestRequest request, final RestChannel channel, final Client client) {
|
||||
String text = request.param("text");
|
||||
if (text == null && request.hasContent()) {
|
||||
text = request.content().toUtf8();
|
||||
}
|
||||
if (text == null) {
|
||||
throw new ElasticsearchIllegalArgumentException("text is missing");
|
||||
}
|
||||
|
||||
AnalyzeRequest analyzeRequest = new AnalyzeRequest(request.param("index"), text);
|
||||
AnalyzeRequest analyzeRequest = new AnalyzeRequest(request.param("index"));
|
||||
analyzeRequest.text(text);
|
||||
analyzeRequest.listenerThreaded(false);
|
||||
analyzeRequest.preferLocal(request.paramAsBoolean("prefer_local", analyzeRequest.preferLocalShard()));
|
||||
analyzeRequest.analyzer(request.param("analyzer"));
|
||||
|
@ -62,6 +65,73 @@ public class RestAnalyzeAction extends BaseRestHandler {
|
|||
analyzeRequest.tokenizer(request.param("tokenizer"));
|
||||
analyzeRequest.tokenFilters(request.paramAsStringArray("token_filters", request.paramAsStringArray("filters", analyzeRequest.tokenFilters())));
|
||||
analyzeRequest.charFilters(request.paramAsStringArray("char_filters", analyzeRequest.charFilters()));
|
||||
|
||||
if (request.hasContent()) {
|
||||
XContentType type = XContentFactory.xContentType(request.content());
|
||||
if (type == null) {
|
||||
if (text == null) {
|
||||
text = request.content().toUtf8();
|
||||
analyzeRequest.text(text);
|
||||
}
|
||||
} else {
|
||||
// NOTE: if rest request with xcontent body has request parameters, the parameters does not override xcontent values
|
||||
buildFromContent(request.content(), analyzeRequest);
|
||||
}
|
||||
}
|
||||
|
||||
client.admin().indices().analyze(analyzeRequest, new RestToXContentListener<AnalyzeResponse>(channel));
|
||||
}
|
||||
|
||||
public static void buildFromContent(BytesReference content, AnalyzeRequest analyzeRequest) throws ElasticsearchIllegalArgumentException {
|
||||
try (XContentParser parser = XContentHelper.createParser(content)) {
|
||||
if (parser.nextToken() != XContentParser.Token.START_OBJECT) {
|
||||
throw new ElasticsearchIllegalArgumentException("Malforrmed content, must start with an object");
|
||||
} else {
|
||||
XContentParser.Token token;
|
||||
String currentFieldName = null;
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
|
||||
if (token == XContentParser.Token.FIELD_NAME) {
|
||||
currentFieldName = parser.currentName();
|
||||
} else if ("prefer_local".equals(currentFieldName) && token == XContentParser.Token.VALUE_BOOLEAN) {
|
||||
analyzeRequest.preferLocal(parser.booleanValue());
|
||||
} else if ("text".equals(currentFieldName) && token == XContentParser.Token.VALUE_STRING) {
|
||||
analyzeRequest.text(parser.text());
|
||||
} else if ("analyzer".equals(currentFieldName) && token == XContentParser.Token.VALUE_STRING) {
|
||||
analyzeRequest.analyzer(parser.text());
|
||||
} else if ("field".equals(currentFieldName) && token == XContentParser.Token.VALUE_STRING) {
|
||||
analyzeRequest.field(parser.text());
|
||||
} else if ("tokenizer".equals(currentFieldName) && token == XContentParser.Token.VALUE_STRING) {
|
||||
analyzeRequest.tokenizer(parser.text());
|
||||
} else if (("token_filters".equals(currentFieldName) || "filters".equals(currentFieldName)) && token == XContentParser.Token.START_ARRAY) {
|
||||
List<String> filters = Lists.newArrayList();
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_ARRAY) {
|
||||
if (token.isValue() == false) {
|
||||
throw new ElasticsearchIllegalArgumentException(currentFieldName + " array element should only contain token filter's name");
|
||||
}
|
||||
filters.add(parser.text());
|
||||
}
|
||||
analyzeRequest.tokenFilters(filters.toArray(new String[0]));
|
||||
} else if ("char_filters".equals(currentFieldName) && token == XContentParser.Token.START_ARRAY) {
|
||||
List<String> charFilters = Lists.newArrayList();
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_ARRAY) {
|
||||
if (token.isValue() == false) {
|
||||
throw new ElasticsearchIllegalArgumentException(currentFieldName + " array element should only contain char filter's name");
|
||||
}
|
||||
charFilters.add(parser.text());
|
||||
}
|
||||
analyzeRequest.tokenFilters(charFilters.toArray(new String[0]));
|
||||
} else {
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown parameter [" + currentFieldName + "] in request body or parameter is of the wrong type[" + token + "] ");
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (IOException e) {
|
||||
throw new ElasticsearchIllegalArgumentException("Failed to parse request body", e);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
}
|
||||
|
|
|
@ -19,16 +19,23 @@
|
|||
|
||||
package org.elasticsearch.rest.action.search;
|
||||
|
||||
import org.elasticsearch.ElasticsearchIllegalArgumentException;
|
||||
import org.elasticsearch.action.search.ClearScrollRequest;
|
||||
import org.elasticsearch.action.search.ClearScrollResponse;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentHelper;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.rest.*;
|
||||
import org.elasticsearch.rest.action.support.RestActions;
|
||||
import org.elasticsearch.rest.action.support.RestStatusToXContentListener;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
|
||||
import static org.elasticsearch.rest.RestRequest.Method.DELETE;
|
||||
|
@ -48,12 +55,20 @@ public class RestClearScrollAction extends BaseRestHandler {
|
|||
@Override
|
||||
public void handleRequest(final RestRequest request, final RestChannel channel, final Client client) {
|
||||
String scrollIds = request.param("scroll_id");
|
||||
if (scrollIds == null) {
|
||||
scrollIds = RestActions.getRestContent(request).toUtf8();
|
||||
}
|
||||
|
||||
ClearScrollRequest clearRequest = new ClearScrollRequest();
|
||||
clearRequest.setScrollIds(Arrays.asList(splitScrollIds(scrollIds)));
|
||||
if (request.hasContent()) {
|
||||
XContentType type = XContentFactory.xContentType(request.content());
|
||||
if (type == null) {
|
||||
scrollIds = RestActions.getRestContent(request).toUtf8();
|
||||
clearRequest.setScrollIds(Arrays.asList(splitScrollIds(scrollIds)));
|
||||
} else {
|
||||
// NOTE: if rest request with xcontent body has request parameters, these parameters does not override xcontent value
|
||||
clearRequest.setScrollIds(null);
|
||||
buildFromContent(request.content(), clearRequest);
|
||||
}
|
||||
}
|
||||
|
||||
client.clearScroll(clearRequest, new RestStatusToXContentListener<ClearScrollResponse>(channel));
|
||||
}
|
||||
|
||||
|
@ -63,4 +78,32 @@ public class RestClearScrollAction extends BaseRestHandler {
|
|||
}
|
||||
return Strings.splitStringByCommaToArray(scrollIds);
|
||||
}
|
||||
|
||||
public static void buildFromContent(BytesReference content, ClearScrollRequest clearScrollRequest) throws ElasticsearchIllegalArgumentException {
|
||||
try (XContentParser parser = XContentHelper.createParser(content)) {
|
||||
if (parser.nextToken() != XContentParser.Token.START_OBJECT) {
|
||||
throw new ElasticsearchIllegalArgumentException("Malformed content, must start with an object");
|
||||
} else {
|
||||
XContentParser.Token token;
|
||||
String currentFieldName = null;
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
|
||||
if (token == XContentParser.Token.FIELD_NAME) {
|
||||
currentFieldName = parser.currentName();
|
||||
} else if ("scroll_id".equals(currentFieldName) && token == XContentParser.Token.START_ARRAY) {
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_ARRAY) {
|
||||
if (token.isValue() == false) {
|
||||
throw new ElasticsearchIllegalArgumentException("scroll_id array element should only contain scroll_id");
|
||||
}
|
||||
clearScrollRequest.addScrollId(parser.text());
|
||||
}
|
||||
} else {
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown parameter [" + currentFieldName + "] in request body or parameter is of the wrong type[" + token + "] ");
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (IOException e) {
|
||||
throw new ElasticsearchIllegalArgumentException("Failed to parse request body", e);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -19,16 +19,25 @@
|
|||
|
||||
package org.elasticsearch.rest.action.search;
|
||||
|
||||
import org.elasticsearch.ElasticsearchIllegalArgumentException;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchScrollRequest;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentHelper;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.rest.*;
|
||||
import org.elasticsearch.rest.action.support.RestActions;
|
||||
import org.elasticsearch.rest.action.support.RestStatusToXContentListener;
|
||||
import org.elasticsearch.search.Scroll;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.common.unit.TimeValue.parseTimeValue;
|
||||
import static org.elasticsearch.rest.RestRequest.Method.GET;
|
||||
import static org.elasticsearch.rest.RestRequest.Method.POST;
|
||||
|
@ -51,16 +60,51 @@ public class RestSearchScrollAction extends BaseRestHandler {
|
|||
@Override
|
||||
public void handleRequest(final RestRequest request, final RestChannel channel, final Client client) {
|
||||
String scrollId = request.param("scroll_id");
|
||||
if (scrollId == null) {
|
||||
scrollId = RestActions.getRestContent(request).toUtf8();
|
||||
}
|
||||
SearchScrollRequest searchScrollRequest = new SearchScrollRequest(scrollId);
|
||||
SearchScrollRequest searchScrollRequest = new SearchScrollRequest();
|
||||
searchScrollRequest.listenerThreaded(false);
|
||||
searchScrollRequest.scrollId(scrollId);
|
||||
String scroll = request.param("scroll");
|
||||
if (scroll != null) {
|
||||
searchScrollRequest.scroll(new Scroll(parseTimeValue(scroll, null)));
|
||||
}
|
||||
|
||||
if (request.hasContent()) {
|
||||
XContentType type = XContentFactory.xContentType(request.content());
|
||||
if (type == null) {
|
||||
if (scrollId == null) {
|
||||
scrollId = RestActions.getRestContent(request).toUtf8();
|
||||
searchScrollRequest.scrollId(scrollId);
|
||||
}
|
||||
} else {
|
||||
// NOTE: if rest request with xcontent body has request parameters, these parameters override xcontent values
|
||||
buildFromContent(request.content(), searchScrollRequest);
|
||||
}
|
||||
}
|
||||
client.searchScroll(searchScrollRequest, new RestStatusToXContentListener<SearchResponse>(channel));
|
||||
}
|
||||
|
||||
public static void buildFromContent(BytesReference content, SearchScrollRequest searchScrollRequest) throws ElasticsearchIllegalArgumentException {
|
||||
try (XContentParser parser = XContentHelper.createParser(content)) {
|
||||
if (parser.nextToken() != XContentParser.Token.START_OBJECT) {
|
||||
throw new ElasticsearchIllegalArgumentException("Malforrmed content, must start with an object");
|
||||
} else {
|
||||
XContentParser.Token token;
|
||||
String currentFieldName = null;
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
|
||||
if (token == XContentParser.Token.FIELD_NAME) {
|
||||
currentFieldName = parser.currentName();
|
||||
} else if ("scroll_id".equals(currentFieldName) && token == XContentParser.Token.VALUE_STRING) {
|
||||
searchScrollRequest.scrollId(parser.text());
|
||||
} else if ("scroll".equals(currentFieldName) && token == XContentParser.Token.VALUE_STRING) {
|
||||
searchScrollRequest.scroll(new Scroll(TimeValue.parseTimeValue(parser.text(), null)));
|
||||
} else {
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown parameter [" + currentFieldName + "] in request body or parameter is of the wrong type[" + token + "] ");
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (IOException e) {
|
||||
throw new ElasticsearchIllegalArgumentException("Failed to parse request body", e);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -180,7 +180,8 @@ public class IndicesRequestTests extends ElasticsearchIntegrationTest {
|
|||
String analyzeShardAction = AnalyzeAction.NAME + "[s]";
|
||||
interceptTransportActions(analyzeShardAction);
|
||||
|
||||
AnalyzeRequest analyzeRequest = new AnalyzeRequest(randomIndexOrAlias(), "text");
|
||||
AnalyzeRequest analyzeRequest = new AnalyzeRequest(randomIndexOrAlias());
|
||||
analyzeRequest.text("text");
|
||||
internalCluster().clientNodeClient().admin().indices().analyze(analyzeRequest).actionGet();
|
||||
|
||||
clearInterceptedActions();
|
||||
|
|
|
@ -21,15 +21,19 @@ package org.elasticsearch.indices.analyze;
|
|||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.ElasticsearchIllegalArgumentException;
|
||||
import org.elasticsearch.action.admin.indices.alias.Alias;
|
||||
import org.elasticsearch.action.admin.indices.analyze.AnalyzeRequest;
|
||||
import org.elasticsearch.action.admin.indices.analyze.AnalyzeRequestBuilder;
|
||||
import org.elasticsearch.action.admin.indices.analyze.AnalyzeResponse;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.xcontent.*;
|
||||
import org.elasticsearch.rest.action.admin.indices.analyze.RestAnalyzeAction;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
|
@ -191,4 +195,56 @@ public class AnalyzeActionTests extends ElasticsearchIntegrationTest {
|
|||
private static String indexOrAlias() {
|
||||
return randomBoolean() ? "test" : "alias";
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParseXContentForAnalyzeReuqest() throws Exception {
|
||||
BytesReference content = XContentFactory.jsonBuilder()
|
||||
.startObject()
|
||||
.field("text", "THIS IS A TEST")
|
||||
.field("tokenizer", "keyword")
|
||||
.array("filters", "lowercase")
|
||||
.endObject().bytes();
|
||||
|
||||
AnalyzeRequest analyzeRequest = new AnalyzeRequest("for test");
|
||||
|
||||
RestAnalyzeAction.buildFromContent(content, analyzeRequest);
|
||||
|
||||
assertThat(analyzeRequest.text(), equalTo("THIS IS A TEST"));
|
||||
assertThat(analyzeRequest.tokenizer(), equalTo("keyword"));
|
||||
assertThat(analyzeRequest.tokenFilters(), equalTo(new String[]{"lowercase"}));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParseXContentForAnalyzeRequestWithInvalidJsonThrowsException() throws Exception {
|
||||
AnalyzeRequest analyzeRequest = new AnalyzeRequest("for test");
|
||||
BytesReference invalidContent = XContentFactory.jsonBuilder().startObject().value("invalid_json").endObject().bytes();
|
||||
|
||||
try {
|
||||
RestAnalyzeAction.buildFromContent(invalidContent, analyzeRequest);
|
||||
fail("shouldn't get here");
|
||||
} catch (Exception e) {
|
||||
assertThat(e, instanceOf(ElasticsearchIllegalArgumentException.class));
|
||||
assertThat(e.getMessage(), equalTo("Failed to parse request body"));
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testParseXContentForAnalyzeRequestWithUnknownParamThrowsException() throws Exception {
|
||||
AnalyzeRequest analyzeRequest = new AnalyzeRequest("for test");
|
||||
BytesReference invalidContent =XContentFactory.jsonBuilder()
|
||||
.startObject()
|
||||
.field("text", "THIS IS A TEST")
|
||||
.field("unknown", "keyword")
|
||||
.endObject().bytes();
|
||||
|
||||
try {
|
||||
RestAnalyzeAction.buildFromContent(invalidContent, analyzeRequest);
|
||||
fail("shouldn't get here");
|
||||
} catch (Exception e) {
|
||||
assertThat(e, instanceOf(ElasticsearchIllegalArgumentException.class));
|
||||
assertThat(e.getMessage(), startsWith("Unknown parameter [unknown]"));
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -20,17 +20,18 @@
|
|||
package org.elasticsearch.search.scroll;
|
||||
|
||||
import org.elasticsearch.ElasticsearchIllegalArgumentException;
|
||||
import org.elasticsearch.action.search.ClearScrollResponse;
|
||||
import org.elasticsearch.action.search.SearchRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.action.search.*;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.common.util.concurrent.UncategorizedExecutionException;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
import org.elasticsearch.rest.action.search.RestClearScrollAction;
|
||||
import org.elasticsearch.rest.action.search.RestSearchScrollAction;
|
||||
import org.elasticsearch.search.SearchHit;
|
||||
import org.elasticsearch.search.sort.FieldSortBuilder;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
|
@ -45,11 +46,7 @@ import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
|||
import static org.elasticsearch.index.query.QueryBuilders.queryStringQuery;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.termQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.greaterThan;
|
||||
import static org.hamcrest.Matchers.instanceOf;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
import static org.hamcrest.Matchers.notNullValue;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
*
|
||||
|
@ -490,4 +487,94 @@ public class SearchScrollTests extends ElasticsearchIntegrationTest {
|
|||
assertHitCount(response, 1);
|
||||
assertThat(response.getHits().getHits().length, equalTo(0));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParseSearchScrollRequest() throws Exception {
|
||||
BytesReference content = XContentFactory.jsonBuilder()
|
||||
.startObject()
|
||||
.field("scroll_id", "SCROLL_ID")
|
||||
.field("scroll", "1m")
|
||||
.endObject().bytes();
|
||||
|
||||
SearchScrollRequest searchScrollRequest = new SearchScrollRequest();
|
||||
RestSearchScrollAction.buildFromContent(content, searchScrollRequest);
|
||||
|
||||
assertThat(searchScrollRequest.scrollId(), equalTo("SCROLL_ID"));
|
||||
assertThat(searchScrollRequest.scroll().keepAlive(), equalTo(TimeValue.parseTimeValue("1m", null)));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParseSearchScrollRequestWithInvalidJsonThrowsException() throws Exception {
|
||||
SearchScrollRequest searchScrollRequest = new SearchScrollRequest();
|
||||
BytesReference invalidContent = XContentFactory.jsonBuilder().startObject()
|
||||
.value("invalid_json").endObject().bytes();
|
||||
|
||||
try {
|
||||
RestSearchScrollAction.buildFromContent(invalidContent, searchScrollRequest);
|
||||
fail("expected parseContent failure");
|
||||
} catch (Exception e) {
|
||||
assertThat(e, instanceOf(ElasticsearchIllegalArgumentException.class));
|
||||
assertThat(e.getMessage(), equalTo("Failed to parse request body"));
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParseSearchScrollRequestWithUnknownParamThrowsException() throws Exception {
|
||||
SearchScrollRequest searchScrollRequest = new SearchScrollRequest();
|
||||
BytesReference invalidContent = XContentFactory.jsonBuilder().startObject()
|
||||
.field("scroll_id", "value_2")
|
||||
.field("unknown", "keyword")
|
||||
.endObject().bytes();
|
||||
|
||||
try {
|
||||
RestSearchScrollAction.buildFromContent(invalidContent, searchScrollRequest);
|
||||
fail("expected parseContent failure");
|
||||
} catch (Exception e) {
|
||||
assertThat(e, instanceOf(ElasticsearchIllegalArgumentException.class));
|
||||
assertThat(e.getMessage(), startsWith("Unknown parameter [unknown]"));
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParseClearScrollRequest() throws Exception {
|
||||
BytesReference content = XContentFactory.jsonBuilder().startObject()
|
||||
.array("scroll_id", "value_1", "value_2")
|
||||
.endObject().bytes();
|
||||
ClearScrollRequest clearScrollRequest = new ClearScrollRequest();
|
||||
RestClearScrollAction.buildFromContent(content, clearScrollRequest);
|
||||
assertThat(clearScrollRequest.scrollIds(), contains("value_1", "value_2"));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParseClearScrollRequestWithInvalidJsonThrowsException() throws Exception {
|
||||
BytesReference invalidContent = XContentFactory.jsonBuilder().startObject()
|
||||
.value("invalid_json").endObject().bytes();
|
||||
ClearScrollRequest clearScrollRequest = new ClearScrollRequest();
|
||||
|
||||
try {
|
||||
RestClearScrollAction.buildFromContent(invalidContent, clearScrollRequest);
|
||||
fail("expected parseContent failure");
|
||||
} catch (Exception e) {
|
||||
assertThat(e, instanceOf(ElasticsearchIllegalArgumentException.class));
|
||||
assertThat(e.getMessage(), equalTo("Failed to parse request body"));
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParseClearScrollRequestWithUnknownParamThrowsException() throws Exception {
|
||||
BytesReference invalidContent = XContentFactory.jsonBuilder().startObject()
|
||||
.array("scroll_id", "value_1", "value_2")
|
||||
.field("unknown", "keyword")
|
||||
.endObject().bytes();
|
||||
ClearScrollRequest clearScrollRequest = new ClearScrollRequest();
|
||||
|
||||
try {
|
||||
RestClearScrollAction.buildFromContent(invalidContent, clearScrollRequest);
|
||||
fail("expected parseContent failure");
|
||||
} catch (Exception e) {
|
||||
assertThat(e, instanceOf(ElasticsearchIllegalArgumentException.class));
|
||||
assertThat(e.getMessage(), startsWith("Unknown parameter [unknown]"));
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
Loading…
Reference in New Issue