API: Add response filtering with filter_path parameter

This change adds a new "filter_path" parameter that can be used to filter and reduce the responses returned by the REST API of elasticsearch.

For example, returning only the shards that failed to be optimized:
```
curl -XPOST 'localhost:9200/beer/_optimize?filter_path=_shards.failed'
{"_shards":{"failed":0}}%
```

It supports multiple filters (separated by a comma):
```
curl -XGET 'localhost:9200/_mapping?pretty&filter_path=*.mappings.*.properties.name,*.mappings.*.properties.title'
```

It also supports the YAML response format. Here it returns only the `_id` field of a newly indexed document:
```
curl -XPOST 'localhost:9200/library/book?filter_path=_id' -d '---hello:\n  world: 1\n'
---
_id: "AU0j64-b-stVfkvus5-A"
```

It also supports wildcards. Here it returns only the host name of every nodes in the cluster:
```
curl -XGET 'http://localhost:9200/_nodes/stats?filter_path=nodes.*.host*'
{"nodes":{"lvJHed8uQQu4brS-SXKsNA":{"host":"portable"}}}
```

And "**" can be used to include sub fields without knowing the exact path. Here it returns only the Lucene version of every segment:
```
curl 'http://localhost:9200/_segments?pretty&filter_path=indices.**.version'
{
  "indices" : {
    "beer" : {
      "shards" : {
        "0" : [ {
          "segments" : {
            "_0" : {
              "version" : "5.2.0"
            },
            "_1" : {
              "version" : "5.2.0"
            }
          }
        } ]
      }
    }
  }
}
```

Note that elasticsearch sometimes returns directly the raw value of a field, like the _source field. If you want to filter _source fields, you should consider combining the already existing _source parameter (see Get API for more details) with the filter_path parameter like this:

```
curl -XGET 'localhost:9200/_search?pretty&filter_path=hits.hits._source&_source=title'
{
  "hits" : {
    "hits" : [ {
      "_source":{"title":"Book #2"}
    }, {
      "_source":{"title":"Book #1"}
    }, {
      "_source":{"title":"Book #3"}
    } ]
  }
}
```
This commit is contained in:
Tanguy Leroux 2015-05-05 14:11:05 +02:00
parent 543f572d80
commit ce63590bd6
31 changed files with 1986 additions and 66 deletions

View File

@ -81,6 +81,113 @@ being consumed by a monitoring tool, rather than intended for human
consumption. The default for the `human` flag is
`false`.
[float]
=== Response Filtering
All REST APIs accept a `filter_path` parameter that can be used to reduce
the response returned by elasticsearch. This parameter takes a comma
separated list of filters expressed with the dot notation:
[source,sh]
--------------------------------------------------
curl -XGET 'localhost:9200/_search?pretty&filter_path=took,hits.hits._id,hits.hits._score'
{
"took" : 3,
"hits" : {
"hits" : [
{
"_id" : "3640",
"_score" : 1.0
},
{
"_id" : "3642",
"_score" : 1.0
}
]
}
}
--------------------------------------------------
It also supports the `*` wildcard character to match any field or part
of a field's name:
[source,sh]
--------------------------------------------------
curl -XGET 'localhost:9200/_nodes/stats?filter_path=nodes.*.ho*'
{
"nodes" : {
"lvJHed8uQQu4brS-SXKsNA" : {
"host" : "portable"
}
}
}
--------------------------------------------------
And the `**` wildcard can be used to include fields without knowing the
exact path of the field. For example, we can return the Lucene version
of every segment with this request:
[source,sh]
--------------------------------------------------
curl 'localhost:9200/_segments?pretty&filter_path=indices.**.version'
{
"indices" : {
"movies" : {
"shards" : {
"0" : [ {
"segments" : {
"_0" : {
"version" : "5.2.0"
}
}
} ],
"2" : [ {
"segments" : {
"_0" : {
"version" : "5.2.0"
}
}
} ]
}
},
"books" : {
"shards" : {
"0" : [ {
"segments" : {
"_0" : {
"version" : "5.2.0"
}
}
} ]
}
}
}
}
--------------------------------------------------
Note that elasticsearch sometimes returns directly the raw value of a field,
like the `_source` field. If you want to filter _source fields, you should
consider combining the already existing `_source` parameter (see
<<get-source-filtering,Get API>> for more details) with the `filter_path`
parameter like this:
[source,sh]
--------------------------------------------------
curl -XGET 'localhost:9200/_search?pretty&filter_path=hits.hits._source&_source=title'
{
"hits" : {
"hits" : [ {
"_source":{"title":"Book #2"}
}, {
"_source":{"title":"Book #1"}
}, {
"_source":{"title":"Book #3"}
} ]
}
}
--------------------------------------------------
[float]
=== Flat Settings

View File

@ -56,6 +56,10 @@
"options" : ["node", "indices", "shards"],
"default" : "node"
},
"filter_path": {
"type" : "list",
"description" : "A comma-separated list of fields to include in the returned response"
},
"types" : {
"type" : "list",
"description" : "A comma-separated list of document types for the `indexing` index metric"

View File

@ -72,6 +72,10 @@
"type" : "boolean",
"description" : "Specify whether query terms should be lowercased"
},
"filter_path": {
"type" : "list",
"description" : "A comma-separated list of fields to include in the returned response"
},
"preference": {
"type" : "string",
"description" : "Specify the node or shard the operation should be performed on (default: random)"

View File

@ -0,0 +1,154 @@
---
"Nodes Stats with response filtering":
- do:
cluster.state: {}
# Get master node id
- set: { master_node: master }
# Nodes Stats with no filtering
- do:
nodes.stats: {}
- is_true: cluster_name
- is_true: nodes
- is_true: nodes.$master.name
- is_true: nodes.$master.indices
- is_true: nodes.$master.indices.docs
- gte: { nodes.$master.indices.docs.count: 0 }
- is_true: nodes.$master.indices.segments
- gte: { nodes.$master.indices.segments.count: 0 }
- is_true: nodes.$master.jvm
- is_true: nodes.$master.jvm.threads
- gte: { nodes.$master.jvm.threads.count: 0 }
- is_true: nodes.$master.jvm.buffer_pools.direct
- gte: { nodes.$master.jvm.buffer_pools.direct.count: 0 }
- gte: { nodes.$master.jvm.buffer_pools.direct.used_in_bytes: 0 }
# Nodes Stats with only "cluster_name" field
- do:
nodes.stats:
filter_path: cluster_name
- is_true: cluster_name
- is_false: nodes
- is_false: nodes.$master.name
- is_false: nodes.$master.indices
- is_false: nodes.$master.jvm
# Nodes Stats with "nodes" field and sub-fields
- do:
nodes.stats:
filter_path: nodes.*
- is_false: cluster_name
- is_true: nodes
- is_true: nodes.$master.name
- is_true: nodes.$master.indices
- is_true: nodes.$master.indices.docs
- gte: { nodes.$master.indices.docs.count: 0 }
- is_true: nodes.$master.indices.segments
- gte: { nodes.$master.indices.segments.count: 0 }
- is_true: nodes.$master.jvm
- is_true: nodes.$master.jvm.threads
- gte: { nodes.$master.jvm.threads.count: 0 }
- is_true: nodes.$master.jvm.buffer_pools.direct
- gte: { nodes.$master.jvm.buffer_pools.direct.count: 0 }
- gte: { nodes.$master.jvm.buffer_pools.direct.used_in_bytes: 0 }
# Nodes Stats with "nodes.*.indices" field and sub-fields
- do:
nodes.stats:
filter_path: nodes.*.indices
- is_false: cluster_name
- is_true: nodes
- is_false: nodes.$master.name
- is_true: nodes.$master.indices
- is_true: nodes.$master.indices.docs
- gte: { nodes.$master.indices.docs.count: 0 }
- is_true: nodes.$master.indices.segments
- gte: { nodes.$master.indices.segments.count: 0 }
- is_false: nodes.$master.jvm
# Nodes Stats with "nodes.*.name" and "nodes.*.indices.docs.count" fields
- do:
nodes.stats:
filter_path: [ "nodes.*.name", "nodes.*.indices.docs.count" ]
- is_false: cluster_name
- is_true: nodes
- is_true: nodes.$master.name
- is_true: nodes.$master.indices
- is_true: nodes.$master.indices.docs
- gte: { nodes.$master.indices.docs.count: 0 }
- is_false: nodes.$master.indices.segments
- is_false: nodes.$master.jvm
# Nodes Stats with all "count" fields
- do:
nodes.stats:
filter_path: "nodes.**.count"
- is_false: cluster_name
- is_true: nodes
- is_false: nodes.$master.name
- is_true: nodes.$master.indices
- is_true: nodes.$master.indices.docs
- gte: { nodes.$master.indices.docs.count: 0 }
- is_true: nodes.$master.indices.segments
- gte: { nodes.$master.indices.segments.count: 0 }
- is_true: nodes.$master.jvm
- is_true: nodes.$master.jvm.threads
- gte: { nodes.$master.jvm.threads.count: 0 }
- is_true: nodes.$master.jvm.buffer_pools.direct
- gte: { nodes.$master.jvm.buffer_pools.direct.count: 0 }
- is_false: nodes.$master.jvm.buffer_pools.direct.used_in_bytes
# Nodes Stats with all "count" fields in sub-fields of "jvm" field
- do:
nodes.stats:
filter_path: "nodes.**.jvm.**.count"
- is_false: cluster_name
- is_true: nodes
- is_false: nodes.$master.name
- is_false: nodes.$master.indices
- is_false: nodes.$master.indices.docs.count
- is_false: nodes.$master.indices.segments.count
- is_true: nodes.$master.jvm
- is_true: nodes.$master.jvm.threads
- gte: { nodes.$master.jvm.threads.count: 0 }
- is_true: nodes.$master.jvm.buffer_pools.direct
- gte: { nodes.$master.jvm.buffer_pools.direct.count: 0 }
- is_false: nodes.$master.jvm.buffer_pools.direct.used_in_bytes
# Nodes Stats with "nodes.*.fs.data" fields
- do:
nodes.stats:
filter_path: "nodes.*.fs.data"
- is_false: cluster_name
- is_true: nodes
- is_false: nodes.$master.name
- is_false: nodes.$master.indices
- is_false: nodes.$master.jvm
- is_true: nodes.$master.fs.data
- is_true: nodes.$master.fs.data.0.path
- is_true: nodes.$master.fs.data.0.type
- is_true: nodes.$master.fs.data.0.total_in_bytes
# Nodes Stats with "nodes.*.fs.data.t*" fields
- do:
nodes.stats:
filter_path: "nodes.*.fs.data.t*"
- is_false: cluster_name
- is_true: nodes
- is_false: nodes.$master.name
- is_false: nodes.$master.indices
- is_false: nodes.$master.jvm
- is_true: nodes.$master.fs.data
- is_false: nodes.$master.fs.data.0.path
- is_true: nodes.$master.fs.data.0.type
- is_true: nodes.$master.fs.data.0.total_in_bytes

View File

@ -0,0 +1,87 @@
---
"Search with response filtering":
- do:
indices.create:
index: test
- do:
index:
index: test
type: test
id: 1
body: { foo: bar }
- do:
index:
index: test
type: test
id: 2
body: { foo: bar }
- do:
indices.refresh:
index: [test]
- do:
search:
index: test
filter_path: "*"
body: "{ query: { match_all: {} } }"
- is_true: took
- is_true: _shards.total
- is_true: hits.total
- is_true: hits.hits.0._index
- is_true: hits.hits.0._type
- is_true: hits.hits.0._id
- is_true: hits.hits.1._index
- is_true: hits.hits.1._type
- is_true: hits.hits.1._id
- do:
search:
index: test
filter_path: "took"
body: "{ query: { match_all: {} } }"
- is_true: took
- is_false: _shards.total
- is_false: hits.total
- is_false: hits.hits.0._index
- is_false: hits.hits.0._type
- is_false: hits.hits.0._id
- is_false: hits.hits.1._index
- is_false: hits.hits.1._type
- is_false: hits.hits.1._id
- do:
search:
index: test
filter_path: "_shards.*"
body: "{ query: { match_all: {} } }"
- is_false: took
- is_true: _shards.total
- is_false: hits.total
- is_false: hits.hits.0._index
- is_false: hits.hits.0._type
- is_false: hits.hits.0._id
- is_false: hits.hits.1._index
- is_false: hits.hits.1._type
- is_false: hits.hits.1._id
- do:
search:
index: test
filter_path: [ "hits.**._i*", "**.total" ]
body: "{ query: { match_all: {} } }"
- is_false: took
- is_true: _shards.total
- is_true: hits.total
- is_true: hits.hits.0._index
- is_false: hits.hits.0._type
- is_true: hits.hits.0._id
- is_true: hits.hits.1._index
- is_false: hits.hits.1._type
- is_true: hits.hits.1._id

View File

@ -40,6 +40,11 @@ public interface XContent {
*/
XContentGenerator createGenerator(OutputStream os) throws IOException;
/**
* Creates a new generator using the provided output stream and some filters.
*/
XContentGenerator createGenerator(OutputStream os, String[] filters) throws IOException;
/**
* Creates a new generator using the provided writer.
*/

View File

@ -77,6 +77,10 @@ public final class XContentBuilder implements BytesStream, Releasable {
return new XContentBuilder(xContent, new BytesStreamOutput());
}
public static XContentBuilder builder(XContent xContent, String[] filters) throws IOException {
return new XContentBuilder(xContent, new BytesStreamOutput(), filters);
}
private XContentGenerator generator;
private final OutputStream bos;
@ -92,8 +96,17 @@ public final class XContentBuilder implements BytesStream, Releasable {
* to call {@link #close()} when the builder is done with.
*/
public XContentBuilder(XContent xContent, OutputStream bos) throws IOException {
this(xContent, bos, null);
}
/**
* Constructs a new builder using the provided xcontent, an OutputStream and some filters. The
* filters are used to filter fields that won't be written to the OutputStream. Make sure
* to call {@link #close()} when the builder is done with.
*/
public XContentBuilder(XContent xContent, OutputStream bos, String[] filters) throws IOException {
this.bos = bos;
this.generator = xContent.createGenerator(bos);
this.generator = xContent.createGenerator(bos, filters);
}
public XContentBuilder fieldCaseConversion(FieldCaseConversion fieldCaseConversion) {

View File

@ -20,11 +20,15 @@
package org.elasticsearch.common.xcontent.cbor;
import com.fasterxml.jackson.core.JsonEncoding;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.dataformat.cbor.CBORFactory;
import org.elasticsearch.ElasticsearchParseException;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.io.FastStringReader;
import org.elasticsearch.common.util.CollectionUtils;
import org.elasticsearch.common.xcontent.*;
import org.elasticsearch.common.xcontent.json.BaseJsonGenerator;
import org.elasticsearch.common.xcontent.support.filtering.FilteringJsonGenerator;
import java.io.*;
@ -59,14 +63,27 @@ public class CborXContent implements XContent {
throw new ElasticsearchParseException("cbor does not support stream parsing...");
}
private XContentGenerator newXContentGenerator(JsonGenerator jsonGenerator) {
return new CborXContentGenerator(new BaseJsonGenerator(jsonGenerator));
}
@Override
public XContentGenerator createGenerator(OutputStream os) throws IOException {
return new CborXContentGenerator(cborFactory.createGenerator(os, JsonEncoding.UTF8));
return newXContentGenerator(cborFactory.createGenerator(os, JsonEncoding.UTF8));
}
@Override
public XContentGenerator createGenerator(OutputStream os, String[] filters) throws IOException {
if (CollectionUtils.isEmpty(filters)) {
return createGenerator(os);
}
FilteringJsonGenerator cborGenerator = new FilteringJsonGenerator(cborFactory.createGenerator(os, JsonEncoding.UTF8), filters);
return new CborXContentGenerator(cborGenerator);
}
@Override
public XContentGenerator createGenerator(Writer writer) throws IOException {
return new CborXContentGenerator(cborFactory.createGenerator(writer));
return newXContentGenerator(cborFactory.createGenerator(writer));
}
@Override

View File

@ -19,10 +19,10 @@
package org.elasticsearch.common.xcontent.cbor;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.dataformat.cbor.CBORParser;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.common.xcontent.json.BaseJsonGenerator;
import org.elasticsearch.common.xcontent.json.JsonXContentGenerator;
import java.io.IOException;
@ -34,7 +34,7 @@ import java.io.OutputStream;
*/
public class CborXContentGenerator extends JsonXContentGenerator {
public CborXContentGenerator(JsonGenerator generator) {
public CborXContentGenerator(BaseJsonGenerator generator) {
super(generator);
}

View File

@ -0,0 +1,80 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.json;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.base.GeneratorBase;
import com.fasterxml.jackson.core.util.JsonGeneratorDelegate;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.io.Streams;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
public class BaseJsonGenerator extends JsonGeneratorDelegate {
protected final GeneratorBase base;
public BaseJsonGenerator(JsonGenerator generator, JsonGenerator base) {
super(generator, true);
if (base instanceof GeneratorBase) {
this.base = (GeneratorBase) base;
} else {
this.base = null;
}
}
public BaseJsonGenerator(JsonGenerator generator) {
this(generator, generator);
}
protected void writeStartRaw(String fieldName) throws IOException {
writeFieldName(fieldName);
writeRaw(':');
}
public void writeEndRaw() {
assert base != null : "JsonGenerator should be of instance GeneratorBase but was: " + delegate.getClass();
if (base != null) {
base.getOutputContext().writeValue();
}
}
protected void writeRawValue(byte[] content, OutputStream bos) throws IOException {
flush();
bos.write(content);
}
protected void writeRawValue(byte[] content, int offset, int length, OutputStream bos) throws IOException {
flush();
bos.write(content, offset, length);
}
protected void writeRawValue(InputStream content, OutputStream bos) throws IOException {
flush();
Streams.copy(content, bos);
}
protected void writeRawValue(BytesReference content, OutputStream bos) throws IOException {
flush();
content.writeTo(bos);
}
}

View File

@ -25,7 +25,9 @@ import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonParser;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.io.FastStringReader;
import org.elasticsearch.common.util.CollectionUtils;
import org.elasticsearch.common.xcontent.*;
import org.elasticsearch.common.xcontent.support.filtering.FilteringJsonGenerator;
import java.io.*;
@ -63,14 +65,27 @@ public class JsonXContent implements XContent {
return '\n';
}
private XContentGenerator newXContentGenerator(JsonGenerator jsonGenerator) {
return new JsonXContentGenerator(new BaseJsonGenerator(jsonGenerator));
}
@Override
public XContentGenerator createGenerator(OutputStream os) throws IOException {
return new JsonXContentGenerator(jsonFactory.createGenerator(os, JsonEncoding.UTF8));
return newXContentGenerator(jsonFactory.createGenerator(os, JsonEncoding.UTF8));
}
@Override
public XContentGenerator createGenerator(OutputStream os, String[] filters) throws IOException {
if (CollectionUtils.isEmpty(filters)) {
return createGenerator(os);
}
FilteringJsonGenerator jsonGenerator = new FilteringJsonGenerator(jsonFactory.createGenerator(os, JsonEncoding.UTF8), filters);
return new JsonXContentGenerator(jsonGenerator);
}
@Override
public XContentGenerator createGenerator(Writer writer) throws IOException {
return new JsonXContentGenerator(jsonFactory.createGenerator(writer));
return newXContentGenerator(jsonFactory.createGenerator(writer));
}
@Override

View File

@ -19,11 +19,8 @@
package org.elasticsearch.common.xcontent.json;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.base.GeneratorBase;
import com.fasterxml.jackson.core.io.SerializedString;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.io.Streams;
import org.elasticsearch.common.xcontent.*;
import java.io.IOException;
@ -35,18 +32,11 @@ import java.io.OutputStream;
*/
public class JsonXContentGenerator implements XContentGenerator {
protected final JsonGenerator generator;
protected final BaseJsonGenerator generator;
private boolean writeLineFeedAtEnd;
private final GeneratorBase base;
public JsonXContentGenerator(JsonGenerator generator) {
public JsonXContentGenerator(BaseJsonGenerator generator) {
this.generator = generator;
if (generator instanceof GeneratorBase) {
base = (GeneratorBase) generator;
} else {
base = null;
}
}
@Override
@ -261,29 +251,23 @@ public class JsonXContentGenerator implements XContentGenerator {
@Override
public void writeRawField(String fieldName, byte[] content, OutputStream bos) throws IOException {
generator.writeFieldName(fieldName);
generator.writeRaw(':');
flush();
bos.write(content);
finishWriteRaw();
generator.writeStartRaw(fieldName);
generator.writeRawValue(content, bos);
generator.writeEndRaw();
}
@Override
public void writeRawField(String fieldName, byte[] content, int offset, int length, OutputStream bos) throws IOException {
generator.writeFieldName(fieldName);
generator.writeRaw(':');
flush();
bos.write(content, offset, length);
finishWriteRaw();
generator.writeStartRaw(fieldName);
generator.writeRawValue(content, offset, length, bos);
generator.writeEndRaw();
}
@Override
public void writeRawField(String fieldName, InputStream content, OutputStream bos) throws IOException {
generator.writeFieldName(fieldName);
generator.writeRaw(':');
flush();
Streams.copy(content, bos);
finishWriteRaw();
generator.writeStartRaw(fieldName);
generator.writeRawValue(content, bos);
generator.writeEndRaw();
}
@Override
@ -308,18 +292,9 @@ public class JsonXContentGenerator implements XContentGenerator {
}
protected void writeObjectRaw(String fieldName, BytesReference content, OutputStream bos) throws IOException {
generator.writeFieldName(fieldName);
generator.writeRaw(':');
flush();
content.writeTo(bos);
finishWriteRaw();
}
private void finishWriteRaw() {
assert base != null : "JsonGenerator should be of instance GeneratorBase but was: " + generator.getClass();
if (base != null) {
base.getOutputContext().writeValue();
}
generator.writeStartRaw(fieldName);
generator.writeRawValue(content, bos);
generator.writeEndRaw();
}
@Override

View File

@ -20,13 +20,15 @@
package org.elasticsearch.common.xcontent.smile;
import com.fasterxml.jackson.core.JsonEncoding;
import com.fasterxml.jackson.core.JsonFactory;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.dataformat.smile.SmileFactory;
import com.fasterxml.jackson.dataformat.smile.SmileGenerator;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.io.FastStringReader;
import org.elasticsearch.common.util.CollectionUtils;
import org.elasticsearch.common.xcontent.*;
import org.elasticsearch.common.xcontent.json.JsonXContentParser;
import org.elasticsearch.common.xcontent.json.BaseJsonGenerator;
import org.elasticsearch.common.xcontent.support.filtering.FilteringJsonGenerator;
import java.io.*;
@ -62,14 +64,27 @@ public class SmileXContent implements XContent {
return (byte) 0xFF;
}
private XContentGenerator newXContentGenerator(JsonGenerator jsonGenerator) {
return new SmileXContentGenerator(new BaseJsonGenerator(jsonGenerator));
}
@Override
public XContentGenerator createGenerator(OutputStream os) throws IOException {
return new SmileXContentGenerator(smileFactory.createGenerator(os, JsonEncoding.UTF8));
return newXContentGenerator(smileFactory.createGenerator(os, JsonEncoding.UTF8));
}
@Override
public XContentGenerator createGenerator(OutputStream os, String[] filters) throws IOException {
if (CollectionUtils.isEmpty(filters)) {
return createGenerator(os);
}
FilteringJsonGenerator smileGenerator = new FilteringJsonGenerator(smileFactory.createGenerator(os, JsonEncoding.UTF8), filters);
return new SmileXContentGenerator(smileGenerator);
}
@Override
public XContentGenerator createGenerator(Writer writer) throws IOException {
return new SmileXContentGenerator(smileFactory.createGenerator(writer));
return newXContentGenerator(smileFactory.createGenerator(writer));
}
@Override

View File

@ -19,10 +19,10 @@
package org.elasticsearch.common.xcontent.smile;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.dataformat.smile.SmileParser;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.common.xcontent.json.BaseJsonGenerator;
import org.elasticsearch.common.xcontent.json.JsonXContentGenerator;
import java.io.IOException;
@ -34,7 +34,7 @@ import java.io.OutputStream;
*/
public class SmileXContentGenerator extends JsonXContentGenerator {
public SmileXContentGenerator(JsonGenerator generator) {
public SmileXContentGenerator(BaseJsonGenerator generator) {
super(generator);
}

View File

@ -0,0 +1,225 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.support.filtering;
import com.fasterxml.jackson.core.JsonGenerator;
import org.elasticsearch.common.regex.Regex;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
/**
* A FilterContext contains the description of a field about to be written by a JsonGenerator.
*/
public class FilterContext {
/**
* The field/property name to be write
*/
private String property;
/**
* List of XContentFilter matched by the current filtering context
*/
private List<String[]> matchings;
/**
* Flag to indicate if the field/property must be written
*/
private Boolean write = null;
/**
* Flag to indicate if the field/property match a filter
*/
private boolean match = false;
/**
* Points to the parent context
*/
private FilterContext parent;
/**
* Type of the field/property
*/
private Type type = Type.VALUE;
protected enum Type {
VALUE,
OBJECT,
ARRAY,
ARRAY_OF_OBJECT
}
public FilterContext(String property, FilterContext parent) {
this.property = property;
this.parent = parent;
}
public void reset(String property) {
this.property = property;
this.write = null;
if (matchings != null) {
matchings.clear();
}
this.match = false;
this.type = Type.VALUE;
}
public void reset(String property, FilterContext parent) {
reset(property);
this.parent = parent;
if (parent.isMatch()) {
match = true;
}
}
public FilterContext parent() {
return parent;
}
public List<String[]> matchings() {
return matchings;
}
public void addMatching(String[] matching) {
if (matchings == null) {
matchings = new ArrayList<>();
}
matchings.add(matching);
}
public boolean isRoot() {
return parent == null;
}
public boolean isArray() {
return Type.ARRAY.equals(type);
}
public void initArray() {
this.type = Type.ARRAY;
}
public boolean isObject() {
return Type.OBJECT.equals(type);
}
public void initObject() {
this.type = Type.OBJECT;
}
public boolean isArrayOfObject() {
return Type.ARRAY_OF_OBJECT.equals(type);
}
public void initArrayOfObject() {
this.type = Type.ARRAY_OF_OBJECT;
}
public boolean isMatch() {
return match;
}
/**
* This method contains the logic to check if a field/property must be included
* or not.
*/
public boolean include() {
if (write == null) {
if (parent != null) {
// the parent context matches the end of a filter list:
// by default we include all the sub properties so we
// don't need to check if the sub properties also match
if (parent.isMatch()) {
write = true;
match = true;
return write;
}
if (parent.matchings() != null) {
// Iterates over the filters matched by the parent context
// and checks if the current context also match
for (String[] matcher : parent.matchings()) {
if (matcher.length > 0) {
String field = matcher[0];
if ("**".equals(field)) {
addMatching(matcher);
}
if ((field != null) && (Regex.simpleMatch(field, property))) {
int remaining = matcher.length - 1;
// the current context matches the end of a filter list:
// it must be written and it is flagged as a direct match
if (remaining == 0) {
write = true;
match = true;
return write;
} else {
String[] submatching = new String[remaining];
System.arraycopy(matcher, 1, submatching, 0, remaining);
addMatching(submatching);
}
}
}
}
}
} else {
// Root object is always written
write = true;
}
if (write == null) {
write = false;
}
}
return write;
}
/**
* Ensure that the full path to the current field is write by the JsonGenerator
*
* @param generator
* @throws IOException
*/
public void writePath(JsonGenerator generator) throws IOException {
if (parent != null) {
parent.writePath(generator);
}
if ((write == null) || (!write)) {
write = true;
if (property == null) {
generator.writeStartObject();
} else {
generator.writeFieldName(property);
if (isArray()) {
generator.writeStartArray();
} else if (isObject() || isArrayOfObject()) {
generator.writeStartObject();
}
}
}
}
}

View File

@ -0,0 +1,423 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.support.filtering;
import com.fasterxml.jackson.core.Base64Variant;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.SerializableString;
import com.google.common.collect.ImmutableList;
import org.elasticsearch.common.Strings;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.xcontent.json.BaseJsonGenerator;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.util.ArrayDeque;
import java.util.List;
import java.util.Queue;
/**
* A FilteringJsonGenerator uses antpath-like filters to include/exclude fields when writing XContent streams.
*
* When writing a XContent stream, this class instantiates (or reuses) a FilterContext instance for each
* field (or property) that must be generated. This filter context is used to check if the field/property must be
* written according to the current list of XContentFilter filters.
*/
public class FilteringJsonGenerator extends BaseJsonGenerator {
/**
* List of previous contexts
* (MAX_CONTEXTS contexts are kept around in order to be reused)
*/
private Queue<FilterContext> contexts = new ArrayDeque<>();
private static final int MAX_CONTEXTS = 10;
/**
* Current filter context
*/
private FilterContext context;
public FilteringJsonGenerator(JsonGenerator generator, String[] filters) {
super(generator);
ImmutableList.Builder<String[]> builder = ImmutableList.builder();
if (filters != null) {
for (String filter : filters) {
String[] matcher = Strings.delimitedListToStringArray(filter, ".");
if (matcher != null) {
builder.add(matcher);
}
}
}
// Creates a root context that matches all filtering rules
this.context = get(null, null, builder.build());
}
/**
* Get a new context instance (and reset it if needed)
*/
private FilterContext get(String property, FilterContext parent) {
FilterContext ctx = contexts.poll();
if (ctx == null) {
ctx = new FilterContext(property, parent);
} else {
ctx.reset(property, parent);
}
return ctx;
}
/**
* Get a new context instance (and reset it if needed)
*/
private FilterContext get(String property, FilterContext context, List<String[]> matchings) {
FilterContext ctx = get(property, context);
if (matchings != null) {
for (String[] matching : matchings) {
ctx.addMatching(matching);
}
}
return ctx;
}
/**
* Adds a context instance to the pool in order to reuse it if needed
*/
private void put(FilterContext ctx) {
if (contexts.size() <= MAX_CONTEXTS) {
contexts.offer(ctx);
}
}
@Override
public void writeStartArray() throws IOException {
context.initArray();
if (context.include()) {
super.writeStartArray();
}
}
@Override
public void writeStartArray(int size) throws IOException {
context.initArray();
if (context.include()) {
super.writeStartArray(size);
}
}
@Override
public void writeEndArray() throws IOException {
// Case of array of objects
if (context.isArrayOfObject()) {
// Release current context and go one level up
FilterContext parent = context.parent();
put(context);
context = parent;
}
if (context.include()) {
super.writeEndArray();
}
}
@Override
public void writeStartObject() throws IOException {
// Case of array of objects
if (context.isArray()) {
// Get a context for the anonymous object
context = get(null, context, context.matchings());
context.initArrayOfObject();
}
if (!context.isArrayOfObject()) {
context.initObject();
}
if (context.include()) {
super.writeStartObject();
}
context = get(null, context);
}
@Override
public void writeEndObject() throws IOException {
if (!context.isRoot()) {
// Release current context and go one level up
FilterContext parent = context.parent();
put(context);
context = parent;
}
if (context.include()) {
super.writeEndObject();
}
}
@Override
public void writeFieldName(String name) throws IOException {
context.reset(name);
if (context.include()) {
// Ensure that the full path to the field is written
context.writePath(delegate);
super.writeFieldName(name);
}
}
@Override
public void writeFieldName(SerializableString name) throws IOException {
context.reset(name.getValue());
if (context.include()) {
// Ensure that the full path to the field is written
context.writePath(delegate);
super.writeFieldName(name);
}
}
@Override
public void writeString(String text) throws IOException {
if (context.include()) {
super.writeString(text);
}
}
@Override
public void writeString(char[] text, int offset, int len) throws IOException {
if (context.include()) {
super.writeString(text, offset, len);
}
}
@Override
public void writeString(SerializableString text) throws IOException {
if (context.include()) {
super.writeString(text);
}
}
@Override
public void writeRawUTF8String(byte[] text, int offset, int length) throws IOException {
if (context.include()) {
super.writeRawUTF8String(text, offset, length);
}
}
@Override
public void writeUTF8String(byte[] text, int offset, int length) throws IOException {
if (context.include()) {
super.writeUTF8String(text, offset, length);
}
}
@Override
public void writeRaw(String text) throws IOException {
if (context.include()) {
super.writeRaw(text);
}
}
@Override
public void writeRaw(String text, int offset, int len) throws IOException {
if (context.include()) {
super.writeRaw(text, offset, len);
}
}
@Override
public void writeRaw(SerializableString raw) throws IOException {
if (context.include()) {
super.writeRaw(raw);
}
}
@Override
public void writeRaw(char[] text, int offset, int len) throws IOException {
if (context.include()) {
super.writeRaw(text, offset, len);
}
}
@Override
public void writeRaw(char c) throws IOException {
if (context.include()) {
super.writeRaw(c);
}
}
@Override
public void writeRawValue(String text) throws IOException {
if (context.include()) {
super.writeRawValue(text);
}
}
@Override
public void writeRawValue(String text, int offset, int len) throws IOException {
if (context.include()) {
super.writeRawValue(text, offset, len);
}
}
@Override
public void writeRawValue(char[] text, int offset, int len) throws IOException {
if (context.include()) {
super.writeRawValue(text, offset, len);
}
}
@Override
public void writeBinary(Base64Variant b64variant, byte[] data, int offset, int len) throws IOException {
if (context.include()) {
super.writeBinary(b64variant, data, offset, len);
}
}
@Override
public int writeBinary(Base64Variant b64variant, InputStream data, int dataLength) throws IOException {
if (context.include()) {
return super.writeBinary(b64variant, data, dataLength);
}
return 0;
}
@Override
public void writeNumber(short v) throws IOException {
if (context.include()) {
super.writeNumber(v);
}
}
@Override
public void writeNumber(int v) throws IOException {
if (context.include()) {
super.writeNumber(v);
}
}
@Override
public void writeNumber(long v) throws IOException {
if (context.include()) {
super.writeNumber(v);
}
}
@Override
public void writeNumber(BigInteger v) throws IOException {
if (context.include()) {
super.writeNumber(v);
}
}
@Override
public void writeNumber(double v) throws IOException {
if (context.include()) {
super.writeNumber(v);
}
}
@Override
public void writeNumber(float v) throws IOException {
if (context.include()) {
super.writeNumber(v);
}
}
@Override
public void writeNumber(BigDecimal v) throws IOException {
if (context.include()) {
super.writeNumber(v);
}
}
@Override
public void writeNumber(String encodedValue) throws IOException, UnsupportedOperationException {
if (context.include()) {
super.writeNumber(encodedValue);
}
}
@Override
public void writeBoolean(boolean state) throws IOException {
if (context.include()) {
super.writeBoolean(state);
}
}
@Override
public void writeNull() throws IOException {
if (context.include()) {
super.writeNull();
}
}
@Override
public void copyCurrentEvent(JsonParser jp) throws IOException {
if (context.include()) {
super.copyCurrentEvent(jp);
}
}
@Override
public void copyCurrentStructure(JsonParser jp) throws IOException {
if (context.include()) {
super.copyCurrentStructure(jp);
}
}
@Override
protected void writeRawValue(byte[] content, OutputStream bos) throws IOException {
if (context.include()) {
super.writeRawValue(content, bos);
}
}
@Override
protected void writeRawValue(byte[] content, int offset, int length, OutputStream bos) throws IOException {
if (context.include()) {
super.writeRawValue(content, offset, length, bos);
}
}
@Override
protected void writeRawValue(InputStream content, OutputStream bos) throws IOException {
if (context.include()) {
super.writeRawValue(content, bos);
}
}
@Override
protected void writeRawValue(BytesReference content, OutputStream bos) throws IOException {
if (context.include()) {
super.writeRawValue(content, bos);
}
}
@Override
public void close() throws IOException {
contexts.clear();
super.close();
}
}

View File

@ -20,11 +20,15 @@
package org.elasticsearch.common.xcontent.yaml;
import com.fasterxml.jackson.core.JsonEncoding;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.dataformat.yaml.YAMLFactory;
import org.elasticsearch.ElasticsearchParseException;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.io.FastStringReader;
import org.elasticsearch.common.util.CollectionUtils;
import org.elasticsearch.common.xcontent.*;
import org.elasticsearch.common.xcontent.json.BaseJsonGenerator;
import org.elasticsearch.common.xcontent.support.filtering.FilteringJsonGenerator;
import java.io.*;
@ -58,14 +62,27 @@ public class YamlXContent implements XContent {
throw new ElasticsearchParseException("yaml does not support stream parsing...");
}
private XContentGenerator newXContentGenerator(JsonGenerator jsonGenerator) {
return new YamlXContentGenerator(new BaseJsonGenerator(jsonGenerator));
}
@Override
public XContentGenerator createGenerator(OutputStream os) throws IOException {
return new YamlXContentGenerator(yamlFactory.createGenerator(os, JsonEncoding.UTF8));
return newXContentGenerator(yamlFactory.createGenerator(os, JsonEncoding.UTF8));
}
@Override
public XContentGenerator createGenerator(OutputStream os, String[] filters) throws IOException {
if (CollectionUtils.isEmpty(filters)) {
return createGenerator(os);
}
FilteringJsonGenerator yamlGenerator = new FilteringJsonGenerator(yamlFactory.createGenerator(os, JsonEncoding.UTF8), filters);
return new YamlXContentGenerator(yamlGenerator);
}
@Override
public XContentGenerator createGenerator(Writer writer) throws IOException {
return new YamlXContentGenerator(yamlFactory.createGenerator(writer));
return newXContentGenerator(yamlFactory.createGenerator(writer));
}
@Override

View File

@ -19,10 +19,10 @@
package org.elasticsearch.common.xcontent.yaml;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.dataformat.yaml.YAMLParser;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.common.xcontent.json.BaseJsonGenerator;
import org.elasticsearch.common.xcontent.json.JsonXContentGenerator;
import java.io.IOException;
@ -34,7 +34,7 @@ import java.io.OutputStream;
*/
public class YamlXContentGenerator extends JsonXContentGenerator {
public YamlXContentGenerator(JsonGenerator generator) {
public YamlXContentGenerator(BaseJsonGenerator generator) {
super(generator);
}

View File

@ -115,7 +115,7 @@ public class BytesRestResponse extends RestResponse {
}
private static XContentBuilder convert(RestChannel channel, RestStatus status, Throwable t) throws IOException {
XContentBuilder builder = channel.newBuilder().startObject();
XContentBuilder builder = channel.newErrorBuilder().startObject();
if (t == null) {
builder.field("error", "unknown");
} else if (channel.detailedErrorsEnabled()) {

View File

@ -44,10 +44,15 @@ public abstract class RestChannel {
}
public XContentBuilder newBuilder() throws IOException {
return newBuilder(request.hasContent() ? request.content() : null);
return newBuilder(request.hasContent() ? request.content() : null, request.hasParam("filter_path"));
}
public XContentBuilder newBuilder(@Nullable BytesReference autoDetectSource) throws IOException {
public XContentBuilder newErrorBuilder() throws IOException {
// Disable filtering when building error responses
return newBuilder(request.hasContent() ? request.content() : null, false);
}
public XContentBuilder newBuilder(@Nullable BytesReference autoDetectSource, boolean useFiltering) throws IOException {
XContentType contentType = XContentType.fromRestContentType(request.param("format", request.header("Content-Type")));
if (contentType == null) {
// try and guess it from the auto detect source
@ -59,7 +64,9 @@ public abstract class RestChannel {
// default to JSON
contentType = XContentType.JSON;
}
XContentBuilder builder = new XContentBuilder(XContentFactory.xContent(contentType), bytesOutput());
String[] filters = useFiltering ? request.paramAsStringArrayOrEmptyIfAll("filter_path") : null;
XContentBuilder builder = new XContentBuilder(XContentFactory.xContent(contentType), bytesOutput(), filters);
if (request.paramAsBoolean("pretty", false)) {
builder.prettyPrint().lfAtEnd();
}

View File

@ -187,7 +187,7 @@ public class RestController extends AbstractLifecycleComponent<RestController> {
// error_trace cannot be used when we disable detailed errors
if (channel.detailedErrorsEnabled() == false && request.paramAsBoolean("error_trace", false)) {
try {
XContentBuilder builder = channel.newBuilder();
XContentBuilder builder = channel.newErrorBuilder();
builder.startObject().field("error","error traces in responses are disabled.").endObject().string();
RestResponse response = new BytesRestResponse(BAD_REQUEST, builder);
response.addHeader("Content-Type", "application/json");

View File

@ -73,7 +73,7 @@ public class RestGetSourceAction extends BaseRestHandler {
client.get(getRequest, new RestResponseListener<GetResponse>(channel) {
@Override
public RestResponse buildResponse(GetResponse response) throws Exception {
XContentBuilder builder = channel.newBuilder(response.getSourceInternal());
XContentBuilder builder = channel.newBuilder(response.getSourceInternal(), false);
if (!response.isExists()) {
return new BytesRestResponse(NOT_FOUND, builder);
} else {

View File

@ -88,7 +88,7 @@ public class RestIndexAction extends BaseRestHandler {
indexRequest.opType(IndexRequest.OpType.fromString(sOpType));
} catch (IllegalArgumentException eia){
try {
XContentBuilder builder = channel.newBuilder();
XContentBuilder builder = channel.newErrorBuilder();
channel.sendResponse(new BytesRestResponse(BAD_REQUEST, builder.startObject().field("error", eia.getMessage()).endObject()));
} catch (IOException e1) {
logger.warn("Failed to send response", e1);

View File

@ -85,7 +85,7 @@ public class RestPutIndexedScriptAction extends BaseRestHandler {
putRequest.opType(IndexRequest.OpType.fromString(sOpType));
} catch (IllegalArgumentException eia){
try {
XContentBuilder builder = channel.newBuilder();
XContentBuilder builder = channel.newErrorBuilder();
channel.sendResponse(new BytesRestResponse(BAD_REQUEST, builder.startObject().field("error", eia.getMessage()).endObject()));
return;
} catch (IOException e1) {

View File

@ -0,0 +1,524 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.support.filtering;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.xcontent.*;
import org.elasticsearch.test.ElasticsearchTestCase;
import org.junit.Test;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.nullValue;
public abstract class AbstractFilteringJsonGeneratorTests extends ElasticsearchTestCase {
protected abstract XContentType getXContentType();
protected abstract void assertXContentBuilder(XContentBuilder expected, XContentBuilder builder);
protected void assertString(XContentBuilder expected, XContentBuilder builder) {
assertNotNull(builder);
assertNotNull(expected);
// Verify that the result is equal to the expected string
assertThat(builder.bytes().toUtf8(), is(expected.bytes().toUtf8()));
}
protected void assertBinary(XContentBuilder expected, XContentBuilder builder) {
assertNotNull(builder);
assertNotNull(expected);
try {
XContent xContent = XContentFactory.xContent(builder.contentType());
XContentParser jsonParser = xContent.createParser(expected.bytes());
XContentParser testParser = xContent.createParser(builder.bytes());
while (true) {
XContentParser.Token token1 = jsonParser.nextToken();
XContentParser.Token token2 = testParser.nextToken();
if (token1 == null) {
assertThat(token2, nullValue());
return;
}
assertThat(token1, equalTo(token2));
switch (token1) {
case FIELD_NAME:
assertThat(jsonParser.currentName(), equalTo(testParser.currentName()));
break;
case VALUE_STRING:
assertThat(jsonParser.text(), equalTo(testParser.text()));
break;
case VALUE_NUMBER:
assertThat(jsonParser.numberType(), equalTo(testParser.numberType()));
assertThat(jsonParser.numberValue(), equalTo(testParser.numberValue()));
break;
}
}
} catch (Exception e) {
fail("Fail to verify the result of the XContentBuilder: " + e.getMessage());
}
}
private XContentBuilder newXContentBuilder(String... filters) throws IOException {
return XContentBuilder.builder(getXContentType().xContent(), filters);
}
/**
* Build a sample using a given XContentBuilder
*/
private XContentBuilder sample(XContentBuilder builder) throws IOException {
assertNotNull(builder);
builder.startObject()
.field("title", "My awesome book")
.field("pages", 456)
.field("price", 27.99)
.field("timestamp", 1428582942867L)
.nullField("default")
.startArray("tags")
.value("elasticsearch")
.value("java")
.endArray()
.startArray("authors")
.startObject()
.field("name", "John Doe")
.field("lastname", "John")
.field("firstname", "Doe")
.endObject()
.startObject()
.field("name", "William Smith")
.field("lastname", "William")
.field("firstname", "Smith")
.endObject()
.endArray()
.startObject("properties")
.field("weight", 0.8d)
.startObject("language")
.startObject("en")
.field("lang", "English")
.field("available", true)
.startArray("distributors")
.startObject()
.field("name", "The Book Shop")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.field("street", "Hampton St")
.field("city", "London")
.endObject()
.startObject()
.field("name", "address #2")
.field("street", "Queen St")
.field("city", "Stornoway")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Sussex Books House")
.endObject()
.endArray()
.endObject()
.startObject("fr")
.field("lang", "French")
.field("available", false)
.startArray("distributors")
.startObject()
.field("name", "La Maison du Livre")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.field("street", "Rue Mouffetard")
.field("city", "Paris")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Thetra")
.endObject()
.endArray()
.endObject()
.endObject()
.endObject()
.endObject();
return builder;
}
/**
* Instanciates a new XContentBuilder with the given filters and builds a sample with it.
*/
private XContentBuilder sample(String... filters) throws IOException {
return sample(newXContentBuilder(filters));
}
@Test
public void testNoFiltering() throws Exception {
XContentBuilder expected = sample();
assertXContentBuilder(expected, sample());
assertXContentBuilder(expected, sample("*"));
assertXContentBuilder(expected, sample("**"));
}
@Test
public void testNoMatch() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject().endObject();
assertXContentBuilder(expected, sample("xyz"));
}
@Test
public void testSimpleField() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.field("title", "My awesome book")
.endObject();
assertXContentBuilder(expected, sample("title"));
}
@Test
public void testSimpleFieldWithWildcard() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.field("price", 27.99)
.startObject("properties")
.field("weight", 0.8d)
.startObject("language")
.startObject("en")
.field("lang", "English")
.field("available", true)
.startArray("distributors")
.startObject()
.field("name", "The Book Shop")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.field("street", "Hampton St")
.field("city", "London")
.endObject()
.startObject()
.field("name", "address #2")
.field("street", "Queen St")
.field("city", "Stornoway")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Sussex Books House")
.endObject()
.endArray()
.endObject()
.startObject("fr")
.field("lang", "French")
.field("available", false)
.startArray("distributors")
.startObject()
.field("name", "La Maison du Livre")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.field("street", "Rue Mouffetard")
.field("city", "Paris")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Thetra")
.endObject()
.endArray()
.endObject()
.endObject()
.endObject()
.endObject();
assertXContentBuilder(expected, sample("pr*"));
}
@Test
public void testMultipleFields() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.field("title", "My awesome book")
.field("pages", 456)
.endObject();
assertXContentBuilder(expected, sample("title", "pages"));
}
@Test
public void testSimpleArray() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.startArray("tags")
.value("elasticsearch")
.value("java")
.endArray()
.endObject();
assertXContentBuilder(expected, sample("tags"));
}
@Test
public void testSimpleArrayOfObjects() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.startArray("authors")
.startObject()
.field("name", "John Doe")
.field("lastname", "John")
.field("firstname", "Doe")
.endObject()
.startObject()
.field("name", "William Smith")
.field("lastname", "William")
.field("firstname", "Smith")
.endObject()
.endArray()
.endObject();
assertXContentBuilder(expected, sample("authors"));
assertXContentBuilder(expected, sample("authors.*"));
assertXContentBuilder(expected, sample("authors.*name"));
}
@Test
public void testSimpleArrayOfObjectsProperty() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.startArray("authors")
.startObject()
.field("lastname", "John")
.endObject()
.startObject()
.field("lastname", "William")
.endObject()
.endArray()
.endObject();
assertXContentBuilder(expected, sample("authors.lastname"));
assertXContentBuilder(expected, sample("authors.l*"));
}
@Test
public void testRecurseField1() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.startArray("authors")
.startObject()
.field("name", "John Doe")
.endObject()
.startObject()
.field("name", "William Smith")
. endObject()
.endArray()
.startObject("properties")
.startObject("language")
.startObject("en")
.startArray("distributors")
.startObject()
.field("name", "The Book Shop")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.endObject()
.startObject()
.field("name", "address #2")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Sussex Books House")
.endObject()
.endArray()
.endObject()
.startObject("fr")
.startArray("distributors")
.startObject()
.field("name", "La Maison du Livre")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Thetra")
.endObject()
.endArray()
.endObject()
.endObject()
.endObject()
.endObject();
assertXContentBuilder(expected, sample("**.name"));
}
@Test
public void testRecurseField2() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.startObject("properties")
.startObject("language")
.startObject("en")
.startArray("distributors")
.startObject()
.field("name", "The Book Shop")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.endObject()
.startObject()
.field("name", "address #2")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Sussex Books House")
.endObject()
.endArray()
.endObject()
.startObject("fr")
.startArray("distributors")
.startObject()
.field("name", "La Maison du Livre")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Thetra")
.endObject()
.endArray()
.endObject()
.endObject()
.endObject()
.endObject();
assertXContentBuilder(expected, sample("properties.**.name"));
}
@Test
public void testRecurseField3() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.startObject("properties")
.startObject("language")
.startObject("en")
.startArray("distributors")
.startObject()
.field("name", "The Book Shop")
.startArray("addresses")
.startObject()
.field("name", "address #1")
.endObject()
.startObject()
.field("name", "address #2")
.endObject()
.endArray()
.endObject()
.startObject()
.field("name", "Sussex Books House")
.endObject()
.endArray()
.endObject()
.endObject()
.endObject()
.endObject();
assertXContentBuilder(expected, sample("properties.*.en.**.name"));
}
@Test
public void testRecurseField4() throws Exception {
XContentBuilder expected = newXContentBuilder().startObject()
.startObject("properties")
.startObject("language")
.startObject("en")
.startArray("distributors")
.startObject()
.field("name", "The Book Shop")
.endObject()
.startObject()
.field("name", "Sussex Books House")
.endObject()
.endArray()
.endObject()
.startObject("fr")
.startArray("distributors")
.startObject()
.field("name", "La Maison du Livre")
.endObject()
.startObject()
.field("name", "Thetra")
.endObject()
.endArray()
.endObject()
.endObject()
.endObject()
.endObject();
assertXContentBuilder(expected, sample("properties.**.distributors.name"));
}
@Test
public void testRawField() throws Exception {
XContentBuilder expectedRawField = newXContentBuilder().startObject().field("foo", 0).startObject("raw").field("content", "hello world!").endObject().endObject();
XContentBuilder expectedRawFieldFiltered = newXContentBuilder().startObject().field("foo", 0).endObject();
XContentBuilder expectedRawFieldNotFiltered =newXContentBuilder().startObject().startObject("raw").field("content", "hello world!").endObject().endObject();
BytesReference raw = newXContentBuilder().startObject().field("content", "hello world!").endObject().bytes();
// Test method: rawField(String fieldName, BytesReference content)
assertXContentBuilder(expectedRawField, newXContentBuilder().startObject().field("foo", 0).rawField("raw", raw).endObject());
assertXContentBuilder(expectedRawFieldFiltered, newXContentBuilder("f*").startObject().field("foo", 0).rawField("raw", raw).endObject());
assertXContentBuilder(expectedRawFieldNotFiltered, newXContentBuilder("r*").startObject().field("foo", 0).rawField("raw", raw).endObject());
// Test method: rawField(String fieldName, byte[] content)
assertXContentBuilder(expectedRawField, newXContentBuilder().startObject().field("foo", 0).rawField("raw", raw.toBytes()).endObject());
assertXContentBuilder(expectedRawFieldFiltered, newXContentBuilder("f*").startObject().field("foo", 0).rawField("raw", raw.toBytes()).endObject());
assertXContentBuilder(expectedRawFieldNotFiltered, newXContentBuilder("r*").startObject().field("foo", 0).rawField("raw", raw.toBytes()).endObject());
// Test method: rawField(String fieldName, InputStream content)
assertXContentBuilder(expectedRawField, newXContentBuilder().startObject().field("foo", 0).rawField("raw", new ByteArrayInputStream(raw.toBytes())).endObject());
assertXContentBuilder(expectedRawFieldFiltered, newXContentBuilder("f*").startObject().field("foo", 0).rawField("raw", new ByteArrayInputStream(raw.toBytes())).endObject());
assertXContentBuilder(expectedRawFieldNotFiltered, newXContentBuilder("r*").startObject().field("foo", 0).rawField("raw", new ByteArrayInputStream(raw.toBytes())).endObject());
}
@Test
public void testArrays() throws Exception {
// Test: Array of values (no filtering)
XContentBuilder expected = newXContentBuilder().startObject().startArray("tags").value("lorem").value("ipsum").value("dolor").endArray().endObject();
assertXContentBuilder(expected, newXContentBuilder("t*").startObject().startArray("tags").value("lorem").value("ipsum").value("dolor").endArray().endObject());
assertXContentBuilder(expected, newXContentBuilder("tags").startObject().startArray("tags").value("lorem").value("ipsum").value("dolor").endArray().endObject());
// Test: Array of values (with filtering)
assertXContentBuilder(newXContentBuilder().startObject().endObject(), newXContentBuilder("foo").startObject().startArray("tags").value("lorem").value("ipsum").value("dolor").endArray().endObject());
// Test: Array of objects (no filtering)
expected = newXContentBuilder().startObject().startArray("tags").startObject().field("lastname", "lorem").endObject().startObject().field("firstname", "ipsum").endObject().endArray().endObject();
assertXContentBuilder(expected, newXContentBuilder("t*").startObject().startArray("tags").startObject().field("lastname", "lorem").endObject().startObject().field("firstname", "ipsum").endObject().endArray().endObject());
assertXContentBuilder(expected, newXContentBuilder("tags").startObject().startArray("tags").startObject().field("lastname", "lorem").endObject().startObject().field("firstname", "ipsum").endObject().endArray().endObject());
// Test: Array of objects (with filtering)
assertXContentBuilder(newXContentBuilder().startObject().endObject(), newXContentBuilder("foo").startObject().startArray("tags").startObject().field("lastname", "lorem").endObject().startObject().field("firstname", "ipsum").endObject().endArray().endObject());
// Test: Array of objects (with partial filtering)
expected = newXContentBuilder().startObject().startArray("tags").startObject().field("firstname", "ipsum").endObject().endArray().endObject();
assertXContentBuilder(expected, newXContentBuilder("t*.firstname").startObject().startArray("tags").startObject().field("lastname", "lorem").endObject().startObject().field("firstname", "ipsum").endObject().endArray().endObject());
}
}

View File

@ -0,0 +1,36 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.support.filtering;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentType;
public class CborFilteringGeneratorTests extends JsonFilteringGeneratorTests {
@Override
protected XContentType getXContentType() {
return XContentType.CBOR;
}
@Override
protected void assertXContentBuilder(XContentBuilder expected, XContentBuilder builder) {
assertBinary(expected, builder);
}
}

View File

@ -0,0 +1,99 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.support.filtering;
import org.elasticsearch.common.io.stream.BytesStreamOutput;
import org.elasticsearch.common.xcontent.XContent;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.json.JsonXContent;
import java.io.IOException;
import java.util.Arrays;
import java.util.Locale;
/**
* Benchmark class to compare filtered and unfiltered XContent generators.
*/
public class FilteringJsonGeneratorBenchmark {
public static void main(String[] args) throws IOException {
final XContent XCONTENT = JsonXContent.jsonXContent;
System.out.println("Executing " + FilteringJsonGeneratorBenchmark.class + "...");
System.out.println("Warming up...");
run(XCONTENT, 500_000, 100, 0.5);
System.out.println("Warmed up.");
System.out.println("nb documents | nb fields | nb fields written | % fields written | time (millis) | rate (docs/sec) | avg size");
for (int nbFields : Arrays.asList(10, 25, 50, 100, 250)) {
for (int nbDocs : Arrays.asList(100, 1000, 10_000, 100_000, 500_000)) {
for (double ratio : Arrays.asList(0.0, 1.0, 0.99, 0.95, 0.9, 0.75, 0.5, 0.25, 0.1, 0.05, 0.01)) {
run(XCONTENT, nbDocs, nbFields, ratio);
}
}
}
System.out.println("Done.");
}
private static void run(XContent xContent, long nbIterations, int nbFields, double ratio) throws IOException {
String[] fields = fields(nbFields);
String[] filters = fields((int) (nbFields * ratio));
long size = 0;
BytesStreamOutput os = new BytesStreamOutput();
long start = System.nanoTime();
for (int i = 0; i < nbIterations; i++) {
XContentBuilder builder = new XContentBuilder(xContent, os, filters);
builder.startObject();
for (String field : fields) {
builder.field(field, System.nanoTime());
}
builder.endObject();
size += builder.bytes().length();
os.reset();
}
double milliseconds = (System.nanoTime() - start) / 1_000_000d;
System.out.printf(Locale.ROOT, "%12d | %9d | %17d | %14.2f %% | %10.3f ms | %15.2f | %8.0f %n",
nbIterations, nbFields,
(int) (nbFields * ratio),
(ratio * 100d),
milliseconds,
((double) nbIterations) / (milliseconds / 1000d),
size / ((double) nbIterations));
}
/**
* Returns a String array of field names starting from "field_0" with a length of n.
* If n=3, the array is ["field_0","field_1","field_2"]
*/
private static String[] fields(int n) {
String[] fields = new String[n];
for (int i = 0; i < n; i++) {
fields[i] = "field_" + i;
}
return fields;
}
}

View File

@ -0,0 +1,36 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.support.filtering;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentType;
public class JsonFilteringGeneratorTests extends AbstractFilteringJsonGeneratorTests {
@Override
protected XContentType getXContentType() {
return XContentType.JSON;
}
@Override
protected void assertXContentBuilder(XContentBuilder expected, XContentBuilder builder) {
assertString(expected, builder);
}
}

View File

@ -0,0 +1,36 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.support.filtering;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentType;
public class SmileFilteringGeneratorTests extends JsonFilteringGeneratorTests {
@Override
protected XContentType getXContentType() {
return XContentType.SMILE;
}
@Override
protected void assertXContentBuilder(XContentBuilder expected, XContentBuilder builder) {
assertBinary(expected, builder);
}
}

View File

@ -0,0 +1,36 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.xcontent.support.filtering;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentType;
public class YamlFilteringGeneratorTests extends AbstractFilteringJsonGeneratorTests {
@Override
protected XContentType getXContentType() {
return XContentType.YAML;
}
@Override
protected void assertXContentBuilder(XContentBuilder expected, XContentBuilder builder) {
assertString(expected, builder);
}
}

View File

@ -168,8 +168,13 @@ public class RestFilterChainTests extends ElasticsearchTestCase {
}
@Override
public XContentBuilder newBuilder(@Nullable BytesReference autoDetectSource) throws IOException {
return super.newBuilder(autoDetectSource);
public XContentBuilder newErrorBuilder() throws IOException {
return super.newErrorBuilder();
}
@Override
public XContentBuilder newBuilder(@Nullable BytesReference autoDetectSource, boolean useFiltering) throws IOException {
return super.newBuilder(autoDetectSource, useFiltering);
}
@Override