Fix getBitmapIndex to consider the case were dim is null
Unit Test for exractionFn with empty result and null_column
UT for TopN queries with Extraction filter
refactor in Extractiuon fileter makematcher for realtime segment and clean code in b/processing/src/test/java/io/druid/query/groupby/GroupByQueryRunnerTest.java
fix to make sure that empty string are converted to null
This provides the alternative to using ComplexMetricSerde.getObjectStrategy()
and using the serde methods from ObjectStrategy as that usage pattern is deprecated.
Issue - while storing results in cache we store the event map which
contains aggregator names mapped to values. Now when someone fire same
query after renaming aggs, the cache key will be same but the event
will contain metric values mapped to older names which leads to wrong
results.
Fix - modify cache to not store raw event but the actual list of values
only.
review comments + fix dimension renaming
review comment
* Fix resource leak in `io.druid.segment.IndexIO.DefaultIndexIOHandler#validateTwoSegments(java.io.File, java.io.File)`
* Un-deprecate `close()` in `QueryableIndex` and make it inherit `Closeable`
* Fix resource leaks in various unit tests
* Add `CloserRule` for closing out resources
it fails when we issue the SegmentMetadataQuery by setting {"bySegment" : true} in context with exception -
java.lang.ClassCastException: io.druid.query.Result cannot be cast to io.druid.query.metadata.metadata.SegmentAnalysis
at io.druid.query.metadata.SegmentMetadataQueryQueryToolChest$4.compare(SegmentMetadataQueryQueryToolChest.java:222) ~[druid-processing-0.7.3-SNAPSHOT.jar:0.7.3-SNAPSHOT]
at com.google.common.collect.NullsFirstOrdering.compare(NullsFirstOrdering.java:44) ~[guava-16.0.1.jar:?]
at com.metamx.common.guava.MergeIterator$1.compare(MergeIterator.java:46) ~[java-util-0.27.0.jar:?]
at com.metamx.common.guava.MergeIterator$1.compare(MergeIterator.java:42) ~[java-util-0.27.0.jar:?]
at java.util.PriorityQueue.siftUpUsingComparator(PriorityQueue.java:649) ~[?:1.7.0_80]
which caused following regression
it fails when we issue the SegmentMetadataQuery by setting {"bySegment" : true} in context with exception -
java.lang.ClassCastException: io.druid.query.Result cannot be cast to io.druid.query.metadata.metadata.SegmentAnalysis
at io.druid.query.metadata.SegmentMetadataQueryQueryToolChest$4.compare(SegmentMetadataQueryQueryToolChest.java:222) ~[druid-processing-0.7.3-SNAPSHOT.jar:0.7.3-SNAPSHOT]
at com.google.common.collect.NullsFirstOrdering.compare(NullsFirstOrdering.java:44) ~[guava-16.0.1.jar:?]
at com.metamx.common.guava.MergeIterator$1.compare(MergeIterator.java:46) ~[java-util-0.27.0.jar:?]
at com.metamx.common.guava.MergeIterator$1.compare(MergeIterator.java:42) ~[java-util-0.27.0.jar:?]
at java.util.PriorityQueue.siftUpUsingComparator(PriorityQueue.java:649) ~[?:1.7.0_80]
* Inverted function arguments to compose postProcFn for GroupBy queries
with havingspec + limitspec.
* Replaced query.getLimitSpec() with null in GroupByQueryToolChest's
mergeGroupByResults
* Added unittest to verify functionality
that is if complex agg did not implement inputSizeFn() so
that segment metadata query shows atleast some information.
also instead of COMPLEX, return type of data stored.
renamed all [Max/Min]*.java to [DoubleMax/DoubleMin]*.java and created [Max/Min]AggregatorFactory.java which can be removed when we dont need the min/max aggregator type backward compatibility
fixes#1211
fix value matcher
more improvements
more fixes for partial null column
fix handling of dimension having only null values
fixes#1211
fix value matcher
more improvements
more fixes for partial null column
review comment
IndexMaker speedups
* About 15% speedup
Conflicts:
processing/src/main/java/io/druid/segment/IndexMaker.java
fix handling of dimension having only null values
fixes#1211
fix value matcher
more improvements
more fixes for partial null column
fix handling of dimension having only null values
fixes#1211
fix value matcher
more improvements
more fixes for partial null column
review comment
review comments
review comment
fix failing tests
review comment
fix compilation
fixes#1330,
Avoid creating Period instance as creating a Period from Long.MAX_VALUE
throws arithmetic exception.
After this query metric will emit duration in seconds instead of
minutes.
* Add some unit tests
* Add io.druid.segment.IndexMerger.reprocess for quick re-indexing of data
* Add dim-value validation to validation checker (instead of ONLY index #)
* General code refactoring to make things a little easier to read
on postAggregations which are removed in the forwarded query.
Add a unit test to replicate the issue.
Add a query that can replicate this issue into integration test.
- compression for single-value dimensions using CompressedVSizeIntsIndexedSupplier
- makes dimension compression configurable via IndexSpec
- IndexSpec also enables configuring bitmap and metric compression