Compare commits

...

109 Commits

Author SHA1 Message Date
Mark Paluch
3879313782
Release version 4.0.9 (Neumann SR9).
See #1730
2021-04-14 11:04:47 +02:00
Mark Paluch
65842e6905
Prepare 4.0.9 (Neumann SR9).
See #1730
2021-04-14 11:04:09 +02:00
Mark Paluch
61d0aa0852
Updated changelog.
See #1730
2021-04-14 11:04:05 +02:00
Peter-Josef Meisch
b7ec0a9013
Fix reactive connection handling.
Original Pull Request #1766
Closes #1759

(cherry picked from commit 58bca88386d9de7ea3946f7691c63bf31ce4ece2)
2021-04-09 00:17:57 +02:00
Mark Paluch
3e20810452
Updated changelog.
See #1731
2021-03-31 18:30:49 +02:00
Mark Paluch
804e5bdc3b
Updated changelog.
See #1699
2021-03-31 17:26:09 +02:00
Mark Paluch
121cd8aa50
Updated changelog.
See #1709
2021-03-17 11:31:32 +01:00
Mark Paluch
187b0befc6
Updated changelog.
See #1702
2021-03-17 11:03:44 +01:00
Mark Paluch
5bd0eb3115
After release cleanups.
See #1697
2021-03-17 10:33:58 +01:00
Mark Paluch
cd4f0cd7e3
Prepare next development iteration.
See #1697
2021-03-17 10:33:56 +01:00
Mark Paluch
3a749bb963
Release version 4.0.8 (Neumann SR8).
See #1697
2021-03-17 10:21:34 +01:00
Mark Paluch
362d0543de
Prepare 4.0.8 (Neumann SR8).
See #1697
2021-03-17 10:21:09 +01:00
Mark Paluch
4c0e9629d4
Updated changelog.
See #1697
2021-03-17 10:21:07 +01:00
Peter-Josef Meisch
3ca345b216
DefaultReactiveElasticsearchClient handle 5xx error with empty body
Original Pull Request #1713
Closes #1712

(cherry picked from commit 6634d0075ace745e17d34d655e15d21abc0fb786)
Test adapted
2021-03-03 06:53:21 +01:00
Christoph Strobl
82d9cb4cc6 Updated changelog.
See #1701
2021-02-18 11:37:50 +01:00
Christoph Strobl
dd1f309810 Updated changelog.
See #1698
2021-02-18 11:18:33 +01:00
Christoph Strobl
f91964719a Updated changelog.
See #1643
2021-02-17 14:20:37 +01:00
Christoph Strobl
a02e190c49 Updated changelog.
See #1642
2021-02-17 13:49:24 +01:00
Christoph Strobl
64bf8139b2 After release cleanups.
See #1570
2021-02-17 11:32:50 +01:00
Christoph Strobl
a597a4fee2 Prepare next development iteration.
See #1570
2021-02-17 11:32:49 +01:00
Christoph Strobl
cfa3a1e762 Release version 4.0.7 (Neumann SR7).
See #1570
2021-02-17 11:07:42 +01:00
Christoph Strobl
401c689211 Prepare 4.0.7 (Neumann SR7).
See #1570
2021-02-17 11:07:10 +01:00
Christoph Strobl
24a4d150ef Updated changelog.
See #1570
2021-02-17 11:07:09 +01:00
Christoph Strobl
057455ec74 Updated changelog.
See #1569
2021-02-17 10:58:27 +01:00
Peter-Josef Meisch
40cff583f4
Allow CustomConversions for entities - adaption for 4.0.x. 2021-01-29 11:55:10 +01:00
Peter-Josef Meisch
eefd5a2187
Allow CustomConversions for entities.
Original PullRequest #1672
Closes #1667

(cherry picked from commit 0ac1b4af00b14cb9509986ab13db0eab44dba4ab)
2021-01-29 11:43:47 +01:00
Peter-Josef Meisch
fc1f65f87d
ReactiveElasticsearchOperations indexName is encoded twice.
Original Pull Request #1666
Closes #1665

(cherry picked from commit 4829b07e53fcbea4b391a6688fd70a580f5a62ab)
2021-01-25 22:24:27 +01:00
Peter-Josef Meisch
e80e32eb97
Fix source filter setup in multiget requests.
Original Pull Request #1664
Closes #1659

(cherry picked from commit 1a02c1e05ae9cfa81b9010dd6872d0c348466399)
2021-01-24 20:22:17 +01:00
Peter-Josef Meisch
ef2600f091
Documentation fix.
Original Pull Request #1663
Closes #1662

(cherry picked from commit 1aabb42355e07f9d6e65a2a0d02569b3a0f01a2d)
2021-01-23 20:12:29 +01:00
Christoph Strobl
ce67d8145d Updated changelog.
See #1571
2021-01-13 15:49:52 +01:00
Christoph Strobl
e7417f8b73
Updated changelog.
See #1572
2021-01-13 15:16:30 +01:00
Greg L. Turnquist
2521f0760e
DATAES-996 - Polishing. 2020-12-17 09:06:01 -06:00
Greg L. Turnquist
c19ac47009
DATAES-996 - Use Docker hub credentials for all CI jobs. 2020-12-17 08:43:31 -06:00
Mark Paluch
4603526e88
DATAES-973 - Updated changelog. 2020-12-09 16:47:48 +01:00
Mark Paluch
39652ff48f
DATAES-966 - Updated changelog. 2020-12-09 15:33:29 +01:00
Mark Paluch
f92153eb4e
DATAES-964 - After release cleanups. 2020-12-09 12:41:27 +01:00
Mark Paluch
84df3daccd
DATAES-964 - Prepare next development iteration. 2020-12-09 12:41:22 +01:00
Mark Paluch
8c378033a4
DATAES-964 - Release version 4.0.6 (Neumann SR6). 2020-12-09 11:16:22 +01:00
Mark Paluch
ed620a5574
DATAES-964 - Prepare 4.0.6 (Neumann SR6). 2020-12-09 11:15:56 +01:00
Mark Paluch
f61ed32bab
DATAES-964 - Updated changelog. 2020-12-09 11:15:53 +01:00
Mark Paluch
64fc98a8fa
DATAES-963 - Updated changelog. 2020-12-09 09:59:16 +01:00
Peter-Josef Meisch
242cf7706b
DATAES-991 - Wrong value for TermVector(with_positions_offets_payloads).
Original PR: #564

(cherry picked from commit 6a6ead5e1ec866812f7bf44af77e587851402ad1)
2020-12-04 08:51:16 +01:00
Mark Paluch
1e6271edf2
DATAES-965 - Updated changelog. 2020-11-11 12:34:45 +01:00
Peter-Josef Meisch
7168c34ee6
DATAES-969 - Use ResultProcessor in ElasticsearchPartQuery to build PartTree.
Original PR: #546

(cherry picked from commit d036693f0510748537c682a5ede99c23938b5250)
2020-11-07 18:47:20 +01:00
Mark Paluch
fd118e67e5
DATAES-968 - Enable Maven caching for Jenkins jobs. 2020-10-30 08:37:40 +01:00
Mark Paluch
232f192a44
DATAES-950 - Updated changelog. 2020-10-28 16:28:06 +01:00
Mark Paluch
49a5bf642b
DATAES-926 - After release cleanups. 2020-10-28 14:51:07 +01:00
Mark Paluch
71fa62262b
DATAES-926 - Prepare next development iteration. 2020-10-28 14:51:04 +01:00
Mark Paluch
d356b6c1b0
DATAES-926 - Release version 4.0.5 (Neumann SR5). 2020-10-28 14:34:43 +01:00
Mark Paluch
a45f721122
DATAES-926 - Prepare 4.0.5 (Neumann SR5). 2020-10-28 14:34:16 +01:00
Mark Paluch
2dc4b57f2d
DATAES-926 - Updated changelog. 2020-10-28 14:34:13 +01:00
Mark Paluch
01fd3d4121
DATAES-925 - Updated changelog. 2020-10-28 12:15:08 +01:00
Mark Paluch
b48d4aed54
DATAES-958 - Updated changelog. 2020-10-28 11:32:35 +01:00
Peter-Josef Meisch
3cbf9dd0cc
DATAES-953 - DateTimeException on converting Instant or Date to custom format.
Original PR: #538

(cherry picked from commit 9bc4bee86fa5727102b94bdb912cce4d21ed5938)
2020-10-15 23:16:27 +02:00
Christoph Strobl
1d1268075f DATAES-927 - Updated changelog. 2020-10-14 14:51:53 +02:00
Peter-Josef Meisch
e293da89f8
DATAES-937 - Repository queries with IN filters fail with empty input list.
Original PR: #525

(cherry picked from commit 7117e5d70d7b52aa689787b2201a3f26639bbcf8)
2020-09-24 22:37:14 +02:00
Peter-Josef Meisch
a88612df71
DATAES-936 - Take id property from the source when deserializing an entity.
Original PR: #523

(cherry picked from commit 8d4c30573297f0481e09de5ab5e419dec775c769)
2020-09-23 20:08:54 +02:00
Mark Paluch
084dd1db56
DATAES-904 - Updated changelog. 2020-09-16 14:12:13 +02:00
Mark Paluch
edd43b7e92
DATAES-905 - After release cleanups. 2020-09-16 12:15:46 +02:00
Mark Paluch
e3b780050d
DATAES-905 - Prepare next development iteration. 2020-09-16 12:15:43 +02:00
Mark Paluch
229fd977cd
DATAES-905 - Release version 4.0.4 (Neumann SR4). 2020-09-16 11:43:15 +02:00
Mark Paluch
c80c821c30
DATAES-905 - Prepare 4.0.4 (Neumann SR4). 2020-09-16 11:42:47 +02:00
Mark Paluch
b12431dfec
DATAES-905 - Updated changelog. 2020-09-16 11:42:44 +02:00
Mark Paluch
1120ce402b
DATAES-888 - Updated changelog. 2020-09-16 11:20:14 +02:00
Mark Paluch
e5593a07a1
DATAES-887 - Updated changelog. 2020-09-16 10:39:06 +02:00
Peter-Josef Meisch
fa0fdd8c82
DATAES-924 - Conversion of properties of collections of Temporal values fails.
Original PR: #519

(cherry picked from commit 0e7791a6875baf48217f9265837705b508d1fdc9)
2020-09-15 23:25:00 +02:00
Peter-Josef Meisch
8686650261 DATAES-912 - Derived Query with "In" Keyword does not work on Text field.
Original PR: #510

(cherry picked from commit 79fdc449b873b317cc6d9544285e870c11a4d240)
2020-08-24 07:32:33 +02:00
Mark Paluch
3a522cd432
DATAES-890 - After release cleanups. 2020-08-12 13:19:59 +02:00
Mark Paluch
0a1eec8f0b
DATAES-890 - Prepare next development iteration. 2020-08-12 13:19:56 +02:00
Mark Paluch
63a3daf20a
DATAES-890 - Release version 4.0.3 (Neumann SR3). 2020-08-12 13:07:48 +02:00
Mark Paluch
4d5638c6d7
DATAES-890 - Prepare 4.0.3 (Neumann SR3). 2020-08-12 13:07:22 +02:00
Mark Paluch
5180b2f8cd
DATAES-890 - Updated changelog. 2020-08-12 13:07:18 +02:00
Mark Paluch
8eaf09cfc4
DATAES-872 - Updated changelog. 2020-08-12 12:01:28 +02:00
Peter-Josef Meisch
383fe3132e DATAES-896 - Use mainField property of @MultiField annotation.
Original PR: #500

(cherry picked from commit fd23c10c163e1959362c078fd8fa4b812ce11c01)
2020-08-09 16:40:05 +02:00
Peter-Josef Meisch
96ce05794e DATAES-897 - Add documentation for Highlight annotation.
Original PR: #499

(cherry picked from commit fd77f62cc4d2452aee8cfce56e037e3daa18477e)
2020-08-08 20:06:40 +02:00
Peter-Josef Meisch
4f29f0d60c DATAES-891 - Returning a Stream from a Query annotated repository method crashes.
Original PR: #497

(cherry picked from commit f989cf873b0e2a5e60044ffa1af42b77b05e9012)
2020-07-29 13:07:41 +02:00
Mark Paluch
886503c41c
DATAES-862 - After release cleanups. 2020-07-22 10:37:09 +02:00
Mark Paluch
c429436f1c
DATAES-862 - Prepare next development iteration. 2020-07-22 10:37:06 +02:00
Mark Paluch
afa611ce09
DATAES-862 - Release version 4.0.2 (Neumann SR2). 2020-07-22 10:21:10 +02:00
Mark Paluch
dc9db5dcdc
DATAES-862 - Prepare 4.0.2 (Neumann SR2). 2020-07-22 10:20:45 +02:00
Mark Paluch
4ee592cd21
DATAES-862 - Updated changelog. 2020-07-22 10:20:41 +02:00
Mark Paluch
cd7b6f8420
DATAES-861 - Updated changelog. 2020-07-22 10:08:51 +02:00
Mark Paluch
237c0ead2e
DATAES-860 - Updated changelog. 2020-07-22 09:44:37 +02:00
Peter-Josef Meisch
6462305521 DATAES-883 - Fix log level on resource load error.
Original PR: #493

(cherry picked from commit 0f940b36d7a89257694ed85639f1a89c4eb2a35a)
2020-07-10 21:20:42 +02:00
Peter-Josef Meisch
0a2038505f DATAES-878 - Wrong value for TermVector.
Original PR: #492

(cherry picked from commit df4e6c449d4b5cf7a9196d88045f7b7af9060311)
2020-07-02 06:45:15 +02:00
Mark Paluch
8276023132
DATAES-824 - Updated changelog. 2020-06-25 12:00:26 +02:00
Peter-Josef Meisch
ae94120d91 DATAES-865 - Polishing.
(cherry picked from commit 92f16846abaf7266de1e9669aadd3bd24f5b64a1)
2020-06-16 18:59:16 +02:00
Been24
d2df9e7f4c DATAES-865 - Fix MappingElasticsearchConverter writing an Object property containing a Map.
Original PR: #482

(cherry picked from commit 1de1aeb2c7ec80580cb2b4b1d98b724277862463)
2020-06-16 18:59:03 +02:00
Peter-Josef Meisch
73fc8f65ee DATAES-863 - Improve server error response handling.
Original PR: #480

(cherry picked from commit 3c44a1c96996ff2af496500505a8194e22b3de02)
2020-06-11 19:16:11 +02:00
Mark Paluch
4d2e4ac22c
DATAES-823 - After release cleanups. 2020-06-10 14:29:30 +02:00
Mark Paluch
8d02946186
DATAES-823 - Prepare next development iteration. 2020-06-10 14:29:27 +02:00
Mark Paluch
3ac4e12e08
DATAES-823 - Release version 4.0.1 (Neumann SR1). 2020-06-10 14:02:28 +02:00
Mark Paluch
bb69482b7b
DATAES-823 - Prepare 4.0.1 (Neumann SR1). 2020-06-10 14:02:00 +02:00
Mark Paluch
20f3298f72
DATAES-823 - Updated changelog. 2020-06-10 14:01:56 +02:00
Mark Paluch
3178707172
DATAES-807 - Updated changelog. 2020-06-10 12:29:56 +02:00
Mark Paluch
b60da78c5b
DATAES-806 - Updated changelog. 2020-06-10 11:40:30 +02:00
Peter-Josef Meisch
8e765cf07c DATAES-857 - Registered simple types are not read from list.
Original PR: #478

(cherry picked from commit 407c8c6c17cf13dffcf0c577fe7ea47bd6f96200)
2020-06-09 16:31:14 +02:00
Peter-Josef Meisch
ff999959a8 DATAES-850 - Add warning and docs for missing TemporalAccessor configuration.
Original PR: #472

(cherry picked from commit 859b22db8e396dc533d479dcf49a590c07b8dc24)
2020-05-31 23:06:38 +02:00
Peter-Josef Meisch
333aba2c59 DATAES-845 - MappingElasticsearchConverter handles lists with null values.
Original PR: #470

(cherry picked from commit 852273eff5c06dbd9e1ef4bcd28d2736c482bdf9)
2020-05-29 19:12:24 +02:00
Mark Paluch
e3e646eb72
DATAES-844 - Improve TOC formatting for migration guides. 2020-05-26 16:23:12 +02:00
Peter-Josef Meisch
b918605efd
DATAES-839 - ReactiveElasticsearchTemplate should use RequestFactory.
Original PR: #466

cherrypicked from dc6734db4391f236aeb11600204db28fe570fb34
2020-05-21 12:32:30 +02:00
Peter-Josef Meisch
c9667755f2
DATAES-835 - Fix code sample in documentation for scroll API.
Original PR: #462
2020-05-20 08:43:03 +02:00
Peter-Josef Meisch
421333dadc DATAES-832 - findAllById repository method returns iterable with null elements for not found ids. 2020-05-18 18:05:30 +02:00
Peter-Josef Meisch
34e3dc735c DATAES-832 - findAllById repository method returns iterable with null elements for not found ids. 2020-05-17 20:01:47 +02:00
Peter-Josef Meisch
e7110c14ab DATAES-831 - SearchOperations.searchForStream does not use requested maxResults.
Original PR: #459

(cherry picked from commit 506f79a45aa93ad5787b25d807de5e5970bf0ea3)
2020-05-17 10:53:29 +02:00
Peter-Josef Meisch
1cee4057d9
DATAES-828 - Fields of type date need to have a format defined.
Original PR: #457
2020-05-14 20:30:30 +02:00
Peter-Josef Meisch
68ce0c2184 DATAES-826 - Repositories should not try to create an index when it already exists.
original PR: #456

(cherry picked from commit c7339dc248370e5e726b6a808c74bb5bd4dc1db1)
2020-05-14 18:06:51 +02:00
Mark Paluch
9adfa0b389
DATAES-808 - After release cleanups. 2020-05-12 12:50:35 +02:00
Mark Paluch
d28f643997
DATAES-808 - Prepare next development iteration. 2020-05-12 12:40:53 +02:00
68 changed files with 2467 additions and 611 deletions

View File

@ -9,7 +9,7 @@ image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-elasticsearch%2
Since this pipeline is purely Docker-based, it's easy to:
* Debug what went wrong on your local machine.
* Test out a a tweak to your `test.sh` script before sending it out.
* Test out a a tweak to your `verify.sh` script before sending it out.
* Experiment against a new image before submitting your pull request.
All of these use cases are great reasons to essentially run what the CI server does on your local machine.

129
Jenkinsfile vendored
View File

@ -3,7 +3,7 @@ pipeline {
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/master", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/2.3.x", threshold: hudson.model.Result.SUCCESS)
}
options {
@ -15,59 +15,83 @@ pipeline {
stage("test: baseline (jdk8)") {
when {
anyOf {
branch 'master'
branch '4.0.x'
not { triggeredBy 'UpstreamCause' }
}
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
DOCKER_HUB = credentials('hub.docker.com-springbuildmaster')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Dsort -U -B'
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-u root -v /var/run/docker.sock:/var/run/docker.sock -v /usr/bin/docker:/usr/bin/docker -v $HOME:/tmp/jenkins-home') {
sh "docker login --username ${DOCKER_HUB_USR} --password ${DOCKER_HUB_PSW}"
sh 'PROFILE=none ci/verify.sh'
sh "ci/clean.sh"
}
}
}
}
}
stage("Test other configurations") {
when {
anyOf {
branch 'master'
branch '4.0.x'
not { triggeredBy 'UpstreamCause' }
}
}
parallel {
stage("test: baseline (jdk11)") {
agent {
docker {
image 'adoptopenjdk/openjdk11:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
DOCKER_HUB = credentials('hub.docker.com-springbuildmaster')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pjava11 clean dependency:list test -Dsort -U -B'
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk11:latest').inside('-u root -v /var/run/docker.sock:/var/run/docker.sock -v /usr/bin/docker:/usr/bin/docker -v $HOME:/tmp/jenkins-home') {
sh "docker login --username ${DOCKER_HUB_USR} --password ${DOCKER_HUB_PSW}"
sh 'PROFILE=java11 ci/verify.sh'
sh "ci/clean.sh"
}
}
}
}
}
stage("test: baseline (jdk12)") {
agent {
docker {
image 'adoptopenjdk/openjdk12:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
DOCKER_HUB = credentials('hub.docker.com-springbuildmaster')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pjava11 clean dependency:list test -Dsort -U -B'
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk12:latest').inside('-u root -v /var/run/docker.sock:/var/run/docker.sock -v /usr/bin/docker:/usr/bin/docker -v $HOME:/tmp/jenkins-home') {
sh "docker login --username ${DOCKER_HUB_USR} --password ${DOCKER_HUB_PSW}"
sh 'PROFILE=java11 ci/verify.sh'
sh "ci/clean.sh"
}
}
}
}
}
}
@ -76,16 +100,12 @@ pipeline {
stage('Release to artifactory') {
when {
anyOf {
branch 'master'
branch '4.0.x'
not { triggeredBy 'UpstreamCause' }
}
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
label 'data'
}
options { timeout(time: 20, unit: 'MINUTES') }
@ -94,27 +114,28 @@ pipeline {
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-elasticsearch " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-v $HOME:/tmp/jenkins-home') {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory -Dmaven.repo.local=/tmp/jenkins-home/.m2/spring-data-elasticsearch-non-root ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-elasticsearch " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
}
}
}
stage('Publish documentation') {
when {
branch 'master'
branch '4.0.x'
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
label 'data'
}
options { timeout(time: 20, unit: 'MINUTES') }
@ -123,12 +144,18 @@ pipeline {
}
steps {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,distribute ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.distribution-repository=temp-private-local " +
'-Dmaven.test.skip=true clean deploy -U -B'
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-v $HOME:/tmp/jenkins-home') {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,distribute -Dmaven.repo.local=/tmp/jenkins-home/.m2/spring-data-elasticsearch-non-root ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.distribution-repository=temp-private-local " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
}
}
}
}

6
ci/clean.sh Executable file
View File

@ -0,0 +1,6 @@
#!/bin/bash -x
set -euo pipefail
MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" \
./mvnw clean -Dmaven.repo.local=/tmp/jenkins-home/.m2/spring-data-elasticsearch

10
ci/verify.sh Executable file
View File

@ -0,0 +1,10 @@
#!/bin/bash -x
set -euo pipefail
mkdir -p /tmp/jenkins-home/.m2/spring-data-elasticsearch
chown -R 1001:1001 .
MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" \
./mvnw \
-P${PROFILE} clean dependency:list verify -Dsort -U -B -Dmaven.repo.local=/tmp/jenkins-home/.m2/spring-data-elasticsearch

11
pom.xml
View File

@ -5,12 +5,12 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-elasticsearch</artifactId>
<version>4.0.0.RELEASE</version>
<version>4.0.9.RELEASE</version>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.3.0.RELEASE</version>
<version>2.3.9.RELEASE</version>
</parent>
<name>Spring Data Elasticsearch</name>
@ -21,7 +21,7 @@
<commonslang>2.6</commonslang>
<elasticsearch>7.6.2</elasticsearch>
<log4j>2.9.1</log4j>
<springdata.commons>2.3.0.RELEASE</springdata.commons>
<springdata.commons>2.3.9.RELEASE</springdata.commons>
<netty>4.1.39.Final</netty>
<java-module-name>spring.data.elasticsearch</java-module-name>
</properties>
@ -383,6 +383,11 @@
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>bintray-plugins</id>
<name>bintray-plugins</name>
<url>https://jcenter.bintray.com</url>
</pluginRepository>
</pluginRepositories>
</project>

View File

@ -44,4 +44,5 @@ include::{spring-data-commons-docs}/repository-namespace-reference.adoc[]
include::{spring-data-commons-docs}/repository-populator-namespace-reference.adoc[]
include::{spring-data-commons-docs}/repository-query-keywords-reference.adoc[]
include::{spring-data-commons-docs}/repository-query-return-types-reference.adoc[]
include::reference/migration-guides.adoc[]
:leveloffset: -1

View File

@ -9,7 +9,6 @@ The Spring Data Elasticsearch project applies core Spring concepts to the develo
You will notice similarities to the Spring data solr and mongodb support in the Spring Framework.
include::reference/elasticsearch-new.adoc[leveloffset=+1]
include::reference/elasticsearch-migration-guide-3.2-4.0.adoc[leveloffset=+1]
[[preface.metadata]]
== Project Metadata

View File

@ -154,7 +154,7 @@ httpHeaders.add("some-header", "on every request") <1>
ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo("localhost:9200", "localhost:9291") <2>
.useSsl() <3>
.usingSsl() <3>
.withProxy("localhost:8888") <4>
.withPathPrefix("ela") <5>
.withConnectTimeout(Duration.ofSeconds(5)) <6>

View File

@ -1,16 +1,16 @@
[[elasticsearch-migration-guide-3.2-4.0]]
== Upgrading from 3.2.x to 4.0.x
= Upgrading from 3.2.x to 4.0.x
This section describes breaking changes from version 3.2.x to 4.0.x and how removed features can be replaced by new introduced features.
=== Removal of the used Jackson Mapper.
[[elasticsearch-migration-guide-3.2-4.0.jackson-removal]]
== Removal of the used Jackson Mapper
One of the changes in version 4.0.x is that Spring Data Elasticsearch does not use the Jackson Mapper anymore to map an entity to the JSON representation needed for Elasticsearch (see <<elasticsearch.mapping>>). In version 3.2.x the Jackson Mapper was the default that was used. It was possible to switch to the meta-model based converter (named `ElasticsearchEntityMapper`) by explicitly configuring it (<<elasticsearch.mapping.meta-model>>).
In version 4.0.x the meta-model based converter is the only one that is available and does not need to be configured explicitly. If you had a custom configuration to enable the meta-model converter by providing a bean like this:
[code,java]
[source,java]
----
@Bean
@Override
@ -30,15 +30,15 @@ You now have to remove this bean, the `ElasticsearchEntityMapper` interface has
.Entity configuration
Some users had custom Jackson annotations on the entity class, for example in order to define a custom name for the mapped document in Elasticsearch or to configure date conversions. These are not taken into account anymore. The needed functionality is now provided with Spring Data Elasticsearch's `@Field` annotation. Please see <<elasticsearch.mapping.meta-model.annotations>> for detailed information.
=== Removal of implicit index name from query objects
[[elasticsearch-migration-guide-3.2-4.0.implicit-index-name]]
== Removal of implicit index name from query objects
In 3.2.x the different query classes like `IndexQuery` or `SearchQuery` had properties that were taking the index name or index names that they were operating upon. If these were not set, the passed in entity was inspected to retrieve the index name that was set in the `@Document` annotation. +
In 4.0.x the index name(s) must now be provided in an additional parameter of type `IndexCoordinates`. By separating this, it now is possible to use one query object against different indices.
So for example the following code:
[code,java]
[source,java]
----
IndexQuery indexQuery = new IndexQueryBuilder()
.withId(person.getId().toString())
@ -50,7 +50,7 @@ String documentId = elasticsearchOperations.index(indexQuery);
must be changed to:
[code,java]
[source,java]
----
IndexCoordinates indexCoordinates = elasticsearchOperations.getIndexCoordinatesFor(person.getClass());
@ -64,8 +64,8 @@ String documentId = elasticsearchOperations.index(indexQuery, indexCoordinates);
To make it easier to work with entities and use the index name that is contained in the entitie's `@Document` annotation, new methods have been added like `DocumentOperations.save(T entity)`;
=== The new Operations interfaces
[[elasticsearch-migration-guide-3.2-4.0.new-operations]]
== The new Operations interfaces
In version 3.2 there was the `ElasticsearchOperations` interface that defined all the methods for the `ElasticsearchTemplate` class. In version 4 the functions have been split into different interfaces, aligning these interfaces with the Elasticsearch API:
@ -77,10 +77,10 @@ In version 3.2 there was the `ElasticsearchOperations` interface that defined al
NOTE: All the functions from the `ElasticsearchOperations` interface in version 3.2 that are now moved to the `IndexOperations` interface are still available, they are marked as deprecated and have default implementations that delegate to the new implementation:
[code,java]
[source,java]
----
/**
* Create an index for given indexName .
* Create an index for given indexName.
*
* @param indexName the name of the index
* @return {@literal true} if the index was created
@ -92,17 +92,17 @@ default boolean createIndex(String indexName) {
}
----
[[elasticsearch-migration-guide-3.2-4.0.deprecations]]
== Deprecations
=== Deprecations
==== Methods and classes
=== Methods and classes
Many functions and classes have been deprecated. These functions still work, but the Javadocs show with what they should be replaced.
.Example from ElasticsearchOperations
[code,java]
[source,java]
----
/**
/*
* Retrieves an object from an index.
*
* @param query the query defining the id of the object to get
@ -115,13 +115,14 @@ Many functions and classes have been deprecated. These functions still work, but
<T> T queryForObject(GetQuery query, Class<T> clazz);
----
==== Elasticsearch deprecations
=== Elasticsearch deprecations
Since version 7 the Elasticsearch `TransportClient` is deprecated, it will be removed with Elasticsearch version 8. Spring Data Elasticsearch deprecates the `ElasticsearchTemplate` class which uses the `TransportClient` in version 4.0.
Mapping types were removed from Elasticsearch 7, they still exist as deprecated values in the Spring Data `@Document` annotation and the `IndexCoordinates` class but they are not used anymore internally.
=== Removals
[[elasticsearch-migration-guide-3.2-4.0.removal]]
== Removals
* As already described, the `ElasticsearchEntityMapper` interface has been removed.
@ -130,4 +131,3 @@ Mapping types were removed from Elasticsearch 7, they still exist as deprecated
* The method `org.springframework.data.elasticsearch.core.ElasticsearchOperations.query(SearchQuery query, ResultsExtractor<T> resultsExtractor);` and the `org.springframework.data.elasticsearch.core.ResultsExtractor` interface have been removed. These could be used to parse the result from Elasticsearch for cases in which the response mapping done with the Jackson based mapper was not enough. Since version 4.0, there are the new <<elasticsearch.operations.searchresulttypes>> to return the information from an Elasticsearch response, so there is no need to expose this low level functionality.
* The low level methods `startScroll`, `continueScroll` and `clearScroll` have been removed from the `ElasticsearchOperations` interface. For low level scroll API access, there now are `searchScrollStart`, `searchScrollContinue` and `searchScrollClear` methods on the `ElasticsearchRestTemplate` class.

View File

@ -35,8 +35,6 @@ IndexCoordinates index = IndexCoordinates.of("sample-index");
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(matchAllQuery())
.withIndices(INDEX_NAME)
.withTypes(TYPE_NAME)
.withFields("message")
.withPageable(PageRequest.of(0, 10))
.build();
@ -62,8 +60,6 @@ IndexCoordinates index = IndexCoordinates.of("sample-index");
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(matchAllQuery())
.withIndices(INDEX_NAME)
.withTypes(TYPE_NAME)
.withFields("message")
.withPageable(PageRequest.of(0, 10))
.build();

View File

@ -43,11 +43,14 @@ The following annotations are available:
* `@Field`: Applied at the field level and defines properties of the field, most of the attributes map to the respective https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html[Elasticsearch Mapping] definitions (the following list is not complete, check the annotation Javadoc for a complete reference):
** `name`: The name of the field as it will be represented in the Elasticsearch document, if not set, the Java field name is used.
** `type`: the field type, can be one of _Text, Keyword, Long, Integer, Short, Byte, Double, Float, Half_Float, Scaled_Float, Date, Date_Nanos, Boolean, Binary, Integer_Range, Float_Range, Long_Range, Double_Range, Date_Range, Ip_Range, Object, Nested, Ip, TokenCount, Percolator, Flattened, Search_As_You_Type_. See https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-types.html[Elasticsearch Mapping Types]
** `format` and `pattern` custom definitions for the _Date_ type.
** `format` and `pattern` definitions for the _Date_ type. `format` must be defined for date types.
** `store`: Flag wether the original field value should be store in Elasticsearch, default value is _false_.
** `analyzer`, `searchAnalyzer`, `normalizer` for specifying custom custom analyzers and normalizer.
* `@GeoPoint`: marks a field as _geo_point_ datatype. Can be omitted if the field is an instance of the `GeoPoint` class.
NOTE: Properties that derive from `TemporalAccessor` must either have a `@Field` annotation of type `FieldType.Date` or a custom converter must be registerd for this type. +
If you are using a custom date format, you need to use _uuuu_ for the year instead of _yyyy_. This is due to a https://www.elastic.co/guide/en/elasticsearch/reference/current/migrate-to-java-time.html#java-time-migration-incompatible-date-formats[change in Elasticsearch 7].
The mapping metadata infrastructure is defined in a separate spring-data-commons project that is technology agnostic.
[[elasticsearch.mapping.meta-model.rules]]

View File

@ -3,8 +3,58 @@
This chapter includes details of the Elasticsearch repository implementation.
.The sample `Book` entity
====
[source,java]
----
@Document(indexName="books")
class Book {
@Id
private String id;
@Field(type = FieldType.text)
private String name;
@Field(type = FieldType.text)
private String summary;
@Field(type = FieldType.Integer)
private Integer price;
// getter/setter ...
}
----
====
include::elasticsearch-repository-queries.adoc[leveloffset=+1]
include::reactive-elasticsearch-repositories.adoc[leveloffset=+1]
[[elasticsearch.repositories.annotations]]
== Annotations for repository methods
=== @Highlight
The `@Highlight` annotation on a repository method defines for which fields of the returned entity highlighting should be included. To search for some text in a `Book` 's name or summary and have the found data highlighted, the following repository method can be used:
====
[source,java]
----
interface BookRepository extends Repository<Book, String> {
@Highlight(fields = {
@HighlightField(name = "name"),
@HighlightField(name = "summary")
})
List<SearchHit<Book>> findByNameOrSummary(String text, String summary);
}
----
====
It is possible to define multiple fields to be highlighted like above, and both the `@Highlight` and the `@HighlightField` annotation can further be customized with a `@HighlightParameters` annotation. Check the Javadocs for the possible configuration options.
In the search results the highlight data can be retrieved from the `SearchHit` class.
[[elasticsearch.annotation]]
== Annotation based configuration
@ -40,7 +90,8 @@ class ProductService {
}
----
<1> The `EnableElasticsearchRepositories` annotation activates the Repository support. If no base package is configured, it will use the one of the configuration class it is put on.
<1> The `EnableElasticsearchRepositories` annotation activates the Repository support.
If no base package is configured, it will use the one of the configuration class it is put on.
<2> Provide a Bean named `elasticsearchTemplate` of type `ElasticsearchOperations` by using one of the configurations shown in the <<elasticsearch.operations>> chapter.
<3> Let Spring inject the Repository bean into your class.
====
@ -145,5 +196,3 @@ Using the `Transport Client` or `Rest Client` element registers an instance of `
</beans>
----
====
include::reactive-elasticsearch-repositories.adoc[leveloffset=+1]

View File

@ -48,7 +48,9 @@ A list of supported keywords for Elasticsearch is shown below.
|===
| Keyword
| Sample
| Elasticsearch Query String| `And`
| Elasticsearch Query String
| `And`
| `findByNameAndPrice`
| `{ "query" : {
"bool" : {
@ -201,7 +203,7 @@ A list of supported keywords for Elasticsearch is shown below.
}
}}`
| `In`
| `In` (when annotated as FieldType.Keyword)
| `findByNameIn(Collection<String>names)`
| `{ "query" : {
"bool" : {
@ -215,7 +217,12 @@ A list of supported keywords for Elasticsearch is shown below.
}
}}`
| `NotIn`
| `In`
| `findByNameIn(Collection<String>names)`
| `{ "query": {"bool": {"must": [{"query_string":{"query": "\"?\" \"?\"", "fields": ["name"]}}]}}}`
| `NotIn` (when annotated as FieldType.Keyword)
| `findByNameNotIn(Collection<String>names)`
| `{ "query" : {
"bool" : {
@ -229,6 +236,10 @@ A list of supported keywords for Elasticsearch is shown below.
}
}}`
| `NotIn`
| `findByNameNotIn(Collection<String>names)`
| `{"query": {"bool": {"must": [{"query_string": {"query": "NOT(\"?\" \"?\")", "fields": ["name"]}}]}}}`
| `Near`
| `findByStoreNear`
| `Not Supported Yet !`

View File

@ -0,0 +1,6 @@
[[elasticsearch.migration]]
= Appendix E: Migration Guides
:leveloffset: +1
include::elasticsearch-migration-guide-3.2-4.0.adoc[]
:leveloffset: -1

View File

@ -0,0 +1,37 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.elasticsearch;
import org.springframework.dao.DataRetrievalFailureException;
import java.util.Map;
/**
* @author Peter-Josef Meisch
* @since 4.0.1 (ported back from master (4.1) branch)
*/
public class BulkFailureException extends DataRetrievalFailureException {
private final Map<String, String> failedDocuments;
public BulkFailureException(String msg, Map<String, String> failedDocuments) {
super(msg);
this.failedDocuments = failedDocuments;
}
public Map<String, String> getFailedDocuments() {
return failedDocuments;
}
}

View File

@ -22,6 +22,7 @@ import org.springframework.dao.UncategorizedDataAccessException;
* @since 4.0
*/
public class UncategorizedElasticsearchException extends UncategorizedDataAccessException {
public UncategorizedElasticsearchException(String msg, Throwable cause) {
super(msg, cause);
}

View File

@ -29,7 +29,7 @@ import java.lang.annotation.Target;
* @author Aleksei Arsenev
*/
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.FIELD)
@Target(ElementType.ANNOTATION_TYPE)
public @interface InnerField {
String suffix();

View File

@ -20,5 +20,5 @@ package org.springframework.data.elasticsearch.annotations;
* @since 4.0
*/
public enum TermVector {
none, no, yes, with_positions, with_offsets, woth_positions_offsets, with_positions_payloads, with_positions_offets_payloads
none, no, yes, with_positions, with_offsets, with_positions_offsets, with_positions_payloads, with_positions_offsets_payloads
}

View File

@ -134,7 +134,7 @@ import org.springframework.web.reactive.function.client.WebClient.RequestBodySpe
*/
public class DefaultReactiveElasticsearchClient implements ReactiveElasticsearchClient, Indices {
private final HostProvider hostProvider;
private final HostProvider<?> hostProvider;
private final RequestCreator requestCreator;
private Supplier<HttpHeaders> headersSupplier = () -> HttpHeaders.EMPTY;
@ -144,7 +144,7 @@ public class DefaultReactiveElasticsearchClient implements ReactiveElasticsearch
*
* @param hostProvider must not be {@literal null}.
*/
public DefaultReactiveElasticsearchClient(HostProvider hostProvider) {
public DefaultReactiveElasticsearchClient(HostProvider<?> hostProvider) {
this(hostProvider, new DefaultRequestCreator());
}
@ -155,7 +155,7 @@ public class DefaultReactiveElasticsearchClient implements ReactiveElasticsearch
* @param hostProvider must not be {@literal null}.
* @param requestCreator must not be {@literal null}.
*/
public DefaultReactiveElasticsearchClient(HostProvider hostProvider, RequestCreator requestCreator) {
public DefaultReactiveElasticsearchClient(HostProvider<?> hostProvider, RequestCreator requestCreator) {
Assert.notNull(hostProvider, "HostProvider must not be null");
Assert.notNull(requestCreator, "RequestCreator must not be null");
@ -639,8 +639,7 @@ public class DefaultReactiveElasticsearchClient implements ReactiveElasticsearch
.flatMap(callback::doWithClient) //
.onErrorResume(throwable -> {
if (throwable instanceof ConnectException) {
if (isCausedByConnectionException(throwable)) {
return hostProvider.getActive(Verification.ACTIVE) //
.flatMap(callback::doWithClient);
}
@ -649,6 +648,27 @@ public class DefaultReactiveElasticsearchClient implements ReactiveElasticsearch
});
}
/**
* checks if the given throwable is a {@link ConnectException} or has one in it's cause chain
*
* @param throwable the throwable to check
* @return true if throwable is caused by a {@link ConnectException}
*/
private boolean isCausedByConnectionException(Throwable throwable) {
Throwable t = throwable;
do {
if (t instanceof ConnectException) {
return true;
}
t = t.getCause();
} while (t != null);
return false;
}
@Override
public Mono<Status> status() {
@ -804,53 +824,85 @@ public class DefaultReactiveElasticsearchClient implements ReactiveElasticsearch
private <T> Publisher<? extends T> handleServerError(Request request, ClientResponse response) {
RestStatus status = RestStatus.fromCode(response.statusCode().value());
int statusCode = response.statusCode().value();
RestStatus status = RestStatus.fromCode(statusCode);
String mediaType = response.headers().contentType().map(MediaType::toString).orElse(XContentType.JSON.mediaType());
return Mono.error(new ElasticsearchStatusException(String.format("%s request to %s returned error code %s.",
request.getMethod(), request.getEndpoint(), response.statusCode().value()), status));
return response.body(BodyExtractors.toMono(byte[].class)) //
.switchIfEmpty(Mono.error(
new ElasticsearchStatusException(String.format("%s request to %s returned error code %s and no body.",
request.getMethod(), request.getEndpoint(), statusCode), status)))
.map(bytes -> new String(bytes, StandardCharsets.UTF_8)) //
.flatMap(content -> contentOrError(content, mediaType, status))
.flatMap(unused -> Mono
.error(new ElasticsearchStatusException(String.format("%s request to %s returned error code %s.",
request.getMethod(), request.getEndpoint(), statusCode), status)));
}
private <T> Publisher<? extends T> handleClientError(String logId, Request request, ClientResponse response,
Class<T> responseType) {
int statusCode = response.statusCode().value();
RestStatus status = RestStatus.fromCode(statusCode);
String mediaType = response.headers().contentType().map(MediaType::toString).orElse(XContentType.JSON.mediaType());
return response.body(BodyExtractors.toMono(byte[].class)) //
.map(bytes -> new String(bytes, StandardCharsets.UTF_8)) //
.flatMap(content -> {
String mediaType = response.headers().contentType().map(MediaType::toString)
.orElse(XContentType.JSON.mediaType());
RestStatus status = RestStatus.fromCode(response.statusCode().value());
try {
ElasticsearchException exception = getElasticsearchException(response, content, mediaType);
if (exception != null) {
StringBuilder sb = new StringBuilder();
buildExceptionMessages(sb, exception);
return Mono.error(new ElasticsearchStatusException(sb.toString(), status, exception));
}
} catch (Exception e) {
return Mono.error(new ElasticsearchStatusException(content, status));
}
return Mono.just(content);
}).doOnNext(it -> ClientLogger.logResponse(logId, response.statusCode(), it)) //
.flatMap(content -> contentOrError(content, mediaType, status)) //
.doOnNext(content -> ClientLogger.logResponse(logId, response.statusCode(), content)) //
.flatMap(content -> doDecode(response, responseType, content));
}
// region ElasticsearchException helper
/**
* checks if the given content body contains an {@link ElasticsearchException}, if yes it is returned in a Mono.error.
* Otherwise the content is returned in the Mono
*
* @param content the content to analyze
* @param mediaType the returned media type
* @param status the response status
* @return a Mono with the content or an Mono.error
*/
private static Mono<String> contentOrError(String content, String mediaType, RestStatus status) {
ElasticsearchException exception = getElasticsearchException(content, mediaType, status);
if (exception != null) {
StringBuilder sb = new StringBuilder();
buildExceptionMessages(sb, exception);
return Mono.error(new ElasticsearchStatusException(sb.toString(), status, exception));
}
return Mono.just(content);
}
/**
* tries to parse an {@link ElasticsearchException} from the given body content
*
* @param content the content to analyse
* @param mediaType the type of the body content
* @return an {@link ElasticsearchException} or {@literal null}.
*/
@Nullable
private ElasticsearchException getElasticsearchException(ClientResponse response, String content, String mediaType)
throws IOException {
private static ElasticsearchException getElasticsearchException(String content, String mediaType, RestStatus status) {
XContentParser parser = createParser(mediaType, content);
// we have a JSON object with an error and a status field
XContentParser.Token token = parser.nextToken(); // Skip START_OBJECT
try {
XContentParser parser = createParser(mediaType, content);
// we have a JSON object with an error and a status field
XContentParser.Token token = parser.nextToken(); // Skip START_OBJECT
do {
token = parser.nextToken();
do {
token = parser.nextToken();
if (parser.currentName().equals("error")) {
return ElasticsearchException.failureFromXContent(parser);
}
} while (token == XContentParser.Token.FIELD_NAME);
return null;
if (parser.currentName().equals("error")) {
return ElasticsearchException.failureFromXContent(parser);
}
} while (token == XContentParser.Token.FIELD_NAME);
return null;
} catch (IOException e) {
return new ElasticsearchStatusException(content, status);
}
}
private static void buildExceptionMessages(StringBuilder sb, Throwable t) {

View File

@ -1,5 +1,5 @@
/*
* Copyright 2018-2020 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@ -27,6 +27,7 @@ import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.web.reactive.function.client.WebClient;
import org.springframework.web.reactive.function.client.WebClient.Builder;
import org.springframework.web.util.DefaultUriBuilderFactory;
/**
* Default {@link WebClientProvider} that uses cached {@link WebClient} instances per {@code hostAndPort}.
@ -156,7 +157,16 @@ class DefaultWebClientProvider implements WebClientProvider {
String baseUrl = String.format("%s://%s:%d%s", this.scheme, socketAddress.getHostString(), socketAddress.getPort(),
pathPrefix == null ? "" : '/' + pathPrefix);
WebClient webClient = builder.baseUrl(baseUrl).filter((request, next) -> next.exchange(request).doOnError(errorListener)).build();
DefaultUriBuilderFactory uriBuilderFactory = new DefaultUriBuilderFactory(baseUrl);
// the template will already be encoded by the RequestConverters methods
uriBuilderFactory.setEncodingMode(DefaultUriBuilderFactory.EncodingMode.VALUES_ONLY);
builder.uriBuilderFactory(uriBuilderFactory); //
WebClient webClient = builder //
.filter((request, next) -> next.exchange(request) //
.doOnError(errorListener)) //
.build(); //
return webClientConfigurer.apply(webClient);
}
}

View File

@ -1,5 +1,5 @@
/*
* Copyright 2018-2020 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@ -34,9 +34,10 @@ import org.springframework.web.reactive.function.client.WebClient;
*
* @author Christoph Strobl
* @author Mark Paluch
* @author Peter-Josef Meisch
* @since 3.2
*/
public interface HostProvider {
public interface HostProvider<T extends HostProvider<T>> {
/**
* Create a new {@link HostProvider} best suited for the given {@link WebClientProvider} and number of hosts.
@ -46,7 +47,7 @@ public interface HostProvider {
* @param endpoints must not be {@literal null} nor empty.
* @return new instance of {@link HostProvider}.
*/
static HostProvider provider(WebClientProvider clientProvider, Supplier<HttpHeaders> headersSupplier,
static HostProvider<?> provider(WebClientProvider clientProvider, Supplier<HttpHeaders> headersSupplier,
InetSocketAddress... endpoints) {
Assert.notNull(clientProvider, "WebClientProvider must not be null");

View File

@ -1,5 +1,5 @@
/*
* Copyright 2018-2020 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@ -21,6 +21,7 @@ import reactor.core.publisher.Mono;
import reactor.util.function.Tuple2;
import java.net.InetSocketAddress;
import java.time.Duration;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
@ -30,6 +31,8 @@ import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.function.Supplier;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.elasticsearch.client.ElasticsearchHost;
import org.springframework.data.elasticsearch.client.ElasticsearchHost.State;
import org.springframework.data.elasticsearch.client.NoReachableHostException;
@ -42,15 +45,19 @@ import org.springframework.web.reactive.function.client.WebClient;
*
* @author Christoph Strobl
* @author Mark Paluch
* @author Peter-Josef Meisch
* @since 3.2
*/
class MultiNodeHostProvider implements HostProvider {
class MultiNodeHostProvider implements HostProvider<MultiNodeHostProvider> {
private final static Logger LOG = LoggerFactory.getLogger(MultiNodeHostProvider.class);
private final WebClientProvider clientProvider;
private final Supplier<HttpHeaders> headersSupplier;
private final Map<InetSocketAddress, ElasticsearchHost> hosts;
MultiNodeHostProvider(WebClientProvider clientProvider, Supplier<HttpHeaders> headersSupplier, InetSocketAddress... endpoints) {
MultiNodeHostProvider(WebClientProvider clientProvider, Supplier<HttpHeaders> headersSupplier,
InetSocketAddress... endpoints) {
this.clientProvider = clientProvider;
this.headersSupplier = headersSupplier;
@ -58,6 +65,8 @@ class MultiNodeHostProvider implements HostProvider {
for (InetSocketAddress endpoint : endpoints) {
this.hosts.put(endpoint, new ElasticsearchHost(endpoint, State.UNKNOWN));
}
LOG.debug("initialized with " + hosts);
}
/*
@ -66,7 +75,7 @@ class MultiNodeHostProvider implements HostProvider {
*/
@Override
public Mono<ClusterInformation> clusterInfo() {
return nodes(null).map(this::updateNodeState).buffer(hosts.size())
return checkNodes(null).map(this::updateNodeState).buffer(hosts.size())
.then(Mono.just(new ClusterInformation(new LinkedHashSet<>(this.hosts.values()))));
}
@ -86,14 +95,19 @@ class MultiNodeHostProvider implements HostProvider {
@Override
public Mono<InetSocketAddress> lookupActiveHost(Verification verification) {
LOG.trace("lookupActiveHost " + verification + " from " + hosts());
if (Verification.LAZY.equals(verification)) {
for (ElasticsearchHost entry : hosts()) {
if (entry.isOnline()) {
LOG.trace("lookupActiveHost returning " + entry);
return Mono.just(entry.getEndpoint());
}
}
LOG.trace("no online host found with LAZY");
}
LOG.trace("searching for active host");
return findActiveHostInKnownActives() //
.switchIfEmpty(findActiveHostInUnresolved()) //
.switchIfEmpty(findActiveHostInDead()) //
@ -105,20 +119,30 @@ class MultiNodeHostProvider implements HostProvider {
}
private Mono<InetSocketAddress> findActiveHostInKnownActives() {
return findActiveForSate(State.ONLINE);
return findActiveForState(State.ONLINE);
}
private Mono<InetSocketAddress> findActiveHostInUnresolved() {
return findActiveForSate(State.UNKNOWN);
return findActiveForState(State.UNKNOWN);
}
private Mono<InetSocketAddress> findActiveHostInDead() {
return findActiveForSate(State.OFFLINE);
return findActiveForState(State.OFFLINE);
}
private Mono<InetSocketAddress> findActiveForSate(State state) {
return nodes(state).map(this::updateNodeState).filter(ElasticsearchHost::isOnline)
.map(ElasticsearchHost::getEndpoint).next();
private Mono<InetSocketAddress> findActiveForState(State state) {
LOG.trace("findActiveForState state " + state + ", current hosts: " + hosts);
return checkNodes(state) //
.map(this::updateNodeState) //
.filter(ElasticsearchHost::isOnline) //
.map(elasticsearchHost -> {
LOG.trace("findActiveForState returning host " + elasticsearchHost);
return elasticsearchHost;
}).map(ElasticsearchHost::getEndpoint) //
.takeLast(1) //
.next();
}
private ElasticsearchHost updateNodeState(Tuple2<InetSocketAddress, ClientResponse> tuple2) {
@ -129,17 +153,19 @@ class MultiNodeHostProvider implements HostProvider {
return elasticsearchHost;
}
private Flux<Tuple2<InetSocketAddress, ClientResponse>> nodes(@Nullable State state) {
private Flux<Tuple2<InetSocketAddress, ClientResponse>> checkNodes(@Nullable State state) {
return Flux.fromIterable(hosts()) //
.filter(entry -> state == null || entry.getState().equals(state)) //
.map(ElasticsearchHost::getEndpoint) //
.flatMap(host -> {
.concatMap(host -> {
Mono<ClientResponse> exchange = createWebClient(host) //
.head().uri("/") //
.headers(httpHeaders -> httpHeaders.addAll(headersSupplier.get())) //
.exchange().doOnError(throwable -> {
.exchange() //
.timeout(Duration.ofSeconds(1)) //
.doOnError(throwable -> {
hosts.put(host, new ElasticsearchHost(host, State.OFFLINE));
clientProvider.getErrorListener().accept(throwable);
});

View File

@ -1,5 +1,5 @@
/*
* Copyright 2018-2020 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@ -32,9 +32,10 @@ import org.springframework.web.reactive.function.client.WebClient;
*
* @author Christoph Strobl
* @author Mark Paluch
* @author Peter-Josef Meisch
* @since 3.2
*/
class SingleNodeHostProvider implements HostProvider {
class SingleNodeHostProvider implements HostProvider<SingleNodeHostProvider> {
private final WebClientProvider clientProvider;
private final Supplier<HttpHeaders> headersSupplier;

View File

@ -36,7 +36,7 @@ import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.elasticsearch.ElasticsearchException;
import org.springframework.data.elasticsearch.BulkFailureException;
import org.springframework.data.elasticsearch.core.convert.ElasticsearchConverter;
import org.springframework.data.elasticsearch.core.convert.MappingElasticsearchConverter;
import org.springframework.data.elasticsearch.core.document.Document;
@ -258,7 +258,11 @@ public abstract class AbstractElasticsearchTemplate implements ElasticsearchOper
long scrollTimeInMillis = TimeValue.timeValueMinutes(1).millis();
// noinspection ConstantConditions
int maxCount = query.isLimiting() ? query.getMaxResults() : 0;
return StreamQueries.streamResults( //
maxCount, //
searchScrollStart(scrollTimeInMillis, query, clazz, index), //
scrollId -> searchScrollContinue(scrollId, scrollTimeInMillis, clazz, index), //
this::searchScrollClear);
@ -401,7 +405,7 @@ public abstract class AbstractElasticsearchTemplate implements ElasticsearchOper
if (item.isFailed())
failedDocuments.put(item.getId(), item.getFailureMessage());
}
throw new ElasticsearchException(
throw new BulkFailureException(
"Bulk operation has failures. Use ElasticsearchException.getFailedDocuments() for detailed messages ["
+ failedDocuments + ']',
failedDocuments);

View File

@ -29,7 +29,9 @@ import org.apache.lucene.queryparser.flexible.core.util.StringUtils;
import org.apache.lucene.queryparser.flexible.standard.QueryParserUtil;
import org.elasticsearch.index.query.BoolQueryBuilder;
import org.elasticsearch.index.query.QueryBuilder;
import org.springframework.data.elasticsearch.annotations.FieldType;
import org.springframework.data.elasticsearch.core.query.Criteria;
import org.springframework.data.elasticsearch.core.query.Field;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@ -118,18 +120,19 @@ class CriteriaQueryProcessor {
Iterator<Criteria.CriteriaEntry> it = chainedCriteria.getQueryCriteriaEntries().iterator();
boolean singeEntryCriteria = (chainedCriteria.getQueryCriteriaEntries().size() == 1);
String fieldName = chainedCriteria.getField().getName();
Field field = chainedCriteria.getField();
String fieldName = field.getName();
Assert.notNull(fieldName, "Unknown field");
QueryBuilder query = null;
if (singeEntryCriteria) {
Criteria.CriteriaEntry entry = it.next();
query = processCriteriaEntry(entry, fieldName);
query = processCriteriaEntry(entry, field);
} else {
query = boolQuery();
while (it.hasNext()) {
Criteria.CriteriaEntry entry = it.next();
((BoolQueryBuilder) query).must(processCriteriaEntry(entry, fieldName));
((BoolQueryBuilder) query).must(processCriteriaEntry(entry, field));
}
}
@ -138,7 +141,11 @@ class CriteriaQueryProcessor {
}
@Nullable
private QueryBuilder processCriteriaEntry(Criteria.CriteriaEntry entry, String fieldName) {
private QueryBuilder processCriteriaEntry(Criteria.CriteriaEntry entry, Field field) {
String fieldName = field.getName();
boolean isKeywordField = FieldType.Keyword == field.getFieldType();
OperationKey key = entry.getKey();
Object value = entry.getValue();
@ -191,10 +198,24 @@ class CriteriaQueryProcessor {
query = fuzzyQuery(fieldName, searchText);
break;
case IN:
query = boolQuery().must(termsQuery(fieldName, toStringList((Iterable<Object>) value)));
if (value instanceof Iterable) {
Iterable<?> iterable = (Iterable<?>) value;
if (isKeywordField) {
query = boolQuery().must(termsQuery(fieldName, toStringList(iterable)));
} else {
query = queryStringQuery(orQueryString(iterable)).field(fieldName);
}
}
break;
case NOT_IN:
query = boolQuery().mustNot(termsQuery(fieldName, toStringList((Iterable<Object>) value)));
if (value instanceof Iterable) {
Iterable<?> iterable = (Iterable<?>) value;
if (isKeywordField) {
query = boolQuery().mustNot(termsQuery(fieldName, toStringList(iterable)));
} else {
query = queryStringQuery("NOT(" + orQueryString(iterable) + ')').field(fieldName);
}
}
break;
}
return query;
@ -208,6 +229,25 @@ class CriteriaQueryProcessor {
return list;
}
private static String orQueryString(Iterable<?> iterable) {
StringBuilder sb = new StringBuilder();
for (Object item : iterable) {
if (item != null) {
if (sb.length() > 0) {
sb.append(' ');
}
sb.append('"');
sb.append(QueryParserUtil.escape(item.toString()));
sb.append('"');
}
}
return sb.toString();
}
private void addBoost(QueryBuilder query, float boost) {
if (Float.isNaN(boost)) {
return;

View File

@ -40,6 +40,8 @@ import org.elasticsearch.index.reindex.DeleteByQueryRequest;
import org.elasticsearch.search.builder.SearchSourceBuilder;
import org.elasticsearch.search.fetch.subphase.FetchSourceContext;
import org.elasticsearch.search.suggest.SuggestBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.elasticsearch.core.convert.ElasticsearchConverter;
import org.springframework.data.elasticsearch.core.document.DocumentAdapters;
import org.springframework.data.elasticsearch.core.document.SearchDocumentResponse;
@ -88,6 +90,8 @@ import org.springframework.util.Assert;
*/
public class ElasticsearchRestTemplate extends AbstractElasticsearchTemplate {
private static final Logger LOGGER = LoggerFactory.getLogger(ElasticsearchRestTemplate.class);
private RestHighLevelClient client;
private ElasticsearchExceptionTranslator exceptionTranslator;
@ -206,7 +210,7 @@ public class ElasticsearchRestTemplate extends AbstractElasticsearchTemplate {
Assert.notNull(id, "id must not be null");
Assert.notNull(index, "index must not be null");
DeleteRequest request = new DeleteRequest(index.getIndexName(), elasticsearchConverter.convertId(id));
DeleteRequest request = requestFactory.deleteRequest(elasticsearchConverter.convertId(id), index);
return execute(client -> client.delete(request, RequestOptions.DEFAULT).getId());
}
@ -300,9 +304,13 @@ public class ElasticsearchRestTemplate extends AbstractElasticsearchTemplate {
@Override
public void searchScrollClear(List<String> scrollIds) {
ClearScrollRequest request = new ClearScrollRequest();
request.scrollIds(scrollIds);
execute(client -> client.clearScroll(request, RequestOptions.DEFAULT));
try {
ClearScrollRequest request = new ClearScrollRequest();
request.scrollIds(scrollIds);
execute(client -> client.clearScroll(request, RequestOptions.DEFAULT));
} catch (Exception e) {
LOGGER.warn("Could not clear scroll: {}", e.getMessage());
}
}
@Override

View File

@ -86,6 +86,7 @@ import org.springframework.util.Assert;
public class ElasticsearchTemplate extends AbstractElasticsearchTemplate {
private static final Logger QUERY_LOGGER = LoggerFactory
.getLogger("org.springframework.data.elasticsearch.core.QUERY");
private static final Logger LOGGER = LoggerFactory.getLogger(ElasticsearchTemplate.class);
private Client client;
@Nullable private String searchTimeout;
@ -322,7 +323,11 @@ public class ElasticsearchTemplate extends AbstractElasticsearchTemplate {
@Override
public void searchScrollClear(List<String> scrollIds) {
client.prepareClearScroll().setScrollIds(scrollIds).execute().actionGet();
try {
client.prepareClearScroll().setScrollIds(scrollIds).execute().actionGet();
} catch (Exception e) {
LOGGER.warn("Could not clear scroll: {}", e.getMessage());
}
}
@Override

View File

@ -15,15 +15,11 @@
*/
package org.springframework.data.elasticsearch.core;
import static org.elasticsearch.index.VersionType.*;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import reactor.util.function.Tuple2;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@ -41,19 +37,10 @@ import org.elasticsearch.action.search.SearchRequest;
import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.action.support.WriteRequest;
import org.elasticsearch.action.support.WriteRequest.RefreshPolicy;
import org.elasticsearch.client.Requests;
import org.elasticsearch.client.core.CountRequest;
import org.elasticsearch.index.get.GetResult;
import org.elasticsearch.index.query.QueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.index.query.WrapperQueryBuilder;
import org.elasticsearch.index.reindex.BulkByScrollResponse;
import org.elasticsearch.index.reindex.DeleteByQueryRequest;
import org.elasticsearch.search.aggregations.Aggregation;
import org.elasticsearch.search.builder.SearchSourceBuilder;
import org.elasticsearch.search.sort.FieldSortBuilder;
import org.elasticsearch.search.sort.SortBuilders;
import org.elasticsearch.search.sort.SortOrder;
import org.reactivestreams.Publisher;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@ -61,9 +48,9 @@ import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.domain.Sort;
import org.springframework.data.elasticsearch.ElasticsearchException;
import org.springframework.data.elasticsearch.BulkFailureException;
import org.springframework.data.elasticsearch.NoSuchIndexException;
import org.springframework.data.elasticsearch.UncategorizedElasticsearchException;
import org.springframework.data.elasticsearch.client.reactive.ReactiveElasticsearchClient;
import org.springframework.data.elasticsearch.core.EntityOperations.AdaptibleEntity;
import org.springframework.data.elasticsearch.core.EntityOperations.Entity;
@ -82,10 +69,8 @@ import org.springframework.data.elasticsearch.core.mapping.SimpleElasticsearchMa
import org.springframework.data.elasticsearch.core.query.BulkOptions;
import org.springframework.data.elasticsearch.core.query.CriteriaQuery;
import org.springframework.data.elasticsearch.core.query.IndexQuery;
import org.springframework.data.elasticsearch.core.query.NativeSearchQuery;
import org.springframework.data.elasticsearch.core.query.Query;
import org.springframework.data.elasticsearch.core.query.SeqNoPrimaryTerm;
import org.springframework.data.elasticsearch.core.query.StringQuery;
import org.springframework.data.elasticsearch.core.query.UpdateQuery;
import org.springframework.data.elasticsearch.support.VersionInfo;
import org.springframework.data.mapping.callback.ReactiveEntityCallbacks;
@ -194,6 +179,7 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
T savedEntity = it.getT1();
IndexResponse indexResponse = it.getT2();
AdaptibleEntity<T> adaptableEntity = operations.forEntity(savedEntity, converter.getConversionService());
// noinspection ReactiveStreamsNullableInLambdaInTransform
return adaptableEntity.populateIdIfNecessary(indexResponse.getId());
}).flatMap(saved -> maybeCallAfterSave(saved, index));
}
@ -268,7 +254,8 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
protected Flux<BulkItemResponse> doBulkOperation(List<?> queries, BulkOptions bulkOptions, IndexCoordinates index) {
BulkRequest bulkRequest = prepareWriteRequest(requestFactory.bulkRequest(queries, bulkOptions, index));
return client.bulk(bulkRequest) //
.onErrorMap(e -> new ElasticsearchException("Error while bulk for request: " + bulkRequest.toString(), e)) //
.onErrorMap(
e -> new UncategorizedElasticsearchException("Error while bulk for request: " + bulkRequest.toString(), e)) //
.flatMap(this::checkForBulkOperationFailure) //
.flatMapMany(response -> Flux.fromArray(response.getItems()));
}
@ -283,7 +270,7 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
failedDocuments.put(item.getId(), item.getFailureMessage());
}
}
ElasticsearchException exception = new ElasticsearchException(
BulkFailureException exception = new BulkFailureException(
"Bulk operation has failures. Use ElasticsearchException.getFailedDocuments() for detailed messages ["
+ failedDocuments + ']',
failedDocuments);
@ -315,9 +302,8 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
return doExists(id, index);
}
private Mono<Boolean> doExists(String id, @Nullable IndexCoordinates index) {
return Mono.defer(() -> doExists(new GetRequest(index.getIndexName(), id)));
private Mono<Boolean> doExists(String id, IndexCoordinates index) {
return Mono.defer(() -> doExists(requestFactory.getRequest(id, index)));
}
/**
@ -334,27 +320,30 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
private <T> Mono<Tuple2<T, IndexResponse>> doIndex(T entity, IndexCoordinates index) {
AdaptibleEntity<?> adaptibleEntity = operations.forEntity(entity, converter.getConversionService());
IndexRequest request = getIndexRequest(entity, adaptibleEntity, index);
IndexRequest request = requestFactory.indexRequest(getIndexQuery(entity), index);
request = prepareIndexRequest(entity, request);
return Mono.just(entity).zipWith(doIndex(request));
}
private IndexRequest getIndexRequest(Object value, AdaptibleEntity<?> entity, IndexCoordinates index) {
private IndexQuery getIndexQuery(Object value) {
AdaptibleEntity<?> entity = operations.forEntity(value, converter.getConversionService());
Object id = entity.getId();
IndexQuery query = new IndexQuery();
IndexRequest request = id != null ? new IndexRequest(index.getIndexName()).id(converter.convertId(id))
: new IndexRequest(index.getIndexName());
request.source(converter.mapObject(value).toJson(), Requests.INDEX_CONTENT_TYPE);
if (id != null) {
query.setId(id.toString());
}
query.setObject(value);
boolean usingSeqNo = false;
if (entity.hasSeqNoPrimaryTerm()) {
SeqNoPrimaryTerm seqNoPrimaryTerm = entity.getSeqNoPrimaryTerm();
if (seqNoPrimaryTerm != null) {
request.setIfSeqNo(seqNoPrimaryTerm.getSequenceNumber());
request.setIfPrimaryTerm(seqNoPrimaryTerm.getPrimaryTerm());
query.setSeqNo(seqNoPrimaryTerm.getSequenceNumber());
query.setPrimaryTerm(seqNoPrimaryTerm.getPrimaryTerm());
usingSeqNo = true;
}
}
@ -364,32 +353,11 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
Number version = entity.getVersion();
if (version != null) {
request.version(version.longValue());
request.versionType(EXTERNAL);
}
}
return request;
}
private IndexQuery getIndexQuery(Object value) {
AdaptibleEntity<?> entity = operations.forEntity(value, converter.getConversionService());
Object id = entity.getId();
IndexQuery query = new IndexQuery();
if (id != null) {
query.setId(id.toString());
}
query.setObject(value);
if (entity.isVersionedEntity()) {
Number version = entity.getVersion();
if (version != null) {
query.setVersion(version.longValue());
}
}
return query;
}
@ -410,9 +378,7 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
}
private Mono<GetResult> doGet(String id, ElasticsearchPersistentEntity<?> entity, IndexCoordinates index) {
return Mono.defer(() -> {
return doGet(new GetRequest(index.getIndexName(), id));
});
return Mono.defer(() -> doGet(requestFactory.getRequest(id, index)));
}
/**
@ -465,8 +431,8 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
private Mono<String> doDeleteById(String id, IndexCoordinates index) {
return Mono.defer(() -> {
return doDelete(prepareDeleteRequest(new DeleteRequest(index.getIndexName(), id)));
DeleteRequest request = requestFactory.deleteRequest(id, index);
return doDelete(prepareDeleteRequest(request));
});
}
@ -479,8 +445,7 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
Assert.notNull(query, "Query must not be null!");
return doDeleteBy(query, getPersistentEntityFor(entityType), index).map(BulkByScrollResponse::getDeleted)
.publishNext();
return doDeleteBy(query, entityType, index).map(BulkByScrollResponse::getDeleted).publishNext();
}
@Override
@ -488,13 +453,10 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
return delete(query, entityType, getIndexCoordinatesFor(entityType));
}
private Flux<BulkByScrollResponse> doDeleteBy(Query query, ElasticsearchPersistentEntity<?> entity,
IndexCoordinates index) {
private Flux<BulkByScrollResponse> doDeleteBy(Query query, Class<?> entityType, IndexCoordinates index) {
return Flux.defer(() -> {
DeleteByQueryRequest request = new DeleteByQueryRequest(index.getIndexNames());
request.setQuery(mappedQuery(query, entity));
DeleteByQueryRequest request = requestFactory.deleteByQueryRequest(query, entityType, index);
return doDeleteBy(prepareDeleteByRequest(request));
});
}
@ -552,8 +514,13 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
*/
protected DeleteByQueryRequest prepareDeleteByRequest(DeleteByQueryRequest request) {
if (refreshPolicy != null && !RefreshPolicy.NONE.equals(refreshPolicy)) {
request = request.setRefresh(true);
if (refreshPolicy != null) {
if (RefreshPolicy.NONE.equals(refreshPolicy)) {
request = request.setRefresh(false);
} else {
request = request.setRefresh(true);
}
}
if (indicesOptions != null) {
@ -661,43 +628,6 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
});
}
private CountRequest buildCountRequest(Query query, ElasticsearchPersistentEntity<?> entity, IndexCoordinates index) {
CountRequest request = new CountRequest(index.getIndexNames());
SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder();
searchSourceBuilder.query(mappedQuery(query, entity));
searchSourceBuilder.trackScores(query.getTrackScores());
QueryBuilder postFilterQuery = mappedFilterQuery(query, entity);
if (postFilterQuery != null) {
searchSourceBuilder.postFilter(postFilterQuery);
}
if (query.getSourceFilter() != null) {
searchSourceBuilder.fetchSource(query.getSourceFilter().getIncludes(), query.getSourceFilter().getExcludes());
}
if (query instanceof NativeSearchQuery && ((NativeSearchQuery) query).getCollapseBuilder() != null) {
searchSourceBuilder.collapse(((NativeSearchQuery) query).getCollapseBuilder());
}
sort(query, entity).forEach(searchSourceBuilder::sort);
if (query.getMinScore() > 0) {
searchSourceBuilder.minScore(query.getMinScore());
}
if (query.getIndicesOptions() != null) {
request.indicesOptions(query.getIndicesOptions());
}
if (query.getPreference() != null) {
request.preference(query.getPreference());
}
request.source(searchSourceBuilder);
return request;
}
/**
* Customization hook on the actual execution result {@link Publisher}. <br />
*
@ -762,61 +692,6 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
.map(DocumentAdapters::from).onErrorResume(NoSuchIndexException.class, it -> Mono.empty());
}
@Nullable
private QueryBuilder mappedFilterQuery(Query query, ElasticsearchPersistentEntity<?> entity) {
if (query instanceof NativeSearchQuery) {
return ((NativeSearchQuery) query).getFilter();
}
return null;
}
private QueryBuilder mappedQuery(Query query, ElasticsearchPersistentEntity<?> entity) {
QueryBuilder elasticsearchQuery = null;
if (query instanceof CriteriaQuery) {
converter.updateQuery((CriteriaQuery) query, entity.getType());
elasticsearchQuery = new CriteriaQueryProcessor().createQueryFromCriteria(((CriteriaQuery) query).getCriteria());
} else if (query instanceof StringQuery) {
elasticsearchQuery = new WrapperQueryBuilder(((StringQuery) query).getSource());
} else if (query instanceof NativeSearchQuery) {
elasticsearchQuery = ((NativeSearchQuery) query).getQuery();
} else {
throw new IllegalArgumentException(String.format("Unknown query type '%s'.", query.getClass()));
}
return elasticsearchQuery != null ? elasticsearchQuery : QueryBuilders.matchAllQuery();
}
private static List<FieldSortBuilder> sort(Query query, ElasticsearchPersistentEntity<?> entity) {
if (query.getSort() == null || query.getSort().isUnsorted()) {
return Collections.emptyList();
}
List<FieldSortBuilder> mappedSort = new ArrayList<>();
for (Sort.Order order : query.getSort()) {
ElasticsearchPersistentProperty property = entity.getPersistentProperty(order.getProperty());
String fieldName = property != null ? property.getFieldName() : order.getProperty();
FieldSortBuilder sort = SortBuilders.fieldSort(fieldName)
.order(order.getDirection().isDescending() ? SortOrder.DESC : SortOrder.ASC);
if (order.getNullHandling() == Sort.NullHandling.NULLS_FIRST) {
sort.missing("_first");
} else if (order.getNullHandling() == Sort.NullHandling.NULLS_LAST) {
sort.missing("_last");
}
mappedSort.add(sort);
}
return mappedSort;
}
/**
* Customization hook to modify a generated {@link SearchRequest} prior to its execution. Eg. by setting the
* {@link SearchRequest#indicesOptions(IndicesOptions) indices options} if applicable.
@ -950,7 +825,6 @@ public class ReactiveElasticsearchTemplate implements ReactiveElasticsearchOpera
return Mono.just(entity);
}
// endregion
protected interface DocumentCallback<T> {

View File

@ -1,5 +1,5 @@
/*
* Copyright 2019-2020 the original author or authors.
* Copyright 2019-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@ -29,6 +29,7 @@ import org.elasticsearch.action.admin.indices.create.CreateIndexRequestBuilder;
import org.elasticsearch.action.admin.indices.mapping.put.PutMappingRequestBuilder;
import org.elasticsearch.action.bulk.BulkRequest;
import org.elasticsearch.action.bulk.BulkRequestBuilder;
import org.elasticsearch.action.delete.DeleteRequest;
import org.elasticsearch.action.get.GetRequest;
import org.elasticsearch.action.get.GetRequestBuilder;
import org.elasticsearch.action.get.MultiGetRequest;
@ -58,6 +59,7 @@ import org.elasticsearch.script.Script;
import org.elasticsearch.script.ScriptType;
import org.elasticsearch.search.aggregations.AbstractAggregationBuilder;
import org.elasticsearch.search.builder.SearchSourceBuilder;
import org.elasticsearch.search.fetch.subphase.FetchSourceContext;
import org.elasticsearch.search.fetch.subphase.highlight.HighlightBuilder;
import org.elasticsearch.search.sort.FieldSortBuilder;
import org.elasticsearch.search.sort.GeoDistanceSortBuilder;
@ -249,6 +251,10 @@ class RequestFactory {
return deleteByQueryRequest;
}
public DeleteRequest deleteRequest(String id, IndexCoordinates index) {
return new DeleteRequest(index.getIndexName(), id);
}
@Deprecated
public DeleteByQueryRequestBuilder deleteByQueryRequestBuilder(Client client, DeleteQuery deleteQuery,
IndexCoordinates index) {
@ -344,6 +350,7 @@ class RequestFactory {
throw new ElasticsearchException(
"object or source is null, failed to index the document [id: " + query.getId() + ']');
}
if (query.getVersion() != null) {
indexRequest.version(query.getVersion());
VersionType versionType = retrieveVersionTypeFromPersistentEntity(query.getObject().getClass());
@ -353,6 +360,7 @@ class RequestFactory {
if (query.getSeqNo() != null) {
indexRequest.setIfSeqNo(query.getSeqNo());
}
if (query.getPrimaryTerm() != null) {
indexRequest.setIfPrimaryTerm(query.getPrimaryTerm());
}
@ -753,13 +761,20 @@ class RequestFactory {
searchQuery.addSourceFilter(new FetchSourceFilter(toArray(searchQuery.getFields()), null));
}
FetchSourceContext fetchSourceContext = getFetchSourceContext(searchQuery);
for (String id : searchQuery.getIds()) {
MultiGetRequest.Item item = new MultiGetRequest.Item(index.getIndexName(), id);
if (searchQuery.getRoute() != null) {
item = item.routing(searchQuery.getRoute());
}
items.add(item);
if (fetchSourceContext != null) {
item.fetchSourceContext(fetchSourceContext);
}
items.add(item);
}
return items;
}
@ -1040,4 +1055,24 @@ class RequestFactory {
return values.toArray(valuesAsArray);
}
private FetchSourceContext getFetchSourceContext(Query searchQuery) {
FetchSourceContext fetchSourceContext = null;
SourceFilter sourceFilter = searchQuery.getSourceFilter();
if (!isEmpty(searchQuery.getFields())) {
if (sourceFilter == null) {
sourceFilter = new FetchSourceFilter(toArray(searchQuery.getFields()), null);
} else {
ArrayList<String> arrayList = new ArrayList<>();
Collections.addAll(arrayList, sourceFilter.getIncludes());
sourceFilter = new FetchSourceFilter(toArray(arrayList), null);
}
fetchSourceContext = new FetchSourceContext(true, sourceFilter.getIncludes(), sourceFilter.getExcludes());
} else if (sourceFilter != null) {
fetchSourceContext = new FetchSourceContext(true, sourceFilter.getIncludes(), sourceFilter.getExcludes());
}
return fetchSourceContext;
}
}

View File

@ -38,8 +38,8 @@ public abstract class ResourceUtil {
/**
* Read a {@link ClassPathResource} into a {@link String}.
*
* @param url
* @return
* @param url url the file url
* @return the contents of the file or null if it could not be read
*/
@Nullable
public static String readFileFromClasspath(String url) {
@ -48,7 +48,7 @@ public abstract class ResourceUtil {
try (InputStream is = classPathResource.getInputStream()) {
return StreamUtils.copyToString(is, Charset.defaultCharset());
} catch (Exception e) {
LOGGER.debug(String.format("Failed to load file from url: %s: %s", url, e.getMessage()));
LOGGER.warn(String.format("Failed to load file from url: %s: %s", url, e.getMessage()));
return null;
}
}

View File

@ -18,6 +18,7 @@ package org.springframework.data.elasticsearch.core;
import java.util.Iterator;
import java.util.List;
import java.util.NoSuchElementException;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.function.Consumer;
import java.util.function.Function;
@ -38,13 +39,15 @@ abstract class StreamQueries {
/**
* Stream query results using {@link SearchScrollHits}.
*
* @param maxCount the maximum number of entities to return, a value of 0 means that all available entities are
* returned
* @param searchHits the initial hits
* @param continueScrollFunction function to continue scrolling applies to the current scrollId.
* @param clearScrollConsumer consumer to clear the scroll context by accepting the scrollIds to clear.
* @param <T>
* @param <T> the entity type
* @return the {@link SearchHitsIterator}.
*/
static <T> SearchHitsIterator<T> streamResults(SearchScrollHits<T> searchHits,
static <T> SearchHitsIterator<T> streamResults(int maxCount, SearchScrollHits<T> searchHits,
Function<String, SearchScrollHits<T>> continueScrollFunction, Consumer<List<String>> clearScrollConsumer) {
Assert.notNull(searchHits, "searchHits must not be null.");
@ -59,20 +62,14 @@ abstract class StreamQueries {
return new SearchHitsIterator<T>() {
// As we couldn't retrieve single result with scroll, store current hits.
private volatile Iterator<SearchHit<T>> scrollHits = searchHits.iterator();
private volatile boolean continueScroll = scrollHits.hasNext();
private volatile AtomicInteger currentCount = new AtomicInteger();
private volatile Iterator<SearchHit<T>> currentScrollHits = searchHits.iterator();
private volatile boolean continueScroll = currentScrollHits.hasNext();
private volatile ScrollState scrollState = new ScrollState(searchHits.getScrollId());
@Override
public void close() {
try {
clearScrollConsumer.accept(scrollState.getScrollIds());
} finally {
scrollHits = null;
scrollState = null;
}
clearScrollConsumer.accept(scrollState.getScrollIds());
}
@Override
@ -99,24 +96,25 @@ abstract class StreamQueries {
@Override
public boolean hasNext() {
if (!continueScroll) {
if (!continueScroll || (maxCount > 0 && currentCount.get() >= maxCount)) {
return false;
}
if (!scrollHits.hasNext()) {
if (!currentScrollHits.hasNext()) {
SearchScrollHits<T> nextPage = continueScrollFunction.apply(scrollState.getScrollId());
scrollHits = nextPage.iterator();
currentScrollHits = nextPage.iterator();
scrollState.updateScrollId(nextPage.getScrollId());
continueScroll = scrollHits.hasNext();
continueScroll = currentScrollHits.hasNext();
}
return scrollHits.hasNext();
return currentScrollHits.hasNext();
}
@Override
public SearchHit<T> next() {
if (hasNext()) {
return scrollHits.next();
currentCount.incrementAndGet();
return currentScrollHits.next();
}
throw new NoSuchElementException();
}

View File

@ -18,11 +18,13 @@ package org.springframework.data.elasticsearch.core.convert;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.time.Instant;
import java.time.ZonedDateTime;
import java.time.temporal.TemporalAccessor;
import java.util.Date;
import java.util.concurrent.ConcurrentHashMap;
import org.elasticsearch.common.time.DateFormatter;
import org.elasticsearch.common.time.DateFormatters;
import org.springframework.data.elasticsearch.annotations.DateFormat;
import org.springframework.util.Assert;
@ -103,10 +105,10 @@ final public class ElasticsearchDateConverter {
* @return the new created object
*/
public <T extends TemporalAccessor> T parse(String input, Class<T> type) {
TemporalAccessor accessor = dateFormatter.parse(input);
ZonedDateTime zonedDateTime = DateFormatters.from(dateFormatter.parse(input));
try {
Method method = type.getMethod("from", TemporalAccessor.class);
Object o = method.invoke(null, accessor);
Object o = method.invoke(null, zonedDateTime);
return type.cast(o);
} catch (NoSuchMethodException e) {
throw new ConversionException("no 'from' factory method found in class " + type.getName());
@ -122,6 +124,7 @@ final public class ElasticsearchDateConverter {
* @return the new created object
*/
public Date parse(String input) {
return new Date(Instant.from(dateFormatter.parse(input)).toEpochMilli());
ZonedDateTime zonedDateTime = DateFormatters.from(dateFormatter.parse(input));
return new Date(Instant.from(zonedDateTime).toEpochMilli());
}
}

View File

@ -15,18 +15,14 @@
*/
package org.springframework.data.elasticsearch.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.time.temporal.TemporalAccessor;
import java.util.*;
import java.util.Map.Entry;
import java.util.Optional;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.stream.Collectors;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.context.ApplicationContext;
@ -44,6 +40,7 @@ import org.springframework.data.elasticsearch.core.mapping.ElasticsearchPersiste
import org.springframework.data.elasticsearch.core.mapping.ElasticsearchPersistentProperty;
import org.springframework.data.elasticsearch.core.mapping.ElasticsearchPersistentPropertyConverter;
import org.springframework.data.elasticsearch.core.query.CriteriaQuery;
import org.springframework.data.elasticsearch.core.query.Field;
import org.springframework.data.elasticsearch.core.query.SeqNoPrimaryTerm;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
@ -77,6 +74,8 @@ import org.springframework.util.ObjectUtils;
public class MappingElasticsearchConverter
implements ElasticsearchConverter, ApplicationContextAware, InitializingBean {
private static final Logger LOGGER = LoggerFactory.getLogger(MappingElasticsearchConverter.class);
private final MappingContext<? extends ElasticsearchPersistentEntity<?>, ElasticsearchPersistentProperty> mappingContext;
private final GenericConversionService conversionService;
@ -85,6 +84,8 @@ public class MappingElasticsearchConverter
private ElasticsearchTypeMapper typeMapper;
private ConcurrentHashMap<String, Integer> propertyWarnings = new ConcurrentHashMap<>();
public MappingElasticsearchConverter(
MappingContext<? extends ElasticsearchPersistentEntity<?>, ElasticsearchPersistentProperty> mappingContext) {
this(mappingContext, null);
@ -267,11 +268,26 @@ public class MappingElasticsearchConverter
return null;
}
if (property.hasPropertyConverter() && String.class.isAssignableFrom(source.getClass())) {
source = property.getPropertyConverter().read((String) source);
Class<R> rawType = targetType.getType();
if (property.hasPropertyConverter()) {
source = propertyConverterRead(property, source);
} else if (TemporalAccessor.class.isAssignableFrom(property.getType())
&& !conversions.hasCustomReadTarget(source.getClass(), rawType)) {
// log at most 5 times
String propertyName = property.getOwner().getType().getSimpleName() + '.' + property.getName();
String key = propertyName + "-read";
int count = propertyWarnings.computeIfAbsent(key, k -> 0);
if (count < 5) {
LOGGER.warn(
"Type {} of property {} is a TemporalAccessor class but has neither a @Field annotation defining the date type nor a registered converter for reading!"
+ " It cannot be mapped from a complex object in Elasticsearch!",
property.getType().getSimpleName(), propertyName);
propertyWarnings.put(key, count + 1);
}
}
Class<R> rawType = targetType.getType();
if (conversions.hasCustomReadTarget(source.getClass(), rawType)) {
return rawType.cast(conversionService.convert(source, rawType));
} else if (source instanceof List) {
@ -283,6 +299,32 @@ public class MappingElasticsearchConverter
return (R) readSimpleValue(source, targetType);
}
private Object propertyConverterRead(ElasticsearchPersistentProperty property, Object source) {
ElasticsearchPersistentPropertyConverter propertyConverter = Objects
.requireNonNull(property.getPropertyConverter());
if (source instanceof String[]) {
// convert to a List
source = Arrays.asList((String[]) source);
}
if (source instanceof List) {
source = ((List<?>) source).stream().map(it -> convertOnRead(propertyConverter, it)).collect(Collectors.toList());
} else if (source instanceof Set) {
source = ((Set<?>) source).stream().map(it -> convertOnRead(propertyConverter, it)).collect(Collectors.toSet());
} else {
source = convertOnRead(propertyConverter, source);
}
return source;
}
private Object convertOnRead(ElasticsearchPersistentPropertyConverter propertyConverter, Object source) {
if (String.class.isAssignableFrom(source.getClass())) {
source = propertyConverter.read((String) source);
}
return source;
}
@SuppressWarnings("unchecked")
@Nullable
private <R> R readCollectionValue(@Nullable List<?> source, ElasticsearchPersistentProperty property,
@ -293,14 +335,17 @@ public class MappingElasticsearchConverter
}
Collection<Object> target = createCollectionForValue(targetType, source.size());
TypeInformation<?> componentType = targetType.getComponentType();
for (Object value : source) {
if (value == null) {
target.add(null);
} else if (componentType != null && !ClassTypeInformation.OBJECT.equals(componentType)
&& isSimpleType(componentType.getType())) {
target.add(readSimpleValue(value, componentType));
} else if (isSimpleType(value)) {
target.add(
readSimpleValue(value, targetType.getComponentType() != null ? targetType.getComponentType() : targetType));
target.add(readSimpleValue(value, componentType != null ? componentType : targetType));
} else {
if (value instanceof List) {
@ -471,8 +516,21 @@ public class MappingElasticsearchConverter
}
if (property.hasPropertyConverter()) {
ElasticsearchPersistentPropertyConverter propertyConverter = property.getPropertyConverter();
value = propertyConverter.write(value);
value = propertyConverterWrite(property, value);
} else if (TemporalAccessor.class.isAssignableFrom(property.getActualType())
&& !conversions.hasCustomWriteTarget(value.getClass())) {
// log at most 5 times
String propertyName = entity.getType().getSimpleName() + '.' + property.getName();
String key = propertyName + "-write";
int count = propertyWarnings.computeIfAbsent(key, k -> 0);
if (count < 5) {
LOGGER.warn(
"Type {} of property {} is a TemporalAccessor class but has neither a @Field annotation defining the date type nor a registered converter for writing!"
+ " It will be mapped to a complex object in Elasticsearch!",
property.getType().getSimpleName(), propertyName);
propertyWarnings.put(key, count + 1);
}
}
if (!isSimpleType(value)) {
@ -486,6 +544,20 @@ public class MappingElasticsearchConverter
}
}
private Object propertyConverterWrite(ElasticsearchPersistentProperty property, Object value) {
ElasticsearchPersistentPropertyConverter propertyConverter = Objects
.requireNonNull(property.getPropertyConverter());
if (value instanceof List) {
value = ((List<?>) value).stream().map(propertyConverter::write).collect(Collectors.toList());
} else if (value instanceof Set) {
value = ((Set<?>) value).stream().map(propertyConverter::write).collect(Collectors.toSet());
} else {
value = propertyConverter.write(value);
}
return value;
}
protected void writeProperty(ElasticsearchPersistentProperty property, Object value, MapValueAccessor sink) {
Optional<Class<?>> customWriteTarget = conversions.getCustomWriteTarget(value.getClass());
@ -556,7 +628,9 @@ public class MappingElasticsearchConverter
Map<Object, Object> target = new LinkedHashMap<>();
Streamable<Entry<String, Object>> mapSource = Streamable.of(value.entrySet());
if (!typeHint.getActualType().getType().equals(Object.class)
TypeInformation<?> actualType = typeHint.getActualType();
if (actualType != null && !actualType.getType().equals(Object.class)
&& isSimpleType(typeHint.getMapValueType().getType())) {
mapSource.forEach(it -> {
@ -595,8 +669,14 @@ public class MappingElasticsearchConverter
: Streamable.of(ObjectUtils.toObjectArray(value));
List<Object> target = new ArrayList<>();
if (!typeHint.getActualType().getType().equals(Object.class) && isSimpleType(typeHint.getActualType().getType())) {
collectionSource.map(this::getWriteSimpleValue).forEach(target::add);
TypeInformation<?> actualType = typeHint.getActualType();
Class<?> type = actualType != null ? actualType.getType() : null;
if (type != null && !type.equals(Object.class) && isSimpleType(type)) {
// noinspection ReturnOfNull
collectionSource //
.map(element -> element != null ? getWriteSimpleValue(element) : null) //
.forEach(target::add);
} else {
collectionSource.map(it -> {
@ -670,10 +750,6 @@ public class MappingElasticsearchConverter
/**
* Compute the type to use by checking the given entity against the store type;
*
* @param entity
* @param source
* @return
*/
private ElasticsearchPersistentEntity<?> computeClosestEntity(ElasticsearchPersistentEntity<?> entity,
Map<String, Object> source) {
@ -709,11 +785,12 @@ public class MappingElasticsearchConverter
if (persistentEntity != null) {
criteriaQuery.getCriteria().getCriteriaChain().forEach(criteria -> {
String name = criteria.getField().getName();
Field field = criteria.getField();
String name = field.getName();
ElasticsearchPersistentProperty property = persistentEntity.getPersistentProperty(name);
if (property != null && property.getName().equals(name)) {
criteria.getField().setName(property.getFieldName());
field.setName(property.getFieldName());
if (property.hasPropertyConverter()) {
ElasticsearchPersistentPropertyConverter propertyConverter = property.getPropertyConverter();
@ -729,6 +806,13 @@ public class MappingElasticsearchConverter
}
});
}
org.springframework.data.elasticsearch.annotations.Field fieldAnnotation = property.findAnnotation(org.springframework.data.elasticsearch.annotations.Field.class);
if (fieldAnnotation != null) {
field.setFieldType(fieldAnnotation.type());
}
}
});
}
@ -746,13 +830,21 @@ public class MappingElasticsearchConverter
@Nullable
public Object get(ElasticsearchPersistentProperty property) {
String fieldName = property.getFieldName();
if (target instanceof Document) {
// nested objects may have properties like 'id' which are recognized as isIdProperty() but they are not
// Documents
Document document = (Document) target;
if (property.isIdProperty() && document.hasId()) {
return document.getId();
Object id = null;
// take the id property from the document source if available
if (!fieldName.contains(".")) {
id = target.get(fieldName);
}
return id != null ? id : document.getId();
}
if (property.isVersionProperty() && document.hasVersion()) {
@ -765,8 +857,6 @@ public class MappingElasticsearchConverter
return ((SearchDocument) target).getScore();
}
String fieldName = property.getFieldName();
if (!fieldName.contains(".")) {
return target.get(fieldName);
}

View File

@ -17,6 +17,8 @@ package org.springframework.data.elasticsearch.core.geo;
import org.springframework.data.geo.Point;
import java.util.Objects;
/**
* geo-location used for #{@link org.springframework.data.elasticsearch.core.query.Criteria}.
*
@ -60,6 +62,20 @@ public class GeoPoint {
return new Point(point.getLat(), point.getLon());
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
GeoPoint geoPoint = (GeoPoint) o;
return Double.compare(geoPoint.lat, lat) == 0 &&
Double.compare(geoPoint.lon, lon) == 0;
}
@Override
public int hashCode() {
return Objects.hash(lat, lon);
}
@Override
public String toString() {
return "GeoPoint{" +

View File

@ -137,8 +137,8 @@ public class MappingBuilder {
}
}
private void mapEntity(XContentBuilder builder, @Nullable ElasticsearchPersistentEntity entity, boolean isRootObject,
String nestedObjectFieldName, boolean nestedOrObjectField, FieldType fieldType,
private void mapEntity(XContentBuilder builder, @Nullable ElasticsearchPersistentEntity<?> entity,
boolean isRootObject, String nestedObjectFieldName, boolean nestedOrObjectField, FieldType fieldType,
@Nullable Field parentFieldAnnotation, @Nullable DynamicMapping dynamicMapping) throws IOException {
boolean writeNestedProperties = !isRootObject && (isAnyPropertyAnnotatedWithField(entity) || nestedOrObjectField);
@ -150,7 +150,7 @@ public class MappingBuilder {
if (nestedOrObjectField && FieldType.Nested == fieldType && parentFieldAnnotation != null
&& parentFieldAnnotation.includeInParent()) {
builder.field("include_in_parent", parentFieldAnnotation.includeInParent());
builder.field("include_in_parent", true);
}
}
@ -366,7 +366,7 @@ public class MappingBuilder {
MappingParameters mappingParameters = MappingParameters.from(annotation);
if (!nestedOrObjectField && mappingParameters.isStore()) {
builder.field(FIELD_PARAM_STORE, mappingParameters.isStore());
builder.field(FIELD_PARAM_STORE, true);
}
mappingParameters.writeTypeAndParametersTo(builder);
}

View File

@ -37,6 +37,11 @@ public class SimpleElasticsearchMappingContext
private @Nullable ApplicationContext context;
@Override
protected boolean shouldCreatePersistentEntityFor(TypeInformation<?> type) {
return !ElasticsearchSimpleTypes.HOLDER.isSimpleType(type.getType());
}
@Override
protected <T> SimpleElasticsearchPersistentEntity<?> createPersistentEntity(TypeInformation<T> typeInformation) {
SimpleElasticsearchPersistentEntity<T> persistentEntity = new SimpleElasticsearchPersistentEntity<>(

View File

@ -23,6 +23,7 @@ import java.util.List;
import org.springframework.data.elasticsearch.annotations.DateFormat;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
import org.springframework.data.elasticsearch.annotations.MultiField;
import org.springframework.data.elasticsearch.annotations.Parent;
import org.springframework.data.elasticsearch.annotations.Score;
import org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter;
@ -83,6 +84,10 @@ public class SimpleElasticsearchPersistentProperty extends
throw new MappingException(String.format("Parent property %s must be of type String!", property.getName()));
}
if (isAnnotationPresent(Field.class) && isAnnotationPresent(MultiField.class)) {
throw new MappingException("@Field annotation must not be used on a @MultiField property.");
}
initDateConverter();
}
@ -114,60 +119,76 @@ public class SimpleElasticsearchPersistentProperty extends
*/
private void initDateConverter() {
Field field = findAnnotation(Field.class);
boolean isTemporalAccessor = TemporalAccessor.class.isAssignableFrom(getType());
boolean isDate = Date.class.isAssignableFrom(getType());
Class<?> actualType = getActualType();
boolean isTemporalAccessor = TemporalAccessor.class.isAssignableFrom(actualType);
boolean isDate = Date.class.isAssignableFrom(actualType);
if (field != null && field.type() == FieldType.Date && (isTemporalAccessor || isDate)) {
if (field != null && (field.type() == FieldType.Date || field.type() == FieldType.Date_Nanos)
&& (isTemporalAccessor || isDate)) {
DateFormat dateFormat = field.format();
ElasticsearchDateConverter converter = null;
if (dateFormat == DateFormat.none) {
throw new MappingException(
String.format("Property %s is annotated with FieldType.%s but has no DateFormat defined",
getOwner().getType().getSimpleName() + "." + getName(), field.type().name()));
}
ElasticsearchDateConverter converter;
if (dateFormat == DateFormat.custom) {
String pattern = field.pattern();
if (StringUtils.hasLength(pattern)) {
converter = ElasticsearchDateConverter.of(pattern);
if (!StringUtils.hasLength(pattern)) {
throw new MappingException(
String.format("Property %s is annotated with FieldType.%s and a custom format but has no pattern defined",
getOwner().getType().getSimpleName() + "." + getName(), field.type().name()));
}
} else if (dateFormat != DateFormat.none) {
converter = ElasticsearchDateConverter.of(pattern);
} else {
converter = ElasticsearchDateConverter.of(dateFormat);
}
if (converter != null) {
ElasticsearchDateConverter dateConverter = converter;
propertyConverter = new ElasticsearchPersistentPropertyConverter() {
@Override
public String write(Object property) {
if (isTemporalAccessor) {
return dateConverter.format((TemporalAccessor) property);
} else { // must be Date
return dateConverter.format((Date) property);
}
}
propertyConverter = new ElasticsearchPersistentPropertyConverter() {
final ElasticsearchDateConverter dateConverter = converter;
@SuppressWarnings("unchecked")
@Override
public Object read(String s) {
if (isTemporalAccessor) {
return dateConverter.parse(s, (Class<? extends TemporalAccessor>) getType());
} else { // must be date
return dateConverter.parse(s);
}
@Override
public String write(Object property) {
if (isTemporalAccessor && TemporalAccessor.class.isAssignableFrom(property.getClass())) {
return dateConverter.format((TemporalAccessor) property);
} else if (isDate && Date.class.isAssignableFrom(property.getClass())) {
return dateConverter.format((Date) property);
} else {
return property.toString();
}
};
}
}
@SuppressWarnings("unchecked")
@Override
public Object read(String s) {
if (isTemporalAccessor) {
return dateConverter.parse(s, (Class<? extends TemporalAccessor>) actualType);
} else { // must be date
return dateConverter.parse(s);
}
}
};
}
}
@SuppressWarnings("ConstantConditions")
@Nullable
private String getAnnotatedFieldName() {
if (isAnnotationPresent(Field.class)) {
String name = null;
String name = findAnnotation(Field.class).name();
return StringUtils.hasText(name) ? name : null;
if (isAnnotationPresent(Field.class)) {
name = findAnnotation(Field.class).name();
} else if (isAnnotationPresent(MultiField.class)) {
name = findAnnotation(MultiField.class).mainField().name();
}
return null;
return StringUtils.hasText(name) ? name : null;
}
/*

View File

@ -393,11 +393,6 @@ public class Criteria {
}
private List<Object> toCollection(Object... values) {
if (values.length == 0 || (values.length > 1 && values[1] instanceof Collection)) {
throw new InvalidDataAccessApiUsageException(
"At least one element " + (values.length > 0 ? ("of argument of type " + values[1].getClass().getName()) : "")
+ " has to be present.");
}
return Arrays.asList(values);
}

View File

@ -15,6 +15,9 @@
*/
package org.springframework.data.elasticsearch.core.query;
import org.springframework.data.elasticsearch.annotations.FieldType;
import org.springframework.lang.Nullable;
/**
* Defines a Field that can be used within a Criteria.
*
@ -27,4 +30,15 @@ public interface Field {
void setName(String name);
String getName();
/**
* @param fieldType sets the field's type
*/
void setFieldType(FieldType fieldType);
/**
* @return The annotated FieldType of the field
*/
@Nullable
FieldType getFieldType();
}

View File

@ -15,10 +15,13 @@
*/
package org.springframework.data.elasticsearch.core.query;
import org.springframework.data.elasticsearch.annotations.FieldType;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/**
* The most trivial implementation of a Field
* The most trivial implementation of a Field. The {@link #name} is updatable, so it may be changed during query
* preparation by the {@link org.springframework.data.elasticsearch.core.convert.MappingElasticsearchConverter}.
*
* @author Rizwan Idrees
* @author Mohsin Husen
@ -27,27 +30,41 @@ import org.springframework.util.Assert;
public class SimpleField implements Field {
private String name;
@Nullable private FieldType fieldType;
public SimpleField(String name) {
Assert.notNull(name, "name must not be null");
Assert.hasText(name, "name must not be null");
this.name = name;
}
@Override
public void setName(String name) {
Assert.notNull(name, "name must not be null");
Assert.hasText(name, "name must not be null");
this.name = name;
}
@Override
public String getName() {
return this.name;
return name;
}
@Override
public void setFieldType(FieldType fieldType) {
this.fieldType = fieldType;
}
@Nullable
@Override
public FieldType getFieldType() {
return fieldType;
}
@Override
public String toString() {
return this.name;
return getName();
}
}

View File

@ -28,11 +28,12 @@ import org.springframework.data.repository.query.RepositoryQuery;
public abstract class AbstractElasticsearchRepositoryQuery implements RepositoryQuery {
protected static final int DEFAULT_STREAM_BATCH_SIZE = 500;
protected ElasticsearchQueryMethod queryMethod;
protected ElasticsearchOperations elasticsearchOperations;
public AbstractElasticsearchRepositoryQuery(ElasticsearchQueryMethod queryMethod,
ElasticsearchOperations elasticsearchOperations) {
ElasticsearchOperations elasticsearchOperations) {
this.queryMethod = queryMethod;
this.elasticsearchOperations = elasticsearchOperations;
}

View File

@ -43,22 +43,20 @@ import org.springframework.util.ClassUtils;
*/
public class ElasticsearchPartQuery extends AbstractElasticsearchRepositoryQuery {
private static final int DEFAULT_STREAM_BATCH_SIZE = 500;
private final PartTree tree;
private final ElasticsearchConverter elasticsearchConverter;
private final MappingContext<?, ElasticsearchPersistentProperty> mappingContext;
public ElasticsearchPartQuery(ElasticsearchQueryMethod method, ElasticsearchOperations elasticsearchOperations) {
super(method, elasticsearchOperations);
this.tree = new PartTree(method.getName(), method.getEntityInformation().getJavaType());
this.tree = new PartTree(queryMethod.getName(), queryMethod.getResultProcessor().getReturnedType().getDomainType());
this.elasticsearchConverter = elasticsearchOperations.getElasticsearchConverter();
this.mappingContext = elasticsearchConverter.getMappingContext();
}
@Override
public Object execute(Object[] parameters) {
Class<?> clazz = queryMethod.getEntityInformation().getJavaType();
Class<?> clazz = queryMethod.getResultProcessor().getReturnedType().getDomainType();
ParametersParameterAccessor accessor = new ParametersParameterAccessor(queryMethod.getParameters(), parameters);
CriteriaQuery query = createQuery(accessor);

View File

@ -19,6 +19,7 @@ import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.SearchHitSupport;
import org.springframework.data.elasticsearch.core.SearchHits;
@ -26,6 +27,7 @@ import org.springframework.data.elasticsearch.core.convert.DateTimeConverters;
import org.springframework.data.elasticsearch.core.mapping.IndexCoordinates;
import org.springframework.data.elasticsearch.core.query.StringQuery;
import org.springframework.data.repository.query.ParametersParameterAccessor;
import org.springframework.data.util.StreamUtils;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.NumberUtils;
@ -69,7 +71,7 @@ public class ElasticsearchStringQuery extends AbstractElasticsearchRepositoryQue
@Override
public Object execute(Object[] parameters) {
Class<?> clazz = queryMethod.getEntityInformation().getJavaType();
Class<?> clazz = queryMethod.getResultProcessor().getReturnedType().getDomainType();
ParametersParameterAccessor accessor = new ParametersParameterAccessor(queryMethod.getParameters(), parameters);
StringQuery stringQuery = createQuery(accessor);
@ -88,6 +90,13 @@ public class ElasticsearchStringQuery extends AbstractElasticsearchRepositoryQue
stringQuery.setPageable(accessor.getPageable());
SearchHits<?> searchHits = elasticsearchOperations.search(stringQuery, clazz, index);
result = SearchHitSupport.page(searchHits, stringQuery.getPageable());
} else if (queryMethod.isStreamQuery()) {
if (accessor.getPageable().isUnpaged()) {
stringQuery.setPageable(PageRequest.of(0, DEFAULT_STREAM_BATCH_SIZE));
} else {
stringQuery.setPageable(accessor.getPageable());
}
result = StreamUtils.createStreamFromIterator(elasticsearchOperations.searchForStream(stringQuery, clazz, index));
} else if (queryMethod.isCollectionQuery()) {
if (accessor.getPageable().isPaged()) {
stringQuery.setPageable(accessor.getPageable());

View File

@ -22,6 +22,7 @@ import java.lang.reflect.Type;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Objects;
import java.util.Optional;
import java.util.stream.Collectors;
@ -88,7 +89,7 @@ public abstract class AbstractElasticsearchRepository<T, ID> implements Elastics
this.entityClass = this.entityInformation.getJavaType();
this.indexOperations = operations.indexOps(this.entityClass);
try {
if (createIndexAndMapping()) {
if (createIndexAndMapping() && !indexOperations.exists()) {
createIndex();
putMapping();
}
@ -153,9 +154,27 @@ public abstract class AbstractElasticsearchRepository<T, ID> implements Elastics
@Override
public Iterable<T> findAllById(Iterable<ID> ids) {
Assert.notNull(ids, "ids can't be null.");
NativeSearchQuery query = new NativeSearchQueryBuilder().withIds(stringIdsRepresentation(ids)).build();
return operations.multiGet(query, getEntityClass(), getIndexCoordinates());
List<T> result = new ArrayList<>();
List<String> stringIds = stringIdsRepresentation(ids);
if (stringIds.isEmpty()) {
return result;
}
NativeSearchQuery query = new NativeSearchQueryBuilder().withIds(stringIds).build();
List<T> multiGetEntities = operations.multiGet(query, getEntityClass(), getIndexCoordinates());
multiGetEntities.forEach(entity -> {
if (entity != null) {
result.add(entity);
}
});
return result;
}
@Override

View File

@ -1,6 +1,415 @@
Spring Data Elasticsearch Changelog
===================================
Changes in version 4.0.9.RELEASE (2021-04-14)
---------------------------------------------
* #1759 - health check with DefaultReactiveElasticsearchClient.
Changes in version 4.1.7 (2021-03-31)
-------------------------------------
Changes in version 4.2.0-RC1 (2021-03-31)
-----------------------------------------
* #1745 - Automatically close scroll context when returning streamed results.
* #1741 - Upgrade to Elasticsearch 7.12.
* #1738 - Readme lists artifacts with .RELEASE and .BUILD-SNAPSHOT suffixes.
* #1736 - Upgrade to OpenWebBeans 2.0.
* #1734 - Remove lombok.
* #1733 - Update CI to Java 16.
* #1727 - Allow multiple date formats for date fields.
* #1719 - Configure index settings with @Setting annotation.
Changes in version 4.2.0-M5 (2021-03-17)
----------------------------------------
* #1725 - Add support for SearchTemplate for reactive client.
* #1721 - IndexOps.getMapping raises exception if mapping contains "dynamic_templates".
* #1718 - Create index with mapping in one step.
* #1712 - Requests with ReactiveElasticsearchRepository methods doesn't fail if it can't connect with Elasticsearch.
* #1711 - Add the type hint _class attribute to the index mapping.
* #1704 - Add SearchFailure field in ByQueryResponse.
* #1700 - Add missing "Document ranking types".
* #1687 - Upgrade to Elasticsearch 7.11.
* #1686 - Add rescore functionality.
* #1678 - Errors are silent in multiGet.
* #1658 - ReactiveElasticsearchClient should use the same request parameters as non-reactive code.
* #1646 - Add function to list all indexes.
* #1514 - Add `matched_queries` field in SearchHit [DATAES-979].
Changes in version 4.1.6 (2021-03-17)
-------------------------------------
* #1712 - Requests with ReactiveElasticsearchRepository methods doesn't fail if it can't connect with Elasticsearch.
Changes in version 4.0.8.RELEASE (2021-03-17)
---------------------------------------------
* #1712 - Requests with ReactiveElasticsearchRepository methods doesn't fail if it can't connect with Elasticsearch.
Changes in version 4.2.0-M4 (2021-02-18)
----------------------------------------
Changes in version 4.1.5 (2021-02-18)
-------------------------------------
Changes in version 4.2.0-M3 (2021-02-17)
----------------------------------------
* #1689 - Missing anchor links in documentation.
* #1680 - After upgrade to 4.x can't read property id from _source named (different value from _id).
* #1679 - Errors are silent in delete by query in ReactiveElasticsearchTemplate.
* #1676 - Align MappingElasticsearchConverter with other Spring Data converters.
* #1675 - Consider Document as simple type.
* #1669 - Cleanup Deprecations from 4.0.
* #1668 - Writing a more complex CriteriaQuery.
* #1667 - Couldn't find PersistentEntity for type class com.example.demo.dto.Address.
* #1665 - ReactiveElasticsearchOperations indexName twice endcoding.
* #1662 - Documentation fix.
* #1659 - Fix source filter setup in multiget requests.
* #1655 - GeoJson types can be lowercase in Elasticsearch.
* #1649 - Upgrade to Elasticsearch 7.10.2.
* #1647 - Use own implementation of date formatters.
* #1644 - Implement update by query.
* #1565 - Allow using FieldNamingStrategy for property to fieldname matching [DATAES-993].
* #1370 - Add enabled mapping parameter to FieldType configuration [DATAES-798].
* #1218 - Add routing parameter to ElasticsearchOperations [DATAES-644].
* #1156 - Add @CountQuery annotation [DATAES-584].
* #1143 - Support for search_after [DATAES-571].
* #803 - Don't update indexed object if it is no persistent entity [DATAES-229].
* #725 - Add query Explain Support [DATAES-149].
Changes in version 4.1.4 (2021-02-17)
-------------------------------------
* #1667 - Couldn't find PersistentEntity for type class com.example.demo.dto.Address.
* #1665 - ReactiveElasticsearchOperations indexName twice endcoding.
* #1662 - Documentation fix.
* #1659 - Fix source filter setup in multiget requests.
* #1655 - GeoJson types can be lowercase in Elasticsearch.
Changes in version 4.0.7.RELEASE (2021-02-17)
---------------------------------------------
* DATAES-996 - Update CI jobs with Docker Login.
* #1667 - Couldn't find PersistentEntity for type class com.example.demo.dto.Address.
* #1665 - ReactiveElasticsearchOperations indexName twice endcoding.
* #1662 - Documentation fix.
* #1659 - Fix source filter setup in multiget requests.
Changes in version 3.2.13.RELEASE (2021-02-17)
----------------------------------------------
* #1694 - Upgrade to Elasticsearch 6.8.14.
* #1662 - Documentation fix.
Changes in version 4.2.0-M2 (2021-01-13)
----------------------------------------
* DATAES-1003 - add timeout to search query.
* DATAES-996 - Update CI jobs with Docker Login.
* DATAES-982 - Improve refresh handling.
* DATAES-946 - Support 'wildcard' field type.
* #1640 - Add support for GetFieldMapping request in ReactiveElasticsearchClient.
* #1638 - Upgrade to Elasticsearch 7.10.1.
* #1634 - Update Testcontainers dependency.
* #1632 - Update copyright notice to 2021.
* #1629 - Update repository after GitHub issues migration.
* #1576 - Add version of Spring dependency to docs [DATAES-1004].
* #1056 - Repository initialization should throw an Exception when index cannot be created [DATAES-481].
Changes in version 4.1.3 (2021-01-13)
-------------------------------------
* DATAES-996 - Update CI jobs with Docker Login.
* #1634 - Update Testcontainers dependency.
Changes in version 4.1.2 (2020-12-09)
-------------------------------------
* DATAES-991 - Wrong value for TermVector(with_positions_offets_payloads).
* DATAES-990 - Index creation fails with Authentication object cannot be null on startup.
* DATAES-987 - IndexOperations getMapping fail when using index alias.
* DATAES-978 - Accept DateFormat.none for a date property to enable custom Converters.
* DATAES-977 - Fix versions in reference documentation for 4.1.
* DATAES-973 - Release 4.1.2 (2020.0.2).
* DATAES-972 - BeforeConvertCallback should be called before index query is built.
* DATAES-543 - Adjust configuration support classes so they do not require proxying.
Changes in version 4.2.0-M1 (2020-12-09)
----------------------------------------
* DATAES-995 - Code Cleanup after DATACMNS-1838.
* DATAES-994 - Add setup for mutation testing.
* DATAES-991 - Wrong value for TermVector(with_positions_offets_payloads).
* DATAES-990 - Index creation fails with Authentication object cannot be null on startup.
* DATAES-989 - Improve deprecation warning for id properties without annotation.
* DATAES-988 - Allow specifying max results in NativeSearchQueryBuilder.
* DATAES-987 - IndexOperations getMapping fail when using index alias.
* DATAES-986 - Fix Javadoc.
* DATAES-985 - Add builder method for track_total_hits to NativeSearchQueryBuilder.
* DATAES-983 - Test dependency hoverfly-java-junit5 leaks into compile scope.
* DATAES-978 - Accept DateFormat.none for a date property to enable custom Converters.
* DATAES-976 - Implement CrudRepository.delete(Iterable<ID> ids).
* DATAES-975 - Upgrade to Elasticsearch 7.10.
* DATAES-974 - remove usage of deprecated WebClient exchange() method.
* DATAES-972 - BeforeConvertCallback should be called before index query is built.
* DATAES-971 - Fix tests for using a proxy with reactive client.
* DATAES-970 - Take Testcontainers version from the Spring Data Build pom.
* DATAES-969 - Use ResultProcessor in ElasticsearchPartQuery to build PartTree.
* DATAES-968 - Enable Maven caching for Jenkins jobs.
* DATAES-966 - Release 4.2 M1 (2021.0.0).
* DATAES-882 - HLRC Configuration - add ability to set max connections for the underlying HttpClient.
* DATAES-588 - Add support for custom callbacks in High Level/Low Level REST Client builder.
* DATAES-543 - Adjust configuration support classes so they do not require proxying.
* DATAES-362 - Add support for composable meta annotations.
* DATAES-247 - Support OpType in IndexQuery.
Changes in version 4.0.6.RELEASE (2020-12-09)
---------------------------------------------
* DATAES-991 - Wrong value for TermVector(with_positions_offets_payloads).
* DATAES-969 - Use ResultProcessor in ElasticsearchPartQuery to build PartTree.
* DATAES-968 - Enable Maven caching for Jenkins jobs.
* DATAES-964 - Release 4.0.6 (Neumann SR6).
Changes in version 3.2.12.RELEASE (2020-12-09)
----------------------------------------------
* DATAES-969 - Use ResultProcessor in ElasticsearchPartQuery to build PartTree.
* DATAES-963 - Release 3.2.12 (Moore SR12).
Changes in version 4.1.1 (2020-11-11)
-------------------------------------
* DATAES-969 - Use ResultProcessor in ElasticsearchPartQuery to build PartTree.
* DATAES-968 - Enable Maven caching for Jenkins jobs.
* DATAES-965 - Release 4.1.1 (2020.0.1).
Changes in version 4.1.0 (2020-10-28)
-------------------------------------
* DATAES-962 - Deprecate Joda support.
* DATAES-960 - Upgrade to Elasticsearch 7.9.3.
* DATAES-956 - Prevent double converter registration.
* DATAES-953 - DateTimeException occurred "yyyy-MM-dd HH: mm: ss" string is converted to Date.
* DATAES-952 - Optimize SearchPage implementation.
* DATAES-951 - Revert DATAES-934.
* DATAES-950 - Release 4.1 GA (2020.0.0).
* DATAES-931 - Add query support for geo shape queries.
* DATAES-796 - Provide new method to return SearchHits in ReactiveElasticsearchClient.
Changes in version 4.0.5.RELEASE (2020-10-28)
---------------------------------------------
* DATAES-953 - DateTimeException occurred "yyyy-MM-dd HH: mm: ss" string is converted to Date.
* DATAES-937 - Repository queries with IN filters fail with empty input list.
* DATAES-936 - Take id property from the source when deserializing an entity.
* DATAES-926 - Release 4.0.5 (Neumann SR5).
Changes in version 3.2.11.RELEASE (2020-10-28)
----------------------------------------------
* DATAES-961 - Upgrade to Elasticsearch 6.8.13.
* DATAES-937 - Repository queries with IN filters fail with empty input list.
* DATAES-925 - Release 3.2.11 (Moore SR11).
Changes in version 3.1.21.RELEASE (2020-10-28)
----------------------------------------------
* DATAES-958 - Release 3.1.21 (Lovelace SR21).
Changes in version 4.1.0-RC2 (2020-10-14)
-----------------------------------------
* DATAES-949 - dependency cleanup.
* DATAES-947 - Adopt to API changes in Project Reactor.
* DATAES-945 - Compilation error on JDK11+.
* DATAES-944 - Simplify logging setup in test environment.
* DATAES-943 - Add missing mapping parameters.
* DATAES-940 - Update to Elasticsearch 7.9.2.
* DATAES-937 - Repository queries with IN filters fail with empty input list.
* DATAES-936 - Take id property from the source when deserializing an entity.
* DATAES-935 - Setup integration tests separate from unit tests.
* DATAES-934 - Add a Query taking method to ElasticsearchRepository.
* DATAES-933 - Fix typo in javaDoc.
* DATAES-932 - GeoPoint - Point conversion is wrong.
* DATAES-930 - Add support for geo_shape type entity properties.
* DATAES-929 - Support geo_shape field type field type.
* DATAES-927 - Release 4.1 RC2 (2020.0.0).
* DATAES-921 - Investigate WebClient.retrieve() instead of using WebClient.exchange().
Changes in version 4.1.0-RC1 (2020-09-16)
-----------------------------------------
* DATAES-924 - Conversion of properties of collections of Temporal values fails.
* DATAES-923 - Upgrade to Elasticsearch 7.9.1.
* DATAES-922 - Move off Sink.emitXXX methods.
* DATAES-920 - Add parameter to @Field annotation to store null values.
* DATAES-919 - Fix error messages in test output.
* DATAES-914 - Use Testcontainers.
* DATAES-913 - Minor optimization on collection-returning derived queries.
* DATAES-912 - Derived Query with "In" Keyword does not work on Text field.
* DATAES-911 - Add documentation for automatic index creation.
* DATAES-910 - Upgrade to Elasticsearch 7.9.0.
* DATAES-909 - Add singular update() methods to ReactiveDocumentOperations.
* DATAES-908 - Fill version on an indexed entity.
* DATAES-907 - Track Total Hits not working when set to false.
* DATAES-904 - Release 4.1 RC1 (2020.0.0).
* DATAES-902 - Update to Elasticsearch 7.8.1.
* DATAES-898 - Add join-type relevant parts to reactive calls.
* DATAES-895 - Criteria.OperationKey.NEAR is not used anywhere.
* DATAES-854 - Add support for rank_feature datatype.
* DATAES-706 - CriteriaQueryProcessor must handle nested Criteria definitions.
Changes in version 4.0.4.RELEASE (2020-09-16)
---------------------------------------------
* DATAES-924 - Conversion of properties of collections of Temporal values fails.
* DATAES-912 - Derived Query with "In" Keyword does not work on Text field.
* DATAES-905 - Release 4.0.4 (Neumann SR4).
Changes in version 3.2.10.RELEASE (2020-09-16)
----------------------------------------------
* DATAES-903 - Update to Elasticsearch 6.8.12.
* DATAES-892 - Fix ElasticsearchEntityMapper recursive descent when reading Map objects.
* DATAES-888 - Release 3.2.10 (Moore SR10).
Changes in version 3.1.20.RELEASE (2020-09-16)
----------------------------------------------
* DATAES-887 - Release 3.1.20 (Lovelace SR20).
Changes in version 4.0.3.RELEASE (2020-08-12)
---------------------------------------------
* DATAES-897 - Add documentation for Highlight annotation.
* DATAES-896 - Use mainField property of @MultiField annotation instead of additional @Field annotation.
* DATAES-891 - Returning a Stream from a Query annotated repository method crashes.
* DATAES-890 - Release 4.0.3 (Neumann SR3).
Changes in version 4.1.0-M2 (2020-08-12)
----------------------------------------
* DATAES-901 - Operations deleting an entity should use a routing deducted from the entity.
* DATAES-899 - Add documentation for join-type.
* DATAES-897 - Add documentation for Highlight annotation.
* DATAES-896 - Use mainField property of @MultiField annotation instead of additional @Field annotation.
* DATAES-894 - Adapt to changes in Reactor.
* DATAES-893 - Adopt to changed module layout of Reactor Netty.
* DATAES-891 - Returning a Stream from a Query annotated repository method crashes.
* DATAES-886 - Complete reactive auditing.
* DATAES-883 - Fix log level on resource load error.
* DATAES-878 - Wrong value for TermVector(woth_positions_offsets).
* DATAES-877 - Update test logging dependency.
* DATAES-876 - Add seqno and primary term to entity on initial save.
* DATAES-875 - MappingElasticsearchConverter.updateQuery not called at all places.
* DATAES-874 - Deprecate parent-id related methods and fields.
* DATAES-872 - Release 4.1 M2 (2020.0.0).
* DATAES-869 - Update to Elasticsearch 7.8.
* DATAES-864 - Rework alias management.
* DATAES-842 - Documentation fixes.
* DATAES-612 - Add support for index templates.
* DATAES-433 - Replace parent-child mappings to join field.
* DATAES-321 - Support time base rolling indices.
* DATAES-244 - Support alias renaming.
* DATAES-233 - Support for rolling index strategy.
* DATAES-207 - Allow fetching indices by alias.
* DATAES-192 - Define alias for document.
* DATAES-150 - mapping are not created when entity is saved in new dynamic name index (spel).
Changes in version 4.0.2.RELEASE (2020-07-22)
---------------------------------------------
* DATAES-883 - Fix log level on resource load error.
* DATAES-878 - Wrong value for TermVector(woth_positions_offsets).
* DATAES-865 - Fix MappingElasticsearchConverter writing an Object property containing a Map.
* DATAES-863 - Improve server error response handling.
* DATAES-862 - Release 4.0.2 (Neumann SR2).
Changes in version 3.2.9.RELEASE (2020-07-22)
---------------------------------------------
* DATAES-861 - Release 3.2.9 (Moore SR9).
Changes in version 3.1.19.RELEASE (2020-07-22)
----------------------------------------------
* DATAES-860 - Release 3.1.19 (Lovelace SR19).
Changes in version 4.1.0-M1 (2020-06-25)
----------------------------------------
* DATAES-870 - Workaround for reactor-netty error.
* DATAES-868 - Upgrade to Netty 4.1.50.Final.
* DATAES-867 - Adopt to changes in Reactor Netty 1.0.
* DATAES-866 - Implement suggest search in reactive client.
* DATAES-865 - Fix MappingElasticsearchConverter writing an Object property containing a Map.
* DATAES-863 - Improve server error response handling.
* DATAES-859 - Don't use randomNumeric() in tests.
* DATAES-858 - Use standard Spring code of conduct.
* DATAES-857 - Registered simple types are not read from list.
* DATAES-853 - Cleanup tests that do not delete test indices.
* DATAES-852 - Upgrade to Elasticsearch 7.7.1.
* DATAES-850 - Add warning and documentation for missing TemporalAccessor configuration.
* DATAES-848 - Add the name of the index to SearchHit.
* DATAES-847 - Add missing DateFormat values.
* DATAES-845 - MappingElasticsearchConverter crashes when writing lists containing null values.
* DATAES-844 - Improve TOC formatting for migration guides.
* DATAES-841 - Remove deprecated type mappings code.
* DATAES-840 - Consolidate index name SpEL resolution.
* DATAES-839 - ReactiveElasticsearchTemplate should use RequestFactory.
* DATAES-838 - Update to Elasticsearch 7.7.0.
* DATAES-836 - Fix typo in Javadocs.
* DATAES-835 - Fix code sample in documentation for scroll API.
* DATAES-832 - findAllById repository method returns iterable with null elements for not found ids.
* DATAES-831 - SearchOperations.searchForStream does not use requested maxResults.
* DATAES-829 - Deprecate AbstractElasticsearchRepository and cleanup SimpleElasticsearchRepository.
* DATAES-828 - Fields of type date need to have a format defined.
* DATAES-827 - Repositories should not try to create an index when it already exists.
* DATAES-826 - Add method to IndexOperations to write an index mapping from a entity class.
* DATAES-825 - Update readme to use latest spring.io docs.
* DATAES-824 - Release 4.1 M1 (2020.0.0).
* DATAES-678 - Introduce ReactiveIndexOperations.
* DATAES-263 - Inner Hits support.
Changes in version 4.0.1.RELEASE (2020-06-10)
---------------------------------------------
* DATAES-857 - Registered simple types are not read from list.
* DATAES-850 - Add warning and documentation for missing TemporalAccessor configuration.
* DATAES-845 - MappingElasticsearchConverter crashes when writing lists containing null values.
* DATAES-844 - Improve TOC formatting for migration guides.
* DATAES-839 - ReactiveElasticsearchTemplate should use RequestFactory.
* DATAES-835 - Fix code sample in documentation for scroll API.
* DATAES-832 - findAllById repository method returns iterable with null elements for not found ids.
* DATAES-831 - SearchOperations.searchForStream does not use requested maxResults.
* DATAES-828 - Fields of type date need to have a format defined.
* DATAES-827 - Repositories should not try to create an index when it already exists.
* DATAES-823 - Release 4.0.1 (Neumann SR1).
Changes in version 3.2.8.RELEASE (2020-06-10)
---------------------------------------------
* DATAES-851 - Upgrade to Elasticsearch 6.8.10.
* DATAES-837 - Update to Elasticsearch 6.8.9.
* DATAES-821 - Fix code for adding an alias.
* DATAES-811 - Remove Travis CI.
* DATAES-807 - Release 3.2.8 (Moore SR8).
* DATAES-776 - Adapt RestClients class to change in InetSocketAddress class in JDK14.
* DATAES-767 - Fix ReactiveElasticsearch handling of 4xx HTTP responses.
Changes in version 3.1.18.RELEASE (2020-06-10)
----------------------------------------------
* DATAES-811 - Remove Travis CI.
* DATAES-806 - Release 3.1.18 (Lovelace SR18).
Changes in version 4.0.0.RELEASE (2020-05-12)
---------------------------------------------
* DATAES-822 - ElasticsearchRestTemplate should not use `spring-web`.
@ -1139,3 +1548,39 @@ Release Notes - Spring Data Elasticsearch - Version 1.0 M1 (2014-02-07)

View File

@ -1,4 +1,4 @@
Spring Data Elasticsearch 4.0 GA
Spring Data Elasticsearch 4.0.9 (Neumann SR9)
Copyright (c) [2013-2019] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
@ -15,3 +15,12 @@ conditions of the subcomponent's license, as noted in the LICENSE file.

View File

@ -1,5 +1,5 @@
/*
* Copyright 2020 the original author or authors.
* Copyright 2020-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@ -19,23 +19,31 @@ import static org.assertj.core.api.Assertions.*;
import static org.elasticsearch.search.internal.SearchContext.*;
import static org.mockito.Mockito.*;
import org.elasticsearch.search.fetch.subphase.FetchSourceContext;
import reactor.core.publisher.Mono;
import reactor.test.StepVerifier;
import java.net.URI;
import java.util.Optional;
import java.util.function.Function;
import org.elasticsearch.ElasticsearchStatusException;
import org.elasticsearch.action.get.GetRequest;
import org.elasticsearch.action.search.SearchRequest;
import org.elasticsearch.client.Request;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.search.builder.SearchSourceBuilder;
import org.junit.jupiter.api.BeforeEach;
import org.elasticsearch.search.fetch.subphase.FetchSourceContext;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.ArgumentCaptor;
import org.mockito.Mock;
import org.mockito.Spy;
import org.mockito.junit.jupiter.MockitoExtension;
import org.springframework.http.HttpStatus;
import org.springframework.web.reactive.function.client.ClientResponse;
import reactor.test.StepVerifier;
import org.springframework.web.reactive.function.client.WebClient;
import org.springframework.web.util.UriBuilder;
/**
* @author Peter-Josef Meisch
@ -46,29 +54,23 @@ class DefaultReactiveElasticsearchClientTest {
@Mock private HostProvider hostProvider;
@Mock private Function<SearchRequest, Request> searchRequestConverter;
@Spy private RequestCreator requestCreator;
private DefaultReactiveElasticsearchClient client;
@Mock private WebClient webClient;
@BeforeEach
void setUp() {
client = new DefaultReactiveElasticsearchClient(hostProvider, new RequestCreator() {
@Override
public Function<SearchRequest, Request> search() {
return searchRequestConverter;
}
}) {
@Test
void shouldSetAppropriateRequestParametersOnCount() {
when(requestCreator.search()).thenReturn(searchRequestConverter);
SearchRequest searchRequest = new SearchRequest("someindex") //
.source(new SearchSourceBuilder().query(QueryBuilders.matchAllQuery()));
ReactiveElasticsearchClient client = new DefaultReactiveElasticsearchClient(hostProvider, requestCreator) {
@Override
public Mono<ClientResponse> execute(ReactiveElasticsearchClientCallback callback) {
return Mono.empty();
}
};
}
@Test
void shouldSetAppropriateRequestParametersOnCount() {
SearchRequest searchRequest = new SearchRequest("someindex") //
.source(new SearchSourceBuilder().query(QueryBuilders.matchAllQuery()));
client.count(searchRequest).as(StepVerifier::create).verifyComplete();
@ -79,4 +81,31 @@ class DefaultReactiveElasticsearchClientTest {
assertThat(source.trackTotalHitsUpTo()).isEqualTo(TRACK_TOTAL_HITS_ACCURATE);
assertThat(source.fetchSource()).isEqualTo(FetchSourceContext.DO_NOT_FETCH_SOURCE);
}
@Test // #1712
@DisplayName("should throw ElasticsearchStatusException on server 5xx with empty body")
void shouldThrowElasticsearchStatusExceptionOnServer5xxWithEmptyBody() {
when(hostProvider.getActive(any())).thenReturn(Mono.just(webClient));
WebClient.RequestBodyUriSpec requestBodyUriSpec = mock(WebClient.RequestBodyUriSpec.class);
when(requestBodyUriSpec.uri((Function<UriBuilder, URI>) any())).thenReturn(requestBodyUriSpec);
when(requestBodyUriSpec.attribute(any(), any())).thenReturn(requestBodyUriSpec);
when(requestBodyUriSpec.headers(any())).thenReturn(requestBodyUriSpec);
when(webClient.method(any())).thenReturn(requestBodyUriSpec);
ClientResponse clientResponse = mock(ClientResponse.class);
when(clientResponse.statusCode()).thenReturn(HttpStatus.SERVICE_UNAVAILABLE);
ClientResponse.Headers headers = mock(ClientResponse.Headers.class);
when(headers.contentType()).thenReturn(Optional.empty());
when(clientResponse.headers()).thenReturn(headers);
when(clientResponse.body(any())).thenReturn(Mono.empty());
when(requestBodyUriSpec.exchange()).thenReturn(Mono.just(clientResponse));
ReactiveElasticsearchClient client = new DefaultReactiveElasticsearchClient(hostProvider, requestCreator);
client.get(new GetRequest("42")) //
.as(StepVerifier::create) //
.expectError(ElasticsearchStatusException.class) //
.verify(); //
}
}

View File

@ -1,5 +1,5 @@
/*
* Copyright 2018-2020 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@ -62,7 +62,7 @@ public class ReactiveElasticsearchClientUnitTests {
static final String HOST = ":9200";
MockDelegatingElasticsearchHostProvider<HostProvider> hostProvider;
MockDelegatingElasticsearchHostProvider<? extends HostProvider<?>> hostProvider;
ReactiveElasticsearchClient client;
@BeforeEach

View File

@ -186,7 +186,7 @@ public class ReactiveMockClientTestsUtils {
return delegate;
}
public MockDelegatingElasticsearchHostProvider<T> withActiveDefaultHost(String host) {
public MockDelegatingElasticsearchHostProvider<? extends HostProvider<?>> withActiveDefaultHost(String host) {
return new MockDelegatingElasticsearchHostProvider(HttpHeaders.EMPTY, clientProvider, errorCollector, delegate,
host);
}

View File

@ -1,5 +1,5 @@
/*
* Copyright 2018-2020 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@ -30,6 +30,7 @@ import org.springframework.data.elasticsearch.client.reactive.ReactiveMockClient
/**
* @author Christoph Strobl
* @author Peter-Josef Meisch
*/
public class SingleNodeHostProviderUnitTests {

View File

@ -197,14 +197,22 @@ class ElasticsearchPartQueryTests {
String query = getQueryBuilder(methodName, parameterClasses, parameters);
String expected = "{\"query\": {" + //
" \"bool\" : {" + //
" \"must\" : [" + //
" {\"bool\" : {\"must\" : [{\"terms\" : {\"name\" : [\"" + names.get(0) + "\", \"" + names.get(1)
+ "\"]}}]}}" + //
" ]" + //
" }" + //
"}}"; //
String expected = "{\n" + //
" \"query\": {\n" + //
" \"bool\": {\n" + //
" \"must\": [\n" + //
" {\n" + //
" \"query_string\": {\n" + //
" \"query\": \"\\\"Title\\\" \\\"Title2\\\"\",\n" + //
" \"fields\": [\n" + //
" \"name^1.0\"\n" + //
" ]\n" + //
" }\n" + //
" }\n" + //
" ]\n" + //
" }\n" + //
" }\n" + //
"}\n"; //
assertEquals(expected, query, false);
}
@ -220,14 +228,22 @@ class ElasticsearchPartQueryTests {
String query = getQueryBuilder(methodName, parameterClasses, parameters);
String expected = "{\"query\": {" + //
" \"bool\" : {" + //
" \"must\" : [" + //
" {\"bool\" : {\"must_not\" : [{\"terms\" : {\"name\" : [\"" + names.get(0) + "\", \"" + names.get(1)
+ "\"]}}]}}" + //
" ]" + //
" }" + //
"}}"; //
String expected = "{\n" + //
" \"query\": {\n" + //
" \"bool\": {\n" + //
" \"must\": [\n" + //
" {\n" + //
" \"query_string\": {\n" + //
" \"query\": \"NOT(\\\"Title\\\" \\\"Title2\\\")\",\n" + //
" \"fields\": [\n" + //
" \"name^1.0\"\n" + //
" ]\n" + //
" }\n" + //
" }\n" + //
" ]\n" + //
" }\n" + //
" }\n" + //
"}\n"; //
assertEquals(expected, query, false);
}

View File

@ -77,7 +77,7 @@ import org.springframework.data.elasticsearch.annotations.ScriptedField;
import org.springframework.data.elasticsearch.core.geo.GeoPoint;
import org.springframework.data.elasticsearch.core.mapping.IndexCoordinates;
import org.springframework.data.elasticsearch.core.query.*;
import org.springframework.data.util.CloseableIterator;
import org.springframework.data.util.StreamUtils;
import org.springframework.lang.Nullable;
/**
@ -1298,27 +1298,33 @@ public abstract class ElasticsearchTemplateTests {
assertThat(sampleEntities).hasSize(30);
}
@Test // DATAES-167
public void shouldReturnResultsWithStreamForGivenCriteriaQuery() {
@Test // DATAES-167, DATAES-831
public void shouldReturnAllResultsWithStreamForGivenCriteriaQuery() {
// given
List<IndexQuery> entities = createSampleEntitiesWithMessage("Test message", 30);
// when
operations.bulkIndex(entities, index);
operations.bulkIndex(createSampleEntitiesWithMessage("Test message", 30), index);
indexOperations.refresh();
// then
CriteriaQuery criteriaQuery = new CriteriaQuery(new Criteria());
criteriaQuery.setPageable(PageRequest.of(0, 10));
CloseableIterator<SearchHit<SampleEntity>> stream = operations.searchForStream(criteriaQuery, SampleEntity.class,
index);
List<SearchHit<SampleEntity>> sampleEntities = new ArrayList<>();
while (stream.hasNext()) {
sampleEntities.add(stream.next());
}
assertThat(sampleEntities).hasSize(30);
long count = StreamUtils
.createStreamFromIterator(operations.searchForStream(criteriaQuery, SampleEntity.class, index)).count();
assertThat(count).isEqualTo(30);
}
@Test // DATAES-831
void shouldLimitStreamResultToRequestedSize() {
operations.bulkIndex(createSampleEntitiesWithMessage("Test message", 30), index);
indexOperations.refresh();
CriteriaQuery criteriaQuery = new CriteriaQuery(new Criteria());
criteriaQuery.setMaxResults(10);
long count = StreamUtils
.createStreamFromIterator(operations.searchForStream(criteriaQuery, SampleEntity.class, index)).count();
assertThat(count).isEqualTo(10);
}
private static List<IndexQuery> createSampleEntitiesWithMessage(String message, int numberOfEntities) {
@ -3128,8 +3134,8 @@ public abstract class ElasticsearchTemplateTests {
operations.refresh(OptimisticEntity.class);
List<Query> queries = singletonList(queryForOne(saved.getId()));
List<SearchHits<OptimisticEntity>> retrievedHits = operations.multiSearch(queries,
OptimisticEntity.class, operations.getIndexCoordinatesFor(OptimisticEntity.class));
List<SearchHits<OptimisticEntity>> retrievedHits = operations.multiSearch(queries, OptimisticEntity.class,
operations.getIndexCoordinatesFor(OptimisticEntity.class));
OptimisticEntity retrieved = retrievedHits.get(0).getSearchHit(0).getContent();
assertThatSeqNoPrimaryTermIsFilled(retrieved);
@ -3162,8 +3168,7 @@ public abstract class ElasticsearchTemplateTests {
operations.save(forEdit1);
forEdit2.setMessage("It'll be great");
assertThatThrownBy(() -> operations.save(forEdit2))
.isInstanceOf(OptimisticLockingFailureException.class);
assertThatThrownBy(() -> operations.save(forEdit2)).isInstanceOf(OptimisticLockingFailureException.class);
}
@Test // DATAES-799
@ -3179,8 +3184,7 @@ public abstract class ElasticsearchTemplateTests {
operations.save(forEdit1);
forEdit2.setMessage("It'll be great");
assertThatThrownBy(() -> operations.save(forEdit2))
.isInstanceOf(OptimisticLockingFailureException.class);
assertThatThrownBy(() -> operations.save(forEdit2)).isInstanceOf(OptimisticLockingFailureException.class);
}
@Test // DATAES-799

View File

@ -34,6 +34,7 @@ import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.dao.DataAccessException;
import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.DateFormat;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.core.mapping.IndexCoordinates;
@ -146,7 +147,7 @@ public class LogEntityTests {
@Field(type = Ip) private String ip;
@Field(type = Date) private java.util.Date date;
@Field(type = Date, format = DateFormat.date_time) private java.util.Date date;
private LogEntity() {}

View File

@ -31,6 +31,8 @@ import reactor.test.StepVerifier;
import java.lang.Long;
import java.lang.Object;
import java.net.ConnectException;
import java.time.LocalDate;
import java.time.format.DateTimeFormatter;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
@ -49,7 +51,6 @@ import org.elasticsearch.search.sort.SortOrder;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.annotation.Id;
@ -524,7 +525,8 @@ public class ReactiveElasticsearchTemplateTests {
@Test // DATAES-567, DATAES-767
public void aggregateShouldErrorWhenIndexDoesNotExist() {
template.aggregate(new CriteriaQuery(Criteria.where("message").is("some message")), SampleEntity.class,
template
.aggregate(new CriteriaQuery(Criteria.where("message").is("some message")), SampleEntity.class,
IndexCoordinates.of("no-such-index")) //
.as(StepVerifier::create) //
.expectError(ElasticsearchStatusException.class);
@ -981,6 +983,28 @@ public class ReactiveElasticsearchTemplateTests {
// --> JUST some helpers
@Test // #1665
void shouldBeAbleToProcessDateMathIndexNames() {
String indexName = "foo-" + LocalDate.now().format(DateTimeFormatter.ofPattern("yyyy.MM"));
String dateMathIndexName = "<foo-{now/M{yyyy.MM}}>";
SampleEntity entity = randomEntity("foo");
template.save(entity, IndexCoordinates.of(dateMathIndexName)) //
.as(StepVerifier::create) //
.expectNext(entity) //
.verifyComplete(); //
template.get(entity.getId(), SampleEntity.class, IndexCoordinates.of(indexName)) //
.as(StepVerifier::create) //
.expectNext(entity) //
.verifyComplete(); //
}
// endregion
// region Helper functions
private SampleEntity randomEntity(String message) {
return SampleEntity.builder() //

View File

@ -0,0 +1,219 @@
/*
* Copyright 2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.elasticsearch.core;
import static org.assertj.core.api.Assertions.*;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.Collections;
import java.util.List;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
import org.springframework.data.elasticsearch.core.mapping.IndexCoordinates;
import org.springframework.data.elasticsearch.core.query.NativeSearchQueryBuilder;
import org.springframework.data.elasticsearch.core.query.Query;
import org.springframework.data.elasticsearch.core.query.SourceFilter;
import org.springframework.data.elasticsearch.junit.jupiter.ElasticsearchRestTemplateConfiguration;
import org.springframework.data.elasticsearch.junit.jupiter.SpringIntegrationTest;
import org.springframework.test.context.ContextConfiguration;
/**
* @author Peter-Josef Meisch
*/
@SpringIntegrationTest
@ContextConfiguration(classes = { ElasticsearchRestTemplateConfiguration.class })
public class SourceFilterIntegrationTests {
private static final String INDEX = "sourcefilter-tests";
@Autowired private ElasticsearchOperations operations;
private IndexOperations indexOps;
@BeforeEach
void setUp() {
indexOps = operations.indexOps(Entity.class);
indexOps.create();
indexOps.putMapping(indexOps.createMapping());
operations.save(Entity.builder().id("42").field1("one").field2("two").field3("three").build());
indexOps.refresh();
}
@AfterEach
void tearDown() {
indexOps.delete();
}
@Test // #1659
@DisplayName("should only return requested fields on search")
void shouldOnlyReturnRequestedFieldsOnSearch() {
Query query = Query.findAll();
query.addFields("field2");
SearchHits<Entity> searchHits = operations.search(query, Entity.class);
assertThat(searchHits).hasSize(1);
Entity entity = searchHits.getSearchHit(0).getContent();
assertThat(entity.getField1()).isNull();
assertThat(entity.getField2()).isEqualTo("two");
assertThat(entity.getField3()).isNull();
}
@Test // #1659
@DisplayName("should only return requested fields on multiget")
void shouldOnlyReturnRequestedFieldsOnGMultiGet() {
Query query = new NativeSearchQueryBuilder().withIds(Collections.singleton("42")).build();
query.addFields("field2");
List<Entity> entities = operations.multiGet(query, Entity.class, IndexCoordinates.of(INDEX));
assertThat(entities).hasSize(1);
Entity entity = entities.get(0);
assertThat(entity.getField1()).isNull();
assertThat(entity.getField2()).isEqualTo("two");
assertThat(entity.getField3()).isNull();
}
@Test // #1659
@DisplayName("should not return excluded fields from SourceFilter on search")
void shouldNotReturnExcludedFieldsFromSourceFilterOnSearch() {
Query query = Query.findAll();
query.addSourceFilter(new SourceFilter() {
@Override
public String[] getIncludes() {
return new String[] {};
}
@Override
public String[] getExcludes() {
return new String[] { "field2" };
}
});
SearchHits<Entity> entities = operations.search(query, Entity.class);
assertThat(entities).hasSize(1);
Entity entity = entities.getSearchHit(0).getContent();
assertThat(entity.getField1()).isNotNull();
assertThat(entity.getField2()).isNull();
assertThat(entity.getField3()).isNotNull();
}
@Test // #1659
@DisplayName("should not return excluded fields from SourceFilter on multiget")
void shouldNotReturnExcludedFieldsFromSourceFilterOnMultiGet() {
Query query = new NativeSearchQueryBuilder().withIds(Collections.singleton("42")).build();
query.addSourceFilter(new SourceFilter() {
@Override
public String[] getIncludes() {
return new String[] {};
}
@Override
public String[] getExcludes() {
return new String[] { "field2" };
}
});
List<Entity> entities = operations.multiGet(query, Entity.class, IndexCoordinates.of(INDEX));
assertThat(entities).hasSize(1);
Entity entity = entities.get(0);
assertThat(entity.getField1()).isNotNull();
assertThat(entity.getField2()).isNull();
assertThat(entity.getField3()).isNotNull();
}
@Test // #1659
@DisplayName("should only return included fields from SourceFilter on search")
void shouldOnlyReturnIncludedFieldsFromSourceFilterOnSearch() {
Query query = Query.findAll();
query.addSourceFilter(new SourceFilter() {
@Override
public String[] getIncludes() {
return new String[] { "field2" };
}
@Override
public String[] getExcludes() {
return new String[] {};
}
});
SearchHits<Entity> entities = operations.search(query, Entity.class);
assertThat(entities).hasSize(1);
Entity entity = entities.getSearchHit(0).getContent();
assertThat(entity.getField1()).isNull();
assertThat(entity.getField2()).isNotNull();
assertThat(entity.getField3()).isNull();
}
@Test // #1659
@DisplayName("should only return included fields from SourceFilter on multiget")
void shouldOnlyReturnIncludedFieldsFromSourceFilterOnMultiGet() {
Query query = new NativeSearchQueryBuilder().withIds(Collections.singleton("42")).build();
query.addSourceFilter(new SourceFilter() {
@Override
public String[] getIncludes() {
return new String[] { "field2" };
}
@Override
public String[] getExcludes() {
return new String[] {};
}
});
List<Entity> entities = operations.multiGet(query, Entity.class, IndexCoordinates.of(INDEX));
assertThat(entities).hasSize(1);
Entity entity = entities.get(0);
assertThat(entity.getField1()).isNull();
assertThat(entity.getField2()).isNotNull();
assertThat(entity.getField3()).isNull();
}
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
@Document(indexName = INDEX)
public static class Entity {
@Id private String id;
@Field(type = FieldType.Text) private String field1;
@Field(type = FieldType.Text) private String field2;
@Field(type = FieldType.Text) private String field3;
}
}

View File

@ -0,0 +1,25 @@
/*
* Copyright 2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.elasticsearch.core;
import org.springframework.data.elasticsearch.junit.jupiter.ElasticsearchTemplateConfiguration;
import org.springframework.test.context.ContextConfiguration;
/**
* @author Peter-Josef Meisch
*/
@ContextConfiguration(classes = { ElasticsearchTemplateConfiguration.class })
public class SourceFilterIntegrationTransportTests extends SourceFilterIntegrationTests {}

View File

@ -25,6 +25,7 @@ import java.util.List;
import java.util.concurrent.atomic.AtomicBoolean;
import org.junit.jupiter.api.Test;
import org.springframework.data.util.StreamUtils;
/**
* @author Sascha Woo
@ -45,6 +46,7 @@ public class StreamQueriesTest {
// when
SearchHitsIterator<String> iterator = StreamQueries.streamResults( //
0, //
searchHits, //
scrollId -> newSearchScrollHits(Collections.emptyList(), scrollId), //
scrollIds -> clearScrollCalled.set(true));
@ -70,6 +72,7 @@ public class StreamQueriesTest {
// when
SearchHitsIterator<String> iterator = StreamQueries.streamResults( //
0, //
searchHits, //
scrollId -> newSearchScrollHits(Collections.emptyList(), scrollId), //
scrollId -> {});
@ -90,10 +93,12 @@ public class StreamQueriesTest {
Collections.singletonList(new SearchHit<String>(null, 0, null, null, "one")), "s-2");
SearchScrollHits<String> searchHits4 = newSearchScrollHits(Collections.emptyList(), "s-3");
Iterator<SearchScrollHits<String>> searchScrollHitsIterator = Arrays.asList(searchHits1, searchHits2, searchHits3,searchHits4).iterator();
Iterator<SearchScrollHits<String>> searchScrollHitsIterator = Arrays
.asList(searchHits1, searchHits2, searchHits3, searchHits4).iterator();
List<String> clearedScrollIds = new ArrayList<>();
SearchHitsIterator<String> iterator = StreamQueries.streamResults( //
0, //
searchScrollHitsIterator.next(), //
scrollId -> searchScrollHitsIterator.next(), //
scrollIds -> clearedScrollIds.addAll(scrollIds));
@ -106,6 +111,56 @@ public class StreamQueriesTest {
assertThat(clearedScrollIds).isEqualTo(Arrays.asList("s-1", "s-2", "s-3"));
}
@Test // DATAES-831
void shouldReturnAllForRequestedSizeOf0() {
SearchScrollHits<String> searchHits1 = newSearchScrollHits(
Collections.singletonList(new SearchHit<String>(null, 0, null, null, "one")), "s-1");
SearchScrollHits<String> searchHits2 = newSearchScrollHits(
Collections.singletonList(new SearchHit<String>(null, 0, null, null, "one")), "s-2");
SearchScrollHits<String> searchHits3 = newSearchScrollHits(
Collections.singletonList(new SearchHit<String>(null, 0, null, null, "one")), "s-2");
SearchScrollHits<String> searchHits4 = newSearchScrollHits(Collections.emptyList(), "s-3");
Iterator<SearchScrollHits<String>> searchScrollHitsIterator = Arrays
.asList(searchHits1, searchHits2, searchHits3, searchHits4).iterator();
SearchHitsIterator<String> iterator = StreamQueries.streamResults( //
0, //
searchScrollHitsIterator.next(), //
scrollId -> searchScrollHitsIterator.next(), //
scrollIds -> {});
long count = StreamUtils.createStreamFromIterator(iterator).count();
assertThat(count).isEqualTo(3);
}
@Test // DATAES-831
void shouldOnlyReturnRequestedCount() {
SearchScrollHits<String> searchHits1 = newSearchScrollHits(
Collections.singletonList(new SearchHit<String>(null, 0, null, null, "one")), "s-1");
SearchScrollHits<String> searchHits2 = newSearchScrollHits(
Collections.singletonList(new SearchHit<String>(null, 0, null, null, "one")), "s-2");
SearchScrollHits<String> searchHits3 = newSearchScrollHits(
Collections.singletonList(new SearchHit<String>(null, 0, null, null, "one")), "s-2");
SearchScrollHits<String> searchHits4 = newSearchScrollHits(Collections.emptyList(), "s-3");
Iterator<SearchScrollHits<String>> searchScrollHitsIterator = Arrays
.asList(searchHits1, searchHits2, searchHits3, searchHits4).iterator();
SearchHitsIterator<String> iterator = StreamQueries.streamResults( //
2, //
searchScrollHitsIterator.next(), //
scrollId -> searchScrollHitsIterator.next(), //
scrollIds -> {});
long count = StreamUtils.createStreamFromIterator(iterator).count();
assertThat(count).isEqualTo(2);
}
private SearchScrollHits<String> newSearchScrollHits(List<SearchHit<String>> hits, String scrollId) {
return new SearchHitsImpl<String>(hits.size(), TotalHitsRelation.EQUAL_TO, 0, scrollId, hits, null);
}

View File

@ -2,6 +2,7 @@ package org.springframework.data.elasticsearch.core.convert;
import static org.assertj.core.api.Assertions.*;
import java.time.Instant;
import java.time.LocalDate;
import java.time.LocalDateTime;
import java.time.ZoneId;
@ -9,6 +10,7 @@ import java.time.ZonedDateTime;
import java.util.Date;
import java.util.GregorianCalendar;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.EnumSource;
@ -76,4 +78,36 @@ class ElasticsearchDateConverterTests {
assertThat(parsed).isEqualTo(legacyDate);
}
@Test // DATAES-953
@DisplayName("should write and read Date with custom format")
void shouldWriteAndReadDateWithCustomFormat() {
// only seconds as the format string does not store millis
long currentTimeSeconds = System.currentTimeMillis() / 1_000;
Date date = new Date(currentTimeSeconds * 1_000);
ElasticsearchDateConverter converter = ElasticsearchDateConverter.of("uuuu-MM-dd HH:mm:ss");
String formatted = converter.format(date);
Date parsed = converter.parse(formatted);
assertThat(parsed).isEqualTo(date);
}
@Test // DATAES-953
@DisplayName("should write and read Instant with custom format")
void shouldWriteAndReadInstantWithCustomFormat() {
// only seconds as the format string does not store millis
long currentTimeSeconds = System.currentTimeMillis() / 1_000;
Instant instant = Instant.ofEpochSecond(currentTimeSeconds);
ElasticsearchDateConverter converter = ElasticsearchDateConverter.of("uuuu-MM-dd HH:mm:ss");
String formatted = converter.format(instant);
Instant parsed = converter.parse(formatted, Instant.class);
assertThat(parsed).isEqualTo(instant);
}
}

View File

@ -26,6 +26,7 @@ import lombok.EqualsAndHashCode;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.RequiredArgsConstructor;
import lombok.Setter;
import java.time.LocalDate;
import java.util.ArrayList;
@ -39,6 +40,7 @@ import java.util.Map;
import org.json.JSONException;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
@ -616,7 +618,26 @@ public class MappingElasticsearchConverterUnitTests {
assertEquals(expected, json, false);
}
@Test
@Test // DATAES-924
@DisplayName("should write list of LocalDate")
void shouldWriteListOfLocalDate() throws JSONException {
LocalDatesEntity entity = new LocalDatesEntity();
entity.setId("4711");
entity.setDates(Arrays.asList(LocalDate.of(2020, 9, 15), LocalDate.of(2019, 5, 1)));
String expected = "{\n" + //
" \"id\": \"4711\",\n" + //
" \"dates\": [\"15.09.2020\", \"01.05.2019\"]\n" + //
"}\n"; //
Document document = Document.create();
mappingElasticsearchConverter.write(entity, document);
String json = document.toJson();
assertEquals(expected, json, false);
}
@Test // DATAES-716
void shouldReadLocalDate() {
Document document = Document.create();
document.put("id", "4711");
@ -632,6 +653,20 @@ public class MappingElasticsearchConverterUnitTests {
assertThat(person.getGender()).isEqualTo(Gender.MAN);
}
@Test // DATAES-924
@DisplayName("should read list of LocalDate")
void shouldReadListOfLocalDate() {
Document document = Document.create();
document.put("id", "4711");
document.put("dates", new String[] { "15.09.2020", "01.05.2019" });
LocalDatesEntity entity = mappingElasticsearchConverter.read(LocalDatesEntity.class, document);
assertThat(entity.getId()).isEqualTo("4711");
assertThat(entity.getDates()).hasSize(2).containsExactly(LocalDate.of(2020, 9, 15), LocalDate.of(2019, 5, 1));
}
@Test // DATAES-763
void writeEntityWithMapDataType() {
@ -718,6 +753,101 @@ public class MappingElasticsearchConverterUnitTests {
assertThat(entity.seqNoPrimaryTerm).isNull();
}
@Test // DATAES-845
void shouldWriteCollectionsWithNullValues() throws JSONException {
EntityWithListProperty entity = new EntityWithListProperty();
entity.setId("42");
entity.setValues(Arrays.asList(null, "two", null, "four"));
String expected = '{' + //
" \"id\": \"42\"," + //
" \"values\": [null, \"two\", null, \"four\"]" + //
'}';
Document document = Document.create();
mappingElasticsearchConverter.write(entity, document);
String json = document.toJson();
assertEquals(expected, json, false);
}
@Test // DATAES-857
void shouldWriteEntityWithListOfGeoPoints() throws JSONException {
GeoPointListEntity entity = new GeoPointListEntity();
entity.setId("42");
List<GeoPoint> locations = Arrays.asList(new GeoPoint(12.34, 23.45), new GeoPoint(34.56, 45.67));
entity.setLocations(locations);
String expected = "{\n" + //
" \"id\": \"42\",\n" + //
" \"locations\": [\n" + //
" {\n" + //
" \"lat\": 12.34,\n" + //
" \"lon\": 23.45\n" + //
" },\n" + //
" {\n" + //
" \"lat\": 34.56,\n" + //
" \"lon\": 45.67\n" + //
" }\n" + //
" ]\n" + //
"}"; //
Document document = Document.create();
mappingElasticsearchConverter.write(entity, document);
String json = document.toJson();
assertEquals(expected, json, false);
}
@Test // DATAES-857
void shouldReadEntityWithListOfGeoPoints() {
String json = "{\n" + //
" \"id\": \"42\",\n" + //
" \"locations\": [\n" + //
" {\n" + //
" \"lat\": 12.34,\n" + //
" \"lon\": 23.45\n" + //
" },\n" + //
" {\n" + //
" \"lat\": 34.56,\n" + //
" \"lon\": 45.67\n" + //
" }\n" + //
" ]\n" + //
"}"; //
Document document = Document.parse(json);
GeoPointListEntity entity = mappingElasticsearchConverter.read(GeoPointListEntity.class, document);
assertThat(entity.id).isEqualTo("42");
assertThat(entity.locations).containsExactly(new GeoPoint(12.34, 23.45), new GeoPoint(34.56, 45.67));
}
@Test // DATAES-865
void shouldWriteEntityWithMapAsObject() throws JSONException {
Map<String, Object> map = new LinkedHashMap<>();
map.put("foo", "bar");
EntityWithObject entity = new EntityWithObject();
entity.setId("42");
entity.setContent(map);
String expected = "{\n" + //
" \"id\": \"42\",\n" + //
" \"content\": {\n" + //
" \"foo\": \"bar\"\n" + //
" }\n" + //
"}\n"; //
Document document = Document.create();
mappingElasticsearchConverter.write(entity, document);
assertEquals(expected, document.toJson(), false);
}
private String pointTemplate(String name, Point point) {
return String.format(Locale.ENGLISH, "\"%s\":{\"lat\":%.1f,\"lon\":%.1f}", name, point.getX(), point.getY());
}
@ -755,6 +885,15 @@ public class MappingElasticsearchConverterUnitTests {
Map<String, Inventory> inventoryMap;
}
@Data
@Getter
@Setter
static class LocalDatesEntity {
@Id private String id;
@Field(name = "dates", type = FieldType.Date, format = DateFormat.custom,
pattern = "dd.MM.uuuu") private List<LocalDate> dates;
}
enum Gender {
MAN("1"), MACHINE("0");
@ -932,4 +1071,23 @@ public class MappingElasticsearchConverterUnitTests {
@Nullable private SeqNoPrimaryTerm seqNoPrimaryTerm;
}
@Data
static class EntityWithListProperty {
@Id private String id;
private List<String> values;
}
@Data
static class GeoPointListEntity {
@Id String id;
List<GeoPoint> locations;
}
@Data
static class EntityWithObject {
@Id private String id;
private Object content;
}
}

View File

@ -42,6 +42,7 @@ import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Set;
import org.assertj.core.data.Percentage;
@ -263,6 +264,7 @@ public class MappingBuilderTests extends MappingContextBaseTests {
}
@Test // DATAES-420
@SuppressWarnings({ "rawtypes", "unchecked" })
public void shouldUseBothAnalyzer() {
// given
@ -285,6 +287,7 @@ public class MappingBuilderTests extends MappingContextBaseTests {
}
@Test // DATAES-492
@SuppressWarnings("rawtypes")
public void shouldUseKeywordNormalizer() {
// given
@ -305,6 +308,7 @@ public class MappingBuilderTests extends MappingContextBaseTests {
}
@Test // DATAES-503
@SuppressWarnings("rawtypes")
public void shouldUseCopyTo() {
// given
@ -408,12 +412,29 @@ public class MappingBuilderTests extends MappingContextBaseTests {
assertEquals(expected, mapping, false);
}
@Test // DATAES-568
@Test // DATAES-568, DATAES-896
public void shouldUseFieldNameOnMultiField() throws JSONException {
// given
String expected = "{\"properties\":{" + "\"id-property\":{\"type\":\"keyword\",\"index\":true},"
+ "\"multifield-property\":{\"type\":\"text\",\"analyzer\":\"whitespace\",\"fields\":{\"prefix\":{\"type\":\"text\",\"analyzer\":\"stop\",\"search_analyzer\":\"standard\"}}}}}";
String expected = "{\n" + //
" \"properties\": {\n" + //
" \"id-property\": {\n" + //
" \"type\": \"keyword\",\n" + //
" \"index\": true\n" + //
" },\n" + //
" \"main-field\": {\n" + //
" \"type\": \"text\",\n" + //
" \"analyzer\": \"whitespace\",\n" + //
" \"fields\": {\n" + //
" \"suff-ix\": {\n" + //
" \"type\": \"text\",\n" + //
" \"analyzer\": \"stop\",\n" + //
" \"search_analyzer\": \"standard\"\n" + //
" }\n" + //
" }\n" + //
" }\n" + //
" }\n" + //
"}\n"; //
// when
String mapping = getMappingBuilder().buildPropertyMapping(FieldNameEntity.MultiFieldEntity.class);
@ -583,7 +604,16 @@ public class MappingBuilderTests extends MappingContextBaseTests {
assertThat(propertyMapping).doesNotContain("seqNoPrimaryTerm");
}
/**
@Test // DATAES-991
void shouldWriteCorrectTermVectorValues() {
IndexOperations indexOps = operations.indexOps(TermVectorFieldEntity.class);
indexOps.create();
indexOps.putMapping(indexOps.createMapping());
}
/**
* @author Xiao Yu
*/
@Setter
@ -659,9 +689,10 @@ public class MappingBuilderTests extends MappingContextBaseTests {
@Nullable @Id @Field("id-property") private String id;
@Nullable @Field("multifield-property") //
@MultiField(mainField = @Field(type = FieldType.Text, analyzer = "whitespace"), otherFields = {
@InnerField(suffix = "prefix", type = FieldType.Text, analyzer = "stop", searchAnalyzer = "standard") }) //
@Nullable //
@MultiField(mainField = @Field(name = "main-field", type = FieldType.Text, analyzer = "whitespace"),
otherFields = {
@InnerField(suffix = "suff-ix", type = FieldType.Text, analyzer = "stop", searchAnalyzer = "standard") }) //
private String description;
}
}
@ -705,6 +736,7 @@ public class MappingBuilderTests extends MappingContextBaseTests {
* @author Stuart Stevenson
* @author Mohsin Husen
*/
@Data
@Document(indexName = "test-index-simple-recursive-mapping-builder", replicas = 0, refreshInterval = "-1")
static class SimpleRecursiveEntity {
@ -804,7 +836,7 @@ public class MappingBuilderTests extends MappingContextBaseTests {
*/
static class SampleInheritedEntityBuilder {
private SampleInheritedEntity result;
private final SampleInheritedEntity result;
public SampleInheritedEntityBuilder(String id) {
result = new SampleInheritedEntity();
@ -827,7 +859,7 @@ public class MappingBuilderTests extends MappingContextBaseTests {
public IndexQuery buildIndex() {
IndexQuery indexQuery = new IndexQuery();
indexQuery.setId(result.getId());
indexQuery.setId(Objects.requireNonNull(result.getId()));
indexQuery.setObject(result);
return indexQuery;
}
@ -859,7 +891,7 @@ public class MappingBuilderTests extends MappingContextBaseTests {
@Nullable @Id private String id;
@Nullable @Field(type = FieldType.Date, index = false) private Date createdDate;
@Nullable @Field(type = FieldType.Date, format = DateFormat.date_time, index = false) private Date createdDate;
@Nullable
public String getId() {
@ -1051,7 +1083,7 @@ public class MappingBuilderTests extends MappingContextBaseTests {
@Document(indexName = "valueDoc")
static class ValueDoc {
@Field(type = Text) private ValueObject valueObject;
@Nullable @Field(type = Text) private ValueObject valueObject;
}
@Getter
@ -1070,4 +1102,20 @@ public class MappingBuilderTests extends MappingContextBaseTests {
@Field(type = Object) private SeqNoPrimaryTerm seqNoPrimaryTerm;
}
@Data
@Document(indexName = "termvectors-test")
static class TermVectorFieldEntity {
@Id private String id;
@Field(type = FieldType.Text, termVector = TermVector.no) private String no;
@Field(type = FieldType.Text, termVector = TermVector.yes) private String yes;
@Field(type = FieldType.Text, termVector = TermVector.with_positions) private String with_positions;
@Field(type = FieldType.Text, termVector = TermVector.with_offsets) private String with_offsets;
@Field(type = FieldType.Text, termVector = TermVector.with_positions_offsets) private String with_positions_offsets;
@Field(type = FieldType.Text,
termVector = TermVector.with_positions_payloads) private String with_positions_payloads;
@Field(type = FieldType.Text,
termVector = TermVector.with_positions_offsets_payloads) private String with_positions_offsets_payloads;
}
}

View File

@ -8,6 +8,7 @@ import org.junit.jupiter.api.Test;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
import org.springframework.data.elasticsearch.annotations.InnerField;
import org.springframework.data.elasticsearch.annotations.MultiField;
import org.springframework.data.elasticsearch.annotations.Score;
import org.springframework.data.elasticsearch.core.mapping.ElasticsearchPersistentEntity;
import org.springframework.lang.Nullable;
@ -31,9 +32,10 @@ public class MappingParametersTest extends MappingContextBaseTests {
@Test // DATAES-621
public void shouldCreateParametersForInnerFieldAnnotation() {
Annotation annotation = entity.getRequiredPersistentProperty("innerField").findAnnotation(InnerField.class);
MappingParameters mappingParameters = MappingParameters.from(annotation);
MultiField multiField = entity.getRequiredPersistentProperty("mainField").findAnnotation(MultiField.class);
InnerField innerField = multiField.otherFields()[0];
MappingParameters mappingParameters = MappingParameters.from(innerField);
assertThat(mappingParameters).isNotNull();
}
@ -61,7 +63,8 @@ public class MappingParametersTest extends MappingContextBaseTests {
static class AnnotatedClass {
@Nullable @Field private String field;
@Nullable @InnerField(suffix = "test", type = FieldType.Text) private String innerField;
@Nullable @MultiField(mainField = @Field,
otherFields = { @InnerField(suffix = "test", type = FieldType.Text) }) private String mainField;
@Score private float score;
@Nullable @Field(type = FieldType.Text, docValues = false) private String docValuesText;
@Nullable @Field(type = FieldType.Nested, docValues = false) private String docValuesNested;

View File

@ -41,10 +41,10 @@ public class SimpleElasticsearchDateMappingTests extends MappingContextBaseTests
private static final String EXPECTED_MAPPING = "{\"properties\":{\"message\":{\"store\":true,"
+ "\"type\":\"text\",\"index\":false,\"analyzer\":\"standard\"},\"customFormatDate\":{\"type\":\"date\",\"format\":\"dd.MM.uuuu hh:mm\"},"
+ "\"defaultFormatDate\":{\"type\":\"date\"},\"basicFormatDate\":{\""
+ "\"basicFormatDate\":{\""
+ "type\":\"date\",\"format\":\"basic_date\"}}}";
@Test // DATAES-568
@Test // DATAES-568, DATAES-828
public void testCorrectDateMappings() {
String mapping = getMappingBuilder().buildPropertyMapping(SampleDateMappingEntity.class);
@ -67,8 +67,6 @@ public class SimpleElasticsearchDateMappingTests extends MappingContextBaseTests
@Field(type = Date, format = DateFormat.custom,
pattern = "dd.MM.uuuu hh:mm") private LocalDateTime customFormatDate;
@Field(type = FieldType.Date) private LocalDateTime defaultFormatDate;
@Field(type = FieldType.Date, format = DateFormat.basic_date) private LocalDateTime basicFormatDate;
}
}

View File

@ -0,0 +1,155 @@
/*
* Copyright 2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.elasticsearch.core.mapping;
import static org.assertj.core.api.Assertions.*;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.Arrays;
import java.util.LinkedHashMap;
import java.util.Map;
import org.elasticsearch.common.geo.GeoPoint;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.IndexOperations;
import org.springframework.data.elasticsearch.core.SearchHits;
import org.springframework.data.elasticsearch.core.convert.ElasticsearchCustomConversions;
import org.springframework.data.elasticsearch.core.query.Query;
import org.springframework.data.elasticsearch.junit.jupiter.ElasticsearchRestTemplateConfiguration;
import org.springframework.data.elasticsearch.junit.jupiter.SpringIntegrationTest;
import org.springframework.data.elasticsearch.repository.config.EnableElasticsearchRepositories;
import org.springframework.test.context.ContextConfiguration;
/**
* Test that a whole entity can be converted using custom conversions
*
* @author Peter-Josef Meisch
*/
@SpringIntegrationTest
@ContextConfiguration(classes = { EntityCustomConversionIntegrationTests.Config.class })
public class EntityCustomConversionIntegrationTests {
@Configuration
@EnableElasticsearchRepositories(basePackages = { "org.springframework.data.elasticsearch.core.mapping" },
considerNestedRepositories = true)
static class Config extends ElasticsearchRestTemplateConfiguration {
@Override
public ElasticsearchCustomConversions elasticsearchCustomConversions() {
return new ElasticsearchCustomConversions(Arrays.asList(new EntityToMapConverter(), new MapToEntityConverter()));
}
}
@Autowired private ElasticsearchOperations operations;
@BeforeEach
void setUp() {
IndexOperations indexOps = operations.indexOps(Entity.class);
indexOps.create();
indexOps.putMapping(indexOps.createMapping());
}
@AfterEach
void tearDown() {
operations.indexOps(Entity.class).delete();
}
@Test // #1667
@DisplayName("should use CustomConversions on entity")
void shouldUseCustomConversionsOnEntity() {
Entity entity = Entity.builder() //
.value("hello") //
.location(new GeoPoint(42.7, 8.0)) //
.build();
org.springframework.data.elasticsearch.core.document.Document document = org.springframework.data.elasticsearch.core.document.Document
.create();
operations.getElasticsearchConverter().write(entity, document);
assertThat(document.getString("the_value")).isEqualTo("hello");
assertThat(document.getString("the_lon")).isEqualTo("8.0");
assertThat(document.getString("the_lat")).isEqualTo("42.7");
}
@Test // #1667
@DisplayName("should store and load entity from Elasticsearch")
void shouldStoreAndLoadEntityFromElasticsearch() {
Entity entity = Entity.builder() //
.value("hello") //
.location(new GeoPoint(42.7, 8.0)) //
.build();
Entity savedEntity = operations.save(entity);
operations.indexOps(Entity.class).refresh();
SearchHits<Entity> searchHits = operations.search(Query.findAll(), Entity.class);
assertThat(searchHits.getTotalHits()).isEqualTo(1);
Entity foundEntity = searchHits.getSearchHit(0).getContent();
assertThat(foundEntity).isEqualTo(entity);
}
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
@Document(indexName = "entity-with-custom-conversions")
static class Entity {
private String value;
private GeoPoint location;
}
@WritingConverter
static class EntityToMapConverter implements Converter<Entity, Map<String, Object>> {
@Override
public Map<String, Object> convert(Entity source) {
LinkedHashMap<String, Object> target = new LinkedHashMap<>();
target.put("the_value", source.getValue());
target.put("the_lat", String.valueOf(source.getLocation().getLat()));
target.put("the_lon", String.valueOf(source.getLocation().getLon()));
return target;
}
}
@ReadingConverter
static class MapToEntityConverter implements Converter<Map<String, Object>, Entity> {
@Override
public Entity convert(Map<String, Object> source) {
Entity entity = new Entity();
entity.setValue((String) source.get("the_value"));
entity.setLocation(new GeoPoint( //
Double.parseDouble((String) (source.get("the_lat"))), //
Double.parseDouble((String) (source.get("the_lon"))) //
));
return entity;
}
}
}

View File

@ -17,17 +17,23 @@ package org.springframework.data.elasticsearch.core.mapping;
import static org.assertj.core.api.Assertions.*;
import lombok.Data;
import java.time.LocalDate;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.util.Date;
import java.util.GregorianCalendar;
import java.util.List;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import org.springframework.data.elasticsearch.annotations.DateFormat;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
import org.springframework.data.elasticsearch.annotations.InnerField;
import org.springframework.data.elasticsearch.annotations.MultiField;
import org.springframework.data.elasticsearch.annotations.Score;
import org.springframework.data.elasticsearch.core.query.SeqNoPrimaryTerm;
import org.springframework.data.mapping.MappingException;
@ -74,7 +80,17 @@ public class SimpleElasticsearchPersistentPropertyUnitTests {
assertThat(persistentProperty.getFieldName()).isEqualTo("by-value");
}
@Test // DATAES-716, DATAES-792
@Test // DATAES-896
void shouldUseNameFromMultiFieldMainField() {
SimpleElasticsearchPersistentEntity<?> persistentEntity = context
.getRequiredPersistentEntity(MultiFieldProperty.class);
ElasticsearchPersistentProperty persistentProperty = persistentEntity.getPersistentProperty("mainfieldProperty");
assertThat(persistentProperty).isNotNull();
assertThat(persistentProperty.getFieldName()).isEqualTo("mainfield");
}
@Test // DATAES-716, DATAES-792, DATAES-924
void shouldSetPropertyConverters() {
SimpleElasticsearchPersistentEntity<?> persistentEntity = context.getRequiredPersistentEntity(DatesProperty.class);
@ -90,6 +106,9 @@ public class SimpleElasticsearchPersistentPropertyUnitTests {
assertThat(persistentProperty.hasPropertyConverter()).isTrue();
assertThat(persistentProperty.getPropertyConverter()).isNotNull();
persistentProperty = persistentEntity.getRequiredPersistentProperty("localDateList");
assertThat(persistentProperty.hasPropertyConverter()).isTrue();
assertThat(persistentProperty.getPropertyConverter()).isNotNull();
}
@Test // DATAES-716
@ -173,6 +192,28 @@ public class SimpleElasticsearchPersistentPropertyUnitTests {
assertThat(seqNoProperty.isReadable()).isFalse();
}
@Test // DATAES-828
void shouldRequireFormatForDateField() {
assertThatExceptionOfType(MappingException.class) //
.isThrownBy(() -> context.getRequiredPersistentEntity(DateFieldWithNoFormat.class)) //
.withMessageContaining("date");
}
@Test // DATAES-828
void shouldRequireFormatForDateNanosField() {
assertThatExceptionOfType(MappingException.class) //
.isThrownBy(() -> context.getRequiredPersistentEntity(DateNanosFieldWithNoFormat.class)) //
.withMessageContaining("date");
}
@Test // DATAES-924
@DisplayName("should require pattern for custom date format")
void shouldRequirePatternForCustomDateFormat() {
assertThatExceptionOfType(MappingException.class) //
.isThrownBy(() -> context.getRequiredPersistentEntity(DateFieldWithCustomFormatAndNoPattern.class)) //
.withMessageContaining("pattern");
}
static class InvalidScoreProperty {
@Nullable @Score String scoreProperty;
}
@ -185,14 +226,37 @@ public class SimpleElasticsearchPersistentPropertyUnitTests {
@Nullable @Field(value = "by-value") String fieldProperty;
}
static class MultiFieldProperty {
@Nullable @MultiField(mainField = @Field("mainfield"),
otherFields = { @InnerField(suffix = "suff", type = FieldType.Keyword) }) String mainfieldProperty;
}
static class DatesProperty {
@Nullable @Field(type = FieldType.Date, format = DateFormat.custom, pattern = "dd.MM.uuuu") LocalDate localDate;
@Nullable @Field(type = FieldType.Date, format = DateFormat.basic_date_time) LocalDateTime localDateTime;
@Nullable @Field(type = FieldType.Date, format = DateFormat.basic_date_time) Date legacyDate;
@Nullable @Field(type = FieldType.Date, format = DateFormat.custom,
pattern = "dd.MM.uuuu") List<LocalDate> localDateList;
}
@Data
static class SeqNoPrimaryTermProperty {
SeqNoPrimaryTerm seqNoPrimaryTerm;
String string;
}
@Data
static class DateFieldWithNoFormat {
@Field(type = FieldType.Date) LocalDateTime datetime;
}
@Data
static class DateFieldWithCustomFormatAndNoPattern {
@Field(type = FieldType.Date, format = DateFormat.custom, pattern = "") LocalDateTime datetime;
}
@Data
static class DateNanosFieldWithNoFormat {
@Field(type = FieldType.Date_Nanos) LocalDateTime datetime;
}
}

View File

@ -15,9 +15,9 @@
*/
package org.springframework.data.elasticsearch.repositories.custommethod;
import static org.apache.commons.lang.RandomStringUtils.*;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.elasticsearch.annotations.FieldType.*;
import static org.springframework.data.elasticsearch.utils.IdGenerator.*;
import lombok.AllArgsConstructor;
import lombok.Builder;
@ -100,7 +100,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethod() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -119,7 +119,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodForNot() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("some");
@ -137,7 +137,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithQuery() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -157,7 +157,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithLessThan() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -165,7 +165,7 @@ public abstract class CustomMethodRepositoryBaseTests {
sampleEntity.setMessage("some message");
repository.save(sampleEntity);
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -185,7 +185,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithBefore() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -205,7 +205,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithAfter() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -225,7 +225,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithLike() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -245,7 +245,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodForStartingWith() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -265,7 +265,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodForEndingWith() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -285,7 +285,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodForContains() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -305,7 +305,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodForIn() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -313,7 +313,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -334,7 +334,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodForNotIn() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -342,7 +342,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -360,16 +360,16 @@ public abstract class CustomMethodRepositoryBaseTests {
}
@Test // DATAES-647
public void shouldHandleManyValuesQueryingIn() {
public void shouldHandleManyKeywordValuesQueryingIn() {
// given
String documentId1 = randomNumeric(32);
String documentId1 = nextIdAsString();
SampleEntity sampleEntity1 = new SampleEntity();
sampleEntity1.setId(documentId1);
sampleEntity1.setKeyword("foo");
repository.save(sampleEntity1);
String documentId2 = randomNumeric(32);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setKeyword("bar");
@ -378,8 +378,9 @@ public abstract class CustomMethodRepositoryBaseTests {
List<String> keywords = new ArrayList<>();
keywords.add("foo");
for (int i = 0; i < 1025; i++) {
keywords.add(randomNumeric(32));
// limit for normal query clauses is 1024, for keywords we change to terms queries
for (int i = 0; i < 1200; i++) {
keywords.add(nextIdAsString());
}
// when
@ -391,16 +392,16 @@ public abstract class CustomMethodRepositoryBaseTests {
}
@Test // DATAES-647
public void shouldHandleManyValuesQueryingNotIn() {
public void shouldHandleManyKeywordValuesQueryingNotIn() {
// given
String documentId1 = randomNumeric(32);
String documentId1 = nextIdAsString();
SampleEntity sampleEntity1 = new SampleEntity();
sampleEntity1.setId(documentId1);
sampleEntity1.setKeyword("foo");
repository.save(sampleEntity1);
String documentId2 = randomNumeric(32);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setKeyword("bar");
@ -409,8 +410,9 @@ public abstract class CustomMethodRepositoryBaseTests {
List<String> keywords = new ArrayList<>();
keywords.add("foo");
for (int i = 0; i < 1025; i++) {
keywords.add(randomNumeric(32));
// limit for normal query clauses is 1024, for keywords we change to terms queries
for (int i = 0; i < 1200; i++) {
keywords.add(nextIdAsString());
}
// when
@ -421,11 +423,51 @@ public abstract class CustomMethodRepositoryBaseTests {
assertThat(list.get(0).getId()).isEqualTo(documentId2);
}
@Test // DATAES-912
void shouldHandleTextFieldQueryingIn() {
String documentId1 = nextIdAsString();
SampleEntity sampleEntity1 = new SampleEntity();
sampleEntity1.setId(documentId1);
sampleEntity1.setMessage("foo");
repository.save(sampleEntity1);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setMessage("bar");
repository.save(sampleEntity2);
List<SampleEntity> list = repository.findByMessageIn(Arrays.asList("Foo", "Bar"));
assertThat(list).hasSize(2);
assertThat(list.stream().map(SampleEntity::getId)).containsExactlyInAnyOrder(documentId1, documentId2);
}
@Test // DATAES-912
void shouldHandleTextFieldQueryingNotIn() {
String documentId1 = nextIdAsString();
SampleEntity sampleEntity1 = new SampleEntity();
sampleEntity1.setId(documentId1);
sampleEntity1.setMessage("foo");
repository.save(sampleEntity1);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setMessage("bar");
repository.save(sampleEntity2);
List<SampleEntity> list = repository.findByMessageNotIn(Arrays.asList("Boo", "Bar"));
assertThat(list).hasSize(1);
assertThat(list.get(0).getId()).isEqualTo(documentId1);
}
@Test
public void shouldExecuteCustomMethodForTrue() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -434,7 +476,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -453,7 +495,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodForFalse() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -462,7 +504,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -482,7 +524,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodForOrderBy() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("abc");
@ -491,7 +533,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// document 2
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("xyz");
@ -500,7 +542,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity2);
// document 3
String documentId3 = randomNumeric(5);
String documentId3 = nextIdAsString();
SampleEntity sampleEntity3 = new SampleEntity();
sampleEntity3.setId(documentId3);
sampleEntity3.setType("def");
@ -520,7 +562,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithBooleanParameter() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -529,7 +571,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -549,7 +591,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldReturnPageableInUnwrappedPageResult() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -558,7 +600,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -601,21 +643,21 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldReturnPageableResultsWithGivenSortingOrder() {
// given
String documentId = random(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setMessage("abc");
sampleEntity.setVersion(System.currentTimeMillis());
repository.save(sampleEntity);
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setMessage("abd");
sampleEntity.setVersion(System.currentTimeMillis());
repository.save(sampleEntity2);
String documentId3 = randomNumeric(5);
String documentId3 = nextIdAsString();
SampleEntity sampleEntity3 = new SampleEntity();
sampleEntity3.setId(documentId3);
sampleEntity3.setMessage("abe");
@ -635,21 +677,21 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldReturnListForMessage() {
// given
String documentId = random(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setMessage("abc");
sampleEntity.setVersion(System.currentTimeMillis());
repository.save(sampleEntity);
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setMessage("abd");
sampleEntity.setVersion(System.currentTimeMillis());
repository.save(sampleEntity2);
String documentId3 = randomNumeric(5);
String documentId3 = nextIdAsString();
SampleEntity sampleEntity3 = new SampleEntity();
sampleEntity3.setId(documentId3);
sampleEntity3.setMessage("abe");
@ -667,7 +709,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithGeoPoint() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -689,7 +731,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithGeoPointAndString() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -699,7 +741,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -722,7 +764,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithWithinGeoPoint() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -745,7 +787,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithWithinPoint() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -768,7 +810,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithNearBox() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -778,7 +820,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test2");
@ -808,7 +850,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldExecuteCustomMethodWithNearPointAndDistance() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -846,7 +888,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethod() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -854,7 +896,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test2");
@ -873,7 +915,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodForNot() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("some");
@ -881,7 +923,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -900,7 +942,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithBooleanParameter() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -909,7 +951,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -928,7 +970,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithLessThan() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -936,7 +978,7 @@ public abstract class CustomMethodRepositoryBaseTests {
sampleEntity.setMessage("some message");
repository.save(sampleEntity);
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -955,7 +997,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithBefore() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -964,7 +1006,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -984,7 +1026,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithAfter() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -993,7 +1035,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -1013,7 +1055,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithLike() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1022,7 +1064,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -1042,7 +1084,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodForStartingWith() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1051,7 +1093,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -1071,7 +1113,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodForEndingWith() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1080,7 +1122,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -1100,7 +1142,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodForContains() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1109,7 +1151,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -1129,7 +1171,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodForIn() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1137,7 +1179,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -1157,7 +1199,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodForNotIn() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1165,7 +1207,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -1185,7 +1227,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodForTrue() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1194,7 +1236,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -1212,7 +1254,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodForFalse() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1221,7 +1263,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
// given
String documentId2 = randomNumeric(5);
String documentId2 = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
sampleEntity2.setType("test");
@ -1240,7 +1282,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithWithinGeoPoint() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1250,7 +1292,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -1271,7 +1313,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithWithinPoint() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1281,7 +1323,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -1302,7 +1344,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithNearBox() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1312,7 +1354,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test2");
@ -1333,7 +1375,7 @@ public abstract class CustomMethodRepositoryBaseTests {
public void shouldCountCustomMethodWithNearPointAndDistance() {
// given
String documentId = randomNumeric(5);
String documentId = nextIdAsString();
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setType("test");
@ -1343,7 +1385,7 @@ public abstract class CustomMethodRepositoryBaseTests {
repository.save(sampleEntity);
documentId = randomNumeric(5);
documentId = nextIdAsString();
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId);
sampleEntity2.setType("test");
@ -1544,6 +1586,28 @@ public abstract class CustomMethodRepositoryBaseTests {
return entities;
}
@Test // DATAES-891
void shouldStreamEntitiesWithQueryAnnotatedMethod() {
List<SampleEntity> entities = createSampleEntities("abc", 20);
repository.saveAll(entities);
Stream<SampleEntity> stream = streamingRepository.streamEntitiesByType("abc");
long count = stream.peek(sampleEntity -> assertThat(sampleEntity).isInstanceOf(SampleEntity.class)).count();
assertThat(count).isEqualTo(20);
}
@Test // DATAES-891
void shouldStreamSearchHitsWithQueryAnnotatedMethod() {
List<SampleEntity> entities = createSampleEntities("abc", 20);
repository.saveAll(entities);
Stream<SearchHit<SampleEntity>> stream = streamingRepository.streamSearchHitsByType("abc");
long count = stream.peek(sampleEntity -> assertThat(sampleEntity).isInstanceOf(SearchHit.class)).count();
assertThat(count).isEqualTo(20);
}
@Data
@NoArgsConstructor
@AllArgsConstructor
@ -1600,6 +1664,10 @@ public abstract class CustomMethodRepositoryBaseTests {
List<SampleEntity> findByKeywordNotIn(List<String> keywords);
List<SampleEntity> findByMessageIn(List<String> keywords);
List<SampleEntity> findByMessageNotIn(List<String> keywords);
Page<SampleEntity> findByIdNotIn(List<String> ids, Pageable pageable);
Page<SampleEntity> findByAvailableTrue(Pageable pageable);
@ -1687,5 +1755,12 @@ public abstract class CustomMethodRepositoryBaseTests {
Stream<SampleEntity> findByType(String type);
Stream<SampleEntity> findByType(String type, Pageable pageable);
@Query("{\"bool\": {\"must\": [{\"term\": {\"type\": \"?0\"}}]}}")
Stream<SampleEntity> streamEntitiesByType(String type);
@Query("{\"bool\": {\"must\": [{\"term\": {\"type\": \"?0\"}}]}}")
Stream<SearchHit<SampleEntity>> streamSearchHitsByType(String type);
}
}

View File

@ -23,12 +23,14 @@ import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.stream.Collectors;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
@ -249,7 +251,7 @@ class QueryKeywordsTests {
products.forEach(product -> assertThat(product.name).isEqualTo("Sugar"));
}
@Test
@Test // DATAES-239
void shouldSearchForNullValues() {
final List<Product> products = repository.findByName(null);
@ -257,7 +259,7 @@ class QueryKeywordsTests {
assertThat(products.get(0).getId()).isEqualTo("6");
}
@Test
@Test // DATAES-239
void shouldDeleteWithNullValues() {
repository.deleteByName(null);
@ -265,6 +267,24 @@ class QueryKeywordsTests {
assertThat(count).isEqualTo(5);
}
@Test // DATAES-937
@DisplayName("should return empty list on findById with empty input list")
void shouldReturnEmptyListOnFindByIdWithEmptyInputList() {
Iterable<Product> products = repository.findAllById(new ArrayList<>());
assertThat(products).isEmpty();
}
@Test // DATAES-937
@DisplayName("should return empty list on derived method with empty input list")
void shouldReturnEmptyListOnDerivedMethodWithEmptyInputList() {
Iterable<Product> products = repository.findAllByNameIn(new ArrayList<>());
assertThat(products).isEmpty();
}
/**
* @author Mohsin Husen
* @author Artur Konczak
@ -344,6 +364,8 @@ class QueryKeywordsTests {
List<Product> findTop2ByName(String name);
void deleteByName(@Nullable String name);
List<Product> findAllByNameIn(List<String> names);
}
}

View File

@ -25,12 +25,14 @@ import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.io.IOException;
import java.lang.Long;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
@ -56,6 +58,7 @@ import org.springframework.data.elasticsearch.junit.jupiter.SpringIntegrationTes
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import org.springframework.data.elasticsearch.repository.config.EnableElasticsearchRepositories;
import org.springframework.data.elasticsearch.utils.IndexInitializer;
import org.springframework.data.util.StreamUtils;
import org.springframework.test.context.ContextConfiguration;
/**
@ -361,6 +364,14 @@ public class SimpleElasticsearchRepositoryTests {
@Test
public void shouldDeleteAll() {
// given
String documentId = randomNumeric(5);
SampleEntity sampleEntity = new SampleEntity();
sampleEntity.setId(documentId);
sampleEntity.setMessage("hello world.");
sampleEntity.setVersion(System.currentTimeMillis());
repository.save(sampleEntity);
// when
repository.deleteAll();
@ -677,6 +688,32 @@ public class SimpleElasticsearchRepositoryTests {
assertThat(savedEntities).hasSize(0);
}
@Test // DATAES-832
void shouldNotReturnNullValuesInFindAllById() {
// given
String documentId1 = "id-one";
SampleEntity sampleEntity1 = new SampleEntity();
sampleEntity1.setId(documentId1);
repository.save(sampleEntity1);
String documentId2 = "id-two";
SampleEntity sampleEntity2 = new SampleEntity();
sampleEntity2.setId(documentId2);
repository.save(sampleEntity2);
String documentId3 = "id-three";
SampleEntity sampleEntity3 = new SampleEntity();
sampleEntity3.setId(documentId3);
repository.save(sampleEntity3);
Iterable<SampleEntity> allById = repository
.findAllById(Arrays.asList("id-one", "does-not-exist", "id-two", "where-am-i", "id-three"));
List<SampleEntity> results = StreamUtils.createStreamFromIterator(allById.iterator()).collect(Collectors.toList());
assertThat(results).hasSize(3);
assertThat(results.stream().map(SampleEntity::getId).collect(Collectors.toList()))
.containsExactlyInAnyOrder("id-one", "id-two", "id-three");
}
private static List<SampleEntity> createSampleEntitiesWithMessage(String message, int numberOfEntities) {
List<SampleEntity> sampleEntities = new ArrayList<>();

View File

@ -0,0 +1,41 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.elasticsearch.utils;
import java.util.concurrent.atomic.AtomicInteger;
/**
* Class to provide sequential IDs. Uses an integer, 2^31 -1 values should be enough for the test runs.
*
* @author Peter-Josef Meisch
*/
public final class IdGenerator {
private static final AtomicInteger NEXT = new AtomicInteger();
private IdGenerator() {}
public static int nextIdAsInt() {
return NEXT.incrementAndGet();
}
public static double nextIdAsDouble() {
return NEXT.incrementAndGet();
}
public static String nextIdAsString() {
return "" + nextIdAsInt();
}
}