Rel 6 2 1 mergeback (#4288)
* jm wrong bundle entry url (#4213)
* Bug test
* here you go
* Generate relative URIs for bundle entry.request.url, as specified
* Point jira issue in changelog
* Adjust tests to fixes
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
Co-authored-by: Tadgh <garygrantgraham@gmail.com>
* improved logging (#4217)
Co-authored-by: Ken Stevens <ken@smilecdr.com>
* Rel 6 1 3 mergeback (#4215)
* Bump for CVE (#3856)
* Bump for CVE
* Bump spring-data version
* Fix compile
* Cut over to spring bom
* Bump to RC1
* remove RC
* do not contrain reindex for common SP updates (#3876)
* only fast-track jobs with exactly one chunk (#3879)
* Fix illegalstateexception when an exception is thrown during stream response (#3882)
* Finish up changelog, minor refactor
* reset buffer only
* Hack for some replacements
* Failure handling
* wip
* Fixed the issue (#3845)
* Fixed the issue
* Changelog modification
* Changelog modification
* Implemented seventh character extended code and the corresponding dis… (#3709)
* Implemented seventh character extended code and the corresponding display
* Modifications
* Changes on previous test according to modifications made in ICD10-CM XML file
* Subscription sending delete events being skipped (#3888)
* fixed bug and added test
* refactor
* Update for CVE (#3895)
* updated pointcuts to work as intended (#3903)
* updated pointcuts to work as intended
* added changelog
* review fixes
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
* 3904 during $delete expunge job hibernate search indexed documents are left orphaned (#3905)
* Add test and implementation
* Add changelog
* 3899 code in limits (#3901)
* Add implementation, changelog, test
* Update hapi-fhir-jpaserver-test-utilities/src/test/java/ca/uhn/fhir/jpa/provider/r4/ResourceProviderR4Test.java
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
* 3884 overlapping searchparameter undetected rel 6 1 (#3909)
* Applying all changes from previous dev branch to current one pointing to rel_6_1
* Fixing merge conflict related to Msg.code value.
* Fixing Msg.code value.
* Making checkstyle happy.
* Making sure that all tests are passing.
* Passing all tests after fixing Msg.code
* Passing all tests.
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
* 3745 - fixed NPE for bundle with duplicate conditional create resourc… (#3746)
* 3745 - fixed NPE for bundle with duplicate conditional create resources and a conditional delete
* created unit test for skip of delete operation while processing duplicating create entries
* moved unit test to FhirSystemDaoR4Test
* 3379 mdm fixes (#3906)
* added MdmLinkCreateSvcimplTest
* fixed creating mdm-link not setting the resource type correctly
* fixed a bug where ResourcePersistenceId was being duplicated instead of passed on
* Update hapi-fhir-jpaserver-mdm/src/test/java/ca/uhn/fhir/jpa/mdm/svc/MdmLinkCreateSvcImplTest.java
Change order of tests such that assertEquals takes expected value then actual value
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
* added changelog, also changed a setup function in test to beforeeach
Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
* Fix to the issue (#3855)
* Fix to the issue
* Progress
* fixed the issue
* Addressing suggestions
* add response status code to MethodOutcome
* Addressing suggestions
Co-authored-by: Ken Stevens <ken@smilecdr.com>
* Fix for caching appearing broken in batch2 for bulkexport jobs (#3912)
* Respect caching in bullk export, fix bug with completed date on empty jobs
* add changelog
* Add impl
* Add breaking test
* Complete failing test
* more broken tests
* Fix more tests'
* Fix paging bug
* Fix another brittle test
* 3915 do not collapse rules with filters (#3916)
* do not attempt to merge compartment permissions with filters
* changelog
* Rename to IT for concurrency problems
Co-authored-by: Tadgh <garygrantgraham@gmail.com>
* Version bump
* fix $mdm-submit output (#3917)
Co-authored-by: Ken Stevens <ken@smilecdr.com>
* Gl3407 bundle offset size (#3918)
* begin with failing test
* fixed
* change log
* rollback default count change and corresponding comments
Co-authored-by: Ken Stevens <ken@smilecdr.com>
* Offset interceptor now only works for external calls
* Initialize some beans (esp interceptors) later in the boot process so they don't slow down startup.
* do not reindex searchparam jobs on startup
* Fix oracle non-enterprise attempting online index add (#3925)
* 3922 delete expunge large dataset (#3923)
* lower batchsize of delete requests so that we do not get sql exceptions
* blah
* fix test
* updated tests to not fail
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
* add index
* Fix up colun grab
* Revert offset mode change
* Revert fix for null/system request details checks for reindex purposes
* Fix bug and add test for SP Validating Interceptor (#3930)
* wip
* Fix uptests
* Fix index online test
* Fix SP validating interceptor logic
* Updating version to: 6.1.1 post release.
* fix compile error
* Deploy to sonatype (#3934)
* adding sonatype profile to checkstyle module
* adding sonatype profile to tinder module
* adding sonatype profile to base pom
* adding final deployToSonatype profile
* wip
* Revert version enum
* Updating version to: 6.1.1 post release.
* Add test, changelog, and implementation
* Add backport info
* Create failing test
* Implemented the fix, fixed existing unit tests
* added changelog
* added test case for no filter, exclude 1 patient
* wip
* Add backport info
* Add info of new version
* Updating version to: 6.1.2 post release.
* bump info and backport for 6.1.2
* Bump for hapi
* Implement bug fixes, add new tests (#4022)
* Implement bug fixes, add new tests
* tidy
* Tidy
* refactor for cleaning
* More tidying
* Lower logging
* Split into nested tests, rename, add todos
* Typo
* Code review
* add backport info
* Updating version to: 6.1.3 post release.
* Updating version to: 6.1.3 post release.
* removed duplicate mention of ver 6.1.3 in versionEnum
* backport pr 4101
* mdm message key (#4111)
* begin with failing test
* fixed 2 tests
* fix tests
* fix tests
* change log
Co-authored-by: Ken Stevens <ken@smilecdr.com>
* backport 6.1.3 docs changes
* fixed typo on doc backport message
* fix test breaking
* Updating version to: 6.1.4 post release.
* wip
Co-authored-by: JasonRoberts-smile <85363818+JasonRoberts-smile@users.noreply.github.com>
Co-authored-by: Qingyixia <106992634+Qingyixia@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: kateryna-mironova <107507153+kateryna-mironova@users.noreply.github.com>
Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com>
Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: markiantorno <markiantorno@gmail.com>
Co-authored-by: Steven Li <steven@smilecdr.com>
* pin okio-jvm for kotlin vuln (#4216)
* Fix UrlUtil.unescape() by not escaping "+" to " " if this is an "application/..." _outputFormat. (#4220)
* First commit: Failing unit test and a TODO with a vague idea of where the bug happens.
* Don't escape "+" in a URL GET parameter if it starts with "application".
* Remove unnecessary TODO.
* Add changelog.
* Code review feedback on naming. Also, make logic more robust by putting plus and should escape boolean && in parens.
* Ks 20221031 migration lock (#4224)
* started design
* complete with tests
* changelog
* cleanup
* tyop
Co-authored-by: Ken Stevens <ken@smilecdr.com>
* 4207-getpagesoffset-set-to-total-number-of-resources-results-in-inconsistent-amount-of-entries-when-requests-are-sent-consecutively (#4209)
* Added test
* Added solution
* Changelog
* Changes made based on comments
* Fix bug with MDM submit
* fix
* Version bump
* 4234 consent in conjunction with versionedapiconverterinterceptor fails (#4236)
* Add constant for interceptor
* add test, changelog
* Allow Batch2 transition from ERRORED to COMPLETE (#4242)
* Allow Batch2 transition from ERRORED to COMPLETE
* Add changelog
* Test fix
Co-authored-by: James Agnew <james@jamess-mbp.lan>
* 3685 When bulk exporting, if no resource type param is provided, defa… (#4233)
* 3685 When bulk exporting, if no resource type param is provided, default to all registered types.
* Update test case.
* Cleaned up changelog.
* Added test case for multiple resource types.
* Added failing test case for not returning Binary resource.
* Refactor solution.
Co-authored-by: kylejule <kyle.jule@smilecdr.com>
* Add next version
* bulk export permanently reusing cached results (#4249)
* Add test, fix bug, add changelog
* minor refactor
* Fix broken test
* Smile 4892 DocumentReference Attachment url (#4237)
* failing test
* fix
* increase test Attachment url size to new max
* decrease limit to 500
* ci fix
Co-authored-by: nathaniel.doef <nathaniel.doef@smilecdr.com>
* Overlapping SearchParameter with the same code and base are not allowed (#4253)
* Overlapping SearchParameter with the same code and base are not allowed
* Fix existing tests according to changes
* Cleanup dead code and remove related tests
* Version Bump
* ignore misfires in quartz
* Allowing Failures On Index Drops (#4272)
* Allowing failure on index drops.
* Adding changeLog
* Modification to changelog following code review.
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
* Revert "ignore misfires in quartz"
This reverts commit 15c74a46bc
.
* Ignore misfires in quartz (#4273)
* Reindex Behaviour Issues (#4261)
* fixmes for ND
* address FIXME comments
* fix tests
* increase max retries
* fix resource id chunking logic
* fix test
* add modular patient
* change log
* version bump
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: nathaniel.doef <nathaniel.doef@smilecdr.com>
* Set official Version
* license
* Fix up numbers
* Fix up numbers
* Update numbers
* wip
* fix numbers
* Fix test:
* Fix more tests
* TEMP FIX FOR BUILD
* wip
* Updating version to: 6.2.1 post release.
* Add a whack of logging
* wip
* add implementation
* wip and test
* wip
* last-second-fetch
* expose useful method
* remove 10000 limit
* Strip some logging
* Fix up logging
* Unpublicize method
* Fix version
* Make minor changes
* once again on 6.2.1
* re-add version enum
* add folder
* fix test
* DIsable busted test
* Disable more broken tests
* Only submit queued chunks
* Quiet log
* Fix up checkstyle
* log to console
* Remove double import
* Version bump
* licenses
* Fix up version
* fix build
* wip
* wip
* Fix up search expansion to use new behaviour of idhelper service
* Fix up MDM Search Expansion
Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com>
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: JasonRoberts-smile <85363818+JasonRoberts-smile@users.noreply.github.com>
Co-authored-by: Qingyixia <106992634+Qingyixia@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: kateryna-mironova <107507153+kateryna-mironova@users.noreply.github.com>
Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com>
Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: markiantorno <markiantorno@gmail.com>
Co-authored-by: Steven Li <steven@smilecdr.com>
Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com>
Co-authored-by: karneet1212 <112980019+karneet1212@users.noreply.github.com>
Co-authored-by: James Agnew <jamesagnew@gmail.com>
Co-authored-by: James Agnew <james@jamess-mbp.lan>
Co-authored-by: KGJ-software <39975592+KGJ-software@users.noreply.github.com>
Co-authored-by: kylejule <kyle.jule@smilecdr.com>
Co-authored-by: Nathan Doef <n.doef@protonmail.com>
Co-authored-by: nathaniel.doef <nathaniel.doef@smilecdr.com>
This commit is contained in:
parent
84d4097964
commit
b1e3468dcd
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
@ -129,8 +129,11 @@
|
||||||
<goal>check</goal>
|
<goal>check</goal>
|
||||||
</goals>
|
</goals>
|
||||||
<configuration>
|
<configuration>
|
||||||
<failsOnError>true</failsOnError>
|
|
||||||
<logViolationsToConsole>true</logViolationsToConsole>
|
<logViolationsToConsole>true</logViolationsToConsole>
|
||||||
|
<failsOnError>true</failsOnError>
|
||||||
|
<suppressionsLocation>${maven.multiModuleProjectDirectory}/src/checkstyle/checkstyle_suppressions.xml</suppressionsLocation>
|
||||||
|
<enableRulesSummary>true</enableRulesSummary>
|
||||||
|
<enableSeveritySummary>true</enableSeveritySummary>
|
||||||
<consoleOutput>true</consoleOutput>
|
<consoleOutput>true</consoleOutput>
|
||||||
<configLocation>${maven.multiModuleProjectDirectory}/src/checkstyle/checkstyle.xml</configLocation>
|
<configLocation>${maven.multiModuleProjectDirectory}/src/checkstyle/checkstyle.xml</configLocation>
|
||||||
</configuration>
|
</configuration>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -145,6 +145,9 @@ public class Constants {
|
||||||
public static final String HEADER_SUFFIX_CT_UTF_8 = "; charset=UTF-8";
|
public static final String HEADER_SUFFIX_CT_UTF_8 = "; charset=UTF-8";
|
||||||
public static final String HEADERVALUE_CORS_ALLOW_METHODS_ALL = "GET, POST, PUT, DELETE, OPTIONS";
|
public static final String HEADERVALUE_CORS_ALLOW_METHODS_ALL = "GET, POST, PUT, DELETE, OPTIONS";
|
||||||
public static final String HEADER_REWRITE_HISTORY = "X-Rewrite-History";
|
public static final String HEADER_REWRITE_HISTORY = "X-Rewrite-History";
|
||||||
|
public static final String HEADER_RETRY_ON_VERSION_CONFLICT = "X-Retry-On-Version-Conflict";
|
||||||
|
public static final String HEADER_MAX_RETRIES = "max-retries";
|
||||||
|
public static final String HEADER_RETRY = "retry";
|
||||||
public static final Map<Integer, String> HTTP_STATUS_NAMES;
|
public static final Map<Integer, String> HTTP_STATUS_NAMES;
|
||||||
public static final String LINK_FHIR_BASE = "fhir-base";
|
public static final String LINK_FHIR_BASE = "fhir-base";
|
||||||
public static final String LINK_FIRST = "first";
|
public static final String LINK_FIRST = "first";
|
||||||
|
|
|
@ -1,5 +1,25 @@
|
||||||
package ca.uhn.fhir.system;
|
package ca.uhn.fhir.system;
|
||||||
|
|
||||||
|
/*-
|
||||||
|
* #%L
|
||||||
|
* HAPI FHIR - Core Library
|
||||||
|
* %%
|
||||||
|
* Copyright (C) 2014 - 2022 Smile CDR, Inc.
|
||||||
|
* %%
|
||||||
|
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
* you may not use this file except in compliance with the License.
|
||||||
|
* You may obtain a copy of the License at
|
||||||
|
*
|
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
*
|
||||||
|
* Unless required by applicable law or agreed to in writing, software
|
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
* See the License for the specific language governing permissions and
|
||||||
|
* limitations under the License.
|
||||||
|
* #L%
|
||||||
|
*/
|
||||||
|
|
||||||
import org.apache.commons.lang3.time.DateUtils;
|
import org.apache.commons.lang3.time.DateUtils;
|
||||||
|
|
||||||
public final class HapiSystemProperties {
|
public final class HapiSystemProperties {
|
||||||
|
|
|
@ -106,6 +106,7 @@ public enum VersionEnum {
|
||||||
V6_1_3,
|
V6_1_3,
|
||||||
V6_1_4,
|
V6_1_4,
|
||||||
V6_2_0,
|
V6_2_0,
|
||||||
|
V6_2_1,
|
||||||
V6_3_0
|
V6_3_0
|
||||||
;
|
;
|
||||||
|
|
||||||
|
|
|
@ -3,14 +3,14 @@
|
||||||
<modelVersion>4.0.0</modelVersion>
|
<modelVersion>4.0.0</modelVersion>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-bom</artifactId>
|
<artifactId>hapi-fhir-bom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<packaging>pom</packaging>
|
<packaging>pom</packaging>
|
||||||
<name>HAPI FHIR BOM</name>
|
<name>HAPI FHIR BOM</name>
|
||||||
|
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -48,7 +48,7 @@ public class ValidationSupportChainCreator {
|
||||||
chain.addValidationSupport(localFileValidationSupport);
|
chain.addValidationSupport(localFileValidationSupport);
|
||||||
chain.addValidationSupport(new SnapshotGeneratingValidationSupport(ctx));
|
chain.addValidationSupport(new SnapshotGeneratingValidationSupport(ctx));
|
||||||
} catch (IOException e) {
|
} catch (IOException e) {
|
||||||
throw new RuntimeException(Msg.code(2141) + "Failed to load local profile.", e);
|
throw new RuntimeException(Msg.code(2207) + "Failed to load local profile.", e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if (commandLine.hasOption("r")) {
|
if (commandLine.hasOption("r")) {
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-cli</artifactId>
|
<artifactId>hapi-fhir-cli</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../../hapi-deployable-pom</relativePath>
|
<relativePath>../../hapi-deployable-pom</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1,6 @@
|
||||||
|
---
|
||||||
|
type: fix
|
||||||
|
issue: 4240
|
||||||
|
jira: SMILE-4892
|
||||||
|
title: "Previously, when creating a `DocumentReference` with an `Attachment` containing a URL over 254 characters
|
||||||
|
an error was thrown. This has been corrected and now an `Attachment` URL can be up to 500 characters."
|
|
@ -0,0 +1,4 @@
|
||||||
|
---
|
||||||
|
type: fix
|
||||||
|
issue: 4250
|
||||||
|
title: "Previously, SearchParameters with identical codes and bases could be created. This has been corrected. If a SearchParameter is submitted which is a duplicate, it will be rejected."
|
|
@ -0,0 +1,8 @@
|
||||||
|
---
|
||||||
|
type: fix
|
||||||
|
issue: 4267
|
||||||
|
title: "Previously, if the `$reindex` operation failed with a `ResourceVersionConflictException` the related
|
||||||
|
batch job would fail. This has been corrected by adding 10 retry attempts for transactions that have
|
||||||
|
failed with a `ResourceVersionConflictException` during the `$reindex` operation. In addition, the `ResourceIdListStep`
|
||||||
|
was submitting one more resource than expected (i.e. 1001 records processed during a `$reindex` operation if only 1000
|
||||||
|
`Resources` were in the database). This has been corrected."
|
|
@ -0,0 +1,4 @@
|
||||||
|
---
|
||||||
|
type: fix
|
||||||
|
issue: 4271
|
||||||
|
title: "Database migration steps were failing with Oracle 19C. This has been fixed by allowing the database engine to skip dropping non-existent indexes."
|
|
@ -1,3 +1,3 @@
|
||||||
---
|
---
|
||||||
release-date: "2022-11-18"
|
release-date: "2022-11-17"
|
||||||
codename: "Vishwa"
|
codename: "Vishwa"
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
This version fixes a bug with 6.2.0 and previous releases wherein batch jobs that created very large chunk counts could occasionally fail to submit a small proportion of chunks.
|
|
@ -0,0 +1,3 @@
|
||||||
|
---
|
||||||
|
release-date: "2022-11-17"
|
||||||
|
codename: "Vishwa"
|
|
@ -11,7 +11,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
<modelVersion>4.0.0</modelVersion>
|
<modelVersion>4.0.0</modelVersion>
|
||||||
|
|
|
@ -184,6 +184,7 @@ public abstract class BaseHapiScheduler implements IHapiScheduler {
|
||||||
ScheduleBuilder<? extends Trigger> schedule = SimpleScheduleBuilder
|
ScheduleBuilder<? extends Trigger> schedule = SimpleScheduleBuilder
|
||||||
.simpleSchedule()
|
.simpleSchedule()
|
||||||
.withIntervalInMilliseconds(theIntervalMillis)
|
.withIntervalInMilliseconds(theIntervalMillis)
|
||||||
|
.withMisfireHandlingInstructionIgnoreMisfires()//We ignore misfires in cases of multiple JVMs each trying to fire.
|
||||||
.repeatForever();
|
.repeatForever();
|
||||||
|
|
||||||
Trigger trigger = TriggerBuilder.newTrigger()
|
Trigger trigger = TriggerBuilder.newTrigger()
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -270,6 +270,14 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
myWorkChunkRepository.incrementWorkChunkErrorCount(theChunkId, theIncrementBy);
|
myWorkChunkRepository.incrementWorkChunkErrorCount(theChunkId, theIncrementBy);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
||||||
|
public boolean canAdvanceInstanceToNextStep(String theInstanceId, String theCurrentStepId) {
|
||||||
|
List<StatusEnum> statusesForStep = myWorkChunkRepository.getDistinctStatusesForStep(theInstanceId, theCurrentStepId);
|
||||||
|
ourLog.debug("Checking whether gated job can advanced to next step. [instanceId={}, currentStepId={}, statusesForStep={}]", theInstanceId, theCurrentStepId, statusesForStep);
|
||||||
|
return statusesForStep.stream().noneMatch(StatusEnum::isIncomplete) && statusesForStep.stream().anyMatch(status -> status == StatusEnum.COMPLETED);
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Note: Not @Transactional because {@link #fetchChunks(String, boolean, int, int, Consumer)} starts a transaction
|
* Note: Not @Transactional because {@link #fetchChunks(String, boolean, int, int, Consumer)} starts a transaction
|
||||||
*/
|
*/
|
||||||
|
@ -289,6 +297,11 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public List<String> fetchallchunkidsforstepWithStatus(String theInstanceId, String theStepId, StatusEnum theStatusEnum) {
|
||||||
|
return myTxTemplate.execute(tx -> myWorkChunkRepository.fetchAllChunkIdsForStepWithStatus(theInstanceId, theStepId, theStatusEnum));
|
||||||
|
}
|
||||||
|
|
||||||
private void fetchChunksForStep(String theInstanceId, String theStepId, int thePageSize, int thePageIndex, Consumer<WorkChunk> theConsumer) {
|
private void fetchChunksForStep(String theInstanceId, String theStepId, int thePageSize, int thePageIndex, Consumer<WorkChunk> theConsumer) {
|
||||||
myTxTemplate.executeWithoutResult(tx -> {
|
myTxTemplate.executeWithoutResult(tx -> {
|
||||||
List<Batch2WorkChunkEntity> chunks = myWorkChunkRepository.fetchChunksForStep(PageRequest.of(thePageIndex, thePageSize), theInstanceId, theStepId);
|
List<Batch2WorkChunkEntity> chunks = myWorkChunkRepository.fetchChunksForStep(PageRequest.of(thePageIndex, thePageSize), theInstanceId, theStepId);
|
||||||
|
|
|
@ -36,6 +36,9 @@ public interface IBatch2WorkChunkRepository extends JpaRepository<Batch2WorkChun
|
||||||
@Query("SELECT e FROM Batch2WorkChunkEntity e WHERE e.myInstanceId = :instanceId ORDER BY e.mySequence ASC")
|
@Query("SELECT e FROM Batch2WorkChunkEntity e WHERE e.myInstanceId = :instanceId ORDER BY e.mySequence ASC")
|
||||||
List<Batch2WorkChunkEntity> fetchChunks(Pageable thePageRequest, @Param("instanceId") String theInstanceId);
|
List<Batch2WorkChunkEntity> fetchChunks(Pageable thePageRequest, @Param("instanceId") String theInstanceId);
|
||||||
|
|
||||||
|
@Query("SELECT DISTINCT e.myStatus from Batch2WorkChunkEntity e where e.myInstanceId = :instanceId AND e.myTargetStepId = :stepId")
|
||||||
|
List<StatusEnum> getDistinctStatusesForStep(@Param("instanceId") String theInstanceId, @Param("stepId") String theStepId);
|
||||||
|
|
||||||
@Query("SELECT e FROM Batch2WorkChunkEntity e WHERE e.myInstanceId = :instanceId AND e.myTargetStepId = :targetStepId ORDER BY e.mySequence ASC")
|
@Query("SELECT e FROM Batch2WorkChunkEntity e WHERE e.myInstanceId = :instanceId AND e.myTargetStepId = :targetStepId ORDER BY e.mySequence ASC")
|
||||||
List<Batch2WorkChunkEntity> fetchChunksForStep(Pageable thePageRequest, @Param("instanceId") String theInstanceId, @Param("targetStepId") String theTargetStepId);
|
List<Batch2WorkChunkEntity> fetchChunksForStep(Pageable thePageRequest, @Param("instanceId") String theInstanceId, @Param("targetStepId") String theTargetStepId);
|
||||||
|
|
||||||
|
@ -62,4 +65,10 @@ public interface IBatch2WorkChunkRepository extends JpaRepository<Batch2WorkChun
|
||||||
@Modifying
|
@Modifying
|
||||||
@Query("UPDATE Batch2WorkChunkEntity e SET e.myErrorCount = e.myErrorCount + :by WHERE e.myId = :id")
|
@Query("UPDATE Batch2WorkChunkEntity e SET e.myErrorCount = e.myErrorCount + :by WHERE e.myId = :id")
|
||||||
void incrementWorkChunkErrorCount(@Param("id") String theChunkId, @Param("by") int theIncrementBy);
|
void incrementWorkChunkErrorCount(@Param("id") String theChunkId, @Param("by") int theIncrementBy);
|
||||||
|
|
||||||
|
@Query("SELECT e.myId from Batch2WorkChunkEntity e where e.myInstanceId = :instanceId AND e.myTargetStepId = :stepId")
|
||||||
|
List<String> fetchAllChunkIdsForStep(@Param("instanceId") String theInstanceId, @Param("stepId") String theStepId);
|
||||||
|
|
||||||
|
@Query("SELECT e.myId from Batch2WorkChunkEntity e where e.myInstanceId = :instanceId AND e.myTargetStepId = :stepId AND e.myStatus = :status")
|
||||||
|
List<String> fetchAllChunkIdsForStepWithStatus(@Param("instanceId")String theInstanceId, @Param("stepId")String theStepId, @Param("status")StatusEnum theStatus);
|
||||||
}
|
}
|
||||||
|
|
|
@ -117,6 +117,11 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks<VersionEnum> {
|
||||||
.modifyColumn("20221017.1", "BLOB_SIZE")
|
.modifyColumn("20221017.1", "BLOB_SIZE")
|
||||||
.nullable()
|
.nullable()
|
||||||
.withType(ColumnTypeEnum.LONG);
|
.withType(ColumnTypeEnum.LONG);
|
||||||
|
|
||||||
|
version.onTable("HFJ_SPIDX_URI")
|
||||||
|
.modifyColumn("20221103.1", "SP_URI")
|
||||||
|
.nullable()
|
||||||
|
.withType(ColumnTypeEnum.STRING, 500);
|
||||||
}
|
}
|
||||||
|
|
||||||
private void init610() {
|
private void init610() {
|
||||||
|
@ -942,7 +947,7 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks<VersionEnum> {
|
||||||
//ConceptMap add version for search
|
//ConceptMap add version for search
|
||||||
Builder.BuilderWithTableName trmConceptMap = version.onTable("TRM_CONCEPT_MAP");
|
Builder.BuilderWithTableName trmConceptMap = version.onTable("TRM_CONCEPT_MAP");
|
||||||
trmConceptMap.addColumn("20200910.1", "VER").nullable().type(ColumnTypeEnum.STRING, 200);
|
trmConceptMap.addColumn("20200910.1", "VER").nullable().type(ColumnTypeEnum.STRING, 200);
|
||||||
trmConceptMap.dropIndex("20200910.2", "IDX_CONCEPT_MAP_URL");
|
trmConceptMap.dropIndex("20200910.2", "IDX_CONCEPT_MAP_URL").failureAllowed();
|
||||||
trmConceptMap.addIndex("20200910.3", "IDX_CONCEPT_MAP_URL").unique(true).withColumns("URL", "VER");
|
trmConceptMap.addIndex("20200910.3", "IDX_CONCEPT_MAP_URL").unique(true).withColumns("URL", "VER");
|
||||||
|
|
||||||
//Term CodeSystem Version and Term ValueSet Version
|
//Term CodeSystem Version and Term ValueSet Version
|
||||||
|
@ -950,13 +955,13 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks<VersionEnum> {
|
||||||
trmCodeSystemVer.addIndex("20200923.1", "IDX_CODESYSTEM_AND_VER").unique(true).withColumns("CODESYSTEM_PID", "CS_VERSION_ID");
|
trmCodeSystemVer.addIndex("20200923.1", "IDX_CODESYSTEM_AND_VER").unique(true).withColumns("CODESYSTEM_PID", "CS_VERSION_ID");
|
||||||
Builder.BuilderWithTableName trmValueSet = version.onTable("TRM_VALUESET");
|
Builder.BuilderWithTableName trmValueSet = version.onTable("TRM_VALUESET");
|
||||||
trmValueSet.addColumn("20200923.2", "VER").nullable().type(ColumnTypeEnum.STRING, 200);
|
trmValueSet.addColumn("20200923.2", "VER").nullable().type(ColumnTypeEnum.STRING, 200);
|
||||||
trmValueSet.dropIndex("20200923.3", "IDX_VALUESET_URL");
|
trmValueSet.dropIndex("20200923.3", "IDX_VALUESET_URL").failureAllowed();
|
||||||
trmValueSet.addIndex("20200923.4", "IDX_VALUESET_URL").unique(true).withColumns("URL", "VER");
|
trmValueSet.addIndex("20200923.4", "IDX_VALUESET_URL").unique(true).withColumns("URL", "VER");
|
||||||
|
|
||||||
//Term ValueSet Component add system version
|
//Term ValueSet Component add system version
|
||||||
Builder.BuilderWithTableName trmValueSetComp = version.onTable("TRM_VALUESET_CONCEPT");
|
Builder.BuilderWithTableName trmValueSetComp = version.onTable("TRM_VALUESET_CONCEPT");
|
||||||
trmValueSetComp.addColumn("20201028.1", "SYSTEM_VER").nullable().type(ColumnTypeEnum.STRING, 200);
|
trmValueSetComp.addColumn("20201028.1", "SYSTEM_VER").nullable().type(ColumnTypeEnum.STRING, 200);
|
||||||
trmValueSetComp.dropIndex("20201028.2", "IDX_VS_CONCEPT_CS_CD");
|
trmValueSetComp.dropIndex("20201028.2", "IDX_VS_CONCEPT_CS_CD").failureAllowed();
|
||||||
trmValueSetComp.addIndex("20201028.3", "IDX_VS_CONCEPT_CS_CODE").unique(true).withColumns("VALUESET_PID", "SYSTEM_URL", "SYSTEM_VER", "CODEVAL").doNothing();
|
trmValueSetComp.addIndex("20201028.3", "IDX_VS_CONCEPT_CS_CODE").unique(true).withColumns("VALUESET_PID", "SYSTEM_URL", "SYSTEM_VER", "CODEVAL").doNothing();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -93,6 +93,72 @@ public class MdmSearchExpandingInterceptorIT extends BaseMdmR4Test {
|
||||||
return createdResourceIds;
|
return createdResourceIds;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private List<String> updateAndLinkNewResources(int theResourceCount) throws InterruptedException {
|
||||||
|
boolean expansion = myDaoConfig.isAllowMdmExpansion();
|
||||||
|
myDaoConfig.setAllowMdmExpansion(false);
|
||||||
|
List<String> createdResourceIds = new ArrayList<>();
|
||||||
|
|
||||||
|
List<String> observationIds = new ArrayList<>();
|
||||||
|
for (int i = 0; i < theResourceCount; i++) {
|
||||||
|
// create patient
|
||||||
|
Patient patient = buildJanePatient();
|
||||||
|
patient.setId("jane-" + i);
|
||||||
|
MdmHelperR4.OutcomeAndLogMessageWrapper withLatch = myMdmHelper.updateWithLatch(addExternalEID(patient, "123"));
|
||||||
|
createdResourceIds.add(withLatch.getDaoMethodOutcome().getId().getIdPart());
|
||||||
|
|
||||||
|
// create observation with patient
|
||||||
|
Observation observation = createObservationWithSubject(createdResourceIds.get(i));
|
||||||
|
// we put the observation ids in a separate list so we can
|
||||||
|
// ensure our returned list is
|
||||||
|
// patient ids, followed by observation ids
|
||||||
|
observationIds.add(observation.getIdElement().getIdPart());
|
||||||
|
}
|
||||||
|
|
||||||
|
assertLinkCount(theResourceCount);
|
||||||
|
|
||||||
|
// add in our observationIds
|
||||||
|
createdResourceIds.addAll(observationIds);
|
||||||
|
|
||||||
|
myDaoConfig.setAllowMdmExpansion(expansion);
|
||||||
|
return createdResourceIds;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testReferenceExpansionWorksWithForcedIds() throws InterruptedException {
|
||||||
|
int resourceCount = 4;
|
||||||
|
List<String> ids = updateAndLinkNewResources(resourceCount);
|
||||||
|
String id = ids.get(0);
|
||||||
|
|
||||||
|
SearchParameterMap searchParameterMap = new SearchParameterMap();
|
||||||
|
searchParameterMap.setLoadSynchronous(true);
|
||||||
|
ReferenceOrListParam referenceOrListParam = new ReferenceOrListParam();
|
||||||
|
referenceOrListParam.addOr(new ReferenceParam("Patient/" + id).setMdmExpand(true));
|
||||||
|
searchParameterMap.add(Observation.SP_SUBJECT, referenceOrListParam);
|
||||||
|
|
||||||
|
//With MDM Expansion disabled, this should return 1 result.
|
||||||
|
myDaoConfig.setAllowMdmExpansion(false);
|
||||||
|
IBundleProvider search = myObservationDao.search(searchParameterMap);
|
||||||
|
assertThat(search.size(), is(equalTo(1)));
|
||||||
|
|
||||||
|
//Once MDM Expansion is allowed, this should now return 4 resourecs.
|
||||||
|
myDaoConfig.setAllowMdmExpansion(true);
|
||||||
|
search = myObservationDao.search(searchParameterMap);
|
||||||
|
assertThat(search.size(), is(equalTo(4)));
|
||||||
|
List<MdmLink> all = myMdmLinkDao.findAll();
|
||||||
|
Long goldenPid = all.get(0).getGoldenResourcePid();
|
||||||
|
IIdType goldenId = myIdHelperService.translatePidIdToForcedId(myFhirContext, "Patient", new ResourcePersistentId(goldenPid));
|
||||||
|
//Verify that expansion by the golden resource id resolves down to everything its links have.
|
||||||
|
|
||||||
|
SearchParameterMap goldenSpMap = new SearchParameterMap();
|
||||||
|
goldenSpMap.setLoadSynchronous(true);
|
||||||
|
ReferenceOrListParam goldenReferenceOrListParam = new ReferenceOrListParam();
|
||||||
|
goldenReferenceOrListParam.addOr(new ReferenceParam(goldenId).setMdmExpand(true));
|
||||||
|
goldenSpMap.add(Observation.SP_SUBJECT, goldenReferenceOrListParam);
|
||||||
|
|
||||||
|
search = myObservationDao.search(goldenSpMap);
|
||||||
|
assertThat(search.size(), is(equalTo(resourceCount)));
|
||||||
|
}
|
||||||
@Test
|
@Test
|
||||||
public void testReferenceExpansionWorks() throws InterruptedException {
|
public void testReferenceExpansionWorks() throws InterruptedException {
|
||||||
int resourceCount = 4;
|
int resourceCount = 4;
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -59,9 +59,11 @@ import static org.apache.commons.lang3.StringUtils.defaultString;
|
||||||
public class ResourceIndexedSearchParamUri extends BaseResourceIndexedSearchParam {
|
public class ResourceIndexedSearchParamUri extends BaseResourceIndexedSearchParam {
|
||||||
|
|
||||||
/*
|
/*
|
||||||
* Note that MYSQL chokes on unique indexes for lengths > 255 so be careful here
|
* Be careful when modifying this value
|
||||||
|
* MySQL chokes on indexes with combined column length greater than 3052 bytes (768 chars)
|
||||||
|
* https://dev.mysql.com/doc/refman/8.0/en/innodb-limits.html
|
||||||
*/
|
*/
|
||||||
public static final int MAX_LENGTH = 254;
|
public static final int MAX_LENGTH = 500;
|
||||||
|
|
||||||
private static final long serialVersionUID = 1L;
|
private static final long serialVersionUID = 1L;
|
||||||
@Column(name = "SP_URI", nullable = true, length = MAX_LENGTH)
|
@Column(name = "SP_URI", nullable = true, length = MAX_LENGTH)
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -37,6 +37,7 @@ import ca.uhn.fhir.rest.param.TokenParam;
|
||||||
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
|
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
|
||||||
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
|
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
|
||||||
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
|
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
|
||||||
|
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
|
||||||
import org.hl7.fhir.instance.model.api.IIdType;
|
import org.hl7.fhir.instance.model.api.IIdType;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -130,6 +130,7 @@ import org.junit.jupiter.api.BeforeEach;
|
||||||
import org.junit.jupiter.api.Disabled;
|
import org.junit.jupiter.api.Disabled;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.http.HttpStatus;
|
||||||
import org.springframework.test.util.AopTestUtils;
|
import org.springframework.test.util.AopTestUtils;
|
||||||
import org.springframework.transaction.TransactionStatus;
|
import org.springframework.transaction.TransactionStatus;
|
||||||
import org.springframework.transaction.support.TransactionCallback;
|
import org.springframework.transaction.support.TransactionCallback;
|
||||||
|
@ -4823,6 +4824,33 @@ public class ResourceProviderDstu3Test extends BaseResourceProviderDstu3Test {
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testDocumentReferenceWith500CharAttachmentUrl() throws IOException {
|
||||||
|
final DocumentReference.ReferredDocumentStatus docStatus = DocumentReference.ReferredDocumentStatus.FINAL;
|
||||||
|
final String longUrl = StringUtils.repeat("a", 500);
|
||||||
|
|
||||||
|
DocumentReference submittedDocumentReference = new DocumentReference();
|
||||||
|
submittedDocumentReference.setDocStatus(docStatus);
|
||||||
|
|
||||||
|
Attachment attachment = new Attachment();
|
||||||
|
attachment.setUrl(longUrl);
|
||||||
|
submittedDocumentReference.getContentFirstRep().setAttachment(attachment);
|
||||||
|
|
||||||
|
String json = myFhirContext.newJsonParser().encodeResourceToString(submittedDocumentReference);
|
||||||
|
HttpPost post = new HttpPost(myServerBase + "/DocumentReference");
|
||||||
|
post.setEntity(new StringEntity(json, ContentType.create(Constants.CT_FHIR_JSON, "UTF-8")));
|
||||||
|
|
||||||
|
try (CloseableHttpResponse response = ourHttpClient.execute(post)) {
|
||||||
|
String resp = IOUtils.toString(response.getEntity().getContent(), StandardCharsets.UTF_8);
|
||||||
|
ourLog.info(resp);
|
||||||
|
assertEquals(HttpStatus.CREATED.value(), response.getStatusLine().getStatusCode());
|
||||||
|
|
||||||
|
DocumentReference createdDocumentReferenced = myFhirContext.newJsonParser().parseResource(DocumentReference.class, resp);
|
||||||
|
assertEquals(docStatus, createdDocumentReferenced.getDocStatus());
|
||||||
|
assertEquals(longUrl, createdDocumentReferenced.getContentFirstRep().getAttachment().getUrl());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private String toStr(Date theDate) {
|
private String toStr(Date theDate) {
|
||||||
return new InstantDt(theDate).getValueAsString();
|
return new InstantDt(theDate).getValueAsString();
|
||||||
}
|
}
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -305,6 +305,39 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
assertEquals(1, chunks.size());
|
assertEquals(1, chunks.size());
|
||||||
assertEquals(5, chunks.get(0).getErrorCount());
|
assertEquals(5, chunks.get(0).getErrorCount());
|
||||||
}
|
}
|
||||||
|
@Test
|
||||||
|
public void testGatedAdvancementByStatus() {
|
||||||
|
// Setup
|
||||||
|
JobInstance instance = createInstance();
|
||||||
|
String instanceId = mySvc.storeNewInstance(instance);
|
||||||
|
String chunkId = storeWorkChunk(DEF_CHUNK_ID, STEP_CHUNK_ID, instanceId, SEQUENCE_NUMBER, null);
|
||||||
|
mySvc.markWorkChunkAsCompletedAndClearData(chunkId, 0);
|
||||||
|
|
||||||
|
boolean canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
|
assertTrue(canAdvance);
|
||||||
|
|
||||||
|
//Storing a new chunk with QUEUED should prevent advancement.
|
||||||
|
String newChunkId = storeWorkChunk(DEF_CHUNK_ID, STEP_CHUNK_ID, instanceId, SEQUENCE_NUMBER, null);
|
||||||
|
|
||||||
|
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
|
assertFalse(canAdvance);
|
||||||
|
|
||||||
|
//Toggle it to complete
|
||||||
|
mySvc.markWorkChunkAsCompletedAndClearData(newChunkId, 0);
|
||||||
|
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
|
assertTrue(canAdvance);
|
||||||
|
|
||||||
|
//Create a new chunk and set it in progress.
|
||||||
|
String newerChunkId= storeWorkChunk(DEF_CHUNK_ID, STEP_CHUNK_ID, instanceId, SEQUENCE_NUMBER, null);
|
||||||
|
mySvc.fetchWorkChunkSetStartTimeAndMarkInProgress(newerChunkId);
|
||||||
|
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
|
assertFalse(canAdvance);
|
||||||
|
|
||||||
|
//Toggle IN_PROGRESS to complete
|
||||||
|
mySvc.markWorkChunkAsCompletedAndClearData(newerChunkId, 0);
|
||||||
|
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
|
assertTrue(canAdvance);
|
||||||
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testMarkChunkAsCompleted_Error() {
|
public void testMarkChunkAsCompleted_Error() {
|
||||||
|
|
|
@ -43,6 +43,7 @@ import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.nio.charset.StandardCharsets;
|
import java.nio.charset.StandardCharsets;
|
||||||
|
import java.util.Arrays;
|
||||||
import java.util.Collections;
|
import java.util.Collections;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.HashSet;
|
import java.util.HashSet;
|
||||||
|
@ -59,6 +60,7 @@ import static org.hamcrest.MatcherAssert.assertThat;
|
||||||
import static org.hamcrest.Matchers.containsString;
|
import static org.hamcrest.Matchers.containsString;
|
||||||
import static org.hamcrest.Matchers.empty;
|
import static org.hamcrest.Matchers.empty;
|
||||||
import static org.hamcrest.Matchers.equalTo;
|
import static org.hamcrest.Matchers.equalTo;
|
||||||
|
import static org.hamcrest.Matchers.hasItem;
|
||||||
import static org.hamcrest.Matchers.hasSize;
|
import static org.hamcrest.Matchers.hasSize;
|
||||||
import static org.hamcrest.Matchers.not;
|
import static org.hamcrest.Matchers.not;
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
|
|
|
@ -40,6 +40,11 @@ import org.mockito.junit.jupiter.MockitoExtension;
|
||||||
import javax.persistence.EntityManager;
|
import javax.persistence.EntityManager;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
||||||
import static org.junit.jupiter.api.Assertions.fail;
|
import static org.junit.jupiter.api.Assertions.fail;
|
||||||
|
|
|
@ -332,8 +332,7 @@ public class FhirResourceDaoR4ConcurrentWriteTest extends BaseJpaR4Test {
|
||||||
@Test
|
@Test
|
||||||
public void testCreateWithClientAssignedId() {
|
public void testCreateWithClientAssignedId() {
|
||||||
myInterceptorRegistry.registerInterceptor(myRetryInterceptor);
|
myInterceptorRegistry.registerInterceptor(myRetryInterceptor);
|
||||||
String value = UserRequestRetryVersionConflictsInterceptor.RETRY + "; " + UserRequestRetryVersionConflictsInterceptor.MAX_RETRIES + "=10";
|
setupRetryBehaviour(mySrd);
|
||||||
when(mySrd.getHeaders(eq(UserRequestRetryVersionConflictsInterceptor.HEADER_NAME))).thenReturn(Collections.singletonList(value));
|
|
||||||
|
|
||||||
List<Future<?>> futures = new ArrayList<>();
|
List<Future<?>> futures = new ArrayList<>();
|
||||||
for (int i = 0; i < 5; i++) {
|
for (int i = 0; i < 5; i++) {
|
||||||
|
@ -502,8 +501,7 @@ public class FhirResourceDaoR4ConcurrentWriteTest extends BaseJpaR4Test {
|
||||||
@Test
|
@Test
|
||||||
public void testDelete() {
|
public void testDelete() {
|
||||||
myInterceptorRegistry.registerInterceptor(myRetryInterceptor);
|
myInterceptorRegistry.registerInterceptor(myRetryInterceptor);
|
||||||
String value = UserRequestRetryVersionConflictsInterceptor.RETRY + "; " + UserRequestRetryVersionConflictsInterceptor.MAX_RETRIES + "=100";
|
setupRetryBehaviour(mySrd);
|
||||||
when(mySrd.getHeaders(eq(UserRequestRetryVersionConflictsInterceptor.HEADER_NAME))).thenReturn(Collections.singletonList(value));
|
|
||||||
|
|
||||||
IIdType patientId = runInTransaction(() -> {
|
IIdType patientId = runInTransaction(() -> {
|
||||||
Patient p = new Patient();
|
Patient p = new Patient();
|
||||||
|
@ -664,8 +662,7 @@ public class FhirResourceDaoR4ConcurrentWriteTest extends BaseJpaR4Test {
|
||||||
@Test
|
@Test
|
||||||
public void testPatch() {
|
public void testPatch() {
|
||||||
myInterceptorRegistry.registerInterceptor(myRetryInterceptor);
|
myInterceptorRegistry.registerInterceptor(myRetryInterceptor);
|
||||||
String value = UserRequestRetryVersionConflictsInterceptor.RETRY + "; " + UserRequestRetryVersionConflictsInterceptor.MAX_RETRIES + "=10";
|
setupRetryBehaviour(mySrd);
|
||||||
when(mySrd.getHeaders(eq(UserRequestRetryVersionConflictsInterceptor.HEADER_NAME))).thenReturn(Collections.singletonList(value));
|
|
||||||
|
|
||||||
Patient p = new Patient();
|
Patient p = new Patient();
|
||||||
p.addName().setFamily("FAMILY");
|
p.addName().setFamily("FAMILY");
|
||||||
|
@ -719,8 +716,7 @@ public class FhirResourceDaoR4ConcurrentWriteTest extends BaseJpaR4Test {
|
||||||
myInterceptorRegistry.registerInterceptor(myRetryInterceptor);
|
myInterceptorRegistry.registerInterceptor(myRetryInterceptor);
|
||||||
|
|
||||||
ServletRequestDetails srd = mock(ServletRequestDetails.class);
|
ServletRequestDetails srd = mock(ServletRequestDetails.class);
|
||||||
String value = UserRequestRetryVersionConflictsInterceptor.RETRY + "; " + UserRequestRetryVersionConflictsInterceptor.MAX_RETRIES + "=10";
|
setupRetryBehaviour(srd);
|
||||||
when(srd.getHeaders(eq(UserRequestRetryVersionConflictsInterceptor.HEADER_NAME))).thenReturn(Collections.singletonList(value));
|
|
||||||
when(srd.getUserData()).thenReturn(new HashMap<>());
|
when(srd.getUserData()).thenReturn(new HashMap<>());
|
||||||
when(srd.getServer()).thenReturn(new RestfulServer(myFhirContext));
|
when(srd.getServer()).thenReturn(new RestfulServer(myFhirContext));
|
||||||
when(srd.getInterceptorBroadcaster()).thenReturn(new InterceptorService());
|
when(srd.getInterceptorBroadcaster()).thenReturn(new InterceptorService());
|
||||||
|
@ -772,8 +768,7 @@ public class FhirResourceDaoR4ConcurrentWriteTest extends BaseJpaR4Test {
|
||||||
myDaoConfig.setDeleteEnabled(false);
|
myDaoConfig.setDeleteEnabled(false);
|
||||||
|
|
||||||
ServletRequestDetails srd = mock(ServletRequestDetails.class);
|
ServletRequestDetails srd = mock(ServletRequestDetails.class);
|
||||||
String value = UserRequestRetryVersionConflictsInterceptor.RETRY + "; " + UserRequestRetryVersionConflictsInterceptor.MAX_RETRIES + "=10";
|
setupRetryBehaviour(srd);
|
||||||
when(srd.getHeaders(eq(UserRequestRetryVersionConflictsInterceptor.HEADER_NAME))).thenReturn(Collections.singletonList(value));
|
|
||||||
when(srd.getUserData()).thenReturn(new HashMap<>());
|
when(srd.getUserData()).thenReturn(new HashMap<>());
|
||||||
when(srd.getServer()).thenReturn(new RestfulServer(myFhirContext));
|
when(srd.getServer()).thenReturn(new RestfulServer(myFhirContext));
|
||||||
when(srd.getInterceptorBroadcaster()).thenReturn(new InterceptorService());
|
when(srd.getInterceptorBroadcaster()).thenReturn(new InterceptorService());
|
||||||
|
@ -824,4 +819,9 @@ public class FhirResourceDaoR4ConcurrentWriteTest extends BaseJpaR4Test {
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private void setupRetryBehaviour(ServletRequestDetails theServletRequestDetails) {
|
||||||
|
when(theServletRequestDetails.isRetry()).thenReturn(true);
|
||||||
|
when(theServletRequestDetails.getMaxRetries()).thenReturn(10);
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -11,14 +11,20 @@ import ca.uhn.fhir.interceptor.api.Pointcut;
|
||||||
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
|
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
|
||||||
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
||||||
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
|
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
|
||||||
|
import ca.uhn.fhir.jpa.test.PatientReindexTestHelper;
|
||||||
import org.hl7.fhir.instance.model.api.IIdType;
|
import org.hl7.fhir.instance.model.api.IIdType;
|
||||||
import org.hl7.fhir.r4.model.Observation;
|
import org.hl7.fhir.r4.model.Observation;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
|
import org.junit.jupiter.api.Disabled;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.junit.jupiter.params.ParameterizedTest;
|
||||||
|
import org.junit.jupiter.params.provider.Arguments;
|
||||||
|
import org.junit.jupiter.params.provider.MethodSource;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
|
||||||
import javax.annotation.PostConstruct;
|
import javax.annotation.PostConstruct;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
import java.util.stream.Stream;
|
||||||
|
|
||||||
import static org.hamcrest.MatcherAssert.assertThat;
|
import static org.hamcrest.MatcherAssert.assertThat;
|
||||||
import static org.hamcrest.Matchers.hasSize;
|
import static org.hamcrest.Matchers.hasSize;
|
||||||
|
@ -30,10 +36,13 @@ public class ReindexJobTest extends BaseJpaR4Test {
|
||||||
private IJobCoordinator myJobCoordinator;
|
private IJobCoordinator myJobCoordinator;
|
||||||
|
|
||||||
private ReindexTestHelper myReindexTestHelper;
|
private ReindexTestHelper myReindexTestHelper;
|
||||||
|
private PatientReindexTestHelper myPatientReindexTestHelper;
|
||||||
|
|
||||||
@PostConstruct
|
@PostConstruct
|
||||||
public void postConstruct() {
|
public void postConstruct() {
|
||||||
myReindexTestHelper = new ReindexTestHelper(myFhirContext, myDaoRegistry, mySearchParamRegistry);
|
myReindexTestHelper = new ReindexTestHelper(myFhirContext, myDaoRegistry, mySearchParamRegistry);
|
||||||
|
boolean incrementVersionAfterReindex = false;
|
||||||
|
myPatientReindexTestHelper = new PatientReindexTestHelper(myJobCoordinator, myBatch2JobHelper, myPatientDao, incrementVersionAfterReindex);
|
||||||
}
|
}
|
||||||
|
|
||||||
@AfterEach
|
@AfterEach
|
||||||
|
@ -167,4 +176,29 @@ public class ReindexJobTest extends BaseJpaR4Test {
|
||||||
assertEquals("java.lang.Error: foo message", outcome.getErrorMessage());
|
assertEquals("java.lang.Error: foo message", outcome.getErrorMessage());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private static Stream<Arguments> numResourcesParams(){
|
||||||
|
return PatientReindexTestHelper.numResourcesParams();
|
||||||
|
}
|
||||||
|
|
||||||
|
@ParameterizedTest
|
||||||
|
@MethodSource("numResourcesParams")
|
||||||
|
@Disabled//TODO Nathan, this is failing intermittently in CI.
|
||||||
|
public void testReindex(int theNumResources){
|
||||||
|
myPatientReindexTestHelper.testReindex(theNumResources);
|
||||||
|
}
|
||||||
|
|
||||||
|
@ParameterizedTest
|
||||||
|
@MethodSource("numResourcesParams")
|
||||||
|
@Disabled//TODO Nathan, this is failing intermittently in CI.
|
||||||
|
public void testSequentialReindexOperation(int theNumResources){
|
||||||
|
myPatientReindexTestHelper.testSequentialReindexOperation(theNumResources);
|
||||||
|
}
|
||||||
|
|
||||||
|
@ParameterizedTest
|
||||||
|
@MethodSource("numResourcesParams")
|
||||||
|
@Disabled//TODO Nathan, this is failing intermittently in CI.
|
||||||
|
public void testParallelReindexOperation(int theNumResources){
|
||||||
|
myPatientReindexTestHelper.testParallelReindexOperation(theNumResources);
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -165,6 +165,7 @@ import org.springframework.transaction.support.TransactionCallbackWithoutResult;
|
||||||
import org.springframework.transaction.support.TransactionTemplate;
|
import org.springframework.transaction.support.TransactionTemplate;
|
||||||
|
|
||||||
import javax.annotation.Nonnull;
|
import javax.annotation.Nonnull;
|
||||||
|
import javax.sql.DataSource;
|
||||||
import java.io.BufferedReader;
|
import java.io.BufferedReader;
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.io.InputStreamReader;
|
import java.io.InputStreamReader;
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
@ -144,6 +144,11 @@
|
||||||
<artifactId>junit-jupiter-engine</artifactId>
|
<artifactId>junit-jupiter-engine</artifactId>
|
||||||
<scope>compile</scope>
|
<scope>compile</scope>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.junit.jupiter</groupId>
|
||||||
|
<artifactId>junit-jupiter-params</artifactId>
|
||||||
|
<scope>compile</scope>
|
||||||
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.hamcrest</groupId>
|
<groupId>org.hamcrest</groupId>
|
||||||
<artifactId>hamcrest</artifactId>
|
<artifactId>hamcrest</artifactId>
|
||||||
|
|
|
@ -0,0 +1,185 @@
|
||||||
|
package ca.uhn.fhir.jpa.test;
|
||||||
|
|
||||||
|
/*-
|
||||||
|
* #%L
|
||||||
|
* HAPI FHIR JPA Server Test Utilities
|
||||||
|
* %%
|
||||||
|
* Copyright (C) 2014 - 2022 Smile CDR, Inc.
|
||||||
|
* %%
|
||||||
|
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
* you may not use this file except in compliance with the License.
|
||||||
|
* You may obtain a copy of the License at
|
||||||
|
*
|
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
*
|
||||||
|
* Unless required by applicable law or agreed to in writing, software
|
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
* See the License for the specific language governing permissions and
|
||||||
|
* limitations under the License.
|
||||||
|
* #L%
|
||||||
|
*/
|
||||||
|
|
||||||
|
import ca.uhn.fhir.batch2.api.IJobCoordinator;
|
||||||
|
import ca.uhn.fhir.batch2.jobs.reindex.ReindexAppCtx;
|
||||||
|
import ca.uhn.fhir.batch2.jobs.reindex.ReindexJobParameters;
|
||||||
|
import ca.uhn.fhir.batch2.model.JobInstance;
|
||||||
|
import ca.uhn.fhir.batch2.model.JobInstanceStartRequest;
|
||||||
|
import ca.uhn.fhir.batch2.model.StatusEnum;
|
||||||
|
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
|
||||||
|
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
|
||||||
|
import ca.uhn.fhir.jpa.partition.SystemRequestDetails;
|
||||||
|
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
||||||
|
import ca.uhn.fhir.jpa.util.TestUtil;
|
||||||
|
import ca.uhn.fhir.rest.api.server.RequestDetails;
|
||||||
|
import org.hl7.fhir.instance.model.api.IBaseResource;
|
||||||
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
|
import org.junit.jupiter.params.provider.Arguments;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.stream.Stream;
|
||||||
|
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
|
import static org.junit.jupiter.api.Assertions.fail;
|
||||||
|
|
||||||
|
public class PatientReindexTestHelper {
|
||||||
|
|
||||||
|
private static final int VERSION_1 = 1;
|
||||||
|
private static final int VERSION_2 = 2;
|
||||||
|
private static final int VERSION_3 = 3;
|
||||||
|
public static final int JOB_WAIT_TIME = 60;
|
||||||
|
|
||||||
|
private final IJobCoordinator myJobCoordinator;
|
||||||
|
private final Batch2JobHelper myBatch2JobHelper;
|
||||||
|
private final IFhirResourceDao<Patient> myPatientDao;
|
||||||
|
private final boolean myIncrementVersionOnReindex;
|
||||||
|
|
||||||
|
public static Stream<Arguments> numResourcesParams(){
|
||||||
|
return Stream.of(
|
||||||
|
Arguments.of(0),
|
||||||
|
Arguments.of(1),
|
||||||
|
Arguments.of(499),
|
||||||
|
Arguments.of(500),
|
||||||
|
Arguments.of(750),
|
||||||
|
Arguments.of(1000),
|
||||||
|
Arguments.of(1001)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
public PatientReindexTestHelper(IJobCoordinator theJobCoordinator, Batch2JobHelper theBatch2JobHelper, IFhirResourceDao<Patient> thePatientDao, boolean theIncrementVersionOnReindex) {
|
||||||
|
myJobCoordinator = theJobCoordinator;
|
||||||
|
myBatch2JobHelper = theBatch2JobHelper;
|
||||||
|
myPatientDao = thePatientDao;
|
||||||
|
myIncrementVersionOnReindex = theIncrementVersionOnReindex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void testReindex(int theNumResources){
|
||||||
|
createPatients(theNumResources);
|
||||||
|
|
||||||
|
validatePersistedPatients(theNumResources, VERSION_1);
|
||||||
|
|
||||||
|
// Reindex 1
|
||||||
|
JobInstanceStartRequest reindexRequest1 = createPatientReindexRequest(theNumResources);
|
||||||
|
Batch2JobStartResponse reindexResponse1 = myJobCoordinator.startInstance(reindexRequest1);
|
||||||
|
JobInstance instance1 = myBatch2JobHelper.awaitJobHasStatus(reindexResponse1.getJobId(), JOB_WAIT_TIME, StatusEnum.COMPLETED);
|
||||||
|
|
||||||
|
validateReindexJob(instance1, theNumResources);
|
||||||
|
|
||||||
|
int expectedVersion = myIncrementVersionOnReindex ? VERSION_2 : VERSION_1;
|
||||||
|
validatePersistedPatients(theNumResources, expectedVersion);
|
||||||
|
}
|
||||||
|
|
||||||
|
public void testSequentialReindexOperation(int theNumResources){
|
||||||
|
createPatients(theNumResources);
|
||||||
|
|
||||||
|
validatePersistedPatients(theNumResources, VERSION_1);
|
||||||
|
|
||||||
|
// Reindex 1
|
||||||
|
JobInstanceStartRequest reindexRequest1 = createPatientReindexRequest(theNumResources);
|
||||||
|
Batch2JobStartResponse reindexResponse1 = myJobCoordinator.startInstance(reindexRequest1);
|
||||||
|
JobInstance instance1 = myBatch2JobHelper.awaitJobHasStatus(reindexResponse1.getJobId(), JOB_WAIT_TIME, StatusEnum.COMPLETED);
|
||||||
|
|
||||||
|
validateReindexJob(instance1, theNumResources);
|
||||||
|
int expectedVersion = myIncrementVersionOnReindex ? VERSION_2 : VERSION_1;
|
||||||
|
validatePersistedPatients(theNumResources, expectedVersion);
|
||||||
|
|
||||||
|
// Reindex 2
|
||||||
|
JobInstanceStartRequest reindexRequest2 = createPatientReindexRequest(theNumResources);
|
||||||
|
Batch2JobStartResponse reindexResponse2 = myJobCoordinator.startInstance(reindexRequest2);
|
||||||
|
JobInstance instance2 = myBatch2JobHelper.awaitJobHasStatus(reindexResponse2.getJobId(), JOB_WAIT_TIME, StatusEnum.COMPLETED);
|
||||||
|
|
||||||
|
validateReindexJob(instance2, theNumResources);
|
||||||
|
expectedVersion = myIncrementVersionOnReindex ? VERSION_3 : VERSION_1;
|
||||||
|
validatePersistedPatients(theNumResources, expectedVersion);
|
||||||
|
}
|
||||||
|
|
||||||
|
public void testParallelReindexOperation(int theNumResources){
|
||||||
|
createPatients(theNumResources);
|
||||||
|
|
||||||
|
validatePersistedPatients(theNumResources, VERSION_1);
|
||||||
|
|
||||||
|
// Reindex 1
|
||||||
|
JobInstanceStartRequest reindexRequest1 = createPatientReindexRequest(theNumResources);
|
||||||
|
Batch2JobStartResponse reindexResponse1 = myJobCoordinator.startInstance(reindexRequest1);
|
||||||
|
|
||||||
|
// Reindex 2
|
||||||
|
JobInstanceStartRequest reindexRequest2 = createPatientReindexRequest(theNumResources);
|
||||||
|
Batch2JobStartResponse reindexResponse2 = myJobCoordinator.startInstance(reindexRequest2);
|
||||||
|
|
||||||
|
// Wait for jobs to finish
|
||||||
|
JobInstance instance1 = myBatch2JobHelper.awaitJobHasStatus(reindexResponse1.getJobId(), JOB_WAIT_TIME, StatusEnum.COMPLETED);
|
||||||
|
JobInstance instance2 = myBatch2JobHelper.awaitJobHasStatus(reindexResponse2.getJobId(), JOB_WAIT_TIME, StatusEnum.COMPLETED);
|
||||||
|
|
||||||
|
validateReindexJob(instance1, theNumResources);
|
||||||
|
|
||||||
|
validateReindexJob(instance2, theNumResources);
|
||||||
|
|
||||||
|
int expectedVersion = myIncrementVersionOnReindex ? VERSION_3 : VERSION_1;
|
||||||
|
validatePersistedPatients(theNumResources, expectedVersion);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
private void createPatients(int theNumPatients) {
|
||||||
|
RequestDetails requestDetails = new SystemRequestDetails();
|
||||||
|
for(int i = 0; i < theNumPatients; i++){
|
||||||
|
Patient patient = new Patient();
|
||||||
|
patient.getNameFirstRep().setFamily("Family-"+i).addGiven("Given-"+i);
|
||||||
|
patient.getIdentifierFirstRep().setValue("Id-"+i);
|
||||||
|
myPatientDao.create(patient, requestDetails);
|
||||||
|
}
|
||||||
|
TestUtil.sleepOneClick();
|
||||||
|
}
|
||||||
|
|
||||||
|
private void validatePersistedPatients(int theExpectedNumPatients, long theExpectedVersion) {
|
||||||
|
RequestDetails requestDetails = new SystemRequestDetails();
|
||||||
|
List<IBaseResource> resources = myPatientDao.search(SearchParameterMap.newSynchronous(), requestDetails).getAllResources();
|
||||||
|
assertEquals(theExpectedNumPatients, resources.size());
|
||||||
|
for(IBaseResource resource : resources){
|
||||||
|
assertEquals(Patient.class, resource.getClass());
|
||||||
|
Patient patient = (Patient) resource;
|
||||||
|
Long actualVersion = patient.getIdElement().getVersionIdPartAsLong();
|
||||||
|
if(theExpectedVersion != actualVersion){
|
||||||
|
String failureMessage = String.format("Failure for Resource [%s] with index [%s]. Expected version: %s, Actual version: %s",
|
||||||
|
patient.getId(), resources.indexOf(resource), theExpectedVersion, actualVersion);
|
||||||
|
fail(failureMessage);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private JobInstanceStartRequest createPatientReindexRequest(int theBatchSize) {
|
||||||
|
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
|
||||||
|
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
|
||||||
|
|
||||||
|
ReindexJobParameters reindexJobParameters = new ReindexJobParameters();
|
||||||
|
reindexJobParameters.setBatchSize(theBatchSize);
|
||||||
|
reindexJobParameters.addUrl("Patient?");
|
||||||
|
|
||||||
|
startRequest.setParameters(reindexJobParameters);
|
||||||
|
return startRequest;
|
||||||
|
}
|
||||||
|
|
||||||
|
private void validateReindexJob(JobInstance theJobInstance, int theRecordsProcessed) {
|
||||||
|
assertEquals(0, theJobInstance.getErrorCount());
|
||||||
|
assertEquals(theRecordsProcessed, theJobInstance.getCombinedRecordsProcessed());
|
||||||
|
}
|
||||||
|
}
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -25,7 +25,6 @@ import ca.uhn.fhir.interceptor.api.Interceptor;
|
||||||
import ca.uhn.fhir.interceptor.api.Pointcut;
|
import ca.uhn.fhir.interceptor.api.Pointcut;
|
||||||
import ca.uhn.fhir.jpa.api.config.DaoConfig;
|
import ca.uhn.fhir.jpa.api.config.DaoConfig;
|
||||||
import ca.uhn.fhir.mdm.api.IMdmLinkExpandSvc;
|
import ca.uhn.fhir.mdm.api.IMdmLinkExpandSvc;
|
||||||
import ca.uhn.fhir.mdm.svc.MdmLinkExpandSvc;
|
|
||||||
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
||||||
import ca.uhn.fhir.mdm.log.Logs;
|
import ca.uhn.fhir.mdm.log.Logs;
|
||||||
import ca.uhn.fhir.model.api.IQueryParameterType;
|
import ca.uhn.fhir.model.api.IQueryParameterType;
|
||||||
|
@ -99,7 +98,8 @@ public class MdmSearchExpandingInterceptor {
|
||||||
ourLog.debug("Parameter has been expanded to: {}", String.join(", ", expandedResourceIds));
|
ourLog.debug("Parameter has been expanded to: {}", String.join(", ", expandedResourceIds));
|
||||||
toRemove.add(refParam);
|
toRemove.add(refParam);
|
||||||
expandedResourceIds.stream()
|
expandedResourceIds.stream()
|
||||||
.map(resourceId -> new ReferenceParam(refParam.getResourceType() + "/" + resourceId))
|
.map(resourceId -> addResourceTypeIfNecessary(refParam.getResourceType(), resourceId))
|
||||||
|
.map(ReferenceParam::new)
|
||||||
.forEach(toAdd::add);
|
.forEach(toAdd::add);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -112,6 +112,14 @@ public class MdmSearchExpandingInterceptor {
|
||||||
orList.addAll(toAdd);
|
orList.addAll(toAdd);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private String addResourceTypeIfNecessary(String theResourceType, String theResourceId) {
|
||||||
|
if (theResourceId.contains("/")) {
|
||||||
|
return theResourceId;
|
||||||
|
} else {
|
||||||
|
return theResourceType + "/" + theResourceId;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Expands out the provided _id parameter into all the various
|
* Expands out the provided _id parameter into all the various
|
||||||
* ids of linked resources.
|
* ids of linked resources.
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -77,6 +77,8 @@ public abstract class RequestDetails {
|
||||||
private String myTransactionGuid;
|
private String myTransactionGuid;
|
||||||
private String myFixedConditionalUrl;
|
private String myFixedConditionalUrl;
|
||||||
private boolean myRewriteHistory;
|
private boolean myRewriteHistory;
|
||||||
|
private int myMaxRetries;
|
||||||
|
private boolean myRetry;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Constructor
|
* Constructor
|
||||||
|
@ -542,4 +544,21 @@ public abstract class RequestDetails {
|
||||||
public void setRewriteHistory(boolean theRewriteHistory) {
|
public void setRewriteHistory(boolean theRewriteHistory) {
|
||||||
myRewriteHistory = theRewriteHistory;
|
myRewriteHistory = theRewriteHistory;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
public int getMaxRetries() {
|
||||||
|
return myMaxRetries;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setMaxRetries(int theMaxRetries) {
|
||||||
|
myMaxRetries = theMaxRetries;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isRetry() {
|
||||||
|
return myRetry;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setRetry(boolean theRetry) {
|
||||||
|
myRetry = theRetry;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -44,11 +44,14 @@ import java.util.ArrayList;
|
||||||
import java.util.Collections;
|
import java.util.Collections;
|
||||||
import java.util.Enumeration;
|
import java.util.Enumeration;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
|
import java.util.Iterator;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
import java.util.StringTokenizer;
|
||||||
import java.util.zip.GZIPInputStream;
|
import java.util.zip.GZIPInputStream;
|
||||||
|
|
||||||
import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
||||||
|
import static org.apache.commons.lang3.StringUtils.trim;
|
||||||
|
|
||||||
public class ServletRequestDetails extends RequestDetails {
|
public class ServletRequestDetails extends RequestDetails {
|
||||||
|
|
||||||
|
@ -186,9 +189,36 @@ public class ServletRequestDetails extends RequestDetails {
|
||||||
if ("true".equals(myServletRequest.getHeader(Constants.HEADER_REWRITE_HISTORY))) {
|
if ("true".equals(myServletRequest.getHeader(Constants.HEADER_REWRITE_HISTORY))) {
|
||||||
setRewriteHistory(true);
|
setRewriteHistory(true);
|
||||||
}
|
}
|
||||||
|
setRetryFields(myServletRequest);
|
||||||
return this;
|
return this;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private void setRetryFields(HttpServletRequest theRequest){
|
||||||
|
if (theRequest == null){
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
Enumeration<String> headers = theRequest.getHeaders(Constants.HEADER_RETRY_ON_VERSION_CONFLICT);
|
||||||
|
if (headers != null) {
|
||||||
|
Iterator<String> headerIterator = headers.asIterator();
|
||||||
|
while(headerIterator.hasNext()){
|
||||||
|
String headerValue = headerIterator.next();
|
||||||
|
if (isNotBlank(headerValue)) {
|
||||||
|
StringTokenizer tok = new StringTokenizer(headerValue, ";");
|
||||||
|
while (tok.hasMoreTokens()) {
|
||||||
|
String next = trim(tok.nextToken());
|
||||||
|
if (next.equals(Constants.HEADER_RETRY)) {
|
||||||
|
setRetry(true);
|
||||||
|
} else if (next.startsWith(Constants.HEADER_MAX_RETRIES + "=")) {
|
||||||
|
String val = trim(next.substring((Constants.HEADER_MAX_RETRIES + "=").length()));
|
||||||
|
int maxRetries = Integer.parseInt(val);
|
||||||
|
setMaxRetries(maxRetries);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public void setServletResponse(HttpServletResponse myServletResponse) {
|
public void setServletResponse(HttpServletResponse myServletResponse) {
|
||||||
this.myServletResponse = myServletResponse;
|
this.myServletResponse = myServletResponse;
|
||||||
}
|
}
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
@ -20,7 +20,7 @@
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-caching-api</artifactId>
|
<artifactId>hapi-fhir-caching-api</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>com.github.ben-manes.caffeine</groupId>
|
<groupId>com.github.ben-manes.caffeine</groupId>
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../../pom.xml</relativePath>
|
<relativePath>../../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>hapi-fhir-spring-boot-sample-client-apache</artifactId>
|
<artifactId>hapi-fhir-spring-boot-sample-client-apache</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>hapi-fhir-spring-boot-sample-client-okhttp</artifactId>
|
<artifactId>hapi-fhir-spring-boot-sample-client-okhttp</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>hapi-fhir-spring-boot-sample-server-jersey</artifactId>
|
<artifactId>hapi-fhir-spring-boot-sample-server-jersey</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-spring-boot</artifactId>
|
<artifactId>hapi-fhir-spring-boot</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
<modelVersion>4.0.0</modelVersion>
|
<modelVersion>4.0.0</modelVersion>
|
||||||
|
|
|
@ -31,6 +31,7 @@ import ca.uhn.fhir.batch2.jobs.export.models.ResourceIdList;
|
||||||
import ca.uhn.fhir.batch2.jobs.models.Id;
|
import ca.uhn.fhir.batch2.jobs.models.Id;
|
||||||
import ca.uhn.fhir.i18n.Msg;
|
import ca.uhn.fhir.i18n.Msg;
|
||||||
import ca.uhn.fhir.jpa.api.config.DaoConfig;
|
import ca.uhn.fhir.jpa.api.config.DaoConfig;
|
||||||
|
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
|
||||||
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
|
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
|
||||||
import ca.uhn.fhir.jpa.bulk.export.model.ExportPIDIteratorParameters;
|
import ca.uhn.fhir.jpa.bulk.export.model.ExportPIDIteratorParameters;
|
||||||
import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId;
|
import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId;
|
||||||
|
|
|
@ -51,6 +51,8 @@ import java.util.concurrent.TimeUnit;
|
||||||
|
|
||||||
public class ReindexStep implements IJobStepWorker<ReindexJobParameters, ResourceIdListWorkChunkJson, VoidModel> {
|
public class ReindexStep implements IJobStepWorker<ReindexJobParameters, ResourceIdListWorkChunkJson, VoidModel> {
|
||||||
|
|
||||||
|
public static final int REINDEX_MAX_RETRIES = 10;
|
||||||
|
|
||||||
private static final Logger ourLog = LoggerFactory.getLogger(ReindexStep.class);
|
private static final Logger ourLog = LoggerFactory.getLogger(ReindexStep.class);
|
||||||
@Autowired
|
@Autowired
|
||||||
private HapiTransactionService myHapiTransactionService;
|
private HapiTransactionService myHapiTransactionService;
|
||||||
|
@ -73,6 +75,8 @@ public class ReindexStep implements IJobStepWorker<ReindexJobParameters, Resourc
|
||||||
@Nonnull
|
@Nonnull
|
||||||
public RunOutcome doReindex(ResourceIdListWorkChunkJson data, IJobDataSink<VoidModel> theDataSink, String theInstanceId, String theChunkId) {
|
public RunOutcome doReindex(ResourceIdListWorkChunkJson data, IJobDataSink<VoidModel> theDataSink, String theInstanceId, String theChunkId) {
|
||||||
RequestDetails requestDetails = new SystemRequestDetails();
|
RequestDetails requestDetails = new SystemRequestDetails();
|
||||||
|
requestDetails.setRetry(true);
|
||||||
|
requestDetails.setMaxRetries(REINDEX_MAX_RETRIES);
|
||||||
TransactionDetails transactionDetails = new TransactionDetails();
|
TransactionDetails transactionDetails = new TransactionDetails();
|
||||||
myHapiTransactionService.execute(requestDetails, transactionDetails, new ReindexJob(data, requestDetails, transactionDetails, theDataSink, theInstanceId, theChunkId));
|
myHapiTransactionService.execute(requestDetails, transactionDetails, new ReindexJob(data, requestDetails, transactionDetails, theDataSink, theInstanceId, theChunkId));
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -28,6 +28,8 @@ import ca.uhn.fhir.batch2.model.StatusEnum;
|
||||||
import ca.uhn.fhir.batch2.model.WorkChunk;
|
import ca.uhn.fhir.batch2.model.WorkChunk;
|
||||||
import ca.uhn.fhir.batch2.models.JobInstanceFetchRequest;
|
import ca.uhn.fhir.batch2.models.JobInstanceFetchRequest;
|
||||||
import org.springframework.data.domain.Page;
|
import org.springframework.data.domain.Page;
|
||||||
|
import org.springframework.transaction.annotation.Propagation;
|
||||||
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
|
||||||
import java.util.Iterator;
|
import java.util.Iterator;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
@ -163,6 +165,9 @@ public interface IJobPersistence {
|
||||||
*/
|
*/
|
||||||
void incrementWorkChunkErrorCount(String theChunkId, int theIncrementBy);
|
void incrementWorkChunkErrorCount(String theChunkId, int theIncrementBy);
|
||||||
|
|
||||||
|
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
||||||
|
boolean canAdvanceInstanceToNextStep(String theInstanceId, String theCurrentStepId);
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Fetches all chunks for a given instance, without loading the data
|
* Fetches all chunks for a given instance, without loading the data
|
||||||
*
|
*
|
||||||
|
@ -172,12 +177,14 @@ public interface IJobPersistence {
|
||||||
*/
|
*/
|
||||||
List<WorkChunk> fetchWorkChunksWithoutData(String theInstanceId, int thePageSize, int thePageIndex);
|
List<WorkChunk> fetchWorkChunksWithoutData(String theInstanceId, int thePageSize, int thePageIndex);
|
||||||
|
|
||||||
/**
|
|
||||||
* Fetch all chunks for a given instance.
|
|
||||||
* @param theInstanceId - instance id
|
/**
|
||||||
* @param theWithData - whether or not to include the data
|
* Fetch all chunks for a given instance.
|
||||||
* @return - an iterator for fetching work chunks
|
* @param theInstanceId - instance id
|
||||||
*/
|
* @param theWithData - whether or not to include the data
|
||||||
|
* @return - an iterator for fetching work chunks
|
||||||
|
*/
|
||||||
Iterator<WorkChunk> fetchAllWorkChunksIterator(String theInstanceId, boolean theWithData);
|
Iterator<WorkChunk> fetchAllWorkChunksIterator(String theInstanceId, boolean theWithData);
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -226,4 +233,6 @@ public interface IJobPersistence {
|
||||||
* @param theInstanceId The instance ID
|
* @param theInstanceId The instance ID
|
||||||
*/
|
*/
|
||||||
JobOperationResultJson cancelInstance(String theInstanceId);
|
JobOperationResultJson cancelInstance(String theInstanceId);
|
||||||
|
|
||||||
|
List<String> fetchallchunkidsforstepWithStatus(String theInstanceId, String theStepId, StatusEnum theStatusEnum);
|
||||||
}
|
}
|
||||||
|
|
|
@ -126,6 +126,11 @@ public class SynchronizedJobPersistenceWrapper implements IJobPersistence {
|
||||||
myWrap.incrementWorkChunkErrorCount(theChunkId, theIncrementBy);
|
myWrap.incrementWorkChunkErrorCount(theChunkId, theIncrementBy);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public boolean canAdvanceInstanceToNextStep(String theInstanceId, String theCurrentStepId) {
|
||||||
|
return myWrap.canAdvanceInstanceToNextStep(theInstanceId, theCurrentStepId);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public synchronized List<WorkChunk> fetchWorkChunksWithoutData(String theInstanceId, int thePageSize, int thePageIndex) {
|
public synchronized List<WorkChunk> fetchWorkChunksWithoutData(String theInstanceId, int thePageSize, int thePageIndex) {
|
||||||
return myWrap.fetchWorkChunksWithoutData(theInstanceId, thePageSize, thePageIndex);
|
return myWrap.fetchWorkChunksWithoutData(theInstanceId, thePageSize, thePageIndex);
|
||||||
|
@ -141,7 +146,6 @@ public class SynchronizedJobPersistenceWrapper implements IJobPersistence {
|
||||||
return myWrap.fetchAllWorkChunksForStepIterator(theInstanceId, theStepId);
|
return myWrap.fetchAllWorkChunksForStepIterator(theInstanceId, theStepId);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public synchronized boolean updateInstance(JobInstance theInstance) {
|
public synchronized boolean updateInstance(JobInstance theInstance) {
|
||||||
return myWrap.updateInstance(theInstance);
|
return myWrap.updateInstance(theInstance);
|
||||||
|
@ -166,4 +170,9 @@ public class SynchronizedJobPersistenceWrapper implements IJobPersistence {
|
||||||
public JobOperationResultJson cancelInstance(String theInstanceId) {
|
public JobOperationResultJson cancelInstance(String theInstanceId) {
|
||||||
return myWrap.cancelInstance(theInstanceId);
|
return myWrap.cancelInstance(theInstanceId);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public List<String> fetchallchunkidsforstepWithStatus(String theInstanceId, String theStepId, StatusEnum theStatusEnum) {
|
||||||
|
return myWrap.fetchallchunkidsforstepWithStatus(theInstanceId, theStepId, theStatusEnum);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -81,6 +81,7 @@ class WorkChannelMessageHandler implements MessageHandler {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
WorkChunk workChunk = chunkOpt.get();
|
WorkChunk workChunk = chunkOpt.get();
|
||||||
|
ourLog.debug("Worker picked up chunk. [chunkId={}, stepId={}, startTime={}]", chunkId, workChunk.getTargetStepId(), workChunk.getStartTime());
|
||||||
|
|
||||||
JobWorkCursor<?, ?, ?> cursor = buildCursorFromNotification(workNotification);
|
JobWorkCursor<?, ?, ?> cursor = buildCursorFromNotification(workNotification);
|
||||||
|
|
||||||
|
|
|
@ -100,12 +100,12 @@ public class ResourceIdListStep<PT extends PartitionedJobParameters, IT extends
|
||||||
previousLastTime = nextChunk.getLastDate().getTime();
|
previousLastTime = nextChunk.getLastDate().getTime();
|
||||||
nextStart = nextChunk.getLastDate();
|
nextStart = nextChunk.getLastDate();
|
||||||
|
|
||||||
while (idBuffer.size() >= MAX_BATCH_OF_IDS) {
|
while (idBuffer.size() > MAX_BATCH_OF_IDS) {
|
||||||
List<TypedPidJson> submissionIds = new ArrayList<>();
|
List<TypedPidJson> submissionIds = new ArrayList<>();
|
||||||
for (Iterator<TypedPidJson> iter = idBuffer.iterator(); iter.hasNext(); ) {
|
for (Iterator<TypedPidJson> iter = idBuffer.iterator(); iter.hasNext(); ) {
|
||||||
submissionIds.add(iter.next());
|
submissionIds.add(iter.next());
|
||||||
iter.remove();
|
iter.remove();
|
||||||
if (submissionIds.size() >= MAX_BATCH_OF_IDS) {
|
if (submissionIds.size() == MAX_BATCH_OF_IDS) {
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -23,9 +23,11 @@ package ca.uhn.fhir.batch2.maintenance;
|
||||||
|
|
||||||
import ca.uhn.fhir.batch2.model.StatusEnum;
|
import ca.uhn.fhir.batch2.model.StatusEnum;
|
||||||
import ca.uhn.fhir.batch2.model.WorkChunk;
|
import ca.uhn.fhir.batch2.model.WorkChunk;
|
||||||
|
import ca.uhn.fhir.jpa.batch.log.Logs;
|
||||||
import com.google.common.collect.ArrayListMultimap;
|
import com.google.common.collect.ArrayListMultimap;
|
||||||
import com.google.common.collect.Multimap;
|
import com.google.common.collect.Multimap;
|
||||||
import org.apache.commons.lang3.ArrayUtils;
|
import org.apache.commons.lang3.ArrayUtils;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
|
||||||
import javax.annotation.Nonnull;
|
import javax.annotation.Nonnull;
|
||||||
import java.util.Collection;
|
import java.util.Collection;
|
||||||
|
@ -36,6 +38,7 @@ import java.util.stream.Collectors;
|
||||||
|
|
||||||
import static java.util.Collections.emptyList;
|
import static java.util.Collections.emptyList;
|
||||||
import static org.apache.commons.lang3.ObjectUtils.defaultIfNull;
|
import static org.apache.commons.lang3.ObjectUtils.defaultIfNull;
|
||||||
|
import static org.slf4j.LoggerFactory.getLogger;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* While performing cleanup, the cleanup job loads all of the known
|
* While performing cleanup, the cleanup job loads all of the known
|
||||||
|
@ -44,6 +47,7 @@ import static org.apache.commons.lang3.ObjectUtils.defaultIfNull;
|
||||||
* needing to hit the database a second time.
|
* needing to hit the database a second time.
|
||||||
*/
|
*/
|
||||||
public class JobChunkProgressAccumulator {
|
public class JobChunkProgressAccumulator {
|
||||||
|
private static final Logger ourLog = Logs.getBatchTroubleshootingLog();
|
||||||
|
|
||||||
private final Set<String> myConsumedInstanceAndChunkIds = new HashSet<>();
|
private final Set<String> myConsumedInstanceAndChunkIds = new HashSet<>();
|
||||||
private final Multimap<String, ChunkStatusCountValue> myInstanceIdToChunkStatuses = ArrayListMultimap.create();
|
private final Multimap<String, ChunkStatusCountValue> myInstanceIdToChunkStatuses = ArrayListMultimap.create();
|
||||||
|
@ -52,6 +56,10 @@ public class JobChunkProgressAccumulator {
|
||||||
return getChunkIdsWithStatus(theInstanceId, theStepId, theStatuses).size();
|
return getChunkIdsWithStatus(theInstanceId, theStepId, theStatuses).size();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
int getTotalChunkCountForInstanceAndStep(String theInstanceId, String theStepId) {
|
||||||
|
return myInstanceIdToChunkStatuses.get(theInstanceId).stream().filter(chunkCount -> chunkCount.myStepId.equals(theStepId)).collect(Collectors.toList()).size();
|
||||||
|
}
|
||||||
|
|
||||||
public List<String> getChunkIdsWithStatus(String theInstanceId, String theStepId, StatusEnum... theStatuses) {
|
public List<String> getChunkIdsWithStatus(String theInstanceId, String theStepId, StatusEnum... theStatuses) {
|
||||||
return getChunkStatuses(theInstanceId).stream()
|
return getChunkStatuses(theInstanceId).stream()
|
||||||
.filter(t -> t.myStepId.equals(theStepId))
|
.filter(t -> t.myStepId.equals(theStepId))
|
||||||
|
@ -73,6 +81,7 @@ public class JobChunkProgressAccumulator {
|
||||||
// Note: If chunks are being written while we're executing, we may see the same chunk twice. This
|
// Note: If chunks are being written while we're executing, we may see the same chunk twice. This
|
||||||
// check avoids adding it twice.
|
// check avoids adding it twice.
|
||||||
if (myConsumedInstanceAndChunkIds.add(instanceId + " " + chunkId)) {
|
if (myConsumedInstanceAndChunkIds.add(instanceId + " " + chunkId)) {
|
||||||
|
ourLog.debug("Adding chunk to accumulator. [chunkId={}, instanceId={}, status={}]", chunkId, instanceId, theChunk.getStatus());
|
||||||
myInstanceIdToChunkStatuses.put(instanceId, new ChunkStatusCountValue(chunkId, theChunk.getTargetStepId(), theChunk.getStatus()));
|
myInstanceIdToChunkStatuses.put(instanceId, new ChunkStatusCountValue(chunkId, theChunk.getTargetStepId(), theChunk.getStatus()));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -88,6 +97,4 @@ public class JobChunkProgressAccumulator {
|
||||||
myStatus = theStatus;
|
myStatus = theStatus;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -140,11 +140,9 @@ public class JobInstanceProcessor {
|
||||||
|
|
||||||
String instanceId = myInstance.getInstanceId();
|
String instanceId = myInstance.getInstanceId();
|
||||||
String currentStepId = jobWorkCursor.getCurrentStepId();
|
String currentStepId = jobWorkCursor.getCurrentStepId();
|
||||||
int incompleteChunks = myProgressAccumulator.countChunksWithStatus(instanceId, currentStepId, StatusEnum.getIncompleteStatuses());
|
boolean shouldAdvance = myJobPersistence.canAdvanceInstanceToNextStep(instanceId, currentStepId);
|
||||||
|
if (shouldAdvance) {
|
||||||
if (incompleteChunks == 0) {
|
|
||||||
String nextStepId = jobWorkCursor.nextStep.getStepId();
|
String nextStepId = jobWorkCursor.nextStep.getStepId();
|
||||||
|
|
||||||
ourLog.info("All processing is complete for gated execution of instance {} step {}. Proceeding to step {}", instanceId, currentStepId, nextStepId);
|
ourLog.info("All processing is complete for gated execution of instance {} step {}. Proceeding to step {}", instanceId, currentStepId, nextStepId);
|
||||||
|
|
||||||
if (jobWorkCursor.nextStep.isReductionStep()) {
|
if (jobWorkCursor.nextStep.isReductionStep()) {
|
||||||
|
@ -154,18 +152,23 @@ public class JobInstanceProcessor {
|
||||||
processChunksForNextSteps(instanceId, nextStepId);
|
processChunksForNextSteps(instanceId, nextStepId);
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
ourLog.debug("Not ready to advance gated execution of instance {} from step {} to {} because there are {} incomplete work chunks",
|
ourLog.debug("Not ready to advance gated execution of instance {} from step {} to {}.",
|
||||||
instanceId, currentStepId, jobWorkCursor.nextStep.getStepId(), incompleteChunks);
|
instanceId, currentStepId, jobWorkCursor.nextStep.getStepId());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void processChunksForNextSteps(String instanceId, String nextStepId) {
|
private void processChunksForNextSteps(String instanceId, String nextStepId) {
|
||||||
List<String> chunksForNextStep = myProgressAccumulator.getChunkIdsWithStatus(instanceId, nextStepId, StatusEnum.QUEUED);
|
List<String> queuedChunksForNextStep = myProgressAccumulator.getChunkIdsWithStatus(instanceId, nextStepId, StatusEnum.QUEUED);
|
||||||
for (String nextChunkId : chunksForNextStep) {
|
int totalChunksForNextStep = myProgressAccumulator.getTotalChunkCountForInstanceAndStep(instanceId, nextStepId);
|
||||||
|
if (totalChunksForNextStep != queuedChunksForNextStep.size()) {
|
||||||
|
ourLog.debug("Total ProgressAccumulator QUEUED chunk count does not match QUEUED chunk size! [instanceId={}, stepId={}, totalChunks={}, queuedChunks={}]", instanceId, nextStepId, totalChunksForNextStep, queuedChunksForNextStep.size());
|
||||||
|
}
|
||||||
|
List<String> chunksToSubmit = myJobPersistence.fetchallchunkidsforstepWithStatus(instanceId, nextStepId, StatusEnum.QUEUED);
|
||||||
|
for (String nextChunkId : chunksToSubmit) {
|
||||||
JobWorkNotification workNotification = new JobWorkNotification(myInstance, nextStepId, nextChunkId);
|
JobWorkNotification workNotification = new JobWorkNotification(myInstance, nextStepId, nextChunkId);
|
||||||
myBatchJobSender.sendWorkChannelMessage(workNotification);
|
myBatchJobSender.sendWorkChannelMessage(workNotification);
|
||||||
}
|
}
|
||||||
|
ourLog.debug("Submitted a batch of chunks for processing. [chunkCount={}, instanceId={}, stepId={}]", chunksToSubmit.size(), instanceId, nextStepId);
|
||||||
myInstance.setCurrentGatedStepId(nextStepId);
|
myInstance.setCurrentGatedStepId(nextStepId);
|
||||||
myJobPersistence.updateInstance(myInstance);
|
myJobPersistence.updateInstance(myInstance);
|
||||||
}
|
}
|
||||||
|
|
|
@ -21,6 +21,9 @@ package ca.uhn.fhir.batch2.model;
|
||||||
*/
|
*/
|
||||||
|
|
||||||
import ca.uhn.fhir.i18n.Msg;
|
import ca.uhn.fhir.i18n.Msg;
|
||||||
|
import ca.uhn.fhir.jpa.batch.log.Logs;
|
||||||
|
import net.bytebuddy.dynamic.ClassFileLocator;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
|
||||||
import javax.annotation.Nonnull;
|
import javax.annotation.Nonnull;
|
||||||
import java.util.Collections;
|
import java.util.Collections;
|
||||||
|
@ -61,6 +64,8 @@ public enum StatusEnum {
|
||||||
*/
|
*/
|
||||||
CANCELLED(true, true);
|
CANCELLED(true, true);
|
||||||
|
|
||||||
|
private static final Logger ourLog = Logs.getBatchTroubleshootingLog();
|
||||||
|
|
||||||
private final boolean myIncomplete;
|
private final boolean myIncomplete;
|
||||||
private final boolean myEnded;
|
private final boolean myEnded;
|
||||||
private static StatusEnum[] ourIncompleteStatuses;
|
private static StatusEnum[] ourIncompleteStatuses;
|
||||||
|
@ -138,23 +143,37 @@ public enum StatusEnum {
|
||||||
if (theOrigStatus == theNewStatus) {
|
if (theOrigStatus == theNewStatus) {
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
Boolean canTransition;
|
||||||
switch (theOrigStatus) {
|
switch (theOrigStatus) {
|
||||||
case QUEUED:
|
case QUEUED:
|
||||||
// initial state can transition to anything
|
// initial state can transition to anything
|
||||||
return true;
|
canTransition = true;
|
||||||
|
break;
|
||||||
case IN_PROGRESS:
|
case IN_PROGRESS:
|
||||||
return theNewStatus != QUEUED;
|
canTransition = theNewStatus != QUEUED;
|
||||||
|
break;
|
||||||
case ERRORED:
|
case ERRORED:
|
||||||
return theNewStatus == FAILED || theNewStatus == COMPLETED || theNewStatus == CANCELLED;
|
canTransition = theNewStatus == FAILED || theNewStatus == COMPLETED || theNewStatus == CANCELLED;
|
||||||
|
break;
|
||||||
case COMPLETED:
|
case COMPLETED:
|
||||||
case CANCELLED:
|
case CANCELLED:
|
||||||
case FAILED:
|
case FAILED:
|
||||||
// terminal state cannot transition
|
// terminal state cannot transition
|
||||||
return false;
|
canTransition = false;
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
canTransition = null;
|
||||||
|
break;
|
||||||
}
|
}
|
||||||
|
|
||||||
throw new IllegalStateException(Msg.code(2131) + "Unknown batch state " + theOrigStatus);
|
if (canTransition == null){
|
||||||
|
throw new IllegalStateException(Msg.code(2131) + "Unknown batch state " + theOrigStatus);
|
||||||
|
} else {
|
||||||
|
if (!canTransition) {
|
||||||
|
ourLog.trace("Tried to execute an illegal state transition. [origStatus={}, newStatus={}]", theOrigStatus, theNewStatus);
|
||||||
|
}
|
||||||
|
return canTransition;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean isIncomplete() {
|
public boolean isIncomplete() {
|
||||||
|
|
|
@ -29,6 +29,8 @@ import org.apache.commons.lang3.builder.ToStringBuilder;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
|
|
||||||
import java.util.Date;
|
import java.util.Date;
|
||||||
|
import java.util.HashMap;
|
||||||
|
import java.util.Map;
|
||||||
import java.util.concurrent.TimeUnit;
|
import java.util.concurrent.TimeUnit;
|
||||||
|
|
||||||
class InstanceProgress {
|
class InstanceProgress {
|
||||||
|
@ -36,6 +38,7 @@ class InstanceProgress {
|
||||||
|
|
||||||
private int myRecordsProcessed = 0;
|
private int myRecordsProcessed = 0;
|
||||||
private int myIncompleteChunkCount = 0;
|
private int myIncompleteChunkCount = 0;
|
||||||
|
private int myQueuedCount = 0;
|
||||||
private int myCompleteChunkCount = 0;
|
private int myCompleteChunkCount = 0;
|
||||||
private int myErroredChunkCount = 0;
|
private int myErroredChunkCount = 0;
|
||||||
private int myFailedChunkCount = 0;
|
private int myFailedChunkCount = 0;
|
||||||
|
@ -44,6 +47,7 @@ class InstanceProgress {
|
||||||
private Long myLatestEndTime = null;
|
private Long myLatestEndTime = null;
|
||||||
private String myErrormessage = null;
|
private String myErrormessage = null;
|
||||||
private StatusEnum myNewStatus = null;
|
private StatusEnum myNewStatus = null;
|
||||||
|
private Map<String, Map<StatusEnum, Integer>> myStepToStatusCountMap = new HashMap<>();
|
||||||
|
|
||||||
public void addChunk(WorkChunk theChunk) {
|
public void addChunk(WorkChunk theChunk) {
|
||||||
myErrorCountForAllStatuses += theChunk.getErrorCount();
|
myErrorCountForAllStatuses += theChunk.getErrorCount();
|
||||||
|
@ -55,6 +59,10 @@ class InstanceProgress {
|
||||||
}
|
}
|
||||||
|
|
||||||
private void updateCompletionStatus(WorkChunk theChunk) {
|
private void updateCompletionStatus(WorkChunk theChunk) {
|
||||||
|
//Update the status map first.
|
||||||
|
Map<StatusEnum, Integer> statusToCountMap = myStepToStatusCountMap.getOrDefault(theChunk.getTargetStepId(), new HashMap<>());
|
||||||
|
statusToCountMap.put(theChunk.getStatus(), statusToCountMap.getOrDefault(theChunk.getStatus(), 0) + 1);
|
||||||
|
|
||||||
switch (theChunk.getStatus()) {
|
switch (theChunk.getStatus()) {
|
||||||
case QUEUED:
|
case QUEUED:
|
||||||
case IN_PROGRESS:
|
case IN_PROGRESS:
|
||||||
|
@ -166,14 +174,19 @@ class InstanceProgress {
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String toString() {
|
public String toString() {
|
||||||
return new ToStringBuilder(this)
|
ToStringBuilder builder = new ToStringBuilder(this)
|
||||||
.append("myIncompleteChunkCount", myIncompleteChunkCount)
|
.append("myIncompleteChunkCount", myIncompleteChunkCount)
|
||||||
.append("myCompleteChunkCount", myCompleteChunkCount)
|
.append("myCompleteChunkCount", myCompleteChunkCount)
|
||||||
.append("myErroredChunkCount", myErroredChunkCount)
|
.append("myErroredChunkCount", myErroredChunkCount)
|
||||||
.append("myFailedChunkCount", myFailedChunkCount)
|
.append("myFailedChunkCount", myFailedChunkCount)
|
||||||
.append("myErrormessage", myErrormessage)
|
.append("myErrormessage", myErrormessage)
|
||||||
.append("myRecordsProcessed", myRecordsProcessed)
|
.append("myRecordsProcessed", myRecordsProcessed);
|
||||||
.toString();
|
|
||||||
|
builder.append("myStepToStatusCountMap", myStepToStatusCountMap);
|
||||||
|
|
||||||
|
return builder.toString();
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public StatusEnum getNewStatus() {
|
public StatusEnum getNewStatus() {
|
||||||
|
|
|
@ -50,6 +50,7 @@ public class JobInstanceStatusUpdater {
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
theJobInstance.setStatus(theNewStatus);
|
theJobInstance.setStatus(theNewStatus);
|
||||||
|
ourLog.debug("Updating job instance {} of type {} from {} to {}", theJobInstance.getInstanceId(), theJobInstance.getJobDefinitionId(), origStatus, theNewStatus);
|
||||||
return updateInstance(theJobInstance);
|
return updateInstance(theJobInstance);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -43,6 +43,7 @@ import java.util.Optional;
|
||||||
import java.util.concurrent.atomic.AtomicInteger;
|
import java.util.concurrent.atomic.AtomicInteger;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertNotEquals;
|
||||||
import static org.junit.jupiter.api.Assertions.assertNull;
|
import static org.junit.jupiter.api.Assertions.assertNull;
|
||||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||||
import static org.junit.jupiter.api.Assertions.fail;
|
import static org.junit.jupiter.api.Assertions.fail;
|
||||||
|
|
|
@ -6,6 +6,7 @@ import ca.uhn.fhir.batch2.jobs.chunk.PartitionedUrlChunkRangeJson;
|
||||||
import ca.uhn.fhir.batch2.jobs.chunk.ResourceIdListWorkChunkJson;
|
import ca.uhn.fhir.batch2.jobs.chunk.ResourceIdListWorkChunkJson;
|
||||||
import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlListJobParameters;
|
import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlListJobParameters;
|
||||||
import ca.uhn.fhir.batch2.model.JobInstance;
|
import ca.uhn.fhir.batch2.model.JobInstance;
|
||||||
|
import ca.uhn.fhir.jpa.api.pid.EmptyResourcePidList;
|
||||||
import ca.uhn.fhir.jpa.api.pid.HomogeneousResourcePidList;
|
import ca.uhn.fhir.jpa.api.pid.HomogeneousResourcePidList;
|
||||||
import ca.uhn.fhir.jpa.api.pid.IResourcePidList;
|
import ca.uhn.fhir.jpa.api.pid.IResourcePidList;
|
||||||
import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc;
|
import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc;
|
||||||
|
@ -74,7 +75,9 @@ public class LoadIdsStepTest {
|
||||||
when(myBatch2DaoSvc.fetchResourceIdsPage(eq(DATE_2), eq(DATE_END), eq(DEFAULT_PAGE_SIZE), isNull(), isNull()))
|
when(myBatch2DaoSvc.fetchResourceIdsPage(eq(DATE_2), eq(DATE_END), eq(DEFAULT_PAGE_SIZE), isNull(), isNull()))
|
||||||
.thenReturn(createIdChunk(20000L, 40000L, DATE_3));
|
.thenReturn(createIdChunk(20000L, 40000L, DATE_3));
|
||||||
when(myBatch2DaoSvc.fetchResourceIdsPage(eq(DATE_3), eq(DATE_END), eq(DEFAULT_PAGE_SIZE), isNull(), isNull()))
|
when(myBatch2DaoSvc.fetchResourceIdsPage(eq(DATE_3), eq(DATE_END), eq(DEFAULT_PAGE_SIZE), isNull(), isNull()))
|
||||||
.thenReturn(createIdChunk(40000L, 40040L, DATE_3));
|
.thenReturn(createIdChunk(40000L, 40040L, DATE_4));
|
||||||
|
when(myBatch2DaoSvc.fetchResourceIdsPage(eq(DATE_4), eq(DATE_END), eq(DEFAULT_PAGE_SIZE), isNull(), isNull()))
|
||||||
|
.thenReturn(new EmptyResourcePidList());
|
||||||
|
|
||||||
mySvc.run(details, mySink);
|
mySvc.run(details, mySink);
|
||||||
|
|
||||||
|
|
|
@ -38,6 +38,7 @@ import java.util.concurrent.CountDownLatch;
|
||||||
import java.util.concurrent.ExecutionException;
|
import java.util.concurrent.ExecutionException;
|
||||||
import java.util.concurrent.Executors;
|
import java.util.concurrent.Executors;
|
||||||
import java.util.concurrent.Future;
|
import java.util.concurrent.Future;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
import static ca.uhn.fhir.batch2.coordinator.JobCoordinatorImplTest.createWorkChunkStep1;
|
import static ca.uhn.fhir.batch2.coordinator.JobCoordinatorImplTest.createWorkChunkStep1;
|
||||||
import static org.awaitility.Awaitility.await;
|
import static org.awaitility.Awaitility.await;
|
||||||
|
@ -180,9 +181,12 @@ public class JobMaintenanceServiceImplTest extends BaseBatch2Test {
|
||||||
JobCoordinatorImplTest.createWorkChunkStep2().setStatus(StatusEnum.QUEUED).setId(CHUNK_ID),
|
JobCoordinatorImplTest.createWorkChunkStep2().setStatus(StatusEnum.QUEUED).setId(CHUNK_ID),
|
||||||
JobCoordinatorImplTest.createWorkChunkStep2().setStatus(StatusEnum.QUEUED).setId(CHUNK_ID_2)
|
JobCoordinatorImplTest.createWorkChunkStep2().setStatus(StatusEnum.QUEUED).setId(CHUNK_ID_2)
|
||||||
);
|
);
|
||||||
|
when (myJobPersistence.canAdvanceInstanceToNextStep(any(), any())).thenReturn(true);
|
||||||
myJobDefinitionRegistry.addJobDefinition(createJobDefinition(JobDefinition.Builder::gatedExecution));
|
myJobDefinitionRegistry.addJobDefinition(createJobDefinition(JobDefinition.Builder::gatedExecution));
|
||||||
when(myJobPersistence.fetchAllWorkChunksIterator(eq(INSTANCE_ID), eq(false)))
|
when(myJobPersistence.fetchAllWorkChunksIterator(eq(INSTANCE_ID), eq(false)))
|
||||||
.thenReturn(chunks.iterator());
|
.thenReturn(chunks.iterator());
|
||||||
|
when(myJobPersistence.fetchallchunkidsforstepWithStatus(eq(INSTANCE_ID), eq(STEP_2), eq(StatusEnum.QUEUED)))
|
||||||
|
.thenReturn(chunks.stream().map(chunk -> chunk.getId()).collect(Collectors.toList()));
|
||||||
JobInstance instance1 = createInstance();
|
JobInstance instance1 = createInstance();
|
||||||
instance1.setCurrentGatedStepId(STEP_1);
|
instance1.setCurrentGatedStepId(STEP_1);
|
||||||
when(myJobPersistence.fetchInstances(anyInt(), eq(0))).thenReturn(Lists.newArrayList(instance1));
|
when(myJobPersistence.fetchInstances(anyInt(), eq(0))).thenReturn(Lists.newArrayList(instance1));
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
<modelVersion>4.0.0</modelVersion>
|
<modelVersion>4.0.0</modelVersion>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
<modelVersion>4.0.0</modelVersion>
|
<modelVersion>4.0.0</modelVersion>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -25,6 +25,7 @@ import ca.uhn.fhir.interceptor.api.Interceptor;
|
||||||
import ca.uhn.fhir.interceptor.api.Pointcut;
|
import ca.uhn.fhir.interceptor.api.Pointcut;
|
||||||
import ca.uhn.fhir.jpa.api.model.ResourceVersionConflictResolutionStrategy;
|
import ca.uhn.fhir.jpa.api.model.ResourceVersionConflictResolutionStrategy;
|
||||||
import ca.uhn.fhir.jpa.partition.SystemRequestDetails;
|
import ca.uhn.fhir.jpa.partition.SystemRequestDetails;
|
||||||
|
import ca.uhn.fhir.rest.api.Constants;
|
||||||
import ca.uhn.fhir.rest.api.server.RequestDetails;
|
import ca.uhn.fhir.rest.api.server.RequestDetails;
|
||||||
import org.apache.commons.lang3.Validate;
|
import org.apache.commons.lang3.Validate;
|
||||||
|
|
||||||
|
@ -46,39 +47,26 @@ import static org.apache.commons.lang3.StringUtils.trim;
|
||||||
@Interceptor
|
@Interceptor
|
||||||
public class UserRequestRetryVersionConflictsInterceptor {
|
public class UserRequestRetryVersionConflictsInterceptor {
|
||||||
|
|
||||||
|
/** Deprecated and moved to {@link ca.uhn.fhir.rest.api.Constants#HEADER_RETRY_ON_VERSION_CONFLICT} */
|
||||||
|
@Deprecated
|
||||||
public static final String HEADER_NAME = "X-Retry-On-Version-Conflict";
|
public static final String HEADER_NAME = "X-Retry-On-Version-Conflict";
|
||||||
|
|
||||||
|
/** Deprecated and moved to {@link ca.uhn.fhir.rest.api.Constants#HEADER_MAX_RETRIES} */
|
||||||
|
@Deprecated
|
||||||
public static final String MAX_RETRIES = "max-retries";
|
public static final String MAX_RETRIES = "max-retries";
|
||||||
|
|
||||||
|
/** Deprecated and moved to {@link ca.uhn.fhir.rest.api.Constants#HEADER_RETRY} */
|
||||||
|
@Deprecated
|
||||||
public static final String RETRY = "retry";
|
public static final String RETRY = "retry";
|
||||||
|
|
||||||
@Hook(value = Pointcut.STORAGE_VERSION_CONFLICT, order = 100)
|
@Hook(value = Pointcut.STORAGE_VERSION_CONFLICT, order = 100)
|
||||||
public ResourceVersionConflictResolutionStrategy check(RequestDetails theRequestDetails) {
|
public ResourceVersionConflictResolutionStrategy check(RequestDetails theRequestDetails) {
|
||||||
ResourceVersionConflictResolutionStrategy retVal = new ResourceVersionConflictResolutionStrategy();
|
ResourceVersionConflictResolutionStrategy retVal = new ResourceVersionConflictResolutionStrategy();
|
||||||
|
boolean shouldSetRetries = theRequestDetails != null && theRequestDetails.isRetry();
|
||||||
if (theRequestDetails != null) {
|
if (shouldSetRetries) {
|
||||||
List<String> headers = theRequestDetails.getHeaders(HEADER_NAME);
|
retVal.setRetry(true);
|
||||||
if (headers != null) {
|
int maxRetries = Math.min(100, theRequestDetails.getMaxRetries());
|
||||||
for (String headerValue : headers) {
|
retVal.setMaxRetries(maxRetries);
|
||||||
if (isNotBlank(headerValue)) {
|
|
||||||
|
|
||||||
StringTokenizer tok = new StringTokenizer(headerValue, ";");
|
|
||||||
while (tok.hasMoreTokens()) {
|
|
||||||
String next = trim(tok.nextToken());
|
|
||||||
if (next.equals(RETRY)) {
|
|
||||||
retVal.setRetry(true);
|
|
||||||
} else if (next.startsWith(MAX_RETRIES + "=")) {
|
|
||||||
|
|
||||||
String val = trim(next.substring((MAX_RETRIES + "=").length()));
|
|
||||||
int maxRetries = Integer.parseInt(val);
|
|
||||||
maxRetries = Math.min(100, maxRetries);
|
|
||||||
retVal.setMaxRetries(maxRetries);
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return retVal;
|
return retVal;
|
||||||
|
@ -90,7 +78,7 @@ public class UserRequestRetryVersionConflictsInterceptor {
|
||||||
*/
|
*/
|
||||||
public static void addRetryHeader(SystemRequestDetails theRequestDetails, int theMaxRetries) {
|
public static void addRetryHeader(SystemRequestDetails theRequestDetails, int theMaxRetries) {
|
||||||
Validate.inclusiveBetween(1, Integer.MAX_VALUE, theMaxRetries, "Max retries must be > 0");
|
Validate.inclusiveBetween(1, Integer.MAX_VALUE, theMaxRetries, "Max retries must be > 0");
|
||||||
String value = RETRY + "; " + MAX_RETRIES + "=" + theMaxRetries;
|
theRequestDetails.setRetry(true);
|
||||||
theRequestDetails.addHeader(HEADER_NAME, value);
|
theRequestDetails.setMaxRetries(theMaxRetries);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -92,11 +92,7 @@ public class SearchParamValidatingInterceptor {
|
||||||
|
|
||||||
SearchParameterMap searchParameterMap = extractSearchParameterMap(runtimeSearchParam);
|
SearchParameterMap searchParameterMap = extractSearchParameterMap(runtimeSearchParam);
|
||||||
if (searchParameterMap != null) {
|
if (searchParameterMap != null) {
|
||||||
if (isUpliftSearchParam(theResource)) {
|
validateStandardSpOnCreate(theRequestDetails, searchParameterMap);
|
||||||
validateUpliftSp(theRequestDetails, runtimeSearchParam, searchParameterMap);
|
|
||||||
} else {
|
|
||||||
validateStandardSpOnCreate(theRequestDetails, searchParameterMap);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -118,58 +114,10 @@ public class SearchParamValidatingInterceptor {
|
||||||
|
|
||||||
SearchParameterMap searchParameterMap = extractSearchParameterMap(runtimeSearchParam);
|
SearchParameterMap searchParameterMap = extractSearchParameterMap(runtimeSearchParam);
|
||||||
if (searchParameterMap != null) {
|
if (searchParameterMap != null) {
|
||||||
if (isUpliftSearchParam(theResource)) {
|
validateStandardSpOnUpdate(theRequestDetails, runtimeSearchParam, searchParameterMap);
|
||||||
validateUpliftSp(theRequestDetails, runtimeSearchParam, searchParameterMap);
|
|
||||||
} else {
|
|
||||||
validateStandardSpOnUpdate(theRequestDetails, runtimeSearchParam, searchParameterMap);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void validateUpliftSp(RequestDetails theRequestDetails, @Nonnull RuntimeSearchParam theRuntimeSearchParam, SearchParameterMap theSearchParameterMap) {
|
|
||||||
Validate.notEmpty(getUpliftExtensions(), "You are attempting to validate an Uplift Search Parameter, but have not defined which URLs correspond to uplifted search parameter extensions.");
|
|
||||||
|
|
||||||
IBundleProvider bundleProvider = getDao().search(theSearchParameterMap, theRequestDetails);
|
|
||||||
List<IBaseResource> allResources = bundleProvider.getAllResources();
|
|
||||||
if(isNotEmpty(allResources)) {
|
|
||||||
Set<String> existingIds = allResources.stream().map(resource -> resource.getIdElement().getIdPart()).collect(Collectors.toSet());
|
|
||||||
if (isNewSearchParam(theRuntimeSearchParam, existingIds)) {
|
|
||||||
for (String upliftExtensionUrl: getUpliftExtensions()) {
|
|
||||||
boolean matchesExistingUplift = allResources.stream()
|
|
||||||
.map(sp -> mySearchParameterCanonicalizer.canonicalizeSearchParameter(sp))
|
|
||||||
.filter(sp -> !sp.getExtensions(upliftExtensionUrl).isEmpty())
|
|
||||||
.anyMatch(sp -> isDuplicateUpliftParameter(theRuntimeSearchParam, sp, upliftExtensionUrl));
|
|
||||||
|
|
||||||
if (matchesExistingUplift) {
|
|
||||||
throwDuplicateError();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private boolean isDuplicateUpliftParameter(RuntimeSearchParam theRuntimeSearchParam, RuntimeSearchParam theSp, String theUpliftUrl) {
|
|
||||||
String firstCode = getUpliftChildExtensionValueByUrl(theRuntimeSearchParam, "code", theUpliftUrl);
|
|
||||||
String secondCode = getUpliftChildExtensionValueByUrl(theSp, "code", theUpliftUrl);
|
|
||||||
String firstElementName = getUpliftChildExtensionValueByUrl(theRuntimeSearchParam, "element-name", theUpliftUrl);
|
|
||||||
String secondElementName = getUpliftChildExtensionValueByUrl(theSp, "element-name", theUpliftUrl);
|
|
||||||
return firstCode.equals(secondCode) && firstElementName.equals(secondElementName);
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
private String getUpliftChildExtensionValueByUrl(RuntimeSearchParam theSp, String theUrl, String theUpliftUrl) {
|
|
||||||
List<IBaseExtension<?, ?>> extensions = theSp.getExtensions(theUpliftUrl);
|
|
||||||
Validate.isTrue(extensions.size() == 1);
|
|
||||||
IBaseExtension<?, ?> topLevelExtension = extensions.get(0);
|
|
||||||
List<IBaseExtension> extension = (List<IBaseExtension>) topLevelExtension.getExtension();
|
|
||||||
String subExtensionValue = extension.stream().filter(ext -> ext.getUrl().equals(theUrl)).map(IBaseExtension::getValue)
|
|
||||||
.map(IPrimitiveType.class::cast)
|
|
||||||
.map(IPrimitiveType::getValueAsString)
|
|
||||||
.findFirst()
|
|
||||||
.orElseThrow(() -> new UnprocessableEntityException(Msg.code(2198), "Unable to process Uplift SP addition as the SearchParameter is malformed."));
|
|
||||||
return subExtensionValue;
|
|
||||||
}
|
|
||||||
|
|
||||||
private boolean isNewSearchParam(RuntimeSearchParam theSearchParam, Set<String> theExistingIds) {
|
private boolean isNewSearchParam(RuntimeSearchParam theSearchParam, Set<String> theExistingIds) {
|
||||||
return theExistingIds
|
return theExistingIds
|
||||||
.stream()
|
.stream()
|
||||||
|
@ -190,17 +138,6 @@ public class SearchParamValidatingInterceptor {
|
||||||
throw new UnprocessableEntityException(Msg.code(2125) + "Can't process submitted SearchParameter as it is overlapping an existing one.");
|
throw new UnprocessableEntityException(Msg.code(2125) + "Can't process submitted SearchParameter as it is overlapping an existing one.");
|
||||||
}
|
}
|
||||||
|
|
||||||
private boolean isUpliftSearchParam(IBaseResource theResource) {
|
|
||||||
if (theResource instanceof IBaseHasExtensions) {
|
|
||||||
IBaseHasExtensions resource = (IBaseHasExtensions) theResource;
|
|
||||||
return resource.getExtension()
|
|
||||||
.stream()
|
|
||||||
.anyMatch(ext -> getUpliftExtensions().contains(ext.getUrl()));
|
|
||||||
} else {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private boolean isNotSearchParameterResource(IBaseResource theResource){
|
private boolean isNotSearchParameterResource(IBaseResource theResource){
|
||||||
return ! SEARCH_PARAM.equalsIgnoreCase(myFhirContext.getResourceType(theResource));
|
return ! SEARCH_PARAM.equalsIgnoreCase(myFhirContext.getResourceType(theResource));
|
||||||
}
|
}
|
||||||
|
|
|
@ -146,54 +146,6 @@ public class SearchParameterValidatingInterceptorTest {
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
|
||||||
public void whenUpliftSearchParameter_thenMoreGranularComparisonSucceeds() {
|
|
||||||
when(myDaoRegistry.getResourceDao(eq(SearchParamValidatingInterceptor.SEARCH_PARAM))).thenReturn(myIFhirResourceDao);
|
|
||||||
|
|
||||||
setPersistedSearchParameters(asList(myExistingSearchParameter));
|
|
||||||
|
|
||||||
SearchParameter newSearchParam = buildSearchParameterWithUpliftExtension(ID2);
|
|
||||||
|
|
||||||
mySearchParamValidatingInterceptor.resourcePreUpdate(null, newSearchParam, myRequestDetails);
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
public void whenUpliftSearchParameter_thenMoreGranularComparisonFails() {
|
|
||||||
when(myDaoRegistry.getResourceDao(eq(SearchParamValidatingInterceptor.SEARCH_PARAM))).thenReturn(myIFhirResourceDao);
|
|
||||||
SearchParameter existingUpliftSp = buildSearchParameterWithUpliftExtension(ID1);
|
|
||||||
setPersistedSearchParameters(asList(existingUpliftSp));
|
|
||||||
|
|
||||||
SearchParameter newSearchParam = buildSearchParameterWithUpliftExtension(ID2);
|
|
||||||
|
|
||||||
try {
|
|
||||||
mySearchParamValidatingInterceptor.resourcePreUpdate(null, newSearchParam, myRequestDetails);
|
|
||||||
fail();
|
|
||||||
} catch (UnprocessableEntityException e) {
|
|
||||||
assertTrue(e.getMessage().contains("2125"));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
@Nonnull
|
|
||||||
private SearchParameter buildSearchParameterWithUpliftExtension(String theID) {
|
|
||||||
SearchParameter newSearchParam = buildSearchParameterWithId(theID);
|
|
||||||
|
|
||||||
Extension topLevelExtension = new Extension();
|
|
||||||
topLevelExtension.setUrl(UPLIFT_URL);
|
|
||||||
|
|
||||||
Extension codeExtension = new Extension();
|
|
||||||
codeExtension.setUrl("code");
|
|
||||||
codeExtension.setValue(new CodeType("identifier"));
|
|
||||||
|
|
||||||
Extension elementExtension = new Extension();
|
|
||||||
elementExtension.setUrl("element-name");
|
|
||||||
elementExtension.setValue(new CodeType("patient-identifier"));
|
|
||||||
|
|
||||||
topLevelExtension.addExtension(codeExtension);
|
|
||||||
topLevelExtension.addExtension(elementExtension);
|
|
||||||
newSearchParam.addExtension(topLevelExtension);
|
|
||||||
return newSearchParam;
|
|
||||||
}
|
|
||||||
|
|
||||||
private void setPersistedSearchParameterIds(List<SearchParameter> theSearchParams) {
|
private void setPersistedSearchParameterIds(List<SearchParameter> theSearchParams) {
|
||||||
List<ResourcePersistentId> resourcePersistentIds = theSearchParams
|
List<ResourcePersistentId> resourcePersistentIds = theSearchParams
|
||||||
.stream()
|
.stream()
|
||||||
|
@ -203,9 +155,6 @@ public class SearchParameterValidatingInterceptorTest {
|
||||||
when(myIFhirResourceDao.searchForIds(any(), any())).thenReturn(resourcePersistentIds);
|
when(myIFhirResourceDao.searchForIds(any(), any())).thenReturn(resourcePersistentIds);
|
||||||
}
|
}
|
||||||
|
|
||||||
private void setPersistedSearchParameters(List<SearchParameter> theSearchParams) {
|
|
||||||
when(myIFhirResourceDao.search(any(), any())).thenReturn(new SimpleBundleProvider(theSearchParams));
|
|
||||||
}
|
|
||||||
|
|
||||||
private SearchParameter buildSearchParameterWithId(String id) {
|
private SearchParameter buildSearchParameterWithId(String id) {
|
||||||
SearchParameter retVal = new SearchParameter();
|
SearchParameter retVal = new SearchParameter();
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.3.0-PRE4-SNAPSHOT</version>
|
<version>6.3.0-PRE5-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue