Rel 6 2 mergeback (#4257)

* jm wrong bundle entry url (#4213)

* Bug test

* here you go

* Generate relative URIs for bundle entry.request.url, as specified

* Point jira issue in changelog

* Adjust tests to fixes

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
Co-authored-by: Tadgh <garygrantgraham@gmail.com>

* improved logging (#4217)

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Rel 6 1 3 mergeback (#4215)

* Bump for CVE (#3856)

* Bump for CVE

* Bump spring-data version

* Fix compile

* Cut over to spring bom

* Bump to RC1

* remove RC

* do not contrain reindex for common SP updates (#3876)

* only fast-track jobs with exactly one chunk (#3879)

* Fix illegalstateexception when an exception is thrown during stream response (#3882)

* Finish up changelog, minor refactor

* reset buffer only

* Hack for some replacements

* Failure handling

* wip

* Fixed the issue (#3845)

* Fixed the issue

* Changelog modification

* Changelog modification

* Implemented seventh character extended code and the corresponding dis… (#3709)

* Implemented seventh character extended code and the corresponding display

* Modifications

* Changes on previous test according to modifications made in ICD10-CM XML file

* Subscription sending delete events being skipped (#3888)

* fixed bug and added test

* refactor

* Update for CVE (#3895)

* updated pointcuts to work as intended (#3903)

* updated pointcuts to work as intended

* added changelog

* review fixes

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* 3904 during $delete expunge job hibernate search indexed documents are left orphaned (#3905)

* Add test and implementation

* Add changelog

* 3899 code in limits (#3901)

* Add implementation, changelog, test

* Update hapi-fhir-jpaserver-test-utilities/src/test/java/ca/uhn/fhir/jpa/provider/r4/ResourceProviderR4Test.java

Co-authored-by: Ken Stevens <khstevens@gmail.com>

Co-authored-by: Ken Stevens <khstevens@gmail.com>

* 3884 overlapping searchparameter undetected rel 6 1 (#3909)

* Applying all changes from previous dev branch to current one pointing to rel_6_1

* Fixing merge conflict related to Msg.code value.

* Fixing Msg.code value.

* Making checkstyle happy.

* Making sure that all tests are passing.

* Passing all tests after fixing Msg.code

* Passing all tests.

Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* 3745 - fixed NPE for bundle with duplicate conditional create resourc… (#3746)

* 3745 - fixed NPE for bundle with duplicate conditional create resources and a conditional delete

* created unit test for skip of delete operation while processing duplicating create entries

* moved unit test to FhirSystemDaoR4Test

* 3379 mdm fixes (#3906)

* added MdmLinkCreateSvcimplTest

* fixed creating mdm-link not setting the resource type correctly

* fixed a bug where ResourcePersistenceId was being duplicated instead of passed on

* Update hapi-fhir-jpaserver-mdm/src/test/java/ca/uhn/fhir/jpa/mdm/svc/MdmLinkCreateSvcImplTest.java

Change order of tests such that assertEquals takes expected value then actual value

Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>

* added changelog, also changed a setup function in test to beforeeach

Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>

* Fix to the issue (#3855)

* Fix to the issue

* Progress

* fixed the issue

* Addressing suggestions

* add response status code to MethodOutcome

* Addressing suggestions

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Fix for caching appearing broken in batch2 for bulkexport jobs (#3912)

* Respect caching in bullk export, fix bug with completed date on empty jobs

* add changelog

* Add impl

* Add breaking test

* Complete failing test

* more broken tests

* Fix more tests'

* Fix paging bug

* Fix another brittle test

* 3915 do not collapse rules with filters (#3916)

* do not attempt to merge compartment permissions with filters

* changelog

* Rename to IT for concurrency problems

Co-authored-by: Tadgh <garygrantgraham@gmail.com>

* Version bump

* fix $mdm-submit output (#3917)

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Gl3407 bundle offset size (#3918)

* begin with failing test

* fixed

* change log

* rollback default count change and corresponding comments

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Offset interceptor now only works for external calls

* Initialize some beans (esp interceptors) later in the boot process so they don't slow down startup.

* do not reindex searchparam jobs on startup

* Fix oracle non-enterprise attempting online index add (#3925)

* 3922 delete expunge large dataset (#3923)

* lower batchsize of delete requests so that we do not get sql exceptions

* blah

* fix test

* updated tests to not fail

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* add index

* Fix up colun grab

* Revert offset mode change

* Revert fix for null/system request details checks for reindex purposes

* Fix bug and add test for SP Validating Interceptor (#3930)

* wip

* Fix uptests

* Fix index online test

* Fix SP validating interceptor logic

* Updating version to: 6.1.1 post release.

* fix compile error

* Deploy to sonatype (#3934)

* adding sonatype profile to checkstyle module

* adding sonatype profile to tinder module

* adding sonatype profile to base pom

* adding final deployToSonatype profile

* wip

* Revert version enum

* Updating version to: 6.1.1 post release.

* Add test, changelog, and implementation

* Add backport info

* Create failing test

* Implemented the fix, fixed existing unit tests

* added changelog

* added test case for no filter, exclude 1 patient

* wip

* Add backport info

* Add info of new version

* Updating version to: 6.1.2 post release.

* bump info and backport for 6.1.2

* Bump for hapi

* Implement bug fixes, add new tests (#4022)

* Implement bug fixes, add new tests

* tidy

* Tidy

* refactor for cleaning

* More tidying

* Lower logging

* Split into nested tests, rename, add todos

* Typo

* Code review

* add backport info

* Updating version to: 6.1.3 post release.

* Updating version to: 6.1.3 post release.

* removed duplicate mention of ver 6.1.3 in versionEnum

* backport pr 4101

* mdm message key (#4111)

* begin with failing test

* fixed 2 tests

* fix tests

* fix tests

* change log

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* backport 6.1.3 docs changes

* fixed typo on doc backport message

* fix test breaking

* Updating version to: 6.1.4 post release.

* wip

Co-authored-by: JasonRoberts-smile <85363818+JasonRoberts-smile@users.noreply.github.com>
Co-authored-by: Qingyixia <106992634+Qingyixia@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: kateryna-mironova <107507153+kateryna-mironova@users.noreply.github.com>
Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com>
Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: markiantorno <markiantorno@gmail.com>
Co-authored-by: Steven Li <steven@smilecdr.com>

* pin okio-jvm for kotlin vuln (#4216)

* Fix UrlUtil.unescape() by not escaping "+" to " " if this is an "application/..." _outputFormat. (#4220)

* First commit:  Failing unit test and a TODO with a vague idea of where the bug happens.

* Don't escape "+" in a URL GET parameter if it starts with "application".

* Remove unnecessary TODO.

* Add changelog.

* Code review feedback on naming.  Also, make logic more robust by putting plus and should escape boolean && in parens.

* Ks 20221031 migration lock (#4224)

* started design

* complete with tests

* changelog

* cleanup

* tyop

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* 4207-getpagesoffset-set-to-total-number-of-resources-results-in-inconsistent-amount-of-entries-when-requests-are-sent-consecutively (#4209)

* Added test

* Added solution

* Changelog

* Changes made based on comments

* Fix bug with MDM submit

* fix

* Version bump

* 4234 consent in conjunction with versionedapiconverterinterceptor fails (#4236)

* Add constant for interceptor

* add test, changelog

* Allow Batch2 transition from ERRORED to COMPLETE (#4242)

* Allow Batch2 transition from ERRORED to COMPLETE

* Add changelog

* Test fix

Co-authored-by: James Agnew <james@jamess-mbp.lan>

* 3685 When bulk exporting, if no resource type param is provided, defa… (#4233)

* 3685 When bulk exporting, if no resource type param is provided, default to all registered types.

* Update test case.

* Cleaned up changelog.

* Added test case for multiple resource types.

* Added failing test case for not returning Binary resource.

* Refactor solution.

Co-authored-by: kylejule <kyle.jule@smilecdr.com>

* Add next version

* bulk export permanently reusing cached results (#4249)

* Add test, fix bug, add changelog

* minor refactor

* Fix

* Bump HAPI version

* Remove jetbrains

* Fix broken test

Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com>
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: JasonRoberts-smile <85363818+JasonRoberts-smile@users.noreply.github.com>
Co-authored-by: Qingyixia <106992634+Qingyixia@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: kateryna-mironova <107507153+kateryna-mironova@users.noreply.github.com>
Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com>
Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: markiantorno <markiantorno@gmail.com>
Co-authored-by: Steven Li <steven@smilecdr.com>
Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com>
Co-authored-by: karneet1212 <112980019+karneet1212@users.noreply.github.com>
Co-authored-by: James Agnew <jamesagnew@gmail.com>
Co-authored-by: James Agnew <james@jamess-mbp.lan>
Co-authored-by: KGJ-software <39975592+KGJ-software@users.noreply.github.com>
Co-authored-by: kylejule <kyle.jule@smilecdr.com>
This commit is contained in:
Tadgh 2022-11-07 18:57:57 -05:00 committed by GitHub
parent 6c9fe710ee
commit d3367cfede
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
117 changed files with 703 additions and 154 deletions

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -24,7 +24,6 @@ import ca.uhn.fhir.rest.client.api.IBasicClient;
import ca.uhn.fhir.rest.client.api.IGenericClient; import ca.uhn.fhir.rest.client.api.IGenericClient;
import ca.uhn.fhir.rest.client.api.IRestfulClient; import ca.uhn.fhir.rest.client.api.IRestfulClient;
import ca.uhn.fhir.rest.client.api.IRestfulClientFactory; import ca.uhn.fhir.rest.client.api.IRestfulClientFactory;
import ca.uhn.fhir.tls.TlsAuthentication;
import ca.uhn.fhir.util.FhirTerser; import ca.uhn.fhir.util.FhirTerser;
import ca.uhn.fhir.util.ReflectionUtil; import ca.uhn.fhir.util.ReflectionUtil;
import ca.uhn.fhir.util.VersionUtil; import ca.uhn.fhir.util.VersionUtil;

View File

@ -488,9 +488,12 @@ public class UrlUtil {
if (theString == null) { if (theString == null) {
return null; return null;
} }
// If the user passes "_outputFormat" as a GET request parameter directly in the URL:
final boolean shouldEscapePlus = !theString.startsWith("application/");
for (int i = 0; i < theString.length(); i++) { for (int i = 0; i < theString.length(); i++) {
char nextChar = theString.charAt(i); char nextChar = theString.charAt(i);
if (nextChar == '%' || nextChar == '+') { if (nextChar == '%' || (nextChar == '+' && shouldEscapePlus)) {
try { try {
// Yes it would be nice to not use a string "UTF-8" but the equivalent // Yes it would be nice to not use a string "UTF-8" but the equivalent
// method that takes Charset is JDK10+ only... sigh.... // method that takes Charset is JDK10+ only... sigh....

View File

@ -103,8 +103,10 @@ public enum VersionEnum {
V6_1_0, V6_1_0,
V6_1_1, V6_1_1,
V6_1_2, V6_1_2,
V6_1_3,
V6_1_4,
V6_2_0, V6_2_0,
V6_3_0, V6_3_0
; ;
public static VersionEnum latestVersion() { public static VersionEnum latestVersion() {

View File

@ -1,10 +1,12 @@
package ca.uhn.fhir.util; package ca.uhn.fhir.util;
import ca.uhn.fhir.rest.api.Constants;
import org.apache.http.message.BasicNameValuePair; import org.apache.http.message.BasicNameValuePair;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.containsInAnyOrder; import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.junit.jupiter.api.Assertions.assertAll;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
@ -45,6 +47,20 @@ public class UrlUtilTest {
assertEquals("A%2BB", UrlUtil.escapeUrlParam("A+B")); assertEquals("A%2BB", UrlUtil.escapeUrlParam("A+B"));
} }
@Test
public void testUnescape() {
assertAll(
() -> assertEquals(Constants.CT_JSON, UrlUtil.unescape(Constants.CT_JSON)),
() -> assertEquals(Constants.CT_NDJSON, UrlUtil.unescape(Constants.CT_NDJSON)),
() -> assertEquals(Constants.CT_XML, UrlUtil.unescape(Constants.CT_XML)),
() -> assertEquals(Constants.CT_XML_PATCH, UrlUtil.unescape(Constants.CT_XML_PATCH)),
() -> assertEquals(Constants.CT_APPLICATION_GZIP, UrlUtil.unescape(Constants.CT_APPLICATION_GZIP)),
() -> assertEquals(Constants.CT_RDF_TURTLE, UrlUtil.unescape(Constants.CT_RDF_TURTLE)),
() -> assertEquals(Constants.CT_FHIR_JSON, UrlUtil.unescape(Constants.CT_FHIR_JSON)),
() -> assertEquals(Constants.CT_FHIR_NDJSON, UrlUtil.unescape(Constants.CT_FHIR_NDJSON))
);
}
@Test @Test
public void testIsValid() { public void testIsValid() {
assertTrue(UrlUtil.isValid("http://foo")); assertTrue(UrlUtil.isValid("http://foo"));

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -3,14 +3,14 @@
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-bom</artifactId> <artifactId>hapi-fhir-bom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<packaging>pom</packaging> <packaging>pom</packaging>
<name>HAPI FHIR BOM</name> <name>HAPI FHIR BOM</name>
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-cli</artifactId> <artifactId>hapi-fhir-cli</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../../hapi-deployable-pom</relativePath> <relativePath>../../hapi-deployable-pom</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -2234,7 +2234,6 @@ public class GenericClient extends BaseClient implements IGenericClient {
return new TransactionExecutable<>(theResources); return new TransactionExecutable<>(theResources);
} }
} }
private class UpdateInternal extends BaseSearch<IUpdateExecutable, IUpdateWithQueryTyped, MethodOutcome> private class UpdateInternal extends BaseSearch<IUpdateExecutable, IUpdateWithQueryTyped, MethodOutcome>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -23,6 +23,7 @@ package ca.uhn.hapi.converters.server;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.context.FhirVersionEnum; import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.interceptor.api.Interceptor;
import ca.uhn.fhir.model.api.IResource; import ca.uhn.fhir.model.api.IResource;
import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
@ -30,6 +31,7 @@ import ca.uhn.fhir.rest.api.server.ResponseDetails;
import ca.uhn.fhir.rest.server.exceptions.AuthenticationException; import ca.uhn.fhir.rest.server.exceptions.AuthenticationException;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.interceptor.InterceptorAdapter; import ca.uhn.fhir.rest.server.interceptor.InterceptorAdapter;
import ca.uhn.fhir.rest.server.interceptor.auth.AuthorizationConstants;
import org.hl7.fhir.converter.NullVersionConverterAdvisor10_30; import org.hl7.fhir.converter.NullVersionConverterAdvisor10_30;
import org.hl7.fhir.converter.NullVersionConverterAdvisor10_40; import org.hl7.fhir.converter.NullVersionConverterAdvisor10_40;
import org.hl7.fhir.convertors.factory.VersionConvertorFactory_10_30; import org.hl7.fhir.convertors.factory.VersionConvertorFactory_10_30;
@ -54,6 +56,8 @@ import static org.apache.commons.lang3.StringUtils.*;
* Versioned API features. * Versioned API features.
* </p> * </p>
*/ */
@Interceptor(order = AuthorizationConstants.ORDER_CONVERTER_INTERCEPTOR)
public class VersionedApiConverterInterceptor extends InterceptorAdapter { public class VersionedApiConverterInterceptor extends InterceptorAdapter {
private final FhirContext myCtxDstu2; private final FhirContext myCtxDstu2;
private final FhirContext myCtxDstu2Hl7Org; private final FhirContext myCtxDstu2Hl7Org;

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -0,0 +1,3 @@
---
release-date: "2022-10-06"
codename: "Unicorn"

View File

@ -1,5 +1,6 @@
--- ---
type: fix type: fix
issue: 4111 issue: 4111
backport: 6.1.3
title: "MDM messages were using the resource id as a message key when it should be using the EID as a partition hash key. title: "MDM messages were using the resource id as a message key when it should be using the EID as a partition hash key.
This could lead to duplicate golden resources on systems using Kafka as a message broker." This could lead to duplicate golden resources on systems using Kafka as a message broker."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 4207
title: "Previously to improve performance, if the total number of resources was less than the _getpageoffset,
the results would default to last resource offset. This is especially evident when requests are consecutive
resulting in one entry being displayed in some requests. This issue is now fixed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4210
jira: SMILE-4685
title: "Generating Bundle with resources was setting `entry.request.url` as absolute url when it should be relative. This has been fixed"

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 4218
title: "Performing a bulk export with an _outputParam value encoded in a GET request URL that contains a '+' (ex: 'application/fhir+ndjson') will result in a 400 because the '+' is replaced with a ' '. After this fix the '+' will remain in the parameter value."

View File

@ -0,0 +1,4 @@
---
type: add
issue: 4224
title: "Added new System Property called 'CLEAR_LOCK_TABLE_WITH_DESCRIPTION' that when set to the uuid of a lock record, will clear that lock record before attempting to insert a new one."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 4232
title: "Previously during Bulk Export, if no `_type` parameter was provided, an error would be thrown. This has been changed, and if the `_type` parameter is omitted, Bulk Export will default to all registered types."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 4234
jira: SMILE-4765
title: "Fixed a bug which caused a failure when combining a Consent Interceptor with version conversion via the `Accept` header."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 4247
title: "Previously, Bulk Export jobs were always reused, even if completed. Now, jobs are only reused if an identical job is already running, and has not yet completed or failed."

View File

@ -1,3 +1,3 @@
--- ---
release-date: "2022-11-18" release-date: "2022-11-18"
codename: "TBD" codename: "Vishwa"

View File

@ -1,3 +1,3 @@
--- ---
release-date: "TBD" release-date: "2022-02-18"
codename: "TBD" codename: "TBD"

View File

@ -11,7 +11,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -35,6 +35,7 @@ import ca.uhn.fhir.jpa.entity.Batch2JobInstanceEntity;
import ca.uhn.fhir.jpa.entity.Batch2WorkChunkEntity; import ca.uhn.fhir.jpa.entity.Batch2WorkChunkEntity;
import ca.uhn.fhir.jpa.util.JobInstanceUtil; import ca.uhn.fhir.jpa.util.JobInstanceUtil;
import ca.uhn.fhir.model.api.PagingIterator; import ca.uhn.fhir.model.api.PagingIterator;
import ca.uhn.fhir.narrative.BaseThymeleafNarrativeGenerator;
import org.apache.commons.collections4.ListUtils; import org.apache.commons.collections4.ListUtils;
import org.apache.commons.lang3.Validate; import org.apache.commons.lang3.Validate;
import org.slf4j.Logger; import org.slf4j.Logger;

View File

@ -160,6 +160,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
public static final String BASE_RESOURCE_NAME = "resource"; public static final String BASE_RESOURCE_NAME = "resource";
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(BaseHapiFhirResourceDao.class); private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(BaseHapiFhirResourceDao.class);
@Autowired @Autowired
protected PlatformTransactionManager myPlatformTransactionManager; protected PlatformTransactionManager myPlatformTransactionManager;
@Autowired(required = false) @Autowired(required = false)

View File

@ -7,7 +7,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -37,6 +37,7 @@ import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException; import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -4,11 +4,13 @@ import ca.uhn.fhir.jpa.model.util.JpaConstants;
import ca.uhn.fhir.rest.server.exceptions.NotImplementedOperationException; import ca.uhn.fhir.rest.server.exceptions.NotImplementedOperationException;
import org.hl7.fhir.dstu3.model.Bundle; import org.hl7.fhir.dstu3.model.Bundle;
import org.hl7.fhir.dstu3.model.CapabilityStatement; import org.hl7.fhir.dstu3.model.CapabilityStatement;
import org.hl7.fhir.dstu3.model.CarePlan;
import org.hl7.fhir.dstu3.model.Parameters; import org.hl7.fhir.dstu3.model.Parameters;
import org.hl7.fhir.dstu3.model.PrimitiveType; import org.hl7.fhir.dstu3.model.PrimitiveType;
import org.hl7.fhir.dstu3.model.StringType; import org.hl7.fhir.dstu3.model.StringType;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import java.util.ArrayList;
import java.util.List; import java.util.List;
import java.util.Optional; import java.util.Optional;
@ -56,7 +58,27 @@ public class ResourceProviderDstu3BundleTest extends BaseResourceProviderDstu3Te
assertTrue(searchInclude.stream().map(PrimitiveType::getValue).anyMatch(stringRevIncludes -> stringRevIncludes.equals("Patient:general-practitioner"))); assertTrue(searchInclude.stream().map(PrimitiveType::getValue).anyMatch(stringRevIncludes -> stringRevIncludes.equals("Patient:general-practitioner")));
assertEquals(searchInclude.size(), 4); assertEquals(searchInclude.size(), 4);
} }
} }
@Test
void testTransactionBundleEntryUri() {
CarePlan carePlan = new CarePlan();
carePlan.getText().setDivAsString("A CarePlan");
carePlan.setId("ACarePlan");
ourClient.create().resource(carePlan).execute();
// GET CarePlans from server
Bundle bundle = ourClient.search()
.byUrl(ourServerBase + "/CarePlan")
.returnBundle(Bundle.class).execute();
// Create and populate list of CarePlans
List<CarePlan> carePlans = new ArrayList<>();
bundle.getEntry().forEach(entry -> carePlans.add((CarePlan) entry.getResource()));
// Post CarePlans should not get: HAPI-2006: Unable to perform PUT, URL provided is invalid...
ourClient.transaction().withResources(carePlans).execute();
}
} }

View File

@ -5,7 +5,6 @@ import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.dao.data.ISearchDao; import ca.uhn.fhir.jpa.dao.data.ISearchDao;
import ca.uhn.fhir.jpa.entity.Search; import ca.uhn.fhir.jpa.entity.Search;
import ca.uhn.fhir.jpa.search.SearchCoordinatorSvcImpl; import ca.uhn.fhir.jpa.search.SearchCoordinatorSvcImpl;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.util.QueryParameterUtils; import ca.uhn.fhir.jpa.util.QueryParameterUtils;
import ca.uhn.fhir.model.api.TemporalPrecisionEnum; import ca.uhn.fhir.model.api.TemporalPrecisionEnum;
import ca.uhn.fhir.model.primitive.InstantDt; import ca.uhn.fhir.model.primitive.InstantDt;
@ -17,7 +16,6 @@ import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.MethodOutcome; import ca.uhn.fhir.rest.api.MethodOutcome;
import ca.uhn.fhir.rest.api.SearchTotalModeEnum; import ca.uhn.fhir.rest.api.SearchTotalModeEnum;
import ca.uhn.fhir.rest.api.SummaryEnum; import ca.uhn.fhir.rest.api.SummaryEnum;
import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.rest.client.api.IClientInterceptor; import ca.uhn.fhir.rest.client.api.IClientInterceptor;
import ca.uhn.fhir.rest.client.api.IGenericClient; import ca.uhn.fhir.rest.client.api.IGenericClient;
import ca.uhn.fhir.rest.client.api.IHttpRequest; import ca.uhn.fhir.rest.client.api.IHttpRequest;
@ -30,7 +28,6 @@ import ca.uhn.fhir.rest.param.ParamPrefixEnum;
import ca.uhn.fhir.rest.param.StringAndListParam; import ca.uhn.fhir.rest.param.StringAndListParam;
import ca.uhn.fhir.rest.param.StringOrListParam; import ca.uhn.fhir.rest.param.StringOrListParam;
import ca.uhn.fhir.rest.param.StringParam; import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.PreconditionFailedException; import ca.uhn.fhir.rest.server.exceptions.PreconditionFailedException;
@ -107,7 +104,6 @@ import org.hl7.fhir.dstu3.model.Organization;
import org.hl7.fhir.dstu3.model.Parameters; import org.hl7.fhir.dstu3.model.Parameters;
import org.hl7.fhir.dstu3.model.Patient; import org.hl7.fhir.dstu3.model.Patient;
import org.hl7.fhir.dstu3.model.Period; import org.hl7.fhir.dstu3.model.Period;
import org.hl7.fhir.dstu3.model.Person;
import org.hl7.fhir.dstu3.model.PlanDefinition; import org.hl7.fhir.dstu3.model.PlanDefinition;
import org.hl7.fhir.dstu3.model.Practitioner; import org.hl7.fhir.dstu3.model.Practitioner;
import org.hl7.fhir.dstu3.model.ProcedureRequest; import org.hl7.fhir.dstu3.model.ProcedureRequest;
@ -2354,7 +2350,7 @@ public class ResourceProviderDstu3Test extends BaseResourceProviderDstu3Test {
assertEquals(1, ((Patient) history.getEntry().get(0).getResource()).getName().size()); assertEquals(1, ((Patient) history.getEntry().get(0).getResource()).getName().size());
assertEquals(HTTPVerb.DELETE, history.getEntry().get(1).getRequest().getMethodElement().getValue()); assertEquals(HTTPVerb.DELETE, history.getEntry().get(1).getRequest().getMethodElement().getValue());
assertEquals("http://localhost:" + ourPort + "/fhir/context/Patient/" + id.getIdPart() + "/_history/2", history.getEntry().get(1).getRequest().getUrl()); assertEquals("Patient/" + id.getIdPart() + "/_history/2", history.getEntry().get(1).getRequest().getUrl());
assertEquals(null, history.getEntry().get(1).getResource()); assertEquals(null, history.getEntry().get(1).getResource());
assertEquals(id.withVersion("1").getValue(), history.getEntry().get(2).getResource().getId()); assertEquals(id.withVersion("1").getValue(), history.getEntry().get(2).getResource().getId());

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -2,6 +2,7 @@ package ca.uhn.fhir.jpa.bulk;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.api.config.DaoConfig; import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.model.Batch2JobInfo; import ca.uhn.fhir.jpa.api.model.Batch2JobInfo;
import ca.uhn.fhir.jpa.api.model.Batch2JobOperationResult; import ca.uhn.fhir.jpa.api.model.Batch2JobOperationResult;
import ca.uhn.fhir.jpa.api.model.BulkExportJobResults; import ca.uhn.fhir.jpa.api.model.BulkExportJobResults;
@ -57,6 +58,7 @@ import java.util.ArrayList;
import java.util.Date; import java.util.Date;
import java.util.HashMap; import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Set;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
import java.util.stream.Stream; import java.util.stream.Stream;
@ -67,12 +69,15 @@ import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.notNullValue; import static org.hamcrest.Matchers.notNullValue;
import static org.hamcrest.Matchers.nullValue; import static org.hamcrest.Matchers.nullValue;
import static org.junit.jupiter.api.Assertions.assertAll;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyString; import static org.mockito.ArgumentMatchers.anyString;
import static org.mockito.Mockito.eq; import static org.mockito.Mockito.eq;
import static org.mockito.Mockito.lenient;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.times; import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when; import static org.mockito.Mockito.when;
@ -93,7 +98,7 @@ public class BulkDataExportProviderTest {
private IBatch2JobRunner myJobRunner; private IBatch2JobRunner myJobRunner;
private DaoConfig myDaoConfig; private DaoConfig myDaoConfig;
private DaoRegistry myDaoRegistry;
private CloseableHttpClient myClient; private CloseableHttpClient myClient;
@InjectMocks @InjectMocks
@ -136,6 +141,9 @@ public class BulkDataExportProviderTest {
public void injectDaoConfig() { public void injectDaoConfig() {
myDaoConfig = new DaoConfig(); myDaoConfig = new DaoConfig();
myProvider.setDaoConfig(myDaoConfig); myProvider.setDaoConfig(myDaoConfig);
myDaoRegistry = mock(DaoRegistry.class);
lenient().when(myDaoRegistry.getRegisteredDaoTypes()).thenReturn(Set.of("Patient", "Observation", "Encounter"));
myProvider.setDaoRegistry(myDaoRegistry);
} }
public void startWithFixedBaseUrl() { public void startWithFixedBaseUrl() {
@ -863,6 +871,52 @@ public class BulkDataExportProviderTest {
} }
} }
@Test
public void testGetBulkExport_outputFormat_FhirNdJson_inHeader() throws IOException {
// when
when(myJobRunner.startNewJob(any()))
.thenReturn(createJobStartResponse());
// call
final HttpGet httpGet = new HttpGet(String.format("http://localhost:%s/%s", myPort, JpaConstants.OPERATION_EXPORT));
httpGet.addHeader("_outputFormat", Constants.CT_FHIR_NDJSON);
httpGet.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
try (CloseableHttpResponse response = myClient.execute(httpGet)) {
ourLog.info("Response: {}", response.toString());
assertEquals(202, response.getStatusLine().getStatusCode());
assertEquals("Accepted", response.getStatusLine().getReasonPhrase());
assertEquals(String.format("http://localhost:%s/$export-poll-status?_jobId=%s", myPort, A_JOB_ID), response.getFirstHeader(Constants.HEADER_CONTENT_LOCATION).getValue());
assertTrue(IOUtils.toString(response.getEntity().getContent(), Charsets.UTF_8).isEmpty());
}
final BulkExportParameters params = verifyJobStart();
assertEquals(Constants.CT_FHIR_NDJSON, params.getOutputFormat());
}
@Test
public void testGetBulkExport_outputFormat_FhirNdJson_inUrl() throws IOException {
// when
when(myJobRunner.startNewJob(any()))
.thenReturn(createJobStartResponse());
// call
final HttpGet httpGet = new HttpGet(String.format("http://localhost:%s/%s?_outputFormat=%s", myPort, JpaConstants.OPERATION_EXPORT, Constants.CT_FHIR_NDJSON));
httpGet.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
try (CloseableHttpResponse response = myClient.execute(httpGet)) {
assertAll(
() -> assertEquals(202, response.getStatusLine().getStatusCode()),
() -> assertEquals("Accepted", response.getStatusLine().getReasonPhrase()),
() -> assertEquals(String.format("http://localhost:%s/$export-poll-status?_jobId=%s", myPort, A_JOB_ID), response.getFirstHeader(Constants.HEADER_CONTENT_LOCATION).getValue()),
() -> assertTrue(IOUtils.toString(response.getEntity().getContent(), Charsets.UTF_8).isEmpty())
);
}
final BulkExportParameters params = verifyJobStart();
assertEquals(Constants.CT_FHIR_NDJSON, params.getOutputFormat());
}
private void callExportAndAssertJobId(Parameters input, String theExpectedJobId) throws IOException { private void callExportAndAssertJobId(Parameters input, String theExpectedJobId) throws IOException {
HttpPost post; HttpPost post;
post = new HttpPost("http://localhost:" + myPort + "/" + JpaConstants.OPERATION_EXPORT); post = new HttpPost("http://localhost:" + myPort + "/" + JpaConstants.OPERATION_EXPORT);

View File

@ -25,12 +25,14 @@ import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r4.model.Binary; import org.hl7.fhir.r4.model.Binary;
import org.hl7.fhir.r4.model.Bundle; import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.Coverage; import org.hl7.fhir.r4.model.Coverage;
import org.hl7.fhir.r4.model.Encounter;
import org.hl7.fhir.r4.model.Enumerations; import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.Group; import org.hl7.fhir.r4.model.Group;
import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.Observation; import org.hl7.fhir.r4.model.Observation;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Reference; import org.hl7.fhir.r4.model.Reference;
import org.jetbrains.annotations.NotNull;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Nested; import org.junit.jupiter.api.Nested;
@ -41,6 +43,7 @@ import org.springframework.beans.factory.annotation.Autowired;
import java.io.IOException; import java.io.IOException;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
import java.util.HashMap; import java.util.HashMap;
import java.util.HashSet; import java.util.HashSet;
@ -48,6 +51,7 @@ import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Optional; import java.util.Optional;
import java.util.Set; import java.util.Set;
import java.util.stream.Collectors;
import static org.awaitility.Awaitility.await; import static org.awaitility.Awaitility.await;
import static org.hamcrest.CoreMatchers.is; import static org.hamcrest.CoreMatchers.is;
@ -56,6 +60,7 @@ import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.containsString; import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.empty; import static org.hamcrest.Matchers.empty;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.hasItem;
import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.hasSize;
import static org.hamcrest.Matchers.not; import static org.hamcrest.Matchers.not;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
@ -63,7 +68,6 @@ import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
public class BulkExportUseCaseTest extends BaseResourceProviderR4Test { public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
private static final Logger ourLog = LoggerFactory.getLogger(BulkExportUseCaseTest.class); private static final Logger ourLog = LoggerFactory.getLogger(BulkExportUseCaseTest.class);
@ -76,6 +80,29 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
@Nested @Nested
public class SpecConformanceTests { public class SpecConformanceTests {
@Test
public void testBatchJobsAreOnlyReusedIfInProgress() throws IOException {
//Given a patient exists
Patient p = new Patient();
p.setId("Pat-1");
myClient.update().resource(p).execute();
//And Given we start a bulk export job
String pollingLocation = submitBulkExportForTypes("Patient");
String jobId = getJobIdFromPollingLocation(pollingLocation);
myBatch2JobHelper.awaitJobCompletion(jobId);
//When we execute another batch job, it should not have the same job id.
String secondPollingLocation = submitBulkExportForTypes("Patient");
String secondJobId = getJobIdFromPollingLocation(secondPollingLocation);
//Then the job id should be different
assertThat(secondJobId, not(equalTo(jobId)));
myBatch2JobHelper.awaitJobCompletion(secondJobId);
}
@Test @Test
public void testPollingLocationContainsAllRequiredAttributesUponCompletion() throws IOException { public void testPollingLocationContainsAllRequiredAttributesUponCompletion() throws IOException {
@ -85,14 +112,8 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
myClient.update().resource(p).execute(); myClient.update().resource(p).execute();
//And Given we start a bulk export job //And Given we start a bulk export job
HttpGet httpGet = new HttpGet(myClient.getServerBase() + "/$export?_type=Patient"); String pollingLocation = submitBulkExportForTypes("Patient");
httpGet.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC); String jobId = getJobIdFromPollingLocation(pollingLocation);
String pollingLocation;
try (CloseableHttpResponse status = ourHttpClient.execute(httpGet)) {
Header[] headers = status.getHeaders("Content-Location");
pollingLocation = headers[0].getValue();
}
String jobId = pollingLocation.substring(pollingLocation.indexOf("_jobId=") + 7);
myBatch2JobHelper.awaitJobCompletion(jobId); myBatch2JobHelper.awaitJobCompletion(jobId);
//Then: When the poll shows as complete, all attributes should be filled. //Then: When the poll shows as complete, all attributes should be filled.
@ -113,6 +134,143 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
assertThat(responseContent, containsString("\"error\" : [ ]")); assertThat(responseContent, containsString("\"error\" : [ ]"));
} }
} }
@NotNull
private String getJobIdFromPollingLocation(String pollingLocation) {
return pollingLocation.substring(pollingLocation.indexOf("_jobId=") + 7);
}
@Test
public void export_shouldExportPatientResource_whenTypeParameterOmitted() throws IOException {
//Given a patient exists
Patient p = new Patient();
p.setId("Pat-1");
myClient.update().resource(p).execute();
//And Given we start a bulk export job
HttpGet httpGet = new HttpGet(myClient.getServerBase() + "/$export");
httpGet.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
String pollingLocation;
try (CloseableHttpResponse status = ourHttpClient.execute(httpGet)) {
Header[] headers = status.getHeaders("Content-Location");
pollingLocation = headers[0].getValue();
}
String jobId = getJobIdFromPollingLocation(pollingLocation);
myBatch2JobHelper.awaitJobCompletion(jobId);
//Then: When the poll shows as complete, all attributes should be filled.
HttpGet statusGet = new HttpGet(pollingLocation);
String expectedOriginalUrl = myClient.getServerBase() + "/$export";
try (CloseableHttpResponse status = ourHttpClient.execute(statusGet)) {
String responseContent = IOUtils.toString(status.getEntity().getContent(), StandardCharsets.UTF_8);
ourLog.info(responseContent);
BulkExportResponseJson result = JsonUtil.deserialize(responseContent, BulkExportResponseJson.class);
assertThat(result.getRequest(), is(equalTo(expectedOriginalUrl)));
assertThat(result.getRequiresAccessToken(), is(equalTo(true)));
assertThat(result.getTransactionTime(), is(notNullValue()));
assertThat(result.getOutput(), is(not(empty())));
//We assert specifically on content as the deserialized version will "helpfully" fill in missing fields.
assertThat(responseContent, containsString("\"error\" : [ ]"));
}
}
@Test
public void export_shouldExportPatientAndObservationAndEncounterResources_whenTypeParameterOmitted() throws IOException {
Patient patient = new Patient();
patient.setId("Pat-1");
myClient.update().resource(patient).execute();
Observation observation = new Observation();
observation.setId("Obs-1");
myClient.update().resource(observation).execute();
Encounter encounter = new Encounter();
encounter.setId("Enc-1");
myClient.update().resource(encounter).execute();
HttpGet httpGet = new HttpGet(myClient.getServerBase() + "/$export");
httpGet.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
String pollingLocation;
try (CloseableHttpResponse status = ourHttpClient.execute(httpGet)) {
Header[] headers = status.getHeaders("Content-Location");
pollingLocation = headers[0].getValue();
}
String jobId = getJobIdFromPollingLocation(pollingLocation);
myBatch2JobHelper.awaitJobCompletion(jobId);
HttpGet statusGet = new HttpGet(pollingLocation);
String expectedOriginalUrl = myClient.getServerBase() + "/$export";
try (CloseableHttpResponse status = ourHttpClient.execute(statusGet)) {
String responseContent = IOUtils.toString(status.getEntity().getContent(), StandardCharsets.UTF_8);
BulkExportResponseJson result = JsonUtil.deserialize(responseContent, BulkExportResponseJson.class);
assertThat(result.getRequest(), is(equalTo(expectedOriginalUrl)));
assertThat(result.getRequiresAccessToken(), is(equalTo(true)));
assertThat(result.getTransactionTime(), is(notNullValue()));
assertEquals(result.getOutput().size(), 3);
assertEquals(1, result.getOutput().stream().filter(o -> o.getType().equals("Patient")).collect(Collectors.toList()).size());
assertEquals(1, result.getOutput().stream().filter(o -> o.getType().equals("Observation")).collect(Collectors.toList()).size());
assertEquals(1, result.getOutput().stream().filter(o -> o.getType().equals("Encounter")).collect(Collectors.toList()).size());
//We assert specifically on content as the deserialized version will "helpfully" fill in missing fields.
assertThat(responseContent, containsString("\"error\" : [ ]"));
}
}
@Test
public void export_shouldNotExportBinaryResource_whenTypeParameterOmitted() throws IOException {
Patient patient = new Patient();
patient.setId("Pat-1");
myClient.update().resource(patient).execute();
Binary binary = new Binary();
binary.setId("Bin-1");
myClient.update().resource(binary).execute();
HttpGet httpGet = new HttpGet(myClient.getServerBase() + "/$export");
httpGet.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
String pollingLocation;
try (CloseableHttpResponse status = ourHttpClient.execute(httpGet)) {
Header[] headers = status.getHeaders("Content-Location");
pollingLocation = headers[0].getValue();
}
String jobId = getJobIdFromPollingLocation(pollingLocation);
myBatch2JobHelper.awaitJobCompletion(jobId);
HttpGet statusGet = new HttpGet(pollingLocation);
String expectedOriginalUrl = myClient.getServerBase() + "/$export";
try (CloseableHttpResponse status = ourHttpClient.execute(statusGet)) {
String responseContent = IOUtils.toString(status.getEntity().getContent(), StandardCharsets.UTF_8);
BulkExportResponseJson result = JsonUtil.deserialize(responseContent, BulkExportResponseJson.class);
assertThat(result.getRequest(), is(equalTo(expectedOriginalUrl)));
assertThat(result.getRequiresAccessToken(), is(equalTo(true)));
assertThat(result.getTransactionTime(), is(notNullValue()));
assertEquals(result.getOutput().size(), 1);
assertEquals(1, result.getOutput().stream().filter(o -> o.getType().equals("Patient")).collect(Collectors.toList()).size());
assertEquals(0, result.getOutput().stream().filter(o -> o.getType().equals("Binary")).collect(Collectors.toList()).size());
//We assert specifically on content as the deserialized version will "helpfully" fill in missing fields.
assertThat(responseContent, containsString("\"error\" : [ ]"));
}
}
}
private String submitBulkExportForTypes(String... theTypes) throws IOException {
String typeString = String.join(",", theTypes);
HttpGet httpGet = new HttpGet(myClient.getServerBase() + "/$export?_type=" + typeString);
httpGet.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
String pollingLocation;
try (CloseableHttpResponse status = ourHttpClient.execute(httpGet)) {
Header[] headers = status.getHeaders("Content-Location");
pollingLocation = headers[0].getValue();
}
return pollingLocation;
} }
@Nested @Nested
@ -233,7 +391,6 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
} }
@Nested @Nested
public class PatientBulkExportTests { public class PatientBulkExportTests {

View File

@ -41,6 +41,8 @@ import javax.persistence.EntityManager;
import java.util.List; import java.util.List;
import java.util.List;
import java.util.List; import java.util.List;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;

View File

@ -24,6 +24,7 @@ import ca.uhn.fhir.rest.server.interceptor.consent.IConsentService;
import ca.uhn.fhir.util.BundleUtil; import ca.uhn.fhir.util.BundleUtil;
import ca.uhn.fhir.util.StopWatch; import ca.uhn.fhir.util.StopWatch;
import ca.uhn.fhir.util.UrlUtil; import ca.uhn.fhir.util.UrlUtil;
import ca.uhn.hapi.converters.server.VersionedApiConverterInterceptor;
import com.google.common.base.Charsets; import com.google.common.base.Charsets;
import com.google.common.collect.Lists; import com.google.common.collect.Lists;
import org.apache.commons.collections4.ListUtils; import org.apache.commons.collections4.ListUtils;
@ -282,6 +283,21 @@ public class ConsentInterceptorResourceProviderR4Test extends BaseResourceProvid
}); });
} }
@Test
public void testConsentWorksWithVersionedApiConverterInterceptor() {
myConsentInterceptor = new ConsentInterceptor(new IConsentService() {
});
ourRestServer.getInterceptorService().registerInterceptor(myConsentInterceptor);
ourRestServer.getInterceptorService().registerInterceptor(new VersionedApiConverterInterceptor());
myClient.create().resource(new Patient().setGender(Enumerations.AdministrativeGender.MALE).addName(new HumanName().setFamily("1"))).execute();
Bundle response = myClient.search().forResource(Patient.class).count(1).accept("application/fhir+json; fhirVersion=3.0").returnBundle(Bundle.class).execute();
assertEquals(1, response.getEntry().size());
assertNull(response.getTotalElement().getValue());
}
@Test @Test
public void testHistoryAndBlockSome() { public void testHistoryAndBlockSome() {
create50Observations(); create50Observations();

View File

@ -10,6 +10,7 @@ import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.Bundle.BundleEntryComponent; import org.hl7.fhir.r4.model.Bundle.BundleEntryComponent;
import org.hl7.fhir.r4.model.Bundle.BundleType; import org.hl7.fhir.r4.model.Bundle.BundleType;
import org.hl7.fhir.r4.model.Bundle.HTTPVerb; import org.hl7.fhir.r4.model.Bundle.HTTPVerb;
import org.hl7.fhir.r4.model.CarePlan;
import org.hl7.fhir.r4.model.Condition; import org.hl7.fhir.r4.model.Condition;
import org.hl7.fhir.r4.model.Enumerations.AdministrativeGender; import org.hl7.fhir.r4.model.Enumerations.AdministrativeGender;
import org.hl7.fhir.r4.model.OperationOutcome; import org.hl7.fhir.r4.model.OperationOutcome;
@ -331,4 +332,25 @@ public class ResourceProviderR4BundleTest extends BaseResourceProviderR4Test {
return ids; return ids;
} }
@Test
void testTransactionBundleEntryUri() {
CarePlan carePlan = new CarePlan();
carePlan.getText().setDivAsString("A CarePlan");
carePlan.setId("ACarePlan");
myClient.create().resource(carePlan).execute();
// GET CarePlans from server
Bundle bundle = myClient.search()
.byUrl(ourServerBase + "/CarePlan")
.returnBundle(Bundle.class).execute();
// Create and populate list of CarePlans
List<CarePlan> carePlans = new ArrayList<>();
bundle.getEntry().forEach(entry -> carePlans.add((CarePlan) entry.getResource()));
// Post CarePlans should not get: HAPI-2006: Unable to perform PUT, URL provided is invalid...
myClient.transaction().withResources(carePlans).execute();
}
} }

View File

@ -10,7 +10,6 @@ import ca.uhn.fhir.jpa.model.entity.ResourceHistoryTable;
import ca.uhn.fhir.jpa.model.util.JpaConstants; import ca.uhn.fhir.jpa.model.util.JpaConstants;
import ca.uhn.fhir.jpa.model.util.UcumServiceUtil; import ca.uhn.fhir.jpa.model.util.UcumServiceUtil;
import ca.uhn.fhir.jpa.search.SearchCoordinatorSvcImpl; import ca.uhn.fhir.jpa.search.SearchCoordinatorSvcImpl;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.term.ZipCollectionBuilder; import ca.uhn.fhir.jpa.term.ZipCollectionBuilder;
import ca.uhn.fhir.jpa.test.config.TestR4Config; import ca.uhn.fhir.jpa.test.config.TestR4Config;
import ca.uhn.fhir.jpa.util.QueryParameterUtils; import ca.uhn.fhir.jpa.util.QueryParameterUtils;
@ -25,7 +24,6 @@ import ca.uhn.fhir.rest.api.MethodOutcome;
import ca.uhn.fhir.rest.api.PreferReturnEnum; import ca.uhn.fhir.rest.api.PreferReturnEnum;
import ca.uhn.fhir.rest.api.SearchTotalModeEnum; import ca.uhn.fhir.rest.api.SearchTotalModeEnum;
import ca.uhn.fhir.rest.api.SummaryEnum; import ca.uhn.fhir.rest.api.SummaryEnum;
import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.rest.client.apache.ResourceEntity; import ca.uhn.fhir.rest.client.apache.ResourceEntity;
import ca.uhn.fhir.rest.client.api.IClientInterceptor; import ca.uhn.fhir.rest.client.api.IClientInterceptor;
import ca.uhn.fhir.rest.client.api.IGenericClient; import ca.uhn.fhir.rest.client.api.IGenericClient;
@ -41,7 +39,6 @@ import ca.uhn.fhir.rest.param.ParamPrefixEnum;
import ca.uhn.fhir.rest.param.StringAndListParam; import ca.uhn.fhir.rest.param.StringAndListParam;
import ca.uhn.fhir.rest.param.StringOrListParam; import ca.uhn.fhir.rest.param.StringOrListParam;
import ca.uhn.fhir.rest.param.StringParam; import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.PreconditionFailedException; import ca.uhn.fhir.rest.server.exceptions.PreconditionFailedException;
@ -71,7 +68,6 @@ import org.apache.http.entity.ContentType;
import org.apache.http.entity.StringEntity; import org.apache.http.entity.StringEntity;
import org.apache.http.message.BasicNameValuePair; import org.apache.http.message.BasicNameValuePair;
import org.apache.http.util.EntityUtils; import org.apache.http.util.EntityUtils;
import org.apache.jena.rdf.model.ModelCon;
import org.hamcrest.Matchers; import org.hamcrest.Matchers;
import org.hl7.fhir.common.hapi.validation.validator.FhirInstanceValidator; import org.hl7.fhir.common.hapi.validation.validator.FhirInstanceValidator;
import org.hl7.fhir.instance.model.api.IAnyResource; import org.hl7.fhir.instance.model.api.IAnyResource;
@ -132,7 +128,6 @@ import org.hl7.fhir.r4.model.Organization;
import org.hl7.fhir.r4.model.Parameters; import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Period; import org.hl7.fhir.r4.model.Period;
import org.hl7.fhir.r4.model.Person;
import org.hl7.fhir.r4.model.Practitioner; import org.hl7.fhir.r4.model.Practitioner;
import org.hl7.fhir.r4.model.Procedure; import org.hl7.fhir.r4.model.Procedure;
import org.hl7.fhir.r4.model.Quantity; import org.hl7.fhir.r4.model.Quantity;
@ -169,6 +164,7 @@ import org.springframework.transaction.support.TransactionCallbackWithoutResult;
import org.springframework.transaction.support.TransactionTemplate; import org.springframework.transaction.support.TransactionTemplate;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
import javax.sql.DataSource;
import java.io.BufferedReader; import java.io.BufferedReader;
import java.io.IOException; import java.io.IOException;
import java.io.InputStreamReader; import java.io.InputStreamReader;
@ -3285,7 +3281,7 @@ public class ResourceProviderR4Test extends BaseResourceProviderR4Test {
assertEquals(1, ((Patient) history.getEntry().get(0).getResource()).getName().size()); assertEquals(1, ((Patient) history.getEntry().get(0).getResource()).getName().size());
assertEquals(HTTPVerb.DELETE, history.getEntry().get(1).getRequest().getMethodElement().getValue()); assertEquals(HTTPVerb.DELETE, history.getEntry().get(1).getRequest().getMethodElement().getValue());
assertEquals("http://localhost:" + ourPort + "/fhir/context/Patient/" + id.getIdPart() + "/_history/2", history.getEntry().get(1).getRequest().getUrl()); assertEquals("Patient/" + id.getIdPart() + "/_history/2", history.getEntry().get(1).getRequest().getUrl());
assertEquals(null, history.getEntry().get(1).getResource()); assertEquals(null, history.getEntry().get(1).getResource());
assertEquals(id.withVersion("1").getValue(), history.getEntry().get(2).getResource().getId()); assertEquals(id.withVersion("1").getValue(), history.getEntry().get(2).getResource().getId());

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -13,20 +13,17 @@ import com.google.common.base.Charsets;
import org.apache.commons.io.IOUtils; import org.apache.commons.io.IOUtils;
import org.apache.http.client.methods.CloseableHttpResponse; import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet; import org.apache.http.client.methods.HttpGet;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.entity.ContentType;
import org.apache.http.entity.StringEntity;
import org.hamcrest.Matchers; import org.hamcrest.Matchers;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4b.model.Bundle; import org.hl7.fhir.r4b.model.Bundle;
import org.hl7.fhir.r4b.model.Bundle.BundleEntryComponent; import org.hl7.fhir.r4b.model.Bundle.BundleEntryComponent;
import org.hl7.fhir.r4b.model.CarePlan;
import org.hl7.fhir.r4b.model.CodeableConcept; import org.hl7.fhir.r4b.model.CodeableConcept;
import org.hl7.fhir.r4b.model.Condition; import org.hl7.fhir.r4b.model.Condition;
import org.hl7.fhir.r4b.model.DateTimeType; import org.hl7.fhir.r4b.model.DateTimeType;
import org.hl7.fhir.r4b.model.MedicationRequest; import org.hl7.fhir.r4b.model.MedicationRequest;
import org.hl7.fhir.r4b.model.Observation; import org.hl7.fhir.r4b.model.Observation;
import org.hl7.fhir.r4b.model.Observation.ObservationComponentComponent; import org.hl7.fhir.r4b.model.Observation.ObservationComponentComponent;
import org.hl7.fhir.r4b.model.OperationOutcome;
import org.hl7.fhir.r4b.model.Organization; import org.hl7.fhir.r4b.model.Organization;
import org.hl7.fhir.r4b.model.Parameters; import org.hl7.fhir.r4b.model.Parameters;
import org.hl7.fhir.r4b.model.Patient; import org.hl7.fhir.r4b.model.Patient;
@ -464,6 +461,28 @@ public class ResourceProviderR4BTest extends BaseResourceProviderR4BTest {
assertThat(ids, Matchers.not(hasItem(o2Id))); assertThat(ids, Matchers.not(hasItem(o2Id)));
} }
@Test
void testTransactionBundleEntryUri() {
CarePlan carePlan = new CarePlan();
carePlan.getText().setDivAsString("A CarePlan");
carePlan.setId("ACarePlan");
myClient.create().resource(carePlan).execute();
// GET CarePlans from server
Bundle bundle = myClient.search()
.byUrl(ourServerBase + "/CarePlan")
.returnBundle(Bundle.class).execute();
// Create and populate list of CarePlans
List<CarePlan> carePlans = new ArrayList<>();
bundle.getEntry().forEach(entry -> carePlans.add((CarePlan) entry.getResource()));
// Post CarePlans should not get: HAPI-2006: Unable to perform PUT, URL provided is invalid...
myClient.transaction().withResources(carePlans).execute();
}
private IIdType createOrganization(String methodName, String s) { private IIdType createOrganization(String methodName, String s) {
Organization o1 = new Organization(); Organization o1 = new Organization();
o1.setName(methodName + s); o1.setName(methodName + s);

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -19,6 +19,7 @@ import org.hamcrest.Matchers;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r5.model.Bundle; import org.hl7.fhir.r5.model.Bundle;
import org.hl7.fhir.r5.model.Bundle.BundleEntryComponent; import org.hl7.fhir.r5.model.Bundle.BundleEntryComponent;
import org.hl7.fhir.r5.model.CarePlan;
import org.hl7.fhir.r5.model.CodeableConcept; import org.hl7.fhir.r5.model.CodeableConcept;
import org.hl7.fhir.r5.model.Condition; import org.hl7.fhir.r5.model.Condition;
import org.hl7.fhir.r5.model.DateTimeType; import org.hl7.fhir.r5.model.DateTimeType;
@ -481,6 +482,27 @@ public class ResourceProviderR5Test extends BaseResourceProviderR5Test {
assertThat(ids, Matchers.not(hasItem(o2Id))); assertThat(ids, Matchers.not(hasItem(o2Id)));
} }
@Test
void testTransactionBundleEntryUri() {
CarePlan carePlan = new CarePlan();
carePlan.getText().setDivAsString("A CarePlan");
carePlan.setId("ACarePlan");
myClient.create().resource(carePlan).execute();
// GET CarePlans from server
Bundle bundle = myClient.search()
.byUrl(ourServerBase + "/CarePlan")
.returnBundle(Bundle.class).execute();
// Create and populate list of CarePlans
List<CarePlan> carePlans = new ArrayList<>();
bundle.getEntry().forEach(entry -> carePlans.add((CarePlan) entry.getResource()));
// Post CarePlans should not get: HAPI-2006: Unable to perform PUT, URL provided is invalid...
myClient.transaction().withResources(carePlans).execute();
}
private IIdType createOrganization(String methodName, String s) { private IIdType createOrganization(String methodName, String s) {
Organization o1 = new Organization(); Organization o1 = new Organization();
o1.setName(methodName + s); o1.setName(methodName + s);

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -7,7 +7,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -24,6 +24,7 @@ public class AuthorizationConstants {
public static final int ORDER_CONSENT_INTERCEPTOR = 100; public static final int ORDER_CONSENT_INTERCEPTOR = 100;
public static final int ORDER_AUTH_INTERCEPTOR = 200; public static final int ORDER_AUTH_INTERCEPTOR = 200;
public static final int ORDER_CONVERTER_INTERCEPTOR = 300;
private AuthorizationConstants() { private AuthorizationConstants() {
super(); super();

View File

@ -369,17 +369,15 @@ public abstract class BaseResourceReturningMethodBinding extends BaseMethodBindi
count = result.preferredPageSize(); count = result.preferredPageSize();
} }
Integer offsetI = RestfulServerUtils.tryToExtractNamedParameter(theRequest, Constants.PARAM_PAGINGOFFSET); Integer offset = RestfulServerUtils.tryToExtractNamedParameter(theRequest, Constants.PARAM_PAGINGOFFSET);
if (offsetI == null || offsetI < 0) { if (offset == null || offset < 0) {
offsetI = 0; offset = 0;
} }
Integer resultSize = result.size(); Integer resultSize = result.size();
int start; int start = offset;
if (resultSize != null) { if (resultSize != null) {
start = Math.max(0, Math.min(offsetI, resultSize - 1)); start = Math.max(0, Math.min(offset, resultSize));
} else {
start = offsetI;
} }
ResponseEncoding responseEncoding = RestfulServerUtils.determineResponseEncodingNoDefault(theRequest, theServer.getDefaultResponseEncoding()); ResponseEncoding responseEncoding = RestfulServerUtils.determineResponseEncodingNoDefault(theRequest, theServer.getDefaultResponseEncoding());

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-spring-boot-samples</artifactId> <artifactId>hapi-fhir-spring-boot-samples</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
</parent> </parent>
<artifactId>hapi-fhir-spring-boot-sample-client-apache</artifactId> <artifactId>hapi-fhir-spring-boot-sample-client-apache</artifactId>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-spring-boot-samples</artifactId> <artifactId>hapi-fhir-spring-boot-samples</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
</parent> </parent>
<artifactId>hapi-fhir-spring-boot-sample-client-okhttp</artifactId> <artifactId>hapi-fhir-spring-boot-sample-client-okhttp</artifactId>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-spring-boot-samples</artifactId> <artifactId>hapi-fhir-spring-boot-samples</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
</parent> </parent>
<artifactId>hapi-fhir-spring-boot-sample-server-jersey</artifactId> <artifactId>hapi-fhir-spring-boot-sample-server-jersey</artifactId>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-spring-boot</artifactId> <artifactId>hapi-fhir-spring-boot</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
</parent> </parent>
<artifactId>hapi-fhir-spring-boot-samples</artifactId> <artifactId>hapi-fhir-spring-boot-samples</artifactId>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -21,18 +21,25 @@ package ca.uhn.fhir.jpa.migrate;
*/ */
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.jpa.migrate.entity.HapiMigrationEntity;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import java.util.Optional;
import java.util.UUID; import java.util.UUID;
import static org.apache.commons.lang3.StringUtils.isBlank;
/** /**
* The approach used in this class is borrowed from org.flywaydb.community.database.ignite.thin.IgniteThinDatabase * The approach used in this class is borrowed from org.flywaydb.community.database.ignite.thin.IgniteThinDatabase
*/ */
public class HapiMigrationLock implements AutoCloseable { public class HapiMigrationLock implements AutoCloseable {
static final Integer LOCK_PID = -100;
private static final Logger ourLog = LoggerFactory.getLogger(HapiMigrationLock.class); private static final Logger ourLog = LoggerFactory.getLogger(HapiMigrationLock.class);
public static final int SLEEP_MILLIS_BETWEEN_LOCK_RETRIES = 1000; public static final int SLEEP_MILLIS_BETWEEN_LOCK_RETRIES = 1000;
public static final int MAX_RETRY_ATTEMPTS = 50; public static final int DEFAULT_MAX_RETRY_ATTEMPTS = 50;
public static int ourMaxRetryAttempts = DEFAULT_MAX_RETRY_ATTEMPTS;
public static final String CLEAR_LOCK_TABLE_WITH_DESCRIPTION = "CLEAR_LOCK_TABLE_WITH_DESCRIPTION";
private final String myLockDescription = UUID.randomUUID().toString(); private final String myLockDescription = UUID.randomUUID().toString();
@ -47,6 +54,7 @@ public class HapiMigrationLock implements AutoCloseable {
} }
private void lock() { private void lock() {
cleanLockTableIfRequested();
int retryCount = 0; int retryCount = 0;
do { do {
@ -55,23 +63,57 @@ public class HapiMigrationLock implements AutoCloseable {
return; return;
} }
retryCount++; retryCount++;
ourLog.info("Waiting for lock on " + this);
if (retryCount < ourMaxRetryAttempts) {
ourLog.info("Waiting for lock on {}. Retry {}/{}", myMigrationStorageSvc.getMigrationTablename(), retryCount, ourMaxRetryAttempts);
Thread.sleep(SLEEP_MILLIS_BETWEEN_LOCK_RETRIES); Thread.sleep(SLEEP_MILLIS_BETWEEN_LOCK_RETRIES);
}
} catch (InterruptedException ex) { } catch (InterruptedException ex) {
// Ignore - if interrupted, we still need to wait for lock to become available // Ignore - if interrupted, we still need to wait for lock to become available
} }
} while (retryCount < MAX_RETRY_ATTEMPTS); } while (retryCount < ourMaxRetryAttempts);
throw new HapiMigrationException(Msg.code(2153) + "Unable to obtain table lock - another database migration may be running. If no " + String message = "Unable to obtain table lock - another database migration may be running. If no " +
"other database migration is running, then the previous migration did not shut down properly and the " + "other database migration is running, then the previous migration did not shut down properly and the " +
"lock record needs to be deleted manually. The lock record is located in the " + myMigrationStorageSvc.getMigrationTablename() + " table with " + "lock record needs to be deleted manually. The lock record is located in the " + myMigrationStorageSvc.getMigrationTablename() + " table with " +
"INSTALLED_RANK = " + HapiMigrationStorageSvc.LOCK_PID); "INSTALLED_RANK = " + LOCK_PID;
Optional<HapiMigrationEntity> otherLockFound = myMigrationStorageSvc.findFirstByPidAndNotDescription(LOCK_PID, myLockDescription);
if (otherLockFound.isPresent()) {
message += " and DESCRIPTION = " + otherLockFound.get().getDescription();
}
throw new HapiMigrationException(Msg.code(2153) + message);
}
/**
*
* @return whether a lock record was successfully deleted
*/
boolean cleanLockTableIfRequested() {
String description = System.getProperty(CLEAR_LOCK_TABLE_WITH_DESCRIPTION);
if (isBlank(description)) {
description = System.getenv(CLEAR_LOCK_TABLE_WITH_DESCRIPTION);
}
if (isBlank(description)) {
return false;
}
ourLog.info("Repairing lock table. Removing row in " + myMigrationStorageSvc.getMigrationTablename() + " with INSTALLED_RANK = " + LOCK_PID + " and DESCRIPTION = " + description);
boolean result = myMigrationStorageSvc.deleteLockRecord(description);
if (result) {
ourLog.info("Successfully removed lock record");
} else {
ourLog.info("No lock record found");
}
return result;
} }
private boolean insertLockingRow() { private boolean insertLockingRow() {
try { try {
return myMigrationStorageSvc.insertLockRecord(myLockDescription); return myMigrationStorageSvc.insertLockRecord(myLockDescription);
} catch (Exception e) { } catch (Exception e) {
ourLog.debug("Failed to insert lock record: {}", e.getMessage());
return false; return false;
} }
} }
@ -83,4 +125,8 @@ public class HapiMigrationLock implements AutoCloseable {
ourLog.error("Failed to delete migration lock record for description = [{}]", myLockDescription); ourLog.error("Failed to delete migration lock record for description = [{}]", myLockDescription);
} }
} }
public static void setMaxRetryAttempts(int theMaxRetryAttempts) {
ourMaxRetryAttempts = theMaxRetryAttempts;
}
} }

View File

@ -32,7 +32,6 @@ import java.util.Set;
public class HapiMigrationStorageSvc { public class HapiMigrationStorageSvc {
public static final String UNKNOWN_VERSION = "unknown"; public static final String UNKNOWN_VERSION = "unknown";
private static final String LOCK_TYPE = "hapi-fhir-lock"; private static final String LOCK_TYPE = "hapi-fhir-lock";
static final Integer LOCK_PID = -100;
private final HapiMigrationDao myHapiMigrationDao; private final HapiMigrationDao myHapiMigrationDao;
@ -104,11 +103,11 @@ public class HapiMigrationStorageSvc {
verifyNoOtherLocksPresent(theLockDescription); verifyNoOtherLocksPresent(theLockDescription);
// Remove the locking row // Remove the locking row
return myHapiMigrationDao.deleteLockRecord(LOCK_PID, theLockDescription); return myHapiMigrationDao.deleteLockRecord(HapiMigrationLock.LOCK_PID, theLockDescription);
} }
void verifyNoOtherLocksPresent(String theLockDescription) { void verifyNoOtherLocksPresent(String theLockDescription) {
Optional<HapiMigrationEntity> otherLockFound = myHapiMigrationDao.findFirstByPidAndNotDescription(LOCK_PID, theLockDescription); Optional<HapiMigrationEntity> otherLockFound = myHapiMigrationDao.findFirstByPidAndNotDescription(HapiMigrationLock.LOCK_PID, theLockDescription);
// Check that there are no other locks in place. This should not happen! // Check that there are no other locks in place. This should not happen!
if (otherLockFound.isPresent()) { if (otherLockFound.isPresent()) {
@ -118,7 +117,7 @@ public class HapiMigrationStorageSvc {
public boolean insertLockRecord(String theLockDescription) { public boolean insertLockRecord(String theLockDescription) {
HapiMigrationEntity entity = new HapiMigrationEntity(); HapiMigrationEntity entity = new HapiMigrationEntity();
entity.setPid(LOCK_PID); entity.setPid(HapiMigrationLock.LOCK_PID);
entity.setType(LOCK_TYPE); entity.setType(LOCK_TYPE);
entity.setDescription(theLockDescription); entity.setDescription(theLockDescription);
entity.setExecutionTime(0); entity.setExecutionTime(0);
@ -126,4 +125,8 @@ public class HapiMigrationStorageSvc {
return myHapiMigrationDao.save(entity); return myHapiMigrationDao.save(entity);
} }
public Optional<HapiMigrationEntity> findFirstByPidAndNotDescription(Integer theLockPid, String theLockDescription) {
return myHapiMigrationDao.findFirstByPidAndNotDescription(theLockPid, theLockDescription);
}
} }

View File

@ -1,7 +1,9 @@
package ca.uhn.fhir.jpa.migrate; package ca.uhn.fhir.jpa.migrate;
import ca.uhn.fhir.interceptor.api.HookParams; import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.jpa.migrate.dao.HapiMigrationDao;
import ca.uhn.fhir.jpa.migrate.taskdef.BaseTask; import ca.uhn.fhir.jpa.migrate.taskdef.BaseTask;
import ca.uhn.fhir.jpa.migrate.taskdef.NopTask;
import ca.uhn.test.concurrency.IPointcutLatch; import ca.uhn.test.concurrency.IPointcutLatch;
import ca.uhn.test.concurrency.PointcutLatch; import ca.uhn.test.concurrency.PointcutLatch;
import org.apache.commons.dbcp2.BasicDataSource; import org.apache.commons.dbcp2.BasicDataSource;
@ -16,6 +18,7 @@ import org.springframework.jdbc.core.JdbcTemplate;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
import java.util.List; import java.util.List;
import java.util.UUID;
import java.util.concurrent.ExecutionException; import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService; import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors; import java.util.concurrent.Executors;
@ -25,6 +28,7 @@ import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.hasSize;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.Assertions.fail;
class HapiMigratorIT { class HapiMigratorIT {
private static final Logger ourLog = LoggerFactory.getLogger(HapiMigratorIT.class); private static final Logger ourLog = LoggerFactory.getLogger(HapiMigratorIT.class);
@ -32,6 +36,7 @@ class HapiMigratorIT {
private final BasicDataSource myDataSource = BaseMigrationTest.getDataSource(); private final BasicDataSource myDataSource = BaseMigrationTest.getDataSource();
private final JdbcTemplate myJdbcTemplate = new JdbcTemplate(myDataSource); private final JdbcTemplate myJdbcTemplate = new JdbcTemplate(myDataSource);
private HapiMigrationStorageSvc myMigrationStorageSvc;
@BeforeEach @BeforeEach
void before() { void before() {
@ -39,12 +44,17 @@ class HapiMigratorIT {
migrator.createMigrationTableIfRequired(); migrator.createMigrationTableIfRequired();
Integer count = myJdbcTemplate.queryForObject("SELECT COUNT(*) FROM " + MIGRATION_TABLENAME, Integer.class); Integer count = myJdbcTemplate.queryForObject("SELECT COUNT(*) FROM " + MIGRATION_TABLENAME, Integer.class);
assertTrue(count > 0); assertTrue(count > 0);
HapiMigrationDao migrationDao = new HapiMigrationDao(myDataSource, DriverTypeEnum.H2_EMBEDDED, MIGRATION_TABLENAME);
myMigrationStorageSvc = new HapiMigrationStorageSvc(migrationDao);
} }
@AfterEach @AfterEach
void after() { void after() {
myJdbcTemplate.execute("DROP TABLE " + MIGRATION_TABLENAME); myJdbcTemplate.execute("DROP TABLE " + MIGRATION_TABLENAME);
assertEquals(0, myDataSource.getNumActive()); assertEquals(0, myDataSource.getNumActive());
HapiMigrationLock.setMaxRetryAttempts(HapiMigrationLock.DEFAULT_MAX_RETRY_ATTEMPTS);
System.clearProperty(HapiMigrationLock.CLEAR_LOCK_TABLE_WITH_DESCRIPTION);
} }
@Test @Test
@ -76,8 +86,7 @@ class HapiMigratorIT {
LatchMigrationTask latchMigrationTask2 = new LatchMigrationTask("second new", "2"); LatchMigrationTask latchMigrationTask2 = new LatchMigrationTask("second new", "2");
LatchMigrationTask latchMigrationTask3 = new LatchMigrationTask("third repeat", "1"); LatchMigrationTask latchMigrationTask3 = new LatchMigrationTask("third repeat", "1");
HapiMigrator migrator2 = buildMigrator(latchMigrationTask2); HapiMigrator migrator2 = buildMigrator(latchMigrationTask2, latchMigrationTask3);
migrator2.addTask(latchMigrationTask3);
// We only expect the first migration to run because the second one will block on the lock and by the time the lock // We only expect the first migration to run because the second one will block on the lock and by the time the lock
// is released, the first one will have already run so there will be nothing to do // is released, the first one will have already run so there will be nothing to do
@ -108,10 +117,77 @@ class HapiMigratorIT {
assertThat(result2.succeededTasks, hasSize(1)); assertThat(result2.succeededTasks, hasSize(1));
} }
@Test
void test_twoSequentialCalls_noblock() throws InterruptedException, ExecutionException {
ExecutorService executor = Executors.newSingleThreadExecutor();
LatchMigrationTask latchMigrationTask = new LatchMigrationTask("first", "1");
HapiMigrator migrator = buildMigrator(latchMigrationTask);
assertEquals(0, countLockRecords());
{
latchMigrationTask.setExpectedCount(1);
Future<MigrationResult> future = executor.submit(() -> migrator.migrate());
latchMigrationTask.awaitExpected();
assertEquals(1, countLockRecords());
latchMigrationTask.release("1");
MigrationResult result = future.get();
assertEquals(0, countLockRecords());
assertThat(result.succeededTasks, hasSize(1));
}
{
Future<MigrationResult> future = executor.submit(() -> migrator.migrate());
MigrationResult result = future.get();
assertEquals(0, countLockRecords());
assertThat(result.succeededTasks, hasSize(0));
}
}
@Test
void test_oldLockFails_block() {
HapiMigrationLock.setMaxRetryAttempts(0);
String description = UUID.randomUUID().toString();
HapiMigrator migrator = buildMigrator();
myMigrationStorageSvc.insertLockRecord(description);
try {
migrator.migrate();
fail();
} catch (HapiMigrationException e) {
assertEquals("HAPI-2153: Unable to obtain table lock - another database migration may be running. If no other database migration is running, then the previous migration did not shut down properly and the lock record needs to be deleted manually. The lock record is located in the TEST_MIGRATOR_TABLE table with INSTALLED_RANK = -100 and DESCRIPTION = " + description, e.getMessage());
}
}
@Test
void test_oldLockWithSystemProperty_cleared() {
HapiMigrationLock.setMaxRetryAttempts(0);
String description = UUID.randomUUID().toString();
HapiMigrator migrator = buildMigrator(new NopTask("1", "1"));
myMigrationStorageSvc.insertLockRecord(description);
System.setProperty(HapiMigrationLock.CLEAR_LOCK_TABLE_WITH_DESCRIPTION, description);
MigrationResult result = migrator.migrate();
assertThat(result.succeededTasks, hasSize(1));
}
private int countLockRecords() {
return myJdbcTemplate.queryForObject("SELECT COUNT(*) FROM " + MIGRATION_TABLENAME + " WHERE \"installed_rank\" = " + HapiMigrationLock.LOCK_PID, Integer.class);
}
@Nonnull @Nonnull
private HapiMigrator buildMigrator(LatchMigrationTask theLatchMigrationTask) { private HapiMigrator buildMigrator(BaseTask... theTasks) {
HapiMigrator retval = buildMigrator(); HapiMigrator retval = buildMigrator();
retval.addTask(theLatchMigrationTask); for (BaseTask next : theTasks) {
retval.addTask(next);
}
return retval; return retval;
} }

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>

View File

@ -43,11 +43,8 @@ public class BulkExportJobParametersValidator implements IJobParametersValidator
List<String> errorMsgs = new ArrayList<>(); List<String> errorMsgs = new ArrayList<>();
// initial validation // initial validation
List<String> resourceTypes = theParameters.getResourceTypes();
if (theParameters.getResourceTypes() == null || theParameters.getResourceTypes().isEmpty()) { if (resourceTypes != null && !resourceTypes.isEmpty()) {
errorMsgs.add("Resource Types are required for an export job.");
}
else {
for (String resourceType : theParameters.getResourceTypes()) { for (String resourceType : theParameters.getResourceTypes()) {
if (resourceType.equalsIgnoreCase("Binary")) { if (resourceType.equalsIgnoreCase("Binary")) {
errorMsgs.add("Bulk export of Binary resources is forbidden"); errorMsgs.add("Bulk export of Binary resources is forbidden");

View File

@ -31,6 +31,7 @@ import ca.uhn.fhir.batch2.jobs.export.models.BulkExportJobParameters;
import ca.uhn.fhir.batch2.jobs.models.Id; import ca.uhn.fhir.batch2.jobs.models.Id;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.jpa.api.config.DaoConfig; import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor; import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
import ca.uhn.fhir.jpa.bulk.export.model.ExportPIDIteratorParameters; import ca.uhn.fhir.jpa.bulk.export.model.ExportPIDIteratorParameters;
import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId; import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId;

View File

@ -130,7 +130,7 @@ public class BulkExportJobParametersValidatorTest {
} }
@Test @Test
public void validate_omittedResourceTypes_returnsErrorMessages() { public void validate_omittedResourceTypes_returnsNoErrorMessages() {
// setup // setup
BulkExportJobParameters parameters = createSystemExportParameters(); BulkExportJobParameters parameters = createSystemExportParameters();
parameters.setResourceTypes(null); parameters.setResourceTypes(null);
@ -140,8 +140,7 @@ public class BulkExportJobParametersValidatorTest {
// verify // verify
assertNotNull(results); assertNotNull(results);
assertEquals(1, results.size()); assertEquals(0, results.size());
assertTrue(results.contains("Resource Types are required for an export job."));
} }
@Test @Test

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -96,14 +96,9 @@ public class JobCoordinatorImpl implements IJobCoordinator {
if (isBlank(paramsString)) { if (isBlank(paramsString)) {
throw new InvalidRequestException(Msg.code(2065) + "No parameters supplied"); throw new InvalidRequestException(Msg.code(2065) + "No parameters supplied");
} }
// if cache - use that first // if cache - use that first
if (theStartRequest.isUseCache()) { if (theStartRequest.isUseCache()) {
FetchJobInstancesRequest request = new FetchJobInstancesRequest(theStartRequest.getJobDefinitionId(), theStartRequest.getParameters(), FetchJobInstancesRequest request = new FetchJobInstancesRequest(theStartRequest.getJobDefinitionId(), theStartRequest.getParameters(), getStatesThatTriggerCache());
StatusEnum.QUEUED,
StatusEnum.IN_PROGRESS,
StatusEnum.COMPLETED
);
List<JobInstance> existing = myJobPersistence.fetchInstances(request, 0, 1000); List<JobInstance> existing = myJobPersistence.fetchInstances(request, 0, 1000);
if (!existing.isEmpty()) { if (!existing.isEmpty()) {
@ -142,6 +137,13 @@ public class JobCoordinatorImpl implements IJobCoordinator {
return response; return response;
} }
/**
* Cache will be used if an identical job is QUEUED or IN_PROGRESS. Otherwise a new one will kickoff.
*/
private StatusEnum[] getStatesThatTriggerCache() {
return new StatusEnum[]{StatusEnum.QUEUED, StatusEnum.IN_PROGRESS};
}
@Override @Override
public JobInstance getInstance(String theInstanceId) { public JobInstance getInstance(String theInstanceId) {
return myJobQuerySvc.fetchInstance(theInstanceId); return myJobQuerySvc.fetchInstance(theInstanceId);

View File

@ -42,6 +42,7 @@ import java.util.Arrays;
import java.util.Optional; import java.util.Optional;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotEquals;
import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertNull;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.fail;
@ -151,9 +152,10 @@ public class JobCoordinatorImplTest extends BaseBatch2Test {
} }
@Test @Test
public void startInstance_usingExistingCache_returnsExistingJobFirst() { public void startInstance_usingExistingCache_returnsExistingIncompleteJobFirst() {
// setup // setup
String instanceId = "completed-id"; String completedInstanceId = "completed-id";
String inProgressInstanceId = "someId";
JobInstanceStartRequest startRequest = new JobInstanceStartRequest(); JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(JOB_DEFINITION_ID); startRequest.setJobDefinitionId(JOB_DEFINITION_ID);
startRequest.setUseCache(true); startRequest.setUseCache(true);
@ -162,32 +164,32 @@ public class JobCoordinatorImplTest extends BaseBatch2Test {
JobDefinition<?> def = createJobDefinition(); JobDefinition<?> def = createJobDefinition();
JobInstance existingInProgInstance = createInstance(); JobInstance existingInProgInstance = createInstance();
existingInProgInstance.setInstanceId("someId"); existingInProgInstance.setInstanceId(inProgressInstanceId);
existingInProgInstance.setStatus(StatusEnum.IN_PROGRESS); existingInProgInstance.setStatus(StatusEnum.IN_PROGRESS);
JobInstance existingCompletedInstance = createInstance(); JobInstance existingCompletedInstance = createInstance();
existingCompletedInstance.setStatus(StatusEnum.COMPLETED); existingCompletedInstance.setStatus(StatusEnum.COMPLETED);
existingCompletedInstance.setInstanceId(instanceId); existingCompletedInstance.setInstanceId(completedInstanceId);
// when // when
when(myJobDefinitionRegistry.getLatestJobDefinition(eq(JOB_DEFINITION_ID))) when(myJobDefinitionRegistry.getLatestJobDefinition(eq(JOB_DEFINITION_ID)))
.thenReturn(Optional.of(def)); .thenReturn(Optional.of(def));
when(myJobInstancePersister.fetchInstances(any(FetchJobInstancesRequest.class), anyInt(), anyInt())) when(myJobInstancePersister.fetchInstances(any(FetchJobInstancesRequest.class), anyInt(), anyInt()))
.thenReturn(Arrays.asList(existingInProgInstance, existingCompletedInstance)); .thenReturn(Arrays.asList(existingInProgInstance));
// test // test
Batch2JobStartResponse startResponse = mySvc.startInstance(startRequest); Batch2JobStartResponse startResponse = mySvc.startInstance(startRequest);
// verify // verify
assertEquals(instanceId, startResponse.getJobId()); // make sure it's the completed one assertEquals(inProgressInstanceId, startResponse.getJobId()); // make sure it's the completed one
assertTrue(startResponse.isUsesCachedResult()); assertTrue(startResponse.isUsesCachedResult());
ArgumentCaptor<FetchJobInstancesRequest> requestArgumentCaptor = ArgumentCaptor.forClass(FetchJobInstancesRequest.class); ArgumentCaptor<FetchJobInstancesRequest> requestArgumentCaptor = ArgumentCaptor.forClass(FetchJobInstancesRequest.class);
verify(myJobInstancePersister) verify(myJobInstancePersister)
.fetchInstances(requestArgumentCaptor.capture(), anyInt(), anyInt()); .fetchInstances(requestArgumentCaptor.capture(), anyInt(), anyInt());
FetchJobInstancesRequest req = requestArgumentCaptor.getValue(); FetchJobInstancesRequest req = requestArgumentCaptor.getValue();
assertEquals(3, req.getStatuses().size()); assertEquals(2, req.getStatuses().size());
assertTrue( assertTrue(
req.getStatuses().contains(StatusEnum.COMPLETED) req.getStatuses().contains(StatusEnum.IN_PROGRESS)
&& req.getStatuses().contains(StatusEnum.IN_PROGRESS)
&& req.getStatuses().contains(StatusEnum.QUEUED) && req.getStatuses().contains(StatusEnum.QUEUED)
); );
} }

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>

View File

@ -51,6 +51,6 @@ public class MdmBatch2Config {
JobDefinition clearJobDefinition = myApplicationContext.getBean(MDM_CLEAR_JOB_BEAN_NAME, JobDefinition.class); JobDefinition clearJobDefinition = myApplicationContext.getBean(MDM_CLEAR_JOB_BEAN_NAME, JobDefinition.class);
myJobDefinitionRegistry.addJobDefinitionIfNotRegistered(clearJobDefinition); myJobDefinitionRegistry.addJobDefinitionIfNotRegistered(clearJobDefinition);
JobDefinition submitJobDefinition = myApplicationContext.getBean(MDM_SUBMIT_JOB_BEAN_NAME, JobDefinition.class); JobDefinition submitJobDefinition = myApplicationContext.getBean(MDM_SUBMIT_JOB_BEAN_NAME, JobDefinition.class);
myJobDefinitionRegistry.addJobDefinitionIfNotRegistered(clearJobDefinition); myJobDefinitionRegistry.addJobDefinitionIfNotRegistered(submitJobDefinition);
} }
} }

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -26,6 +26,7 @@ import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.jpa.api.config.DaoConfig; import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.model.Batch2JobInfo; import ca.uhn.fhir.jpa.api.model.Batch2JobInfo;
import ca.uhn.fhir.jpa.api.model.Batch2JobOperationResult; import ca.uhn.fhir.jpa.api.model.Batch2JobOperationResult;
import ca.uhn.fhir.jpa.api.model.BulkExportJobResults; import ca.uhn.fhir.jpa.api.model.BulkExportJobResults;
@ -68,6 +69,7 @@ import org.springframework.beans.factory.annotation.Autowired;
import javax.servlet.http.HttpServletResponse; import javax.servlet.http.HttpServletResponse;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
import java.util.Date; import java.util.Date;
@ -100,6 +102,9 @@ public class BulkDataExportProvider {
@Autowired @Autowired
private DaoConfig myDaoConfig; private DaoConfig myDaoConfig;
@Autowired
private DaoRegistry myDaoRegistry;
/** /**
* $export * $export
*/ */
@ -136,6 +141,13 @@ public class BulkDataExportProvider {
// Set the original request URL as part of the job information, as this is used in the poll-status-endpoint, and is needed for the report. // Set the original request URL as part of the job information, as this is used in the poll-status-endpoint, and is needed for the report.
parameters.setOriginalRequestUrl(theRequestDetails.getCompleteUrl()); parameters.setOriginalRequestUrl(theRequestDetails.getCompleteUrl());
// If no _type parameter is provided, default to all resource types except Binary
if (theOptions.getResourceTypes() == null || theOptions.getResourceTypes().isEmpty()) {
List<String> resourceTypes = new ArrayList<>(myDaoRegistry.getRegisteredDaoTypes());
resourceTypes.remove("Binary");
parameters.setResourceTypes(resourceTypes);
}
// start job // start job
Batch2JobStartResponse response = myJobRunner.startNewJob(parameters); Batch2JobStartResponse response = myJobRunner.startNewJob(parameters);
@ -486,4 +498,9 @@ public class BulkDataExportProvider {
public void setDaoConfig(DaoConfig theDaoConfig) { public void setDaoConfig(DaoConfig theDaoConfig) {
myDaoConfig = theDaoConfig; myDaoConfig = theDaoConfig;
} }
@VisibleForTesting
public void setDaoRegistry(DaoRegistry theDaoRegistry) {
myDaoRegistry = theDaoRegistry;
}
} }

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -136,7 +136,7 @@ public class Dstu3BundleFactory implements IVersionSpecificBundleFactory {
if (httpVerb != null) { if (httpVerb != null) {
entry.getRequest().getMethodElement().setValueAsString(httpVerb); entry.getRequest().getMethodElement().setValueAsString(httpVerb);
if (id != null) { if (id != null) {
entry.getRequest().setUrl(id.getValue()); entry.getRequest().setUrl(id.toUnqualified().getValue());
} }
} }
if ("DELETE".equals(httpVerb)) { if ("DELETE".equals(httpVerb)) {

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -138,7 +138,7 @@ public class R4BundleFactory implements IVersionSpecificBundleFactory {
if (httpVerb != null) { if (httpVerb != null) {
entry.getRequest().getMethodElement().setValueAsString(httpVerb); entry.getRequest().getMethodElement().setValueAsString(httpVerb);
if (id != null) { if (id != null) {
entry.getRequest().setUrl(id.getValue()); entry.getRequest().setUrl(id.toUnqualified().getValue());
} }
} }
if ("DELETE".equals(httpVerb)) { if ("DELETE".equals(httpVerb)) {

View File

@ -146,6 +146,37 @@ public class PagingTest {
} }
} }
@Test()
public void testSendingSameRequestConsecutivelyResultsInSameResponse() throws Exception {
initBundleProvider(10);
myServerExtension.getRestfulServer().registerProvider(new DummyPatientResourceProvider());
myServerExtension.getRestfulServer().setPagingProvider(pagingProvider);
when(pagingProvider.canStoreSearchResults()).thenReturn(true);
when(pagingProvider.getDefaultPageSize()).thenReturn(10);
when(pagingProvider.getMaximumPageSize()).thenReturn(50);
when(pagingProvider.storeResultList(any(RequestDetails.class), any(IBundleProvider.class))).thenReturn("ABCD");
when(pagingProvider.retrieveResultList(any(RequestDetails.class), anyString())).thenReturn(ourBundleProvider);
String nextLink;
String base = "http://localhost:" + myServerExtension.getPort();
HttpGet get = new HttpGet(base + "/Patient?_getpagesoffset=10");
String responseContent;
try (CloseableHttpResponse resp = ourClient.execute(get)) {
assertEquals(200, resp.getStatusLine().getStatusCode());
responseContent = IOUtils.toString(resp.getEntity().getContent(), Charsets.UTF_8);
Bundle bundle = ourContext.newJsonParser().parseResource(Bundle.class, responseContent);
assertEquals(0, bundle.getEntry().size());
}
try (CloseableHttpResponse resp = ourClient.execute(get)) {
assertEquals(200, resp.getStatusLine().getStatusCode());
responseContent = IOUtils.toString(resp.getEntity().getContent(), Charsets.UTF_8);
Bundle bundle = ourContext.newJsonParser().parseResource(Bundle.class, responseContent);
assertEquals(0, bundle.getEntry().size());
}
}
private void checkParam(String theUri, String theCheckedParam, String theExpectedValue) { private void checkParam(String theUri, String theCheckedParam, String theExpectedValue) {
Optional<String> paramValue = URLEncodedUtils.parse(theUri, CHARSET_UTF8).stream() Optional<String> paramValue = URLEncodedUtils.parse(theUri, CHARSET_UTF8).stream()
.filter(nameValuePair -> nameValuePair.getName().equals(theCheckedParam)) .filter(nameValuePair -> nameValuePair.getName().equals(theCheckedParam))

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.3.0-PRE2-SNAPSHOT</version> <version>6.3.0-PRE3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

Some files were not shown because too many files have changed in this diff Show More