Rel 6 10 mb (#5441)

* Allow cached search with consent active when safe (#5387)

Allow the search cache when using consent if safe

* Change package installation behaviour such that it updates the existing SearchParameter base with remaining resources (#5376)

* Change package installation behavior such that it updates the existing SearchParameter base with remaining resources

* Change package installation behavior such that it updates the existing SearchParameter base with remaining resources

* Use resourceType in the package installer output to fix tests. Minor change with resourceType condition. Update changelog description to make it more readable.

* Use resourceType in the package installer output to fix tests. Minor change with resourceType condition. Update changelog description to make it more readable.

* Transaction with conditional update fails if SearchNarrowingInterceptor is registered and Enabled Partitioning (#5389)

* Transaction with conditional update fails if SearchNarrowingInterceptor is registered and Enabled Partitioning - Implementation

* Reverse Chaining searches returns an error when invoked with parameter _lastUpdated. (#5177)

* version bump

* Bump to core release 6.0.22 (#5028)

* Bump to core release 6.0.16

* Bump to core version 6.0.20

* Fix errors thrown as a result of VersionSpecificWorkerContextWrapper

* Bump to core 6.0.22

* Resolve 5126 hfj res ver prov might cause migration error on db that automatically indexes the primary key (#5127)

* dropped old index FK_RESVERPROV_RES_PID on RES_PID column before adding IDX_RESVERPROV_RES_PID

* added changelog

* changed to valid version number

* changed to valid version number, need to be ordered by version number...

* 5123 - Use DEFAULT partition for server-based requests if none specified (#5124)

5123 - Use DEFAULT partition for server-based requests if none specified

* consent remove all suppresses next link in bundle (#5119)

* added FIXME with source of issue

* added FIXME with root cause

* added FIXME with root cause

* Providing solution to the issue and removing fixmes.

* Providing changelog

* auto-formatting.

* Adding new test.

* Adding a new test for standard paging

* let's try this and see if it works...?

* fix tests

* cleanup to trigger a new run

* fixing tests

---------

Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* 5117 MDM Score for No Match Fields Should Not Be Included in Total Score  (#5118)

* fix, test, changelog

* fix, test, changelog

---------

Co-authored-by: justindar <justin.dar@smilecdr.com>

* _source search parameter needs to support modifiers (#5095)

_source search parameter needs to support modifiers - added support form :contains, :missing, :above modifiers

* Fix HFQL docs (#5151)

* Expunge operation on codesystem may throw 500 internal error with precondition fail message. (#5156)

* Initial failing test.

* Solution with changelog.

* fixing format.

* Addressing comment from code review.

* fixing failing test.

---------

Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* documentation update (#5154)

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* Fix hsql jdbc driver deps (#5168)

Avoid non-included classes in jdbc driver dependencies.

* $delete-expunge over 10k resources will now delete all resources (#5144)

* First commit with very rough fix and unit test.

* Refinements to ResourceIdListStep and Batch2DaoSvcImpl.  Make LoadIdsStepTest pass.   Enhance Batch2DaoSvcImplTest.

* Spotless

* Fix checkstyle errors.

* Fix test failures.

* Minor refactoring.  New unit test.  Finalize changelist.

* Spotless fix.

* Delete now useless code from unit test.

* Delete more useless code.

* Test pre-commit hook

* More spotless fixes.

* Address most code review feedback.

* Remove use of pageSize parameter and see if this breaks the pipeline.

* Remove use of pageSize parameter and see if this breaks the pipeline.

* Fix the noUrl case by passing an unlimited Pegeable instead.  Effectively stop using page size for most databases.

* Deprecate the old method and have it call the new one by default.

* updating documentation (#5170)

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* _source search parameter modifiers for Subscription matching (#5159)

* _source search parameter modifiers for Subscription matching - test, implementation and changelog

* first fix

* tests and preliminary fixes

* wip, commit before switching to release branch.

* adding capability to handle _lastUpdated in reverse search (_has)

* adding changelog

* applying spotless.

* addressing code review comments.

---------

Co-authored-by: tadgh <garygrantgraham@gmail.com>
Co-authored-by: dotasek <david.otasek@smilecdr.com>
Co-authored-by: Steve Corbett <137920358+steve-corbett-smilecdr@users.noreply.github.com>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: justindar <justin.dar@smilecdr.com>
Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>
Co-authored-by: Nathan Doef <n.doef@protonmail.com>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: michaelabuckley <michaelabuckley@gmail.com>
Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com>

* Br 20231019 add cr settings for cds hooks (#5394)

* Add settings used in CR CDS Services.  Remove config dependency on Spring Boot.

* Add changelog

* Use String.format rather than concat strings

* spotless apply

* Add javadoc

* Upgrade notes for the forced-id change (#5400)

Add upgrade notes for forced-id

* Clean stale search results more aggressively. (#5396)

Use bulk DMA statements when cleaning the search cache.
The cleaner job now works as long as possible until a deadline based on the scheduling frequency.

* bump version of clinical reasoning (#5406)

* Transaction fails if SearchNarrowingInterceptor is registered and Partitioning Enabled - fix cross-tenant requests failure (#5408)

* Transaction with conditional update fails if SearchNarrowingInterceptor is registered and Enabled Partitioning - fix and tests added

* removed unused alias from SQL query of mdm-clear (#5416)

* Issue 5418 support Boolean class return type in BaseInterceptorService (#5421)

* Enable child classes to use Boolean class return type

* spotless

---------

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* If AutoInflateBinaries is enabled, binaries are created on the disk only for the first resource entry of the bundle (#5420)

* If AutoInflateBinaries is enabled, binaries created on disk by bundled requests are created only for the first resource entry - fix

* Revert "Issue 5418 support Boolean class return type in BaseInterceptorService (#5421)" (#5423)

This reverts commit 4e295a59fb.

Co-authored-by: Nathan Doef <nathaniel.doef@smilecdr.com>

* Use new FHIR_ID column for sorting (#5405)

* Sort `_id` using new FHIR_ID column.
* Fix old tests that put client-assigned ids first.

* Better indexing for sort

* Bump core to 6.1.2.2 (#5425)

* Bump core to 6.1.2.1

Patch release that uses https for primary org.hl7.fhir.core package server

* Bump core to 6.1.2.2

* Make sure to return always a value for Boolean class return type. (#5424)

Implement change in a non-disruptive way for overriders

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* Add non-standard __pid SP for breaking ties cheaply during sorts. (#5428)

Add a non-standard __pid SP.

* Review changes for new _pid SP. (#5430)

Change name to _pid to match our standard and add warning.

* Fix VersionCanonicalizer conversion from R5 into DSTU2 for CapabilityStatement, Parameters and StructuredDefinition (#5432)

* Fix VersionCanonicalizer conversion from R5 into DSTU2 for CapabilityStatement, Parameters and StructuredDefinition.

* Fix spotless issue

* CVEs for 6.10.0 (#5433)

* Bump jetty

* Bump okio-jvm

* 8.2.0 mysql connector

* Jena and elastic bumps

* Fix test

* 5412 post bundle on partition incorrect response.link shown (#5413)

* Initial fix and unit test provided

* spottless check

* Made relevant changes to make solution version agnostic

* relevant logic changes made

* spotless changes made

* New logic added to fix failing test cases

* formatting

* New logic to make the function more robust

* spotless checks

* Left a trailing slash in the tests

* Made relevant test changes and changed logic

* spotless changes

* Update hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/6_10_0/5412-during-partition-fullUrl-not-shown-in-response.yaml

changing changelog

Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>

* Formatting requirements

---------

Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>

* Resolve We don't have guaranteed subscription delivery if a resource is too large (#5414)

* first fix

* - added the ability to handle null payload to SubscriptionDeliveringMessageSubscriber and SubscriptionDeliveringEmailSubscriber
- refactored code to reduce repeated code
- cleaned unnecessary comments and reformatted files

* Changed myResourceModifiedMessagePersistenceSvc to be autowired

* removed unused import

* added error handling when inflating the message to email and message subscriber

* reformatted code

* Fixing subscription tests with mocked IResourceModifiedMessagePersistenceSvc

* Changes by gary

* Reformatted file

* fixed failed tests

* implemented test for message and email delivery subscriber. Fixed logical error. Reformatted File.

* - implemented IT
- fixed logical error
- added changelog

* fix for cdr tests, NOTE: this makes the assumption that we will always succeed for inflating the database in the tests that uses SynchronousSubscriptionMatcherInterceptor

* fix for cdr tests, NOTE: this makes the assumption that we will always succeed for inflating the database in the tests that uses SynchronousSubscriptionMatcherInterceptor

* resolve code review comments

* reformatted files

* fixed tests

* Fix for failing IT test in jpaserver-starter (#5435)

Co-authored-by: dotasek <dotasek.dev@gmail.com>

* wip

---------

Co-authored-by: michaelabuckley <michaelabuckley@gmail.com>
Co-authored-by: Martha Mitran <martha.mitran@smilecdr.com>
Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>
Co-authored-by: TynerGjs <132295567+TynerGjs@users.noreply.github.com>
Co-authored-by: dotasek <david.otasek@smilecdr.com>
Co-authored-by: Steve Corbett <137920358+steve-corbett-smilecdr@users.noreply.github.com>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: justindar <justin.dar@smilecdr.com>
Co-authored-by: Nathan Doef <n.doef@protonmail.com>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com>
Co-authored-by: Brenin Rhodes <brenin@alphora.com>
Co-authored-by: Justin McKelvy <60718638+Capt-Mac@users.noreply.github.com>
Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com>
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
Co-authored-by: Nathan Doef <nathaniel.doef@smilecdr.com>
Co-authored-by: LalithE <132382565+LalithE@users.noreply.github.com>
Co-authored-by: dotasek <dotasek.dev@gmail.com>
This commit is contained in:
Tadgh 2023-11-10 14:00:08 -08:00 committed by GitHub
parent 3bba9fb1f2
commit 777859ad00
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
136 changed files with 2823 additions and 1017 deletions

View File

@ -263,10 +263,14 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
return myRegisteredPointcuts.contains(thePointcut); return myRegisteredPointcuts.contains(thePointcut);
} }
protected Class<?> getBooleanReturnType() {
return boolean.class;
}
@Override @Override
public boolean callHooks(POINTCUT thePointcut, HookParams theParams) { public boolean callHooks(POINTCUT thePointcut, HookParams theParams) {
assert haveAppropriateParams(thePointcut, theParams); assert haveAppropriateParams(thePointcut, theParams);
assert thePointcut.getReturnType() == void.class || thePointcut.getReturnType() == boolean.class; assert thePointcut.getReturnType() == void.class || thePointcut.getReturnType() == getBooleanReturnType();
Object retValObj = doCallHooks(thePointcut, theParams, true); Object retValObj = doCallHooks(thePointcut, theParams, true);
return (Boolean) retValObj; return (Boolean) retValObj;
@ -282,14 +286,16 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
for (BaseInvoker nextInvoker : invokers) { for (BaseInvoker nextInvoker : invokers) {
Object nextOutcome = nextInvoker.invoke(theParams); Object nextOutcome = nextInvoker.invoke(theParams);
Class<?> pointcutReturnType = thePointcut.getReturnType(); Class<?> pointcutReturnType = thePointcut.getReturnType();
if (pointcutReturnType.equals(boolean.class)) { if (pointcutReturnType.equals(getBooleanReturnType())) {
Boolean nextOutcomeAsBoolean = (Boolean) nextOutcome; Boolean nextOutcomeAsBoolean = (Boolean) nextOutcome;
if (Boolean.FALSE.equals(nextOutcomeAsBoolean)) { if (Boolean.FALSE.equals(nextOutcomeAsBoolean)) {
ourLog.trace("callHooks({}) for invoker({}) returned false", thePointcut, nextInvoker); ourLog.trace("callHooks({}) for invoker({}) returned false", thePointcut, nextInvoker);
theRetVal = false; theRetVal = false;
break; break;
} else {
theRetVal = true;
} }
} else if (pointcutReturnType.equals(void.class) == false) { } else if (!pointcutReturnType.equals(void.class)) {
if (nextOutcome != null) { if (nextOutcome != null) {
theRetVal = nextOutcome; theRetVal = nextOutcome;
break; break;
@ -349,7 +355,7 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
List<BaseInvoker> retVal; List<BaseInvoker> retVal;
if (haveMultiple == false) { if (!haveMultiple) {
// The global list doesn't need to be sorted every time since it's sorted on // The global list doesn't need to be sorted every time since it's sorted on
// insertion each time. Doing so is a waste of cycles.. // insertion each time. Doing so is a waste of cycles..
@ -485,9 +491,9 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
myMethod = theHookMethod; myMethod = theHookMethod;
Class<?> returnType = theHookMethod.getReturnType(); Class<?> returnType = theHookMethod.getReturnType();
if (myPointcut.getReturnType().equals(boolean.class)) { if (myPointcut.getReturnType().equals(getBooleanReturnType())) {
Validate.isTrue( Validate.isTrue(
boolean.class.equals(returnType) || void.class.equals(returnType), getBooleanReturnType().equals(returnType) || void.class.equals(returnType),
"Method does not return boolean or void: %s", "Method does not return boolean or void: %s",
theHookMethod); theHookMethod);
} else if (myPointcut.getReturnType().equals(void.class)) { } else if (myPointcut.getReturnType().equals(void.class)) {

View File

@ -199,6 +199,8 @@ public class Constants {
public static final String PARAM_PRETTY_VALUE_FALSE = "false"; public static final String PARAM_PRETTY_VALUE_FALSE = "false";
public static final String PARAM_PRETTY_VALUE_TRUE = "true"; public static final String PARAM_PRETTY_VALUE_TRUE = "true";
public static final String PARAM_PROFILE = "_profile"; public static final String PARAM_PROFILE = "_profile";
public static final String PARAM_PID = "_pid";
public static final String PARAM_QUERY = "_query"; public static final String PARAM_QUERY = "_query";
public static final String PARAM_RESPONSE_URL = "response-url"; // Used in messaging public static final String PARAM_RESPONSE_URL = "response-url"; // Used in messaging
public static final String PARAM_REVINCLUDE = "_revinclude"; public static final String PARAM_REVINCLUDE = "_revinclude";

View File

@ -458,7 +458,7 @@ public class VersionCanonicalizer {
@Override @Override
public IBaseParameters parametersFromCanonical(Parameters theParameters) { public IBaseParameters parametersFromCanonical(Parameters theParameters) {
Resource converted = VersionConvertorFactory_10_40.convertResource(theParameters, ADVISOR_10_40); Resource converted = VersionConvertorFactory_10_40.convertResource(theParameters, ADVISOR_10_40);
return (IBaseParameters) reencodeToHl7Org(converted); return (IBaseParameters) reencodeFromHl7Org(converted);
} }
@Override @Override
@ -470,7 +470,7 @@ public class VersionCanonicalizer {
@Override @Override
public IBaseResource structureDefinitionFromCanonical(StructureDefinition theResource) { public IBaseResource structureDefinitionFromCanonical(StructureDefinition theResource) {
Resource converted = VersionConvertorFactory_10_50.convertResource(theResource, ADVISOR_10_50); Resource converted = VersionConvertorFactory_10_50.convertResource(theResource, ADVISOR_10_50);
return reencodeToHl7Org(converted); return reencodeFromHl7Org(converted);
} }
@Override @Override
@ -514,7 +514,7 @@ public class VersionCanonicalizer {
@Override @Override
public IBaseConformance capabilityStatementFromCanonical(CapabilityStatement theResource) { public IBaseConformance capabilityStatementFromCanonical(CapabilityStatement theResource) {
Resource converted = VersionConvertorFactory_10_50.convertResource(theResource, ADVISOR_10_50); Resource converted = VersionConvertorFactory_10_50.convertResource(theResource, ADVISOR_10_50);
return (IBaseConformance) reencodeToHl7Org(converted); return (IBaseConformance) reencodeFromHl7Org(converted);
} }
private Resource reencodeToHl7Org(IBaseResource theInput) { private Resource reencodeToHl7Org(IBaseResource theInput) {

View File

@ -2,22 +2,19 @@ package ca.uhn.hapi.converters.canonical;
import ca.uhn.fhir.context.FhirVersionEnum; import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.model.dstu2.composite.CodingDt; import ca.uhn.fhir.model.dstu2.composite.CodingDt;
import ca.uhn.fhir.model.dstu2.resource.Conformance;
import ca.uhn.fhir.util.HapiExtensions; import ca.uhn.fhir.util.HapiExtensions;
import org.apache.commons.lang3.StringUtils;
import org.hl7.fhir.convertors.factory.VersionConvertorFactory_40_50;
import org.hl7.fhir.instance.model.api.IBaseCoding; import org.hl7.fhir.instance.model.api.IBaseCoding;
import org.hl7.fhir.instance.model.api.IBaseHasExtensions;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IPrimitiveType;
import org.hl7.fhir.r4.model.CodeType; import org.hl7.fhir.r4.model.CodeType;
import org.hl7.fhir.r4.model.Coding; import org.hl7.fhir.r4.model.Coding;
import org.hl7.fhir.r5.model.Base; import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r5.model.CapabilityStatement;
import org.hl7.fhir.r5.model.Enumeration; import org.hl7.fhir.r5.model.Enumeration;
import org.hl7.fhir.r5.model.SearchParameter; import org.hl7.fhir.r5.model.SearchParameter;
import org.hl7.fhir.r5.model.StructureDefinition;
import org.junit.jupiter.api.Nested;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import javax.annotation.Nonnull;
import java.util.List;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static ca.uhn.fhir.util.ExtensionUtil.getExtensionPrimitiveValues; import static ca.uhn.fhir.util.ExtensionUtil.getExtensionPrimitiveValues;
@ -25,40 +22,23 @@ import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.contains;
import static org.hamcrest.Matchers.empty; import static org.hamcrest.Matchers.empty;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
class VersionCanonicalizerTest { class VersionCanonicalizerTest {
@Nested
class VersionCanonicalizerR4 {
private static final FhirVersionEnum FHIR_VERSION = FhirVersionEnum.R4;
private static final VersionCanonicalizer ourCanonicalizer = new VersionCanonicalizer(FHIR_VERSION);
@Test @Test
public void testToCanonicalCoding() { public void testToCanonical_SearchParameterNoCustomResourceType_ConvertedCorrectly() {
VersionCanonicalizer canonicalizer = new VersionCanonicalizer(FhirVersionEnum.DSTU2);
IBaseCoding coding = new CodingDt("dstuSystem", "dstuCode");
Coding convertedCoding = canonicalizer.codingToCanonical(coding);
assertEquals("dstuCode", convertedCoding.getCode());
assertEquals("dstuSystem", convertedCoding.getSystem());
}
@Test
public void testFromCanonicalSearchParameter() {
VersionCanonicalizer canonicalizer = new VersionCanonicalizer(FhirVersionEnum.DSTU2);
SearchParameter inputR5 = new SearchParameter();
inputR5.setUrl("http://foo");
ca.uhn.fhir.model.dstu2.resource.SearchParameter outputDstu2 = (ca.uhn.fhir.model.dstu2.resource.SearchParameter) canonicalizer.searchParameterFromCanonical(inputR5);
assertEquals("http://foo", outputDstu2.getUrl());
}
@Test
public void testToCanonicalSearchParameter_NoCustomResourceType() {
// Setup
VersionCanonicalizer canonicalizer = new VersionCanonicalizer(FhirVersionEnum.R4);
org.hl7.fhir.r4.model.SearchParameter input = new org.hl7.fhir.r4.model.SearchParameter(); org.hl7.fhir.r4.model.SearchParameter input = new org.hl7.fhir.r4.model.SearchParameter();
input.addBase("Patient"); input.addBase("Patient");
input.addBase("Observation"); input.addBase("Observation");
input.addTarget("Organization"); input.addTarget("Organization");
// Test // Test
org.hl7.fhir.r5.model.SearchParameter actual = canonicalizer.searchParameterToCanonical(input); org.hl7.fhir.r5.model.SearchParameter actual = ourCanonicalizer.searchParameterToCanonical(input);
// Verify // Verify
assertThat(actual.getBase().stream().map(Enumeration::getCode).collect(Collectors.toList()), contains("Patient", "Observation")); assertThat(actual.getBase().stream().map(Enumeration::getCode).collect(Collectors.toList()), contains("Patient", "Observation"));
@ -69,10 +49,8 @@ class VersionCanonicalizerTest {
} }
@Test @Test
public void testToCanonicalSearchParameter_WithCustomResourceType() { public void testToCanonical_SearchParameterWithCustomResourceType__ConvertedCorrectly() {
// Setup // Setup
VersionCanonicalizer canonicalizer = new VersionCanonicalizer(FhirVersionEnum.R4);
org.hl7.fhir.r4.model.SearchParameter input = new org.hl7.fhir.r4.model.SearchParameter(); org.hl7.fhir.r4.model.SearchParameter input = new org.hl7.fhir.r4.model.SearchParameter();
input.addBase("Base1"); input.addBase("Base1");
input.addBase("Base2"); input.addBase("Base2");
@ -80,7 +58,7 @@ class VersionCanonicalizerTest {
input.addTarget("Target2"); input.addTarget("Target2");
// Test // Test
org.hl7.fhir.r5.model.SearchParameter actual = canonicalizer.searchParameterToCanonical(input); org.hl7.fhir.r5.model.SearchParameter actual = ourCanonicalizer.searchParameterToCanonical(input);
// Verify // Verify
assertThat(actual.getBase().stream().map(Enumeration::getCode).collect(Collectors.toList()), empty()); assertThat(actual.getBase().stream().map(Enumeration::getCode).collect(Collectors.toList()), empty());
@ -92,6 +70,52 @@ class VersionCanonicalizerTest {
assertThat(input.getTarget().stream().map(CodeType::getCode).toList(), contains("Target1", "Target2")); assertThat(input.getTarget().stream().map(CodeType::getCode).toList(), contains("Target1", "Target2"));
} }
}
@Nested
class VersionCanonicalizerDstu2 {
private static final FhirVersionEnum FHIR_VERSION = FhirVersionEnum.DSTU2;
private static final VersionCanonicalizer ourCanonicalizer = new VersionCanonicalizer(FHIR_VERSION);
@Test
public void testToCanonical_Coding_ConvertSuccessful() {
IBaseCoding coding = new CodingDt("dstuSystem", "dstuCode");
Coding convertedCoding = ourCanonicalizer.codingToCanonical(coding);
assertEquals("dstuCode", convertedCoding.getCode());
assertEquals("dstuSystem", convertedCoding.getSystem());
}
@Test
public void testFromCanonical_SearchParameter_ConvertSuccessful() {
SearchParameter inputR5 = new SearchParameter();
inputR5.setUrl("http://foo");
ca.uhn.fhir.model.dstu2.resource.SearchParameter outputDstu2 = (ca.uhn.fhir.model.dstu2.resource.SearchParameter) ourCanonicalizer.searchParameterFromCanonical(inputR5);
assertEquals("http://foo", outputDstu2.getUrl());
}
@Test
public void testFromCanonical_CapabilityStatement_ConvertSuccessful() {
CapabilityStatement inputR5 = new CapabilityStatement();
inputR5.setUrl("http://foo");
Conformance conformance = (Conformance) ourCanonicalizer.capabilityStatementFromCanonical(inputR5);
assertEquals("http://foo", conformance.getUrl());
}
@Test
public void testFromCanonical_StructureDefinition_ConvertSuccessful() {
StructureDefinition inputR5 = new StructureDefinition();
inputR5.setId("123");
ca.uhn.fhir.model.dstu2.resource.StructureDefinition structureDefinition = (ca.uhn.fhir.model.dstu2.resource.StructureDefinition) ourCanonicalizer.structureDefinitionFromCanonical(inputR5);
assertEquals("StructureDefinition/123", structureDefinition.getId().getValue());
}
@Test
public void testFromCanonical_Parameters_ConvertSuccessful() {
org.hl7.fhir.r4.model.Parameters inputR4 = new Parameters();
inputR4.setParameter("paramA", "1");
ca.uhn.fhir.model.dstu2.resource.Parameters parameters = (ca.uhn.fhir.model.dstu2.resource.Parameters) ourCanonicalizer.parametersFromCanonical(inputR4);
assertNotNull(parameters.getParameter());
assertEquals("paramA", parameters.getParameter().get(0).getName());
}
}
} }

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 5176
jira: SMILE-6333
title: "Previously, the use of search parameter _lastUpdated as part of a reverse chaining search would return an error
message to the client. This issue has been fixed"

View File

@ -0,0 +1,7 @@
---
type: add
issue: 5366
jira: SMILE-5184
title: "The package installer overrides existing (built-in) SearchParameter with multiple base resources.
This is happening when installing US Core package for Practitioner.given as an example.
This change allows the existing SearchParameter to continue to exist with the remaining base resources."

View File

@ -0,0 +1,4 @@
---
type: add
issue: 5375
title: "Add settings for CDS Services using CDS on FHIR. Also removed the dependency on Spring Boot from the CR configs used by CDS Hooks."

View File

@ -0,0 +1,6 @@
---
type: perf
issue: 5387
title: "Enable the search cache for some requests even when a consent interceptor is active.
If no consent service uses canSeeResource (i.e. shouldProcessCanSeeResource() returns false);
or startOperation() returns AUTHORIZED; then the search cache is enabled."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 5388
title: "Previously, with partitioning enabled and `UrlBaseTenantIdentificationStrategy` used, registering
`SearchNarrowingInterceptor` would cause to incorrect resolution of `entry.request.url` parameter during
transaction bundle processing. This has been fixed."

View File

@ -0,0 +1,5 @@
---
type: perf
issue: 5395
title: "The background activity that clears stale search results now has higher throughput.
Busy servers should no longer accumulate dead stale search results."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 5404
title: "Cql translating bug where FHIRHelpers library function was erroring and blocking clinical reasoning content functionality"

View File

@ -0,0 +1,4 @@
---
type: perf
issue: 5405
title: "Sorting by _id now uses the FHIR_ID column on HFJ_RESOURCE and avoid joins."

View File

@ -0,0 +1,7 @@
---
type: add
issue: 5407
title: "Previously, when the payload of a subscription message exceeds the broker maximum message size, exception would
be thrown and retry will be performed indefinitely until the maximum message size is adjusted. Now, the message will be
successfully delivered for rest-hook and email subscriptions, while message subscriptions remains the same behavior as
before."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 5412
title: "Previously, with Partitioning enabled, submitting a bundle request would return a response with the partition name displayed twice in `response.link` property. This has been fixed."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 5415
title: "Previously, `$mdm-clear` jobs would fail on MSSQL. This is now fixed."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 5419
title: "Previously, when `AllowAutoInflateBinaries` was enabled in `JpaStorageSettings` and bundles with multiple
resources were submitted, binaries were created on the disk only for the first resource entry of the bundle.
This has been fixed."

View File

@ -0,0 +1,5 @@
---
type: add
issue: 5428
title: "Add support for non-standard _pid SearchParameter to the the JPA engine.
This new SP provides an efficient tie-breaking sort key."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 5431
jira: SMILE-5306
title: "Previously, using VersionCanonicalizer to convert a CapabilityStatement from R5 to DSTU2 would fail. This is now fixed."

View File

@ -11,5 +11,5 @@
<li>Thymeleaf (Testpage Overlay): 3.0.14.RELEASE -&gt; 3.1.2.RELEASE</li> <li>Thymeleaf (Testpage Overlay): 3.0.14.RELEASE -&gt; 3.1.2.RELEASE</li>
<li>xpp3 (All): 1.1.4c.0 -&gt; 1.1.6</li> <li>xpp3 (All): 1.1.4c.0 -&gt; 1.1.6</li>
<li>HtmlUnit (All): 2.67.0 -&gt; 2.70.0</li> <li>HtmlUnit (All): 2.67.0 -&gt; 2.70.0</li>
<li>org.hl7.fhir.core (All): 6.0.22.2 -&gt; 6.1.2</li> <li>org.hl7.fhir.core (All): 6.0.22.2 -&gt; 6.1.2.2</li>
</ul>" </ul>"

View File

@ -1,3 +1,9 @@
### Major Database Change
This release makes performance changes to the database definition in a way that is incompatible with releases before 6.4.
Attempting to run version 6.2 or older simultaneously with this release may experience errors when saving new resources.
### Change Tracking and Subscriptions
This release introduces significant a change to the mechanism performing submission of resource modification events This release introduces significant a change to the mechanism performing submission of resource modification events
to the message broker. Previously, an event would be submitted as part of the synchronous transaction to the message broker. Previously, an event would be submitted as part of the synchronous transaction
modifying a resource. Synchronous submission yielded responsive publishing with the caveat that events would be dropped modifying a resource. Synchronous submission yielded responsive publishing with the caveat that events would be dropped
@ -8,6 +14,7 @@ database upon completion of the transaction and subsequently submitted to the br
This new asynchronous submission mechanism will introduce a slight delay in event publishing. It is our view that such This new asynchronous submission mechanism will introduce a slight delay in event publishing. It is our view that such
delay is largely compensated by the capability to retry submission upon failure which will eliminate event losses. delay is largely compensated by the capability to retry submission upon failure which will eliminate event losses.
### Tag, Security Label, and Profile changes
There are some potentially breaking changes: There are some potentially breaking changes:
* On resource retrieval and before storage, tags, security label and profile collections in resource meta will be * On resource retrieval and before storage, tags, security label and profile collections in resource meta will be

View File

@ -24,3 +24,13 @@ The ConsentInterceptor requires a user-supplied instance of the [IConsentService
```java ```java
{{snippet:classpath:/ca/uhn/hapi/fhir/docs/ConsentInterceptors.java|service}} {{snippet:classpath:/ca/uhn/hapi/fhir/docs/ConsentInterceptors.java|service}}
``` ```
## Performance and Privacy
Filtering search results in `canSeeResource()` requires inspecting every resource during a search and editing the results.
This is slower than the normal path, and will prevent the reuse of the results from the search cache.
The `willSeeResource()` operation supports reusing cached search results, but removed resources may be 'visible' as holes in returned bundles.
Disabling `canSeeResource()` by returning `false` from `processCanSeeResource()` will enable the search cache.

View File

@ -22,6 +22,12 @@ Searching on Location.Position using `near` currently uses a box search, not a r
The special `_filter` is only partially implemented. The special `_filter` is only partially implemented.
### _pid
The JPA server implements a non-standard special `_pid` which matches/sorts on the raw internal database id.
This sort is useful for imposing tie-breaking sort order in an efficient way.
Note that this is an internal feature that may change or be removed in the future. Use with caution.
<a name="uplifted-refchains"/> <a name="uplifted-refchains"/>

View File

@ -96,7 +96,7 @@ public class HapiFhirHibernateJpaDialect extends HibernateJpaDialect {
+ makeErrorMessage( + makeErrorMessage(
messageToPrepend, "resourceIndexedCompositeStringUniqueConstraintFailure")); messageToPrepend, "resourceIndexedCompositeStringUniqueConstraintFailure"));
} }
if (constraintName.contains(ResourceTable.IDX_RES_FHIR_ID)) { if (constraintName.contains(ResourceTable.IDX_RES_TYPE_FHIR_ID)) {
throw new ResourceVersionConflictException( throw new ResourceVersionConflictException(
Msg.code(825) + makeErrorMessage(messageToPrepend, "forcedIdConstraintFailure")); Msg.code(825) + makeErrorMessage(messageToPrepend, "forcedIdConstraintFailure"));
} }

View File

@ -114,7 +114,6 @@ import ca.uhn.fhir.jpa.search.builder.predicate.ComboNonUniqueSearchParameterPre
import ca.uhn.fhir.jpa.search.builder.predicate.ComboUniqueSearchParameterPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.ComboUniqueSearchParameterPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ForcedIdPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityNormalizedPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.QuantityNormalizedPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder;
@ -613,12 +612,6 @@ public class JpaConfig {
return new DatePredicateBuilder(theSearchBuilder); return new DatePredicateBuilder(theSearchBuilder);
} }
@Bean
@Scope("prototype")
public ForcedIdPredicateBuilder newForcedIdPredicateBuilder(SearchQueryBuilder theSearchBuilder) {
return new ForcedIdPredicateBuilder(theSearchBuilder);
}
@Bean @Bean
@Scope("prototype") @Scope("prototype")
public NumberPredicateBuilder newNumberPredicateBuilder(SearchQueryBuilder theSearchBuilder) { public NumberPredicateBuilder newNumberPredicateBuilder(SearchQueryBuilder theSearchBuilder) {

View File

@ -53,7 +53,7 @@ public interface IMdmLinkJpaRepository
@Modifying @Modifying
@Query( @Query(
value = value =
"DELETE FROM MPI_LINK_AUD f WHERE GOLDEN_RESOURCE_PID IN (:goldenPids) OR TARGET_PID IN (:goldenPids)", "DELETE FROM MPI_LINK_AUD WHERE GOLDEN_RESOURCE_PID IN (:goldenPids) OR TARGET_PID IN (:goldenPids)",
nativeQuery = true) nativeQuery = true)
void deleteLinksHistoryWithAnyReferenceToPids(@Param("goldenPids") List<Long> theResourcePids); void deleteLinksHistoryWithAnyReferenceToPids(@Param("goldenPids") List<Long> theResourcePids);

View File

@ -20,8 +20,7 @@
package ca.uhn.fhir.jpa.dao.data; package ca.uhn.fhir.jpa.dao.data;
import ca.uhn.fhir.jpa.entity.Search; import ca.uhn.fhir.jpa.entity.Search;
import org.springframework.data.domain.Pageable; import com.google.errorprone.annotations.CanIgnoreReturnValue;
import org.springframework.data.domain.Slice;
import org.springframework.data.jpa.repository.JpaRepository; import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Modifying; import org.springframework.data.jpa.repository.Modifying;
import org.springframework.data.jpa.repository.Query; import org.springframework.data.jpa.repository.Query;
@ -30,6 +29,8 @@ import org.springframework.data.repository.query.Param;
import java.util.Collection; import java.util.Collection;
import java.util.Date; import java.util.Date;
import java.util.Optional; import java.util.Optional;
import java.util.Set;
import java.util.stream.Stream;
public interface ISearchDao extends JpaRepository<Search, Long>, IHapiFhirJpaRepository { public interface ISearchDao extends JpaRepository<Search, Long>, IHapiFhirJpaRepository {
@ -38,10 +39,12 @@ public interface ISearchDao extends JpaRepository<Search, Long>, IHapiFhirJpaRep
@Query( @Query(
"SELECT s.myId FROM Search s WHERE (s.myCreated < :cutoff) AND (s.myExpiryOrNull IS NULL OR s.myExpiryOrNull < :now) AND (s.myDeleted IS NULL OR s.myDeleted = FALSE)") "SELECT s.myId FROM Search s WHERE (s.myCreated < :cutoff) AND (s.myExpiryOrNull IS NULL OR s.myExpiryOrNull < :now) AND (s.myDeleted IS NULL OR s.myDeleted = FALSE)")
Slice<Long> findWhereCreatedBefore(@Param("cutoff") Date theCutoff, @Param("now") Date theNow, Pageable thePage); Stream<Long> findWhereCreatedBefore(@Param("cutoff") Date theCutoff, @Param("now") Date theNow);
@Query("SELECT s.myId FROM Search s WHERE s.myDeleted = TRUE") @Query("SELECT new ca.uhn.fhir.jpa.dao.data.SearchIdAndResultSize(" + "s.myId, "
Slice<Long> findDeleted(Pageable thePage); + "(select max(sr.myOrder) as maxOrder from SearchResult sr where sr.mySearchPid = s.myId)) "
+ "FROM Search s WHERE s.myDeleted = TRUE")
Stream<SearchIdAndResultSize> findDeleted();
@Query( @Query(
"SELECT s FROM Search s WHERE s.myResourceType = :type AND s.mySearchQueryStringHash = :hash AND (s.myCreated > :cutoff) AND s.myDeleted = FALSE AND s.myStatus <> 'FAILED'") "SELECT s FROM Search s WHERE s.myResourceType = :type AND s.mySearchQueryStringHash = :hash AND (s.myCreated > :cutoff) AND s.myDeleted = FALSE AND s.myStatus <> 'FAILED'")
@ -54,10 +57,15 @@ public interface ISearchDao extends JpaRepository<Search, Long>, IHapiFhirJpaRep
int countDeleted(); int countDeleted();
@Modifying @Modifying
@Query("UPDATE Search s SET s.myDeleted = :deleted WHERE s.myId = :pid") @Query("UPDATE Search s SET s.myDeleted = :deleted WHERE s.myId in (:pids)")
void updateDeleted(@Param("pid") Long thePid, @Param("deleted") boolean theDeleted); @CanIgnoreReturnValue
int updateDeleted(@Param("pids") Set<Long> thePid, @Param("deleted") boolean theDeleted);
@Modifying @Modifying
@Query("DELETE FROM Search s WHERE s.myId = :pid") @Query("DELETE FROM Search s WHERE s.myId = :pid")
void deleteByPid(@Param("pid") Long theId); void deleteByPid(@Param("pid") Long theId);
@Modifying
@Query("DELETE FROM Search s WHERE s.myId in (:pids)")
void deleteByPids(@Param("pids") Collection<Long> theSearchToDelete);
} }

View File

@ -20,14 +20,18 @@
package ca.uhn.fhir.jpa.dao.data; package ca.uhn.fhir.jpa.dao.data;
import ca.uhn.fhir.jpa.entity.SearchInclude; import ca.uhn.fhir.jpa.entity.SearchInclude;
import com.google.errorprone.annotations.CanIgnoreReturnValue;
import org.springframework.data.jpa.repository.JpaRepository; import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Modifying; import org.springframework.data.jpa.repository.Modifying;
import org.springframework.data.jpa.repository.Query; import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.query.Param; import org.springframework.data.repository.query.Param;
import java.util.Collection;
public interface ISearchIncludeDao extends JpaRepository<SearchInclude, Long>, IHapiFhirJpaRepository { public interface ISearchIncludeDao extends JpaRepository<SearchInclude, Long>, IHapiFhirJpaRepository {
@Modifying @Modifying
@Query(value = "DELETE FROM SearchInclude r WHERE r.mySearchPid = :search") @Query(value = "DELETE FROM SearchInclude r WHERE r.mySearchPid in (:search)")
void deleteForSearch(@Param("search") Long theSearchPid); @CanIgnoreReturnValue
int deleteForSearch(@Param("search") Collection<Long> theSearchPid);
} }

View File

@ -20,6 +20,7 @@
package ca.uhn.fhir.jpa.dao.data; package ca.uhn.fhir.jpa.dao.data;
import ca.uhn.fhir.jpa.entity.SearchResult; import ca.uhn.fhir.jpa.entity.SearchResult;
import com.google.errorprone.annotations.CanIgnoreReturnValue;
import org.springframework.data.domain.Pageable; import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Slice; import org.springframework.data.domain.Slice;
import org.springframework.data.jpa.repository.JpaRepository; import org.springframework.data.jpa.repository.JpaRepository;
@ -27,6 +28,7 @@ import org.springframework.data.jpa.repository.Modifying;
import org.springframework.data.jpa.repository.Query; import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.query.Param; import org.springframework.data.repository.query.Param;
import java.util.Collection;
import java.util.List; import java.util.List;
public interface ISearchResultDao extends JpaRepository<SearchResult, Long>, IHapiFhirJpaRepository { public interface ISearchResultDao extends JpaRepository<SearchResult, Long>, IHapiFhirJpaRepository {
@ -37,12 +39,19 @@ public interface ISearchResultDao extends JpaRepository<SearchResult, Long>, IHa
@Query(value = "SELECT r.myResourcePid FROM SearchResult r WHERE r.mySearchPid = :search") @Query(value = "SELECT r.myResourcePid FROM SearchResult r WHERE r.mySearchPid = :search")
List<Long> findWithSearchPidOrderIndependent(@Param("search") Long theSearchPid); List<Long> findWithSearchPidOrderIndependent(@Param("search") Long theSearchPid);
@Query(value = "SELECT r.myId FROM SearchResult r WHERE r.mySearchPid = :search") @Modifying
Slice<Long> findForSearch(Pageable thePage, @Param("search") Long theSearchPid); @Query("DELETE FROM SearchResult s WHERE s.mySearchPid IN :searchIds")
@CanIgnoreReturnValue
int deleteBySearchIds(@Param("searchIds") Collection<Long> theSearchIds);
@Modifying @Modifying
@Query("DELETE FROM SearchResult s WHERE s.myId IN :ids") @Query(
void deleteByIds(@Param("ids") List<Long> theContent); "DELETE FROM SearchResult s WHERE s.mySearchPid = :searchId and s.myOrder >= :rangeStart and s.myOrder <= :rangeEnd")
@CanIgnoreReturnValue
int deleteBySearchIdInRange(
@Param("searchId") Long theSearchId,
@Param("rangeStart") int theRangeStart,
@Param("rangeEnd") int theRangeEnd);
@Query("SELECT count(r) FROM SearchResult r WHERE r.mySearchPid = :search") @Query("SELECT count(r) FROM SearchResult r WHERE r.mySearchPid = :search")
int countForSearch(@Param("search") Long theSearchPid); int countForSearch(@Param("search") Long theSearchPid);

View File

@ -0,0 +1,37 @@
/*-
* #%L
* HAPI FHIR JPA Server
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.dao.data;
import java.util.Objects;
/**
* Record for search result returning the PK of a Search, and the number of associated SearchResults
*/
public class SearchIdAndResultSize {
/** Search PK */
public final long searchId;
/** Number of SearchResults attached */
public final int size;
public SearchIdAndResultSize(long theSearchId, Integer theSize) {
searchId = theSearchId;
size = Objects.requireNonNullElse(theSize, 0);
}
}

View File

@ -37,21 +37,22 @@ public class SearchResult implements Serializable {
private static final long serialVersionUID = 1L; private static final long serialVersionUID = 1L;
@Deprecated(since = "6.10", forRemoval = true) // migrating to composite PK on searchPid,Order
@GeneratedValue(strategy = GenerationType.AUTO, generator = "SEQ_SEARCH_RES") @GeneratedValue(strategy = GenerationType.AUTO, generator = "SEQ_SEARCH_RES")
@SequenceGenerator(name = "SEQ_SEARCH_RES", sequenceName = "SEQ_SEARCH_RES") @SequenceGenerator(name = "SEQ_SEARCH_RES", sequenceName = "SEQ_SEARCH_RES")
@Id @Id
@Column(name = "PID") @Column(name = "PID")
private Long myId; private Long myId;
@Column(name = "SEARCH_ORDER", nullable = false, insertable = true, updatable = false) @Column(name = "SEARCH_PID", insertable = true, updatable = false, nullable = false)
private Long mySearchPid;
@Column(name = "SEARCH_ORDER", insertable = true, updatable = false, nullable = false)
private int myOrder; private int myOrder;
@Column(name = "RESOURCE_PID", insertable = true, updatable = false, nullable = false) @Column(name = "RESOURCE_PID", insertable = true, updatable = false, nullable = false)
private Long myResourcePid; private Long myResourcePid;
@Column(name = "SEARCH_PID", insertable = true, updatable = false, nullable = false)
private Long mySearchPid;
/** /**
* Constructor * Constructor
*/ */

View File

@ -118,12 +118,19 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks<VersionEnum> {
Builder.BuilderWithTableName hfjResource = version.onTable("HFJ_RESOURCE"); Builder.BuilderWithTableName hfjResource = version.onTable("HFJ_RESOURCE");
hfjResource.modifyColumn("20231018.2", "FHIR_ID").nonNullable(); hfjResource.modifyColumn("20231018.2", "FHIR_ID").nonNullable();
hfjResource.dropIndex("20231027.1", "IDX_RES_FHIR_ID");
hfjResource hfjResource
.addIndex("20231018.3", "IDX_RES_FHIR_ID") .addIndex("20231027.2", "IDX_RES_TYPE_FHIR_ID")
.unique(true) .unique(true)
.online(true) .online(true)
.includeColumns("RES_ID") // include res_id and our deleted flag so we can satisfy Observation?_sort=_id from the index on
.withColumns("FHIR_ID", "RES_TYPE"); // platforms that support it.
.includeColumns("RES_ID, RES_DELETED_AT")
.withColumns("RES_TYPE", "FHIR_ID");
// For resolving references that don't supply the type.
hfjResource.addIndex("20231027.3", "IDX_RES_FHIR_ID").unique(false).withColumns("FHIR_ID");
} }
protected void init680() { protected void init680() {

View File

@ -40,6 +40,7 @@ import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.searchparam.registry.ISearchParamRegistryController; import ca.uhn.fhir.jpa.searchparam.registry.ISearchParamRegistryController;
import ca.uhn.fhir.jpa.searchparam.util.SearchParameterHelper; import ca.uhn.fhir.jpa.searchparam.util.SearchParameterHelper;
import ca.uhn.fhir.rest.api.server.IBundleProvider; import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.param.StringParam; import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.param.TokenParam; import ca.uhn.fhir.rest.param.TokenParam;
@ -64,12 +65,13 @@ import org.springframework.beans.factory.annotation.Autowired;
import java.io.IOException; import java.io.IOException;
import java.util.Collection; import java.util.Collection;
import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Optional; import java.util.Optional;
import javax.annotation.Nonnull;
import javax.annotation.PostConstruct; import javax.annotation.PostConstruct;
import static ca.uhn.fhir.jpa.packages.util.PackageUtils.DEFAULT_INSTALL_TYPES; import static ca.uhn.fhir.jpa.packages.util.PackageUtils.DEFAULT_INSTALL_TYPES;
import static ca.uhn.fhir.util.SearchParameterUtil.getBaseAsStrings;
import static org.apache.commons.lang3.StringUtils.defaultString; import static org.apache.commons.lang3.StringUtils.defaultString;
import static org.apache.commons.lang3.StringUtils.isBlank; import static org.apache.commons.lang3.StringUtils.isBlank;
@ -251,7 +253,7 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
for (IBaseResource next : resources) { for (IBaseResource next : resources) {
try { try {
next = isStructureDefinitionWithoutSnapshot(next) ? generateSnapshot(next) : next; next = isStructureDefinitionWithoutSnapshot(next) ? generateSnapshot(next) : next;
create(next, theInstallationSpec, theOutcome); install(next, theInstallationSpec, theOutcome);
} catch (Exception e) { } catch (Exception e) {
ourLog.warn( ourLog.warn(
"Failed to upload resource of type {} with ID {} - Error: {}", "Failed to upload resource of type {} with ID {} - Error: {}",
@ -345,83 +347,42 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
* ============================= Utility methods =============================== * ============================= Utility methods ===============================
*/ */
@VisibleForTesting @VisibleForTesting
void create( void install(
IBaseResource theResource, IBaseResource theResource,
PackageInstallationSpec theInstallationSpec, PackageInstallationSpec theInstallationSpec,
PackageInstallOutcomeJson theOutcome) { PackageInstallOutcomeJson theOutcome) {
IFhirResourceDao dao = myDaoRegistry.getResourceDao(theResource.getClass());
SearchParameterMap map = createSearchParameterMapFor(theResource);
IBundleProvider searchResult = searchResource(dao, map);
if (validForUpload(theResource)) {
if (searchResult.isEmpty()) {
ourLog.info("Creating new resource matching {}", map.toNormalizedQueryString(myFhirContext)); if (!validForUpload(theResource)) {
theOutcome.incrementResourcesInstalled(myFhirContext.getResourceType(theResource));
IIdType id = theResource.getIdElement();
if (id.isEmpty()) {
createResource(dao, theResource);
ourLog.info("Created resource with new id");
} else {
if (id.isIdPartValidLong()) {
String newIdPart = "npm-" + id.getIdPart();
id.setParts(id.getBaseUrl(), id.getResourceType(), newIdPart, id.getVersionIdPart());
}
try {
updateResource(dao, theResource);
ourLog.info("Created resource with existing id");
} catch (ResourceVersionConflictException exception) {
final Optional<IBaseResource> optResource = readResourceById(dao, id);
final String existingResourceUrlOrNull = optResource
.filter(MetadataResource.class::isInstance)
.map(MetadataResource.class::cast)
.map(MetadataResource::getUrl)
.orElse(null);
final String newResourceUrlOrNull = (theResource instanceof MetadataResource)
? ((MetadataResource) theResource).getUrl()
: null;
ourLog.error(
"Version conflict error: This is possibly due to a collision between ValueSets from different IGs that are coincidentally using the same resource ID: [{}] and new resource URL: [{}], with the exisitng resource having URL: [{}]. Ignoring this update and continuing: The first IG wins. ",
id.getIdPart(),
newResourceUrlOrNull,
existingResourceUrlOrNull,
exception);
}
}
} else {
if (theInstallationSpec.isReloadExisting()) {
ourLog.info("Updating existing resource matching {}", map.toNormalizedQueryString(myFhirContext));
theResource.setId(searchResult
.getResources(0, 1)
.get(0)
.getIdElement()
.toUnqualifiedVersionless());
DaoMethodOutcome outcome = updateResource(dao, theResource);
if (!outcome.isNop()) {
theOutcome.incrementResourcesInstalled(myFhirContext.getResourceType(theResource));
}
} else {
ourLog.info(
"Skipping update of existing resource matching {}",
map.toNormalizedQueryString(myFhirContext));
}
}
} else {
ourLog.warn( ourLog.warn(
"Failed to upload resource of type {} with ID {} - Error: Resource failed validation", "Failed to upload resource of type {} with ID {} - Error: Resource failed validation",
theResource.fhirType(), theResource.fhirType(),
theResource.getIdElement().getValue()); theResource.getIdElement().getValue());
return;
}
IFhirResourceDao dao = myDaoRegistry.getResourceDao(theResource.getClass());
SearchParameterMap map = createSearchParameterMapFor(theResource);
IBundleProvider searchResult = searchResource(dao, map);
String resourceQuery = map.toNormalizedQueryString(myFhirContext);
if (!searchResult.isEmpty() && !theInstallationSpec.isReloadExisting()) {
ourLog.info("Skipping update of existing resource matching {}", resourceQuery);
return;
}
if (!searchResult.isEmpty()) {
ourLog.info("Updating existing resource matching {}", resourceQuery);
}
IBaseResource existingResource =
!searchResult.isEmpty() ? searchResult.getResources(0, 1).get(0) : null;
boolean isInstalled = createOrUpdateResource(dao, theResource, existingResource);
if (isInstalled) {
theOutcome.incrementResourcesInstalled(myFhirContext.getResourceType(theResource));
} }
} }
private Optional<IBaseResource> readResourceById(IFhirResourceDao dao, IIdType id) { private Optional<IBaseResource> readResourceById(IFhirResourceDao dao, IIdType id) {
try { try {
return Optional.ofNullable(dao.read(id.toUnqualifiedVersionless(), newSystemRequestDetails())); return Optional.ofNullable(dao.read(id.toUnqualifiedVersionless(), createRequestDetails()));
} catch (Exception exception) { } catch (Exception exception) {
// ignore because we're running this query to help build the log // ignore because we're running this query to help build the log
@ -432,30 +393,112 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
} }
private IBundleProvider searchResource(IFhirResourceDao theDao, SearchParameterMap theMap) { private IBundleProvider searchResource(IFhirResourceDao theDao, SearchParameterMap theMap) {
return theDao.search(theMap, newSystemRequestDetails()); return theDao.search(theMap, createRequestDetails());
} }
@Nonnull protected boolean createOrUpdateResource(
private SystemRequestDetails newSystemRequestDetails() { IFhirResourceDao theDao, IBaseResource theResource, IBaseResource theExistingResource) {
return new SystemRequestDetails().setRequestPartitionId(RequestPartitionId.defaultPartition()); final IIdType id = theResource.getIdElement();
if (theExistingResource == null && id.isEmpty()) {
ourLog.debug("Install resource without id will be created");
theDao.create(theResource, createRequestDetails());
return true;
} }
private void createResource(IFhirResourceDao theDao, IBaseResource theResource) { if (theExistingResource == null && !id.isEmpty() && id.isIdPartValidLong()) {
if (myPartitionSettings.isPartitioningEnabled()) { String newIdPart = "npm-" + id.getIdPart();
SystemRequestDetails requestDetails = newSystemRequestDetails(); id.setParts(id.getBaseUrl(), id.getResourceType(), newIdPart, id.getVersionIdPart());
theDao.create(theResource, requestDetails); }
boolean isExistingUpdated = updateExistingResourceIfNecessary(theDao, theResource, theExistingResource);
boolean shouldOverrideId = theExistingResource != null && !isExistingUpdated;
if (shouldOverrideId) {
ourLog.debug(
"Existing resource {} will be overridden with installed resource {}",
theExistingResource.getIdElement(),
id);
theResource.setId(theExistingResource.getIdElement().toUnqualifiedVersionless());
} else { } else {
theDao.create(theResource); ourLog.debug("Install resource {} will be created", id);
}
} }
DaoMethodOutcome updateResource(IFhirResourceDao theDao, IBaseResource theResource) { DaoMethodOutcome outcome = updateResource(theDao, theResource);
if (myPartitionSettings.isPartitioningEnabled()) { return outcome != null && !outcome.isNop();
SystemRequestDetails requestDetails = newSystemRequestDetails();
return theDao.update(theResource, requestDetails);
} else {
return theDao.update(theResource, new SystemRequestDetails());
} }
private boolean updateExistingResourceIfNecessary(
IFhirResourceDao theDao, IBaseResource theResource, IBaseResource theExistingResource) {
if (!"SearchParameter".equals(theResource.getClass().getSimpleName())) {
return false;
}
if (theExistingResource == null) {
return false;
}
if (theExistingResource
.getIdElement()
.getIdPart()
.equals(theResource.getIdElement().getIdPart())) {
return false;
}
Collection<String> remainingBaseList = new HashSet<>(getBaseAsStrings(myFhirContext, theExistingResource));
remainingBaseList.removeAll(getBaseAsStrings(myFhirContext, theResource));
if (remainingBaseList.isEmpty()) {
return false;
}
myFhirContext
.getResourceDefinition(theExistingResource)
.getChildByName("base")
.getMutator()
.setValue(theExistingResource, null);
for (String baseResourceName : remainingBaseList) {
myFhirContext.newTerser().addElement(theExistingResource, "base", baseResourceName);
}
ourLog.info(
"Existing SearchParameter {} will be updated with base {}",
theExistingResource.getIdElement().getIdPart(),
remainingBaseList);
updateResource(theDao, theExistingResource);
return true;
}
private DaoMethodOutcome updateResource(IFhirResourceDao theDao, IBaseResource theResource) {
DaoMethodOutcome outcome = null;
IIdType id = theResource.getIdElement();
RequestDetails requestDetails = createRequestDetails();
try {
outcome = theDao.update(theResource, requestDetails);
} catch (ResourceVersionConflictException exception) {
final Optional<IBaseResource> optResource = readResourceById(theDao, id);
final String existingResourceUrlOrNull = optResource
.filter(MetadataResource.class::isInstance)
.map(MetadataResource.class::cast)
.map(MetadataResource::getUrl)
.orElse(null);
final String newResourceUrlOrNull =
(theResource instanceof MetadataResource) ? ((MetadataResource) theResource).getUrl() : null;
ourLog.error(
"Version conflict error: This is possibly due to a collision between ValueSets from different IGs that are coincidentally using the same resource ID: [{}] and new resource URL: [{}], with the exisitng resource having URL: [{}]. Ignoring this update and continuing: The first IG wins. ",
id.getIdPart(),
newResourceUrlOrNull,
existingResourceUrlOrNull,
exception);
}
return outcome;
}
private RequestDetails createRequestDetails() {
SystemRequestDetails requestDetails = new SystemRequestDetails();
if (myPartitionSettings.isPartitioningEnabled()) {
requestDetails.setRequestPartitionId(RequestPartitionId.defaultPartition());
}
return requestDetails;
} }
boolean validForUpload(IBaseResource theResource) { boolean validForUpload(IBaseResource theResource) {
@ -480,7 +523,7 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
return false; return false;
} }
if (SearchParameterUtil.getBaseAsStrings(myFhirContext, theResource).isEmpty()) { if (getBaseAsStrings(myFhirContext, theResource).isEmpty()) {
ourLog.warn( ourLog.warn(
"Failed to validate resource of type {} with url {} - Error: Resource base is empty", "Failed to validate resource of type {} with url {} - Error: Resource base is empty",
theResource.fhirType(), theResource.fhirType(),
@ -560,20 +603,21 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
} }
} }
private SearchParameterMap createSearchParameterMapFor(IBaseResource resource) { private SearchParameterMap createSearchParameterMapFor(IBaseResource theResource) {
if (resource.getClass().getSimpleName().equals("NamingSystem")) { String resourceType = theResource.getClass().getSimpleName();
String uniqueId = extractUniqeIdFromNamingSystem(resource); if ("NamingSystem".equals(resourceType)) {
String uniqueId = extractUniqeIdFromNamingSystem(theResource);
return SearchParameterMap.newSynchronous().add("value", new StringParam(uniqueId).setExact(true)); return SearchParameterMap.newSynchronous().add("value", new StringParam(uniqueId).setExact(true));
} else if (resource.getClass().getSimpleName().equals("Subscription")) { } else if ("Subscription".equals(resourceType)) {
String id = extractIdFromSubscription(resource); String id = extractSimpleValue(theResource, "id");
return SearchParameterMap.newSynchronous().add("_id", new TokenParam(id)); return SearchParameterMap.newSynchronous().add("_id", new TokenParam(id));
} else if (resource.getClass().getSimpleName().equals("SearchParameter")) { } else if ("SearchParameter".equals(resourceType)) {
return buildSearchParameterMapForSearchParameter(resource); return buildSearchParameterMapForSearchParameter(theResource);
} else if (resourceHasUrlElement(resource)) { } else if (resourceHasUrlElement(theResource)) {
String url = extractUniqueUrlFromMetadataResource(resource); String url = extractSimpleValue(theResource, "url");
return SearchParameterMap.newSynchronous().add("url", new UriParam(url)); return SearchParameterMap.newSynchronous().add("url", new UriParam(url));
} else { } else {
TokenParam identifierToken = extractIdentifierFromOtherResourceTypes(resource); TokenParam identifierToken = extractIdentifierFromOtherResourceTypes(theResource);
return SearchParameterMap.newSynchronous().add("identifier", identifierToken); return SearchParameterMap.newSynchronous().add("identifier", identifierToken);
} }
} }
@ -593,7 +637,7 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
} }
if (resourceHasUrlElement(theResource)) { if (resourceHasUrlElement(theResource)) {
String url = extractUniqueUrlFromMetadataResource(theResource); String url = extractSimpleValue(theResource, "url");
return SearchParameterMap.newSynchronous().add("url", new UriParam(url)); return SearchParameterMap.newSynchronous().add("url", new UriParam(url));
} else { } else {
TokenParam identifierToken = extractIdentifierFromOtherResourceTypes(theResource); TokenParam identifierToken = extractIdentifierFromOtherResourceTypes(theResource);
@ -601,32 +645,17 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
} }
} }
private String extractUniqeIdFromNamingSystem(IBaseResource resource) { private String extractUniqeIdFromNamingSystem(IBaseResource theResource) {
FhirTerser terser = myFhirContext.newTerser(); IBase uniqueIdComponent = (IBase) extractValue(theResource, "uniqueId");
IBase uniqueIdComponent = (IBase) terser.getSingleValueOrNull(resource, "uniqueId");
if (uniqueIdComponent == null) { if (uniqueIdComponent == null) {
throw new ImplementationGuideInstallationException( throw new ImplementationGuideInstallationException(
Msg.code(1291) + "NamingSystem does not have uniqueId component."); Msg.code(1291) + "NamingSystem does not have uniqueId component.");
} }
IPrimitiveType<?> asPrimitiveType = (IPrimitiveType<?>) terser.getSingleValueOrNull(uniqueIdComponent, "value"); return extractSimpleValue(uniqueIdComponent, "value");
return (String) asPrimitiveType.getValue();
} }
private String extractIdFromSubscription(IBaseResource resource) { private TokenParam extractIdentifierFromOtherResourceTypes(IBaseResource theResource) {
FhirTerser terser = myFhirContext.newTerser(); Identifier identifier = (Identifier) extractValue(theResource, "identifier");
IPrimitiveType<?> asPrimitiveType = (IPrimitiveType<?>) terser.getSingleValueOrNull(resource, "id");
return (String) asPrimitiveType.getValue();
}
private String extractUniqueUrlFromMetadataResource(IBaseResource resource) {
FhirTerser terser = myFhirContext.newTerser();
IPrimitiveType<?> asPrimitiveType = (IPrimitiveType<?>) terser.getSingleValueOrNull(resource, "url");
return (String) asPrimitiveType.getValue();
}
private TokenParam extractIdentifierFromOtherResourceTypes(IBaseResource resource) {
FhirTerser terser = myFhirContext.newTerser();
Identifier identifier = (Identifier) terser.getSingleValueOrNull(resource, "identifier");
if (identifier != null) { if (identifier != null) {
return new TokenParam(identifier.getSystem(), identifier.getValue()); return new TokenParam(identifier.getSystem(), identifier.getValue());
} else { } else {
@ -635,6 +664,15 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
} }
} }
private Object extractValue(IBase theResource, String thePath) {
return myFhirContext.newTerser().getSingleValueOrNull(theResource, thePath);
}
private String extractSimpleValue(IBase theResource, String thePath) {
IPrimitiveType<?> asPrimitiveType = (IPrimitiveType<?>) extractValue(theResource, thePath);
return (String) asPrimitiveType.getValue();
}
private boolean resourceHasUrlElement(IBaseResource resource) { private boolean resourceHasUrlElement(IBaseResource resource) {
BaseRuntimeElementDefinition<?> def = myFhirContext.getElementDefinition(resource.getClass()); BaseRuntimeElementDefinition<?> def = myFhirContext.getElementDefinition(resource.getClass());
if (!(def instanceof BaseRuntimeElementCompositeDefinition)) { if (!(def instanceof BaseRuntimeElementCompositeDefinition)) {

View File

@ -25,12 +25,16 @@ import ca.uhn.fhir.jpa.model.sched.HapiJob;
import ca.uhn.fhir.jpa.model.sched.IHasScheduledJobs; import ca.uhn.fhir.jpa.model.sched.IHasScheduledJobs;
import ca.uhn.fhir.jpa.model.sched.ISchedulerService; import ca.uhn.fhir.jpa.model.sched.ISchedulerService;
import ca.uhn.fhir.jpa.model.sched.ScheduledJobDefinition; import ca.uhn.fhir.jpa.model.sched.ScheduledJobDefinition;
import ca.uhn.fhir.jpa.search.cache.DatabaseSearchCacheSvcImpl;
import ca.uhn.fhir.jpa.search.cache.ISearchCacheSvc; import ca.uhn.fhir.jpa.search.cache.ISearchCacheSvc;
import org.quartz.JobExecutionContext; import org.quartz.JobExecutionContext;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional; import org.springframework.transaction.annotation.Transactional;
import java.time.Instant;
import java.time.temporal.ChronoUnit;
import static ca.uhn.fhir.jpa.search.cache.DatabaseSearchCacheSvcImpl.SEARCH_CLEANUP_JOB_INTERVAL_MILLIS; import static ca.uhn.fhir.jpa.search.cache.DatabaseSearchCacheSvcImpl.SEARCH_CLEANUP_JOB_INTERVAL_MILLIS;
/** /**
@ -42,7 +46,6 @@ import static ca.uhn.fhir.jpa.search.cache.DatabaseSearchCacheSvcImpl.SEARCH_CLE
// in Smile. // in Smile.
// //
public class StaleSearchDeletingSvcImpl implements IStaleSearchDeletingSvc, IHasScheduledJobs { public class StaleSearchDeletingSvcImpl implements IStaleSearchDeletingSvc, IHasScheduledJobs {
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(StaleSearchDeletingSvcImpl.class);
@Autowired @Autowired
private JpaStorageSettings myStorageSettings; private JpaStorageSettings myStorageSettings;
@ -53,7 +56,16 @@ public class StaleSearchDeletingSvcImpl implements IStaleSearchDeletingSvc, IHas
@Override @Override
@Transactional(propagation = Propagation.NEVER) @Transactional(propagation = Propagation.NEVER)
public void pollForStaleSearchesAndDeleteThem() { public void pollForStaleSearchesAndDeleteThem() {
mySearchCacheSvc.pollForStaleSearchesAndDeleteThem(RequestPartitionId.allPartitions()); mySearchCacheSvc.pollForStaleSearchesAndDeleteThem(RequestPartitionId.allPartitions(), getDeadline());
}
/**
* Calculate a deadline to finish before the next scheduled run.
*/
protected Instant getDeadline() {
return Instant.ofEpochMilli(DatabaseSearchCacheSvcImpl.now())
// target a 90% duty-cycle to avoid confusing quartz
.plus((long) (SEARCH_CLEANUP_JOB_INTERVAL_MILLIS * 0.90), ChronoUnit.MILLIS);
} }
@Override @Override

View File

@ -43,7 +43,6 @@ import ca.uhn.fhir.jpa.search.builder.predicate.ComboNonUniqueSearchParameterPre
import ca.uhn.fhir.jpa.search.builder.predicate.ComboUniqueSearchParameterPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.ComboUniqueSearchParameterPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ForcedIdPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ICanMakeMissingParamPredicate; import ca.uhn.fhir.jpa.search.builder.predicate.ICanMakeMissingParamPredicate;
import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ParsedLocationParam; import ca.uhn.fhir.jpa.search.builder.predicate.ParsedLocationParam;
@ -69,10 +68,10 @@ import ca.uhn.fhir.parser.DataFormatException;
import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.QualifiedParamList; import ca.uhn.fhir.rest.api.QualifiedParamList;
import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum; import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum;
import ca.uhn.fhir.rest.api.SearchContainedModeEnum;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.param.CompositeParam; import ca.uhn.fhir.rest.param.CompositeParam;
import ca.uhn.fhir.rest.param.DateParam; import ca.uhn.fhir.rest.param.DateParam;
import ca.uhn.fhir.rest.param.DateRangeParam;
import ca.uhn.fhir.rest.param.HasParam; import ca.uhn.fhir.rest.param.HasParam;
import ca.uhn.fhir.rest.param.NumberParam; import ca.uhn.fhir.rest.param.NumberParam;
import ca.uhn.fhir.rest.param.QuantityParam; import ca.uhn.fhir.rest.param.QuantityParam;
@ -95,7 +94,6 @@ import com.healthmarketscience.sqlbuilder.ComboCondition;
import com.healthmarketscience.sqlbuilder.Condition; import com.healthmarketscience.sqlbuilder.Condition;
import com.healthmarketscience.sqlbuilder.Expression; import com.healthmarketscience.sqlbuilder.Expression;
import com.healthmarketscience.sqlbuilder.InCondition; import com.healthmarketscience.sqlbuilder.InCondition;
import com.healthmarketscience.sqlbuilder.OrderObject;
import com.healthmarketscience.sqlbuilder.SelectQuery; import com.healthmarketscience.sqlbuilder.SelectQuery;
import com.healthmarketscience.sqlbuilder.SetOperationQuery; import com.healthmarketscience.sqlbuilder.SetOperationQuery;
import com.healthmarketscience.sqlbuilder.Subquery; import com.healthmarketscience.sqlbuilder.Subquery;
@ -123,6 +121,7 @@ import java.util.function.Supplier;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import javax.annotation.Nullable; import javax.annotation.Nullable;
import static ca.uhn.fhir.jpa.search.builder.QueryStack.SearchForIdsParams.with;
import static ca.uhn.fhir.jpa.util.QueryParameterUtils.fromOperation; import static ca.uhn.fhir.jpa.util.QueryParameterUtils.fromOperation;
import static ca.uhn.fhir.jpa.util.QueryParameterUtils.getChainedPart; import static ca.uhn.fhir.jpa.util.QueryParameterUtils.getChainedPart;
import static ca.uhn.fhir.jpa.util.QueryParameterUtils.getParamNameWithPrefix; import static ca.uhn.fhir.jpa.util.QueryParameterUtils.getParamNameWithPrefix;
@ -275,16 +274,21 @@ public class QueryStack {
} }
public void addSortOnResourceId(boolean theAscending) { public void addSortOnResourceId(boolean theAscending) {
ResourceTablePredicateBuilder resourceTablePredicateBuilder;
BaseJoiningPredicateBuilder firstPredicateBuilder = mySqlBuilder.getOrCreateFirstPredicateBuilder(); BaseJoiningPredicateBuilder firstPredicateBuilder = mySqlBuilder.getOrCreateFirstPredicateBuilder();
ForcedIdPredicateBuilder sortPredicateBuilder = if (firstPredicateBuilder instanceof ResourceTablePredicateBuilder) {
mySqlBuilder.addForcedIdPredicateBuilder(firstPredicateBuilder.getResourceIdColumn()); resourceTablePredicateBuilder = (ResourceTablePredicateBuilder) firstPredicateBuilder;
if (!theAscending) {
mySqlBuilder.addSortString(
sortPredicateBuilder.getColumnForcedId(), false, OrderObject.NullOrder.FIRST, myUseAggregate);
} else { } else {
mySqlBuilder.addSortString(sortPredicateBuilder.getColumnForcedId(), true, myUseAggregate); resourceTablePredicateBuilder =
mySqlBuilder.addResourceTablePredicateBuilder(firstPredicateBuilder.getResourceIdColumn());
} }
mySqlBuilder.addSortNumeric(firstPredicateBuilder.getResourceIdColumn(), theAscending, myUseAggregate); mySqlBuilder.addSortString(resourceTablePredicateBuilder.getColumnFhirId(), theAscending, myUseAggregate);
}
/** Sort on RES_ID -- used to break ties for reliable sort */
public void addSortOnResourcePID(boolean theAscending) {
BaseJoiningPredicateBuilder predicateBuilder = mySqlBuilder.getOrCreateFirstPredicateBuilder();
mySqlBuilder.addSortString(predicateBuilder.getResourceIdColumn(), theAscending);
} }
public void addSortOnResourceLink( public void addSortOnResourceLink(
@ -1107,7 +1111,7 @@ public class QueryStack {
if (paramName.startsWith("_has:")) { if (paramName.startsWith("_has:")) {
ourLog.trace("Handing double _has query: {}", paramName); ourLog.trace("Handling double _has query: {}", paramName);
String qualifier = paramName.substring(4); String qualifier = paramName.substring(4);
for (IQueryParameterType next : nextOrList) { for (IQueryParameterType next : nextOrList) {
@ -1160,26 +1164,30 @@ public class QueryStack {
parameterName = parameterName.substring(0, colonIndex); parameterName = parameterName.substring(0, colonIndex);
} }
ResourceLinkPredicateBuilder join = ResourceLinkPredicateBuilder resourceLinkTableJoin =
mySqlBuilder.addReferencePredicateBuilderReversed(this, theSourceJoinColumn); mySqlBuilder.addReferencePredicateBuilderReversed(this, theSourceJoinColumn);
Condition partitionPredicate = join.createPartitionIdPredicate(theRequestPartitionId); Condition partitionPredicate = resourceLinkTableJoin.createPartitionIdPredicate(theRequestPartitionId);
List<String> paths = join.createResourceLinkPaths(targetResourceType, paramReference, new ArrayList<>()); List<String> paths = resourceLinkTableJoin.createResourceLinkPaths(
targetResourceType, paramReference, new ArrayList<>());
if (CollectionUtils.isEmpty(paths)) { if (CollectionUtils.isEmpty(paths)) {
throw new InvalidRequestException(Msg.code(2305) + "Reference field does not exist: " + paramReference); throw new InvalidRequestException(Msg.code(2305) + "Reference field does not exist: " + paramReference);
} }
Condition typePredicate = BinaryCondition.equalTo( Condition typePredicate = BinaryCondition.equalTo(
join.getColumnTargetResourceType(), mySqlBuilder.generatePlaceholder(theResourceType)); resourceLinkTableJoin.getColumnTargetResourceType(),
Condition pathPredicate = mySqlBuilder.generatePlaceholder(theResourceType));
toEqualToOrInPredicate(join.getColumnSourcePath(), mySqlBuilder.generatePlaceholders(paths)); Condition pathPredicate = toEqualToOrInPredicate(
Condition linkedPredicate = searchForIdsWithAndOr( resourceLinkTableJoin.getColumnSourcePath(), mySqlBuilder.generatePlaceholders(paths));
join.getColumnSrcResourceId(),
targetResourceType, Condition linkedPredicate =
parameterName, searchForIdsWithAndOr(with().setSourceJoinColumn(resourceLinkTableJoin.getColumnSrcResourceId())
Collections.singletonList(orValues), .setResourceName(targetResourceType)
theRequest, .setParamName(parameterName)
theRequestPartitionId, .setAndOrParams(Collections.singletonList(orValues))
SearchContainedModeEnum.FALSE); .setRequest(theRequest)
.setRequestPartitionId(theRequestPartitionId));
andPredicates.add(toAndPredicate(partitionPredicate, pathPredicate, typePredicate, linkedPredicate)); andPredicates.add(toAndPredicate(partitionPredicate, pathPredicate, typePredicate, linkedPredicate));
} }
@ -2270,57 +2278,125 @@ public class QueryStack {
} }
@Nullable @Nullable
public Condition searchForIdsWithAndOr( public Condition searchForIdsWithAndOr(SearchForIdsParams theSearchForIdsParams) {
@Nullable DbColumn theSourceJoinColumn,
String theResourceName,
String theParamName,
List<List<IQueryParameterType>> theAndOrParams,
RequestDetails theRequest,
RequestPartitionId theRequestPartitionId,
SearchContainedModeEnum theSearchContainedMode) {
if (theAndOrParams.isEmpty()) { if (theSearchForIdsParams.myAndOrParams.isEmpty()) {
return null; return null;
} }
switch (theParamName) { switch (theSearchForIdsParams.myParamName) {
case IAnyResource.SP_RES_ID: case IAnyResource.SP_RES_ID:
return createPredicateResourceId( return createPredicateResourceId(
theSourceJoinColumn, theAndOrParams, theResourceName, null, theRequestPartitionId); theSearchForIdsParams.mySourceJoinColumn,
theSearchForIdsParams.myAndOrParams,
theSearchForIdsParams.myResourceName,
null,
theSearchForIdsParams.myRequestPartitionId);
case Constants.PARAM_PID:
return createPredicateResourcePID(
theSearchForIdsParams.mySourceJoinColumn, theSearchForIdsParams.myAndOrParams);
case PARAM_HAS: case PARAM_HAS:
return createPredicateHas( return createPredicateHas(
theSourceJoinColumn, theResourceName, theAndOrParams, theRequest, theRequestPartitionId); theSearchForIdsParams.mySourceJoinColumn,
theSearchForIdsParams.myResourceName,
theSearchForIdsParams.myAndOrParams,
theSearchForIdsParams.myRequest,
theSearchForIdsParams.myRequestPartitionId);
case Constants.PARAM_TAG: case Constants.PARAM_TAG:
case Constants.PARAM_PROFILE: case Constants.PARAM_PROFILE:
case Constants.PARAM_SECURITY: case Constants.PARAM_SECURITY:
if (myStorageSettings.getTagStorageMode() == JpaStorageSettings.TagStorageModeEnum.INLINE) { if (myStorageSettings.getTagStorageMode() == JpaStorageSettings.TagStorageModeEnum.INLINE) {
return createPredicateSearchParameter( return createPredicateSearchParameter(
theSourceJoinColumn, theSearchForIdsParams.mySourceJoinColumn,
theResourceName, theSearchForIdsParams.myResourceName,
theParamName, theSearchForIdsParams.myParamName,
theAndOrParams, theSearchForIdsParams.myAndOrParams,
theRequest, theSearchForIdsParams.myRequest,
theRequestPartitionId); theSearchForIdsParams.myRequestPartitionId);
} else { } else {
return createPredicateTag(theSourceJoinColumn, theAndOrParams, theParamName, theRequestPartitionId); return createPredicateTag(
theSearchForIdsParams.mySourceJoinColumn,
theSearchForIdsParams.myAndOrParams,
theSearchForIdsParams.myParamName,
theSearchForIdsParams.myRequestPartitionId);
} }
case Constants.PARAM_SOURCE: case Constants.PARAM_SOURCE:
return createPredicateSourceForAndList(theSourceJoinColumn, theAndOrParams); return createPredicateSourceForAndList(
theSearchForIdsParams.mySourceJoinColumn, theSearchForIdsParams.myAndOrParams);
case Constants.PARAM_LASTUPDATED:
// this case statement handles a _lastUpdated query as part of a reverse search
// only (/Patient?_has:Encounter:patient:_lastUpdated=ge2023-10-24).
// performing a _lastUpdated query on a resource (/Patient?_lastUpdated=eq2023-10-24)
// is handled in {@link SearchBuilder#createChunkedQuery}.
return createReverseSearchPredicateLastUpdated(
theSearchForIdsParams.myAndOrParams, theSearchForIdsParams.mySourceJoinColumn);
default: default:
return createPredicateSearchParameter( return createPredicateSearchParameter(
theSourceJoinColumn, theSearchForIdsParams.mySourceJoinColumn,
theResourceName, theSearchForIdsParams.myResourceName,
theParamName, theSearchForIdsParams.myParamName,
theAndOrParams, theSearchForIdsParams.myAndOrParams,
theRequest, theSearchForIdsParams.myRequest,
theRequestPartitionId); theSearchForIdsParams.myRequestPartitionId);
} }
} }
/**
* Raw match on RES_ID
*/
private Condition createPredicateResourcePID(
DbColumn theSourceJoinColumn, List<List<IQueryParameterType>> theAndOrParams) {
DbColumn pidColumn = theSourceJoinColumn;
if (pidColumn == null) {
BaseJoiningPredicateBuilder predicateBuilder = mySqlBuilder.getOrCreateFirstPredicateBuilder();
pidColumn = predicateBuilder.getResourceIdColumn();
}
// we don't support any modifiers for now
Set<Long> pids = theAndOrParams.stream()
.map(orList -> orList.stream()
.map(v -> v.getValueAsQueryToken(myFhirContext))
.map(Long::valueOf)
.collect(Collectors.toSet()))
.reduce(Sets::intersection)
.orElse(Set.of());
if (pids.isEmpty()) {
mySqlBuilder.setMatchNothing();
return null;
}
return toEqualToOrInPredicate(pidColumn, mySqlBuilder.generatePlaceholders(pids));
}
private Condition createReverseSearchPredicateLastUpdated(
List<List<IQueryParameterType>> theAndOrParams, DbColumn theSourceColumn) {
ResourceTablePredicateBuilder resourceTableJoin =
mySqlBuilder.addResourceTablePredicateBuilder(theSourceColumn);
List<Condition> andPredicates = new ArrayList<>(theAndOrParams.size());
for (List<IQueryParameterType> aList : theAndOrParams) {
if (!aList.isEmpty()) {
DateParam dateParam = (DateParam) aList.get(0);
DateRangeParam dateRangeParam = new DateRangeParam(dateParam);
Condition aCondition = mySqlBuilder.addPredicateLastUpdated(dateRangeParam, resourceTableJoin);
andPredicates.add(aCondition);
}
}
return toAndPredicate(andPredicates);
}
@Nullable @Nullable
private Condition createPredicateSearchParameter( private Condition createPredicateSearchParameter(
@Nullable DbColumn theSourceJoinColumn, @Nullable DbColumn theSourceJoinColumn,
@ -3020,4 +3096,82 @@ public class QueryStack {
theParamDefinition, myOrValues, myLeafTarget, myLeafParamName, myLeafPathPrefix, myQualifiers); theParamDefinition, myOrValues, myLeafTarget, myLeafParamName, myLeafPathPrefix, myQualifiers);
} }
} }
public static class SearchForIdsParams {
DbColumn mySourceJoinColumn;
String myResourceName;
String myParamName;
List<List<IQueryParameterType>> myAndOrParams;
RequestDetails myRequest;
RequestPartitionId myRequestPartitionId;
ResourceTablePredicateBuilder myResourceTablePredicateBuilder;
public static SearchForIdsParams with() {
return new SearchForIdsParams();
}
public DbColumn getSourceJoinColumn() {
return mySourceJoinColumn;
}
public SearchForIdsParams setSourceJoinColumn(DbColumn theSourceJoinColumn) {
mySourceJoinColumn = theSourceJoinColumn;
return this;
}
public String getResourceName() {
return myResourceName;
}
public SearchForIdsParams setResourceName(String theResourceName) {
myResourceName = theResourceName;
return this;
}
public String getParamName() {
return myParamName;
}
public SearchForIdsParams setParamName(String theParamName) {
myParamName = theParamName;
return this;
}
public List<List<IQueryParameterType>> getAndOrParams() {
return myAndOrParams;
}
public SearchForIdsParams setAndOrParams(List<List<IQueryParameterType>> theAndOrParams) {
myAndOrParams = theAndOrParams;
return this;
}
public RequestDetails getRequest() {
return myRequest;
}
public SearchForIdsParams setRequest(RequestDetails theRequest) {
myRequest = theRequest;
return this;
}
public RequestPartitionId getRequestPartitionId() {
return myRequestPartitionId;
}
public SearchForIdsParams setRequestPartitionId(RequestPartitionId theRequestPartitionId) {
myRequestPartitionId = theRequestPartitionId;
return this;
}
public ResourceTablePredicateBuilder getResourceTablePredicateBuilder() {
return myResourceTablePredicateBuilder;
}
public SearchForIdsParams setResourceTablePredicateBuilder(
ResourceTablePredicateBuilder theResourceTablePredicateBuilder) {
myResourceTablePredicateBuilder = theResourceTablePredicateBuilder;
return this;
}
}
} }

View File

@ -131,6 +131,7 @@ import javax.persistence.criteria.CriteriaBuilder;
import static ca.uhn.fhir.jpa.model.util.JpaConstants.UNDESIRED_RESOURCE_LINKAGES_FOR_EVERYTHING_ON_PATIENT_INSTANCE; import static ca.uhn.fhir.jpa.model.util.JpaConstants.UNDESIRED_RESOURCE_LINKAGES_FOR_EVERYTHING_ON_PATIENT_INSTANCE;
import static ca.uhn.fhir.jpa.search.builder.QueryStack.LOCATION_POSITION; import static ca.uhn.fhir.jpa.search.builder.QueryStack.LOCATION_POSITION;
import static ca.uhn.fhir.jpa.search.builder.QueryStack.SearchForIdsParams.with;
import static org.apache.commons.lang3.StringUtils.defaultString; import static org.apache.commons.lang3.StringUtils.defaultString;
import static org.apache.commons.lang3.StringUtils.isBlank; import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.isNotBlank; import static org.apache.commons.lang3.StringUtils.isNotBlank;
@ -281,14 +282,11 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
continue; continue;
} }
List<List<IQueryParameterType>> andOrParams = myParams.get(nextParamName); List<List<IQueryParameterType>> andOrParams = myParams.get(nextParamName);
Condition predicate = theQueryStack.searchForIdsWithAndOr( Condition predicate = theQueryStack.searchForIdsWithAndOr(with().setResourceName(myResourceName)
null, .setParamName(nextParamName)
myResourceName, .setAndOrParams(andOrParams)
nextParamName, .setRequest(theRequest)
andOrParams, .setRequestPartitionId(myRequestPartitionId));
theRequest,
myRequestPartitionId,
searchContainedMode);
if (predicate != null) { if (predicate != null) {
theSearchSqlBuilder.addPredicate(predicate); theSearchSqlBuilder.addPredicate(predicate);
} }
@ -840,6 +838,10 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
theQueryStack.addSortOnResourceId(ascending); theQueryStack.addSortOnResourceId(ascending);
} else if (Constants.PARAM_PID.equals(theSort.getParamName())) {
theQueryStack.addSortOnResourcePID(ascending);
} else if (Constants.PARAM_LASTUPDATED.equals(theSort.getParamName())) { } else if (Constants.PARAM_LASTUPDATED.equals(theSort.getParamName())) {
theQueryStack.addSortOnLastUpdated(ascending); theQueryStack.addSortOnLastUpdated(ascending);

View File

@ -1,51 +0,0 @@
/*-
* #%L
* HAPI FHIR JPA Server
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.search.builder.predicate;
import ca.uhn.fhir.jpa.search.builder.sql.SearchQueryBuilder;
import com.healthmarketscience.sqlbuilder.dbspec.basic.DbColumn;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ForcedIdPredicateBuilder extends BaseJoiningPredicateBuilder {
private static final Logger ourLog = LoggerFactory.getLogger(ForcedIdPredicateBuilder.class);
private final DbColumn myColumnResourceId;
private final DbColumn myColumnForcedId;
/**
* Constructor
*/
public ForcedIdPredicateBuilder(SearchQueryBuilder theSearchSqlBuilder) {
super(theSearchSqlBuilder, theSearchSqlBuilder.addTable("HFJ_FORCED_ID"));
myColumnResourceId = getTable().addColumn("RESOURCE_PID");
myColumnForcedId = getTable().addColumn("FORCED_ID");
}
@Override
public DbColumn getResourceIdColumn() {
return myColumnResourceId;
}
public DbColumn getColumnForcedId() {
return myColumnForcedId;
}
}

View File

@ -51,7 +51,6 @@ import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.parser.DataFormatException; import ca.uhn.fhir.parser.DataFormatException;
import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum; import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum;
import ca.uhn.fhir.rest.api.SearchContainedModeEnum;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.param.ReferenceParam; import ca.uhn.fhir.rest.param.ReferenceParam;
import ca.uhn.fhir.rest.param.TokenParam; import ca.uhn.fhir.rest.param.TokenParam;
@ -87,6 +86,7 @@ import java.util.stream.Collectors;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
import javax.annotation.Nullable; import javax.annotation.Nullable;
import static ca.uhn.fhir.jpa.search.builder.QueryStack.SearchForIdsParams.with;
import static org.apache.commons.lang3.StringUtils.isBlank; import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.trim; import static org.apache.commons.lang3.StringUtils.trim;
@ -456,14 +456,13 @@ public class ResourceLinkPredicateBuilder extends BaseJoiningPredicateBuilder im
List<Condition> andPredicates = new ArrayList<>(); List<Condition> andPredicates = new ArrayList<>();
List<List<IQueryParameterType>> chainParamValues = Collections.singletonList(orValues); List<List<IQueryParameterType>> chainParamValues = Collections.singletonList(orValues);
andPredicates.add(childQueryFactory.searchForIdsWithAndOr( andPredicates.add(
myColumnTargetResourceId, childQueryFactory.searchForIdsWithAndOr(with().setSourceJoinColumn(myColumnTargetResourceId)
subResourceName, .setResourceName(subResourceName)
chain, .setParamName(chain)
chainParamValues, .setAndOrParams(chainParamValues)
theRequest, .setRequest(theRequest)
theRequestPartitionId, .setRequestPartitionId(theRequestPartitionId)));
SearchContainedModeEnum.FALSE));
orPredicates.add(QueryParameterUtils.toAndPredicate(andPredicates)); orPredicates.add(QueryParameterUtils.toAndPredicate(andPredicates));
} }

View File

@ -19,6 +19,7 @@
*/ */
package ca.uhn.fhir.jpa.search.builder.predicate; package ca.uhn.fhir.jpa.search.builder.predicate;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.search.builder.sql.SearchQueryBuilder; import ca.uhn.fhir.jpa.search.builder.sql.SearchQueryBuilder;
import ca.uhn.fhir.jpa.util.QueryParameterUtils; import ca.uhn.fhir.jpa.util.QueryParameterUtils;
import com.healthmarketscience.sqlbuilder.BinaryCondition; import com.healthmarketscience.sqlbuilder.BinaryCondition;
@ -35,6 +36,7 @@ public class ResourceTablePredicateBuilder extends BaseJoiningPredicateBuilder {
private final DbColumn myColumnResType; private final DbColumn myColumnResType;
private final DbColumn myColumnLastUpdated; private final DbColumn myColumnLastUpdated;
private final DbColumn myColumnLanguage; private final DbColumn myColumnLanguage;
private final DbColumn myColumnFhirId;
/** /**
* Constructor * Constructor
@ -42,10 +44,11 @@ public class ResourceTablePredicateBuilder extends BaseJoiningPredicateBuilder {
public ResourceTablePredicateBuilder(SearchQueryBuilder theSearchSqlBuilder) { public ResourceTablePredicateBuilder(SearchQueryBuilder theSearchSqlBuilder) {
super(theSearchSqlBuilder, theSearchSqlBuilder.addTable("HFJ_RESOURCE")); super(theSearchSqlBuilder, theSearchSqlBuilder.addTable("HFJ_RESOURCE"));
myColumnResId = getTable().addColumn("RES_ID"); myColumnResId = getTable().addColumn("RES_ID");
myColumnResType = getTable().addColumn("RES_TYPE"); myColumnResType = getTable().addColumn(ResourceTable.RES_TYPE);
myColumnResDeletedAt = getTable().addColumn("RES_DELETED_AT"); myColumnResDeletedAt = getTable().addColumn("RES_DELETED_AT");
myColumnLastUpdated = getTable().addColumn("RES_UPDATED"); myColumnLastUpdated = getTable().addColumn("RES_UPDATED");
myColumnLanguage = getTable().addColumn("RES_LANGUAGE"); myColumnLanguage = getTable().addColumn("RES_LANGUAGE");
myColumnFhirId = getTable().addColumn(ResourceTable.FHIR_ID);
} }
@Override @Override
@ -77,4 +80,8 @@ public class ResourceTablePredicateBuilder extends BaseJoiningPredicateBuilder {
public DbColumn getColumnLastUpdated() { public DbColumn getColumnLastUpdated() {
return myColumnLastUpdated; return myColumnLastUpdated;
} }
public DbColumn getColumnFhirId() {
return myColumnFhirId;
}
} }

View File

@ -32,7 +32,6 @@ import ca.uhn.fhir.jpa.search.builder.predicate.ComboNonUniqueSearchParameterPre
import ca.uhn.fhir.jpa.search.builder.predicate.ComboUniqueSearchParameterPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.ComboUniqueSearchParameterPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ForcedIdPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityNormalizedPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.QuantityNormalizedPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder;
@ -62,7 +61,6 @@ import com.healthmarketscience.sqlbuilder.dbspec.basic.DbJoin;
import com.healthmarketscience.sqlbuilder.dbspec.basic.DbSchema; import com.healthmarketscience.sqlbuilder.dbspec.basic.DbSchema;
import com.healthmarketscience.sqlbuilder.dbspec.basic.DbSpec; import com.healthmarketscience.sqlbuilder.dbspec.basic.DbSpec;
import com.healthmarketscience.sqlbuilder.dbspec.basic.DbTable; import com.healthmarketscience.sqlbuilder.dbspec.basic.DbTable;
import org.apache.commons.lang3.Validate;
import org.hibernate.dialect.Dialect; import org.hibernate.dialect.Dialect;
import org.hibernate.dialect.SQLServerDialect; import org.hibernate.dialect.SQLServerDialect;
import org.hibernate.dialect.pagination.AbstractLimitHandler; import org.hibernate.dialect.pagination.AbstractLimitHandler;
@ -222,18 +220,6 @@ public class SearchQueryBuilder {
return mySqlBuilderFactory.dateIndexTable(this); return mySqlBuilderFactory.dateIndexTable(this);
} }
/**
* Add and return a predicate builder for selecting a forced ID. This is only intended for use with sorts so it can not
* be the root query.
*/
public ForcedIdPredicateBuilder addForcedIdPredicateBuilder(@Nonnull DbColumn theSourceJoinColumn) {
Validate.isTrue(theSourceJoinColumn != null);
ForcedIdPredicateBuilder retVal = mySqlBuilderFactory.newForcedIdPredicateBuilder(this);
addTableForSorting(retVal, theSourceJoinColumn);
return retVal;
}
/** /**
* Create, add and return a predicate builder (or a root query if no root query exists yet) for selecting on a NUMBER search parameter * Create, add and return a predicate builder (or a root query if no root query exists yet) for selecting on a NUMBER search parameter
*/ */
@ -417,11 +403,6 @@ public class SearchQueryBuilder {
addTable(thePredicateBuilder, theSourceJoinColumn, SelectQuery.JoinType.INNER); addTable(thePredicateBuilder, theSourceJoinColumn, SelectQuery.JoinType.INNER);
} }
private void addTableForSorting(
BaseJoiningPredicateBuilder thePredicateBuilder, @Nullable DbColumn theSourceJoinColumn) {
addTable(thePredicateBuilder, theSourceJoinColumn, SelectQuery.JoinType.LEFT_OUTER);
}
private void addTable( private void addTable(
BaseJoiningPredicateBuilder thePredicateBuilder, BaseJoiningPredicateBuilder thePredicateBuilder,
@Nullable DbColumn theSourceJoinColumn, @Nullable DbColumn theSourceJoinColumn,
@ -699,15 +680,24 @@ public class SearchQueryBuilder {
public ComboCondition addPredicateLastUpdated(DateRangeParam theDateRange) { public ComboCondition addPredicateLastUpdated(DateRangeParam theDateRange) {
ResourceTablePredicateBuilder resourceTableRoot = getOrCreateResourceTablePredicateBuilder(false); ResourceTablePredicateBuilder resourceTableRoot = getOrCreateResourceTablePredicateBuilder(false);
return addPredicateLastUpdated(theDateRange, resourceTableRoot);
}
public ComboCondition addPredicateLastUpdated(
DateRangeParam theDateRange, ResourceTablePredicateBuilder theResourceTablePredicateBuilder) {
List<Condition> conditions = new ArrayList<>(2); List<Condition> conditions = new ArrayList<>(2);
BinaryCondition condition; BinaryCondition condition;
if (isNotEqualsComparator(theDateRange)) { if (isNotEqualsComparator(theDateRange)) {
condition = createConditionForValueWithComparator( condition = createConditionForValueWithComparator(
LESSTHAN, resourceTableRoot.getLastUpdatedColumn(), theDateRange.getLowerBoundAsInstant()); LESSTHAN,
theResourceTablePredicateBuilder.getLastUpdatedColumn(),
theDateRange.getLowerBoundAsInstant());
conditions.add(condition); conditions.add(condition);
condition = createConditionForValueWithComparator( condition = createConditionForValueWithComparator(
GREATERTHAN, resourceTableRoot.getLastUpdatedColumn(), theDateRange.getUpperBoundAsInstant()); GREATERTHAN,
theResourceTablePredicateBuilder.getLastUpdatedColumn(),
theDateRange.getUpperBoundAsInstant());
conditions.add(condition); conditions.add(condition);
return ComboCondition.or(conditions.toArray(new Condition[0])); return ComboCondition.or(conditions.toArray(new Condition[0]));
} }
@ -715,7 +705,7 @@ public class SearchQueryBuilder {
if (theDateRange.getLowerBoundAsInstant() != null) { if (theDateRange.getLowerBoundAsInstant() != null) {
condition = createConditionForValueWithComparator( condition = createConditionForValueWithComparator(
GREATERTHAN_OR_EQUALS, GREATERTHAN_OR_EQUALS,
resourceTableRoot.getLastUpdatedColumn(), theResourceTablePredicateBuilder.getLastUpdatedColumn(),
theDateRange.getLowerBoundAsInstant()); theDateRange.getLowerBoundAsInstant());
conditions.add(condition); conditions.add(condition);
} }
@ -723,7 +713,7 @@ public class SearchQueryBuilder {
if (theDateRange.getUpperBoundAsInstant() != null) { if (theDateRange.getUpperBoundAsInstant() != null) {
condition = createConditionForValueWithComparator( condition = createConditionForValueWithComparator(
LESSTHAN_OR_EQUALS, LESSTHAN_OR_EQUALS,
resourceTableRoot.getLastUpdatedColumn(), theResourceTablePredicateBuilder.getLastUpdatedColumn(),
theDateRange.getUpperBoundAsInstant()); theDateRange.getUpperBoundAsInstant());
conditions.add(condition); conditions.add(condition);
} }
@ -757,7 +747,7 @@ public class SearchQueryBuilder {
List<Long> excludePids = JpaPid.toLongList(theExistingPidSetToExclude); List<Long> excludePids = JpaPid.toLongList(theExistingPidSetToExclude);
ourLog.trace("excludePids = " + excludePids); ourLog.trace("excludePids = {}", excludePids);
DbColumn resourceIdColumn = getOrCreateFirstPredicateBuilder().getResourceIdColumn(); DbColumn resourceIdColumn = getOrCreateFirstPredicateBuilder().getResourceIdColumn();
InCondition predicate = new InCondition(resourceIdColumn, generatePlaceholders(excludePids)); InCondition predicate = new InCondition(resourceIdColumn, generatePlaceholders(excludePids));

View File

@ -24,7 +24,6 @@ import ca.uhn.fhir.jpa.search.builder.predicate.ComboNonUniqueSearchParameterPre
import ca.uhn.fhir.jpa.search.builder.predicate.ComboUniqueSearchParameterPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.ComboUniqueSearchParameterPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ForcedIdPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityNormalizedPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.QuantityNormalizedPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder; import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder;
@ -63,10 +62,6 @@ public class SqlObjectFactory {
return myApplicationContext.getBean(DatePredicateBuilder.class, theSearchSqlBuilder); return myApplicationContext.getBean(DatePredicateBuilder.class, theSearchSqlBuilder);
} }
public ForcedIdPredicateBuilder newForcedIdPredicateBuilder(SearchQueryBuilder theSearchSqlBuilder) {
return myApplicationContext.getBean(ForcedIdPredicateBuilder.class, theSearchSqlBuilder);
}
public NumberPredicateBuilder numberIndexTable(SearchQueryBuilder theSearchSqlBuilder) { public NumberPredicateBuilder numberIndexTable(SearchQueryBuilder theSearchSqlBuilder) {
return myApplicationContext.getBean(NumberPredicateBuilder.class, theSearchSqlBuilder); return myApplicationContext.getBean(NumberPredicateBuilder.class, theSearchSqlBuilder);
} }

View File

@ -25,29 +25,35 @@ import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.dao.data.ISearchDao; import ca.uhn.fhir.jpa.dao.data.ISearchDao;
import ca.uhn.fhir.jpa.dao.data.ISearchIncludeDao; import ca.uhn.fhir.jpa.dao.data.ISearchIncludeDao;
import ca.uhn.fhir.jpa.dao.data.ISearchResultDao; import ca.uhn.fhir.jpa.dao.data.ISearchResultDao;
import ca.uhn.fhir.jpa.dao.data.SearchIdAndResultSize;
import ca.uhn.fhir.jpa.dao.tx.HapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.HapiTransactionService;
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService.IExecutionBuilder;
import ca.uhn.fhir.jpa.entity.Search; import ca.uhn.fhir.jpa.entity.Search;
import ca.uhn.fhir.jpa.model.search.SearchStatusEnum; import ca.uhn.fhir.jpa.model.search.SearchStatusEnum;
import ca.uhn.fhir.system.HapiSystemProperties; import ca.uhn.fhir.system.HapiSystemProperties;
import com.google.common.annotations.VisibleForTesting; import com.google.common.annotations.VisibleForTesting;
import com.google.common.collect.Lists;
import org.apache.commons.lang3.Validate; import org.apache.commons.lang3.Validate;
import org.apache.commons.lang3.time.DateUtils; import org.apache.commons.lang3.time.DateUtils;
import org.hibernate.Session;
import org.hl7.fhir.dstu3.model.InstantType; import org.hl7.fhir.dstu3.model.InstantType;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Slice;
import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional; import org.springframework.transaction.annotation.Transactional;
import java.sql.Connection;
import java.time.Instant; import java.time.Instant;
import java.util.Collection; import java.util.Collection;
import java.util.Date; import java.util.Date;
import java.util.List; import java.util.HashSet;
import java.util.Optional; import java.util.Optional;
import java.util.Set;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.stream.Stream;
import javax.annotation.Nonnull;
import javax.persistence.EntityManager;
public class DatabaseSearchCacheSvcImpl implements ISearchCacheSvc { public class DatabaseSearchCacheSvcImpl implements ISearchCacheSvc {
/* /*
@ -56,13 +62,12 @@ public class DatabaseSearchCacheSvcImpl implements ISearchCacheSvc {
* type query and this can fail if we have 1000s of params * type query and this can fail if we have 1000s of params
*/ */
public static final int DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_STMT = 500; public static final int DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_STMT = 500;
public static final int DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_PAS = 20000; public static final int DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_PAS = 50000;
public static final long SEARCH_CLEANUP_JOB_INTERVAL_MILLIS = DateUtils.MILLIS_PER_MINUTE; public static final long SEARCH_CLEANUP_JOB_INTERVAL_MILLIS = DateUtils.MILLIS_PER_MINUTE;
public static final int DEFAULT_MAX_DELETE_CANDIDATES_TO_FIND = 2000; public static final int DEFAULT_MAX_DELETE_CANDIDATES_TO_FIND = 2000;
private static final Logger ourLog = LoggerFactory.getLogger(DatabaseSearchCacheSvcImpl.class); private static final Logger ourLog = LoggerFactory.getLogger(DatabaseSearchCacheSvcImpl.class);
private static int ourMaximumResultsToDeleteInOneStatement = DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_STMT; private static int ourMaximumResultsToDeleteInOneStatement = DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_STMT;
private static int ourMaximumResultsToDeleteInOnePass = DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_PAS; private static int ourMaximumResultsToDeleteInOneCommit = DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_PAS;
private static int ourMaximumSearchesToCheckForDeletionCandidacy = DEFAULT_MAX_DELETE_CANDIDATES_TO_FIND;
private static Long ourNowForUnitTests; private static Long ourNowForUnitTests;
/* /*
* We give a bit of extra leeway just to avoid race conditions where a query result * We give a bit of extra leeway just to avoid race conditions where a query result
@ -74,6 +79,9 @@ public class DatabaseSearchCacheSvcImpl implements ISearchCacheSvc {
@Autowired @Autowired
private ISearchDao mySearchDao; private ISearchDao mySearchDao;
@Autowired
private EntityManager myEntityManager;
@Autowired @Autowired
private ISearchResultDao mySearchResultDao; private ISearchResultDao mySearchResultDao;
@ -169,14 +177,249 @@ public class DatabaseSearchCacheSvcImpl implements ISearchCacheSvc {
return Optional.empty(); return Optional.empty();
} }
/**
* A transient worker for a single pass through stale-search deletion.
*/
class DeleteRun {
final RequestPartitionId myRequestPartitionId;
final Instant myDeadline;
final Date myCutoffForDeletion;
final Set<Long> myUpdateDeletedFlagBatch = new HashSet<>();
final Set<Long> myDeleteSearchBatch = new HashSet<>();
/** the Search pids of the SearchResults we plan to delete in a chunk */
final Set<Long> myDeleteSearchResultsBatch = new HashSet<>();
/**
* Number of results we have queued up in mySearchPidsToDeleteResults to delete.
* We try to keep this to a reasonable size to avoid long transactions that may escalate to a table lock.
*/
private int myDeleteSearchResultsBatchCount = 0;
DeleteRun(Instant theDeadline, Date theCutoffForDeletion, RequestPartitionId theRequestPartitionId) {
myDeadline = theDeadline;
myCutoffForDeletion = theCutoffForDeletion;
myRequestPartitionId = theRequestPartitionId;
}
/**
* Mark all ids in the mySearchesToMarkForDeletion buffer as deleted, and clear the buffer.
*/
public void flushDeleteMarks() {
if (myUpdateDeletedFlagBatch.isEmpty()) {
return;
}
ourLog.debug("Marking {} searches as deleted", myUpdateDeletedFlagBatch.size());
mySearchDao.updateDeleted(myUpdateDeletedFlagBatch, true);
myUpdateDeletedFlagBatch.clear();
commitOpenChanges();
}
/**
* Dig into the guts of our Hibernate session, flush any changes in the session, and commit the underlying connection.
*/
private void commitOpenChanges() {
// flush to force Hibernate to actually get a connection from the pool
myEntityManager.flush();
// get our connection from the underlying Hibernate session, and commit
//noinspection resource
myEntityManager.unwrap(Session.class).doWork(Connection::commit);
}
void throwIfDeadlineExpired() {
boolean result = Instant.ofEpochMilli(now()).isAfter(myDeadline);
if (result) {
throw new DeadlineException(
Msg.code(2443) + "Deadline expired while cleaning Search cache - " + myDeadline);
}
}
private int deleteMarkedSearchesInBatches() {
AtomicInteger deletedCounter = new AtomicInteger(0);
try (final Stream<SearchIdAndResultSize> toDelete = mySearchDao.findDeleted()) {
assert toDelete != null;
toDelete.forEach(nextSearchToDelete -> {
throwIfDeadlineExpired();
deleteSearchAndResults(nextSearchToDelete.searchId, nextSearchToDelete.size);
deletedCounter.incrementAndGet();
});
}
// flush anything left in the buffers
flushSearchResultDeletes();
flushSearchAndIncludeDeletes();
int deletedCount = deletedCounter.get();
ourLog.info("Deleted {} expired searches", deletedCount);
return deletedCount;
}
/**
* Schedule theSearchPid for deletion assuming it has theNumberOfResults SearchResults attached.
*
* We accumulate a batch of search pids for deletion, and then do a bulk DML as we reach a threshold number
* of SearchResults.
*
* @param theSearchPid pk of the Search
* @param theNumberOfResults the number of SearchResults attached
*/
private void deleteSearchAndResults(long theSearchPid, int theNumberOfResults) {
ourLog.trace("Buffering deletion of search pid {} and {} results", theSearchPid, theNumberOfResults);
myDeleteSearchBatch.add(theSearchPid);
if (theNumberOfResults > ourMaximumResultsToDeleteInOneCommit) {
// don't buffer this one - do it inline
deleteSearchResultsByChunk(theSearchPid, theNumberOfResults);
return;
}
myDeleteSearchResultsBatch.add(theSearchPid);
myDeleteSearchResultsBatchCount += theNumberOfResults;
if (myDeleteSearchResultsBatchCount > ourMaximumResultsToDeleteInOneCommit) {
flushSearchResultDeletes();
}
if (myDeleteSearchBatch.size() > ourMaximumResultsToDeleteInOneStatement) {
// flush the results to make sure we don't have any references.
flushSearchResultDeletes();
flushSearchAndIncludeDeletes();
}
}
/**
* If this Search has more results than our max delete size,
* delete in by itself in range chunks.
* @param theSearchPid the target Search pid
* @param theNumberOfResults the number of search results present
*/
private void deleteSearchResultsByChunk(long theSearchPid, int theNumberOfResults) {
ourLog.debug(
"Search {} is large: has {} results. Deleting results in chunks.",
theSearchPid,
theNumberOfResults);
for (int rangeEnd = theNumberOfResults; rangeEnd >= 0; rangeEnd -= ourMaximumResultsToDeleteInOneCommit) {
int rangeStart = rangeEnd - ourMaximumResultsToDeleteInOneCommit;
ourLog.trace("Deleting results for search {}: {} - {}", theSearchPid, rangeStart, rangeEnd);
mySearchResultDao.deleteBySearchIdInRange(theSearchPid, rangeStart, rangeEnd);
commitOpenChanges();
}
}
private void flushSearchAndIncludeDeletes() {
if (myDeleteSearchBatch.isEmpty()) {
return;
}
ourLog.debug("Deleting {} Search records", myDeleteSearchBatch.size());
// referential integrity requires we delete includes before the search
mySearchIncludeDao.deleteForSearch(myDeleteSearchBatch);
mySearchDao.deleteByPids(myDeleteSearchBatch);
myDeleteSearchBatch.clear();
commitOpenChanges();
}
private void flushSearchResultDeletes() {
if (myDeleteSearchResultsBatch.isEmpty()) {
return;
}
ourLog.debug(
"Deleting {} Search Results from {} searches",
myDeleteSearchResultsBatchCount,
myDeleteSearchResultsBatch.size());
mySearchResultDao.deleteBySearchIds(myDeleteSearchResultsBatch);
myDeleteSearchResultsBatch.clear();
myDeleteSearchResultsBatchCount = 0;
commitOpenChanges();
}
IExecutionBuilder getTxBuilder() {
return myTransactionService.withSystemRequest().withRequestPartitionId(myRequestPartitionId);
}
private void run() {
ourLog.debug("Searching for searches which are before {}", myCutoffForDeletion);
// this tx builder is not really for tx management.
// Instead, it is used bind a Hibernate session + connection to this thread.
// We will run a streaming query to look for work, and then commit changes in batches during the loops.
getTxBuilder().execute(theStatus -> {
try {
markDeletedInBatches();
throwIfDeadlineExpired();
// Delete searches that are marked as deleted
int deletedCount = deleteMarkedSearchesInBatches();
throwIfDeadlineExpired();
if ((ourLog.isDebugEnabled() || HapiSystemProperties.isTestModeEnabled()) && (deletedCount > 0)) {
Long total = mySearchDao.count();
ourLog.debug("Deleted {} searches, {} remaining", deletedCount, total);
}
} catch (DeadlineException theTimeoutException) {
ourLog.warn(theTimeoutException.getMessage());
}
return null;
});
}
/**
* Stream through a list of pids before our cutoff, and set myDeleted=true in batches in a DML statement.
*/
private void markDeletedInBatches() {
try (Stream<Long> toMarkDeleted =
mySearchDao.findWhereCreatedBefore(myCutoffForDeletion, new Date(now()))) {
assert toMarkDeleted != null;
toMarkDeleted.forEach(nextSearchToDelete -> {
throwIfDeadlineExpired();
if (myUpdateDeletedFlagBatch.size() >= ourMaximumResultsToDeleteInOneStatement) {
flushDeleteMarks();
}
ourLog.trace("Marking search with PID {} as ready for deletion", nextSearchToDelete);
myUpdateDeletedFlagBatch.add(nextSearchToDelete);
});
flushDeleteMarks();
}
}
}
/**
* Marker to abandon our delete run when we are over time.
*/
private static class DeadlineException extends RuntimeException {
public DeadlineException(String message) {
super(message);
}
}
@Override @Override
public void pollForStaleSearchesAndDeleteThem(RequestPartitionId theRequestPartitionId) { public void pollForStaleSearchesAndDeleteThem(RequestPartitionId theRequestPartitionId, Instant theDeadline) {
HapiTransactionService.noTransactionAllowed(); HapiTransactionService.noTransactionAllowed();
if (!myStorageSettings.isExpireSearchResults()) { if (!myStorageSettings.isExpireSearchResults()) {
return; return;
} }
final Date cutoff = getCutoff();
final DeleteRun run = new DeleteRun(theDeadline, cutoff, theRequestPartitionId);
run.run();
}
@Nonnull
private Date getCutoff() {
long cutoffMillis = myStorageSettings.getExpireSearchResultsAfterMillis(); long cutoffMillis = myStorageSettings.getExpireSearchResultsAfterMillis();
if (myStorageSettings.getReuseCachedSearchResultsForMillis() != null) { if (myStorageSettings.getReuseCachedSearchResultsForMillis() != null) {
cutoffMillis = cutoffMillis + myStorageSettings.getReuseCachedSearchResultsForMillis(); cutoffMillis = cutoffMillis + myStorageSettings.getReuseCachedSearchResultsForMillis();
@ -189,108 +432,16 @@ public class DatabaseSearchCacheSvcImpl implements ISearchCacheSvc {
new InstantType(cutoff), new InstantType(cutoff),
new InstantType(new Date(now()))); new InstantType(new Date(now())));
} }
return cutoff;
ourLog.debug("Searching for searches which are before {}", cutoff);
// Mark searches as deleted if they should be
final Slice<Long> toMarkDeleted = myTransactionService
.withSystemRequestOnPartition(theRequestPartitionId)
.execute(theStatus -> mySearchDao.findWhereCreatedBefore(
cutoff, new Date(), PageRequest.of(0, ourMaximumSearchesToCheckForDeletionCandidacy)));
assert toMarkDeleted != null;
for (final Long nextSearchToDelete : toMarkDeleted) {
ourLog.debug("Deleting search with PID {}", nextSearchToDelete);
myTransactionService
.withSystemRequest()
.withRequestPartitionId(theRequestPartitionId)
.execute(t -> {
mySearchDao.updateDeleted(nextSearchToDelete, true);
return null;
});
}
// Delete searches that are marked as deleted
final Slice<Long> toDelete = myTransactionService
.withSystemRequestOnPartition(theRequestPartitionId)
.execute(theStatus ->
mySearchDao.findDeleted(PageRequest.of(0, ourMaximumSearchesToCheckForDeletionCandidacy)));
assert toDelete != null;
for (final Long nextSearchToDelete : toDelete) {
ourLog.debug("Deleting search with PID {}", nextSearchToDelete);
myTransactionService
.withSystemRequest()
.withRequestPartitionId(theRequestPartitionId)
.execute(t -> {
deleteSearch(nextSearchToDelete);
return null;
});
}
int count = toDelete.getContent().size();
if (count > 0) {
if (ourLog.isDebugEnabled() || HapiSystemProperties.isTestModeEnabled()) {
Long total = myTransactionService
.withSystemRequest()
.withRequestPartitionId(theRequestPartitionId)
.execute(t -> mySearchDao.count());
ourLog.debug("Deleted {} searches, {} remaining", count, total);
}
}
}
private void deleteSearch(final Long theSearchPid) {
mySearchDao.findById(theSearchPid).ifPresent(searchToDelete -> {
mySearchIncludeDao.deleteForSearch(searchToDelete.getId());
/*
* Note, we're only deleting up to 500 results in an individual search here. This
* is to prevent really long running transactions in cases where there are
* huge searches with tons of results in them. By the time we've gotten here
* we have marked the parent Search entity as deleted, so it's not such a
* huge deal to be only partially deleting search results. They'll get deleted
* eventually
*/
int max = ourMaximumResultsToDeleteInOnePass;
Slice<Long> resultPids = mySearchResultDao.findForSearch(PageRequest.of(0, max), searchToDelete.getId());
if (resultPids.hasContent()) {
List<List<Long>> partitions =
Lists.partition(resultPids.getContent(), ourMaximumResultsToDeleteInOneStatement);
for (List<Long> nextPartition : partitions) {
mySearchResultDao.deleteByIds(nextPartition);
}
}
// Only delete if we don't have results left in this search
if (resultPids.getNumberOfElements() < max) {
ourLog.debug(
"Deleting search {}/{} - Created[{}]",
searchToDelete.getId(),
searchToDelete.getUuid(),
new InstantType(searchToDelete.getCreated()));
mySearchDao.deleteByPid(searchToDelete.getId());
} else {
ourLog.debug(
"Purged {} search results for deleted search {}/{}",
resultPids.getSize(),
searchToDelete.getId(),
searchToDelete.getUuid());
}
});
}
@VisibleForTesting
public static void setMaximumSearchesToCheckForDeletionCandidacyForUnitTest(
int theMaximumSearchesToCheckForDeletionCandidacy) {
ourMaximumSearchesToCheckForDeletionCandidacy = theMaximumSearchesToCheckForDeletionCandidacy;
} }
@VisibleForTesting @VisibleForTesting
public static void setMaximumResultsToDeleteInOnePassForUnitTest(int theMaximumResultsToDeleteInOnePass) { public static void setMaximumResultsToDeleteInOnePassForUnitTest(int theMaximumResultsToDeleteInOnePass) {
ourMaximumResultsToDeleteInOnePass = theMaximumResultsToDeleteInOnePass; ourMaximumResultsToDeleteInOneCommit = theMaximumResultsToDeleteInOnePass;
} }
@VisibleForTesting @VisibleForTesting
public static void setMaximumResultsToDeleteForUnitTest(int theMaximumResultsToDelete) { public static void setMaximumResultsToDeleteInOneStatement(int theMaximumResultsToDelete) {
ourMaximumResultsToDeleteInOneStatement = theMaximumResultsToDelete; ourMaximumResultsToDeleteInOneStatement = theMaximumResultsToDelete;
} }
@ -302,7 +453,7 @@ public class DatabaseSearchCacheSvcImpl implements ISearchCacheSvc {
ourNowForUnitTests = theNowForUnitTests; ourNowForUnitTests = theNowForUnitTests;
} }
private static long now() { public static long now() {
if (ourNowForUnitTests != null) { if (ourNowForUnitTests != null) {
return ourNowForUnitTests; return ourNowForUnitTests;
} }

View File

@ -23,6 +23,7 @@ import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.entity.Search; import ca.uhn.fhir.jpa.entity.Search;
import java.time.Instant; import java.time.Instant;
import java.time.temporal.ChronoUnit;
import java.util.Optional; import java.util.Optional;
public interface ISearchCacheSvc { public interface ISearchCacheSvc {
@ -86,5 +87,10 @@ public interface ISearchCacheSvc {
* if they have some other mechanism for expiring stale results other than manually looking for them * if they have some other mechanism for expiring stale results other than manually looking for them
* and deleting them. * and deleting them.
*/ */
void pollForStaleSearchesAndDeleteThem(RequestPartitionId theRequestPartitionId); void pollForStaleSearchesAndDeleteThem(RequestPartitionId theRequestPartitionId, Instant theDeadline);
@Deprecated(since = "6.10", forRemoval = true) // wipmb delete once cdr merges
default void pollForStaleSearchesAndDeleteThem(RequestPartitionId theRequestPartitionId) {
pollForStaleSearchesAndDeleteThem(theRequestPartitionId, Instant.now().plus(1, ChronoUnit.MINUTES));
}
} }

View File

@ -35,6 +35,7 @@ import ca.uhn.fhir.jpa.subscription.async.AsyncResourceModifiedSubmitterSvc;
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage;
import ca.uhn.fhir.model.primitive.IdDt; import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc; import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import com.fasterxml.jackson.core.JsonProcessingException; import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper; import com.fasterxml.jackson.databind.ObjectMapper;
@ -45,6 +46,7 @@ import org.slf4j.LoggerFactory;
import java.util.Date; import java.util.Date;
import java.util.List; import java.util.List;
import java.util.Optional;
import static ca.uhn.fhir.jpa.model.entity.PersistedResourceModifiedMessageEntityPK.with; import static ca.uhn.fhir.jpa.model.entity.PersistedResourceModifiedMessageEntityPK.with;
@ -92,9 +94,43 @@ public class ResourceModifiedMessagePersistenceSvcImpl implements IResourceModif
@Override @Override
public ResourceModifiedMessage inflatePersistedResourceModifiedMessage( public ResourceModifiedMessage inflatePersistedResourceModifiedMessage(
IPersistedResourceModifiedMessage thePersistedResourceModifiedMessage) { ResourceModifiedMessage theResourceModifiedMessage) {
return inflateResourceModifiedMessageFromEntity((ResourceModifiedEntity) thePersistedResourceModifiedMessage); return inflateResourceModifiedMessageFromEntity(createEntityFrom(theResourceModifiedMessage));
}
@Override
public Optional<ResourceModifiedMessage> inflatePersistedResourceModifiedMessageOrNull(
ResourceModifiedMessage theResourceModifiedMessage) {
ResourceModifiedMessage inflatedResourceModifiedMessage = null;
try {
inflatedResourceModifiedMessage = inflatePersistedResourceModifiedMessage(theResourceModifiedMessage);
} catch (ResourceNotFoundException e) {
IdDt idDt = new IdDt(
theResourceModifiedMessage.getPayloadType(myFhirContext),
theResourceModifiedMessage.getPayloadId(),
theResourceModifiedMessage.getPayloadVersion());
ourLog.warn("Scheduled submission will be ignored since resource {} cannot be found", idDt.getIdPart(), e);
} catch (Exception ex) {
ourLog.error("Unknown error encountered on inflation of resources.", ex);
}
return Optional.ofNullable(inflatedResourceModifiedMessage);
}
@Override
public ResourceModifiedMessage createResourceModifiedMessageFromEntityWithoutInflation(
IPersistedResourceModifiedMessage thePersistedResourceModifiedMessage) {
ResourceModifiedMessage resourceModifiedMessage = getPayloadLessMessageFromString(
((ResourceModifiedEntity) thePersistedResourceModifiedMessage).getSummaryResourceModifiedMessage());
IdDt resourceId =
createIdDtFromResourceModifiedEntity((ResourceModifiedEntity) thePersistedResourceModifiedMessage);
resourceModifiedMessage.setPayloadId(resourceId);
return resourceModifiedMessage;
} }
@Override @Override
@ -112,17 +148,13 @@ public class ResourceModifiedMessagePersistenceSvcImpl implements IResourceModif
protected ResourceModifiedMessage inflateResourceModifiedMessageFromEntity( protected ResourceModifiedMessage inflateResourceModifiedMessageFromEntity(
ResourceModifiedEntity theResourceModifiedEntity) { ResourceModifiedEntity theResourceModifiedEntity) {
String resourcePid =
theResourceModifiedEntity.getResourceModifiedEntityPK().getResourcePid();
String resourceVersion =
theResourceModifiedEntity.getResourceModifiedEntityPK().getResourceVersion();
String resourceType = theResourceModifiedEntity.getResourceType(); String resourceType = theResourceModifiedEntity.getResourceType();
ResourceModifiedMessage retVal = ResourceModifiedMessage retVal =
getPayloadLessMessageFromString(theResourceModifiedEntity.getSummaryResourceModifiedMessage()); getPayloadLessMessageFromString(theResourceModifiedEntity.getSummaryResourceModifiedMessage());
SystemRequestDetails systemRequestDetails = SystemRequestDetails systemRequestDetails =
new SystemRequestDetails().setRequestPartitionId(retVal.getPartitionId()); new SystemRequestDetails().setRequestPartitionId(retVal.getPartitionId());
IdDt resourceIdDt = new IdDt(resourceType, resourcePid, resourceVersion); IdDt resourceIdDt = createIdDtFromResourceModifiedEntity(theResourceModifiedEntity);
IFhirResourceDao dao = myDaoRegistry.getResourceDao(resourceType); IFhirResourceDao dao = myDaoRegistry.getResourceDao(resourceType);
IBaseResource iBaseResource = dao.read(resourceIdDt, systemRequestDetails, true); IBaseResource iBaseResource = dao.read(resourceIdDt, systemRequestDetails, true);
@ -164,6 +196,16 @@ public class ResourceModifiedMessagePersistenceSvcImpl implements IResourceModif
} }
} }
private IdDt createIdDtFromResourceModifiedEntity(ResourceModifiedEntity theResourceModifiedEntity) {
String resourcePid =
theResourceModifiedEntity.getResourceModifiedEntityPK().getResourcePid();
String resourceVersion =
theResourceModifiedEntity.getResourceModifiedEntityPK().getResourceVersion();
String resourceType = theResourceModifiedEntity.getResourceType();
return new IdDt(resourceType, resourcePid, resourceVersion);
}
private static class PayloadLessResourceModifiedMessage extends ResourceModifiedMessage { private static class PayloadLessResourceModifiedMessage extends ResourceModifiedMessage {
public PayloadLessResourceModifiedMessage(ResourceModifiedMessage theMsg) { public PayloadLessResourceModifiedMessage(ResourceModifiedMessage theMsg) {

View File

@ -2979,7 +2979,7 @@ public class TermReadSvcImpl implements ITermReadSvc, IHasScheduledJobs {
if (resultList.size() > 1) if (resultList.size() > 1)
throw new NonUniqueResultException(Msg.code(911) + "More than one CodeSystem is pointed by forcedId: " throw new NonUniqueResultException(Msg.code(911) + "More than one CodeSystem is pointed by forcedId: "
+ theForcedId + ". Was constraint " + ResourceTable.IDX_RES_FHIR_ID + " removed?"); + theForcedId + ". Was constraint " + ResourceTable.IDX_RES_TYPE_FHIR_ID + " removed?");
IFhirResourceDao<CodeSystem> csDao = myDaoRegistry.getResourceDao("CodeSystem"); IFhirResourceDao<CodeSystem> csDao = myDaoRegistry.getResourceDao("CodeSystem");
IBaseResource cs = myJpaStorageResourceParser.toResource(resultList.get(0), false); IBaseResource cs = myJpaStorageResourceParser.toResource(resultList.get(0), false);

View File

@ -5,7 +5,6 @@ import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.context.support.IValidationSupport; import ca.uhn.fhir.context.support.IValidationSupport;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.model.DaoMethodOutcome;
import ca.uhn.fhir.jpa.dao.data.INpmPackageVersionDao; import ca.uhn.fhir.jpa.dao.data.INpmPackageVersionDao;
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
import ca.uhn.fhir.jpa.dao.tx.NonTransactionalHapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.NonTransactionalHapiTransactionService;
@ -13,18 +12,24 @@ import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.packages.loader.PackageResourceParsingSvc; import ca.uhn.fhir.jpa.packages.loader.PackageResourceParsingSvc;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.searchparam.registry.ISearchParamRegistryController; import ca.uhn.fhir.jpa.searchparam.registry.ISearchParamRegistryController;
import ca.uhn.fhir.jpa.searchparam.util.SearchParameterHelper;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.SimpleBundleProvider; import ca.uhn.fhir.rest.server.SimpleBundleProvider;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r4.model.CodeSystem; import org.hl7.fhir.r4.model.CodeSystem;
import org.hl7.fhir.r4.model.CodeType;
import org.hl7.fhir.r4.model.Communication; import org.hl7.fhir.r4.model.Communication;
import org.hl7.fhir.r4.model.DocumentReference; import org.hl7.fhir.r4.model.DocumentReference;
import org.hl7.fhir.r4.model.Enumerations; import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.SearchParameter; import org.hl7.fhir.r4.model.SearchParameter;
import org.hl7.fhir.r4.model.Subscription; import org.hl7.fhir.r4.model.Subscription;
import org.hl7.fhir.utilities.npm.NpmPackage; import org.hl7.fhir.utilities.npm.NpmPackage;
import org.hl7.fhir.utilities.npm.PackageGenerator; import org.hl7.fhir.utilities.npm.PackageGenerator;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.MethodSource;
import org.mockito.ArgumentCaptor; import org.mockito.ArgumentCaptor;
import org.mockito.Captor; import org.mockito.Captor;
import org.mockito.InjectMocks; import org.mockito.InjectMocks;
@ -36,6 +41,9 @@ import javax.annotation.Nonnull;
import java.io.ByteArrayOutputStream; import java.io.ByteArrayOutputStream;
import java.io.IOException; import java.io.IOException;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Iterator;
import java.util.List; import java.util.List;
import java.util.Optional; import java.util.Optional;
@ -45,12 +53,10 @@ import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.times; import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.verifyNoInteractions;
import static org.mockito.Mockito.when; import static org.mockito.Mockito.when;
@ExtendWith(MockitoExtension.class) @ExtendWith(MockitoExtension.class)
public class PackageInstallerSvcImplTest { public class PackageInstallerSvcImplTest {
public static final String PACKAGE_VERSION = "1.0"; public static final String PACKAGE_VERSION = "1.0";
public static final String PACKAGE_ID_1 = "package1"; public static final String PACKAGE_ID_1 = "package1";
@ -65,7 +71,13 @@ public class PackageInstallerSvcImplTest {
@Mock @Mock
private IFhirResourceDao<CodeSystem> myCodeSystemDao; private IFhirResourceDao<CodeSystem> myCodeSystemDao;
@Mock @Mock
private IFhirResourceDao<SearchParameter> mySearchParameterDao;
@Mock
private IValidationSupport myIValidationSupport; private IValidationSupport myIValidationSupport;
@Mock
private SearchParameterHelper mySearchParameterHelper;
@Mock
private SearchParameterMap mySearchParameterMap;
@Spy @Spy
private FhirContext myCtx = FhirContext.forR4Cached(); private FhirContext myCtx = FhirContext.forR4Cached();
@Spy @Spy
@ -77,6 +89,15 @@ public class PackageInstallerSvcImplTest {
@InjectMocks @InjectMocks
private PackageInstallerSvcImpl mySvc; private PackageInstallerSvcImpl mySvc;
@Captor
private ArgumentCaptor<SearchParameterMap> mySearchParameterMapCaptor;
@Captor
private ArgumentCaptor<CodeSystem> myCodeSystemCaptor;
@Captor
private ArgumentCaptor<SearchParameter> mySearchParameterCaptor;
@Captor
private ArgumentCaptor<RequestDetails> myRequestDetailsCaptor;
@Test @Test
public void testPackageCompatibility() { public void testPackageCompatibility() {
mySvc.assertFhirVersionsAreCompatible("R4", "R4B"); mySvc.assertFhirVersionsAreCompatible("R4", "R4B");
@ -206,19 +227,7 @@ public class PackageInstallerSvcImplTest {
cs.setUrl("http://my-code-system"); cs.setUrl("http://my-code-system");
cs.setContent(CodeSystem.CodeSystemContentMode.COMPLETE); cs.setContent(CodeSystem.CodeSystemContentMode.COMPLETE);
NpmPackage pkg = createPackage(cs, PACKAGE_ID_1); PackageInstallationSpec spec = setupResourceInPackage(existingCs, cs, myCodeSystemDao);
when(myPackageVersionDao.findByPackageIdAndVersion(any(), any())).thenReturn(Optional.empty());
when(myPackageCacheManager.installPackage(any())).thenReturn(pkg);
when(myDaoRegistry.getResourceDao(CodeSystem.class)).thenReturn(myCodeSystemDao);
when(myCodeSystemDao.search(any(), any())).thenReturn(new SimpleBundleProvider(existingCs));
when(myCodeSystemDao.update(any(),any(RequestDetails.class))).thenReturn(new DaoMethodOutcome());
PackageInstallationSpec spec = new PackageInstallationSpec();
spec.setName(PACKAGE_ID_1);
spec.setVersion(PACKAGE_VERSION);
spec.setInstallMode(PackageInstallationSpec.InstallModeEnum.STORE_AND_INSTALL);
spec.setPackageContents(packageToBytes(pkg));
// Test // Test
mySvc.install(spec); mySvc.install(spec);
@ -233,34 +242,108 @@ public class PackageInstallerSvcImplTest {
assertEquals("existingcs", codeSystem.getIdPart()); assertEquals("existingcs", codeSystem.getIdPart());
} }
@Nonnull public enum InstallType {
private static byte[] packageToBytes(NpmPackage pkg) throws IOException { CREATE, UPDATE_WITH_EXISTING, UPDATE, UPDATE_OVERRIDE
ByteArrayOutputStream stream = new ByteArrayOutputStream();
pkg.save(stream);
byte[] bytes = stream.toByteArray();
return bytes;
} }
@Captor public static List<Object[]> parameters() {
private ArgumentCaptor<SearchParameterMap> mySearchParameterMapCaptor; return List.of(
@Captor new Object[]{null, null, null, List.of("Patient"), InstallType.CREATE},
private ArgumentCaptor<CodeSystem> myCodeSystemCaptor; new Object[]{null, null, "us-core-patient-given", List.of("Patient"), InstallType.UPDATE},
new Object[]{"individual-given", List.of("Patient", "Practitioner"), "us-core-patient-given", List.of("Patient"), InstallType.UPDATE_WITH_EXISTING},
new Object[]{"patient-given", List.of("Patient"), "us-core-patient-given", List.of("Patient"), InstallType.UPDATE_OVERRIDE}
);
}
@ParameterizedTest
@MethodSource("parameters")
public void testCreateOrUpdate_withSearchParameter(String theExistingId, Collection<String> theExistingBase,
String theInstallId, Collection<String> theInstallBase,
InstallType theInstallType) throws IOException {
// Setup
SearchParameter existingSP = null;
if (theExistingId != null) {
existingSP = createSearchParameter(theExistingId, theExistingBase);
}
SearchParameter installSP = createSearchParameter(theInstallId, theInstallBase);
PackageInstallationSpec spec = setupResourceInPackage(existingSP, installSP, mySearchParameterDao);
// Test
mySvc.install(spec);
// Verify
if (theInstallType == InstallType.CREATE) {
verify(mySearchParameterDao, times(1)).create(mySearchParameterCaptor.capture(), myRequestDetailsCaptor.capture());
} else if (theInstallType == InstallType.UPDATE_WITH_EXISTING){
verify(mySearchParameterDao, times(2)).update(mySearchParameterCaptor.capture(), myRequestDetailsCaptor.capture());
} else {
verify(mySearchParameterDao, times(1)).update(mySearchParameterCaptor.capture(), myRequestDetailsCaptor.capture());
}
Iterator<SearchParameter> iteratorSP = mySearchParameterCaptor.getAllValues().iterator();
if (theInstallType == InstallType.UPDATE_WITH_EXISTING) {
SearchParameter capturedSP = iteratorSP.next();
assertEquals(theExistingId, capturedSP.getIdPart());
List<String> expectedBase = new ArrayList<>(theExistingBase);
expectedBase.removeAll(theInstallBase);
assertEquals(expectedBase, capturedSP.getBase().stream().map(CodeType::getCode).toList());
}
SearchParameter capturedSP = iteratorSP.next();
if (theInstallType == InstallType.UPDATE_OVERRIDE) {
assertEquals(theExistingId, capturedSP.getIdPart());
} else {
assertEquals(theInstallId, capturedSP.getIdPart());
}
assertEquals(theInstallBase, capturedSP.getBase().stream().map(CodeType::getCode).toList());
}
private PackageInstallationSpec setupResourceInPackage(IBaseResource myExistingResource, IBaseResource myInstallResource,
IFhirResourceDao myFhirResourceDao) throws IOException {
NpmPackage pkg = createPackage(myInstallResource, myInstallResource.getClass().getSimpleName());
when(myPackageVersionDao.findByPackageIdAndVersion(any(), any())).thenReturn(Optional.empty());
when(myPackageCacheManager.installPackage(any())).thenReturn(pkg);
when(myDaoRegistry.getResourceDao(myInstallResource.getClass())).thenReturn(myFhirResourceDao);
when(myFhirResourceDao.search(any(), any())).thenReturn(myExistingResource != null ?
new SimpleBundleProvider(myExistingResource) : new SimpleBundleProvider());
if (myInstallResource.getClass().getSimpleName().equals("SearchParameter")) {
when(mySearchParameterHelper.buildSearchParameterMapFromCanonical(any())).thenReturn(Optional.of(mySearchParameterMap));
}
PackageInstallationSpec spec = new PackageInstallationSpec();
spec.setName(PACKAGE_ID_1);
spec.setVersion(PACKAGE_VERSION);
spec.setInstallMode(PackageInstallationSpec.InstallModeEnum.STORE_AND_INSTALL);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
pkg.save(stream);
spec.setPackageContents(stream.toByteArray());
return spec;
}
@Nonnull @Nonnull
private NpmPackage createPackage(CodeSystem cs, String packageId) throws IOException { private NpmPackage createPackage(IBaseResource theResource, String theResourceType) {
PackageGenerator manifestGenerator = new PackageGenerator(); PackageGenerator manifestGenerator = new PackageGenerator();
manifestGenerator.name(packageId); manifestGenerator.name(PACKAGE_ID_1);
manifestGenerator.version(PACKAGE_VERSION); manifestGenerator.version(PACKAGE_VERSION);
manifestGenerator.description("a package"); manifestGenerator.description("a package");
manifestGenerator.fhirVersions(List.of(FhirVersionEnum.R4.getFhirVersionString())); manifestGenerator.fhirVersions(List.of(FhirVersionEnum.R4.getFhirVersionString()));
String csString = myCtx.newJsonParser().encodeResourceToString(theResource);
NpmPackage pkg = NpmPackage.empty(manifestGenerator); NpmPackage pkg = NpmPackage.empty(manifestGenerator);
pkg.addFile("package", theResourceType + ".json", csString.getBytes(StandardCharsets.UTF_8), theResourceType);
String csString = myCtx.newJsonParser().encodeResourceToString(cs);
pkg.addFile("package", "cs.json", csString.getBytes(StandardCharsets.UTF_8), "CodeSystem");
return pkg; return pkg;
} }
private static SearchParameter createSearchParameter(String theId, Collection<String> theBase) {
SearchParameter searchParameter = new SearchParameter();
if (theId != null) {
searchParameter.setId(new IdType("SearchParameter", theId));
}
searchParameter.setCode("someCode");
theBase.forEach(base -> searchParameter.getBase().add(new CodeType(base)));
searchParameter.setExpression("someExpression");
return searchParameter;
}
} }

View File

@ -22,6 +22,7 @@ import java.util.ArrayList;
import java.util.HashMap; import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Set;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
@ -72,9 +73,9 @@ public class FhirResourceDaoR4SearchLastNAsyncIT extends BaseR4SearchLastN {
public void testLastNChunking() { public void testLastNChunking() {
runInTransaction(() -> { runInTransaction(() -> {
for (Search search : mySearchDao.findAll()) { Set<Long> all = mySearchDao.findAll().stream().map(Search::getId).collect(Collectors.toSet());
mySearchDao.updateDeleted(search.getId(), true);
} mySearchDao.updateDeleted(all, true);
}); });
// Set up search parameters that will return 75 Observations. // Set up search parameters that will return 75 Observations.

View File

@ -76,7 +76,7 @@ import javax.persistence.Transient;
import javax.persistence.UniqueConstraint; import javax.persistence.UniqueConstraint;
import javax.persistence.Version; import javax.persistence.Version;
import static ca.uhn.fhir.jpa.model.entity.ResourceTable.IDX_RES_FHIR_ID; import static ca.uhn.fhir.jpa.model.entity.ResourceTable.IDX_RES_TYPE_FHIR_ID;
@Indexed(routingBinder = @RoutingBinderRef(type = ResourceTableRoutingBinder.class)) @Indexed(routingBinder = @RoutingBinderRef(type = ResourceTableRoutingBinder.class))
@Entity @Entity
@ -84,12 +84,13 @@ import static ca.uhn.fhir.jpa.model.entity.ResourceTable.IDX_RES_FHIR_ID;
name = ResourceTable.HFJ_RESOURCE, name = ResourceTable.HFJ_RESOURCE,
uniqueConstraints = { uniqueConstraints = {
@UniqueConstraint( @UniqueConstraint(
name = IDX_RES_FHIR_ID, name = IDX_RES_TYPE_FHIR_ID,
columnNames = {"FHIR_ID", "RES_TYPE"}) columnNames = {"RES_TYPE", "FHIR_ID"})
}, },
indexes = { indexes = {
// Do not reuse previously used index name: IDX_INDEXSTATUS, IDX_RES_TYPE // Do not reuse previously used index name: IDX_INDEXSTATUS, IDX_RES_TYPE
@Index(name = "IDX_RES_DATE", columnList = BaseHasResource.RES_UPDATED), @Index(name = "IDX_RES_DATE", columnList = BaseHasResource.RES_UPDATED),
@Index(name = "IDX_RES_FHIR_ID", columnList = "FHIR_ID"),
@Index( @Index(
name = "IDX_RES_TYPE_DEL_UPDATED", name = "IDX_RES_TYPE_DEL_UPDATED",
columnList = "RES_TYPE,RES_DELETED_AT,RES_UPDATED,PARTITION_ID,RES_ID"), columnList = "RES_TYPE,RES_DELETED_AT,RES_UPDATED,PARTITION_ID,RES_ID"),
@ -100,10 +101,11 @@ public class ResourceTable extends BaseHasResource implements Serializable, IBas
public static final int RESTYPE_LEN = 40; public static final int RESTYPE_LEN = 40;
public static final String HFJ_RESOURCE = "HFJ_RESOURCE"; public static final String HFJ_RESOURCE = "HFJ_RESOURCE";
public static final String RES_TYPE = "RES_TYPE"; public static final String RES_TYPE = "RES_TYPE";
public static final String FHIR_ID = "FHIR_ID";
private static final int MAX_LANGUAGE_LENGTH = 20; private static final int MAX_LANGUAGE_LENGTH = 20;
private static final long serialVersionUID = 1L; private static final long serialVersionUID = 1L;
public static final int MAX_FORCED_ID_LENGTH = 100; public static final int MAX_FORCED_ID_LENGTH = 100;
public static final String IDX_RES_FHIR_ID = "IDX_RES_FHIR_ID"; public static final String IDX_RES_TYPE_FHIR_ID = "IDX_RES_TYPE_FHIR_ID";
/** /**
* Holds the narrative text only - Used for Fulltext searching but not directly stored in the DB * Holds the narrative text only - Used for Fulltext searching but not directly stored in the DB
@ -381,7 +383,7 @@ public class ResourceTable extends BaseHasResource implements Serializable, IBas
* Will be null during insert time until the first read. * Will be null during insert time until the first read.
*/ */
@Column( @Column(
name = "FHIR_ID", name = FHIR_ID,
// [A-Za-z0-9\-\.]{1,64} - https://www.hl7.org/fhir/datatypes.html#id // [A-Za-z0-9\-\.]{1,64} - https://www.hl7.org/fhir/datatypes.html#id
length = 64, length = 64,
// we never update this after insert, and the Generator will otherwise "dirty" the object. // we never update this after insert, and the Generator will otherwise "dirty" the object.

View File

@ -51,6 +51,8 @@ public class ResourceMetaParams {
Map<String, Class<? extends IQueryParameterAnd<?>>> resourceMetaAndParams = new HashMap<>(); Map<String, Class<? extends IQueryParameterAnd<?>>> resourceMetaAndParams = new HashMap<>();
resourceMetaParams.put(IAnyResource.SP_RES_ID, StringParam.class); resourceMetaParams.put(IAnyResource.SP_RES_ID, StringParam.class);
resourceMetaAndParams.put(IAnyResource.SP_RES_ID, StringAndListParam.class); resourceMetaAndParams.put(IAnyResource.SP_RES_ID, StringAndListParam.class);
resourceMetaParams.put(Constants.PARAM_PID, TokenParam.class);
resourceMetaAndParams.put(Constants.PARAM_PID, TokenAndListParam.class);
resourceMetaParams.put(Constants.PARAM_TAG, TokenParam.class); resourceMetaParams.put(Constants.PARAM_TAG, TokenParam.class);
resourceMetaAndParams.put(Constants.PARAM_TAG, TokenAndListParam.class); resourceMetaAndParams.put(Constants.PARAM_TAG, TokenAndListParam.class);
resourceMetaParams.put(Constants.PARAM_PROFILE, UriParam.class); resourceMetaParams.put(Constants.PARAM_PROFILE, UriParam.class);

View File

@ -33,7 +33,9 @@ import ca.uhn.fhir.jpa.subscription.match.registry.ActiveSubscription;
import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry; import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry;
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription;
import ca.uhn.fhir.jpa.subscription.model.ResourceDeliveryMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceDeliveryMessage;
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage;
import ca.uhn.fhir.rest.api.server.IBundleProvider; import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import ca.uhn.fhir.util.BundleBuilder; import ca.uhn.fhir.util.BundleBuilder;
import com.google.common.annotations.VisibleForTesting; import com.google.common.annotations.VisibleForTesting;
import org.apache.commons.text.StringSubstitutor; import org.apache.commons.text.StringSubstitutor;
@ -48,6 +50,7 @@ import org.springframework.messaging.MessagingException;
import java.util.HashMap; import java.util.HashMap;
import java.util.Map; import java.util.Map;
import java.util.Optional;
import static ca.uhn.fhir.jpa.subscription.util.SubscriptionUtil.createRequestDetailForPartitionedRequest; import static ca.uhn.fhir.jpa.subscription.util.SubscriptionUtil.createRequestDetailForPartitionedRequest;
@ -60,6 +63,9 @@ public abstract class BaseSubscriptionDeliverySubscriber implements MessageHandl
@Autowired @Autowired
protected SubscriptionRegistry mySubscriptionRegistry; protected SubscriptionRegistry mySubscriptionRegistry;
@Autowired
protected IResourceModifiedMessagePersistenceSvc myResourceModifiedMessagePersistenceSvc;
@Autowired @Autowired
private IInterceptorBroadcaster myInterceptorBroadcaster; private IInterceptorBroadcaster myInterceptorBroadcaster;
@ -149,6 +155,13 @@ public abstract class BaseSubscriptionDeliverySubscriber implements MessageHandl
return builder.getBundle(); return builder.getBundle();
} }
protected Optional<ResourceModifiedMessage> inflateResourceModifiedMessageFromDeliveryMessage(
ResourceDeliveryMessage theMsg) {
ResourceModifiedMessage payloadLess =
new ResourceModifiedMessage(theMsg.getPayloadId(myFhirContext), theMsg.getOperationType());
return myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(payloadLess);
}
@VisibleForTesting @VisibleForTesting
public void setFhirContextForUnitTest(FhirContext theCtx) { public void setFhirContextForUnitTest(FhirContext theCtx) {
myFhirContext = theCtx; myFhirContext = theCtx;
@ -174,6 +187,12 @@ public abstract class BaseSubscriptionDeliverySubscriber implements MessageHandl
myMatchUrlService = theMatchUrlService; myMatchUrlService = theMatchUrlService;
} }
@VisibleForTesting
public void setResourceModifiedMessagePersistenceSvcForUnitTest(
IResourceModifiedMessagePersistenceSvc theResourceModifiedMessagePersistenceSvc) {
myResourceModifiedMessagePersistenceSvc = theResourceModifiedMessagePersistenceSvc;
}
public IInterceptorBroadcaster getInterceptorBroadcaster() { public IInterceptorBroadcaster getInterceptorBroadcaster() {
return myInterceptorBroadcaster; return myInterceptorBroadcaster;
} }

View File

@ -24,6 +24,7 @@ import ca.uhn.fhir.jpa.model.entity.StorageSettings;
import ca.uhn.fhir.jpa.subscription.match.deliver.BaseSubscriptionDeliverySubscriber; import ca.uhn.fhir.jpa.subscription.match.deliver.BaseSubscriptionDeliverySubscriber;
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription;
import ca.uhn.fhir.jpa.subscription.model.ResourceDeliveryMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceDeliveryMessage;
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage;
import ca.uhn.fhir.rest.api.EncodingEnum; import ca.uhn.fhir.rest.api.EncodingEnum;
import com.google.common.annotations.VisibleForTesting; import com.google.common.annotations.VisibleForTesting;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
@ -33,6 +34,7 @@ import org.springframework.beans.factory.annotation.Autowired;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
import java.util.Optional;
import static org.apache.commons.lang3.StringUtils.defaultString; import static org.apache.commons.lang3.StringUtils.defaultString;
import static org.apache.commons.lang3.StringUtils.isNotBlank; import static org.apache.commons.lang3.StringUtils.isNotBlank;
@ -73,7 +75,7 @@ public class SubscriptionDeliveringEmailSubscriber extends BaseSubscriptionDeliv
if (isNotBlank(subscription.getPayloadString())) { if (isNotBlank(subscription.getPayloadString())) {
EncodingEnum encoding = EncodingEnum.forContentType(subscription.getPayloadString()); EncodingEnum encoding = EncodingEnum.forContentType(subscription.getPayloadString());
if (encoding != null) { if (encoding != null) {
payload = theMessage.getPayloadString(); payload = getPayloadStringFromMessageOrEmptyString(theMessage);
} }
} }
@ -112,4 +114,24 @@ public class SubscriptionDeliveringEmailSubscriber extends BaseSubscriptionDeliv
public IEmailSender getEmailSender() { public IEmailSender getEmailSender() {
return myEmailSender; return myEmailSender;
} }
/**
* Get the payload string, fetch it from the DB when the payload is null.
*/
private String getPayloadStringFromMessageOrEmptyString(ResourceDeliveryMessage theMessage) {
String payload = theMessage.getPayloadString();
if (theMessage.getPayload(myCtx) != null) {
return payload;
}
Optional<ResourceModifiedMessage> inflatedMessage =
inflateResourceModifiedMessageFromDeliveryMessage(theMessage);
if (inflatedMessage.isEmpty()) {
return "";
}
payload = inflatedMessage.get().getPayloadString();
return payload;
}
} }

View File

@ -39,6 +39,7 @@ import org.springframework.messaging.MessagingException;
import java.net.URI; import java.net.URI;
import java.net.URISyntaxException; import java.net.URISyntaxException;
import java.util.Optional;
import static org.apache.commons.lang3.StringUtils.isNotBlank; import static org.apache.commons.lang3.StringUtils.isNotBlank;
@ -66,7 +67,7 @@ public class SubscriptionDeliveringMessageSubscriber extends BaseSubscriptionDel
IBaseResource payloadResource = createDeliveryBundleForPayloadSearchCriteria( IBaseResource payloadResource = createDeliveryBundleForPayloadSearchCriteria(
theSubscription, theWrappedMessageToSend.getPayload().getPayload(myFhirContext)); theSubscription, theWrappedMessageToSend.getPayload().getPayload(myFhirContext));
ResourceModifiedJsonMessage newWrappedMessageToSend = ResourceModifiedJsonMessage newWrappedMessageToSend =
convertDeliveryMessageToResourceModifiedMessage(theSourceMessage, payloadResource); convertDeliveryMessageToResourceModifiedJsonMessage(theSourceMessage, payloadResource);
theWrappedMessageToSend.setPayload(newWrappedMessageToSend.getPayload()); theWrappedMessageToSend.setPayload(newWrappedMessageToSend.getPayload());
payloadId = payloadId =
payloadResource.getIdElement().toUnqualifiedVersionless().getValue(); payloadResource.getIdElement().toUnqualifiedVersionless().getValue();
@ -82,7 +83,7 @@ public class SubscriptionDeliveringMessageSubscriber extends BaseSubscriptionDel
.getValue()); .getValue());
} }
private ResourceModifiedJsonMessage convertDeliveryMessageToResourceModifiedMessage( private ResourceModifiedJsonMessage convertDeliveryMessageToResourceModifiedJsonMessage(
ResourceDeliveryMessage theMsg, IBaseResource thePayloadResource) { ResourceDeliveryMessage theMsg, IBaseResource thePayloadResource) {
ResourceModifiedMessage payload = ResourceModifiedMessage payload =
new ResourceModifiedMessage(myFhirContext, thePayloadResource, theMsg.getOperationType()); new ResourceModifiedMessage(myFhirContext, thePayloadResource, theMsg.getOperationType());
@ -96,8 +97,17 @@ public class SubscriptionDeliveringMessageSubscriber extends BaseSubscriptionDel
public void handleMessage(ResourceDeliveryMessage theMessage) throws MessagingException, URISyntaxException { public void handleMessage(ResourceDeliveryMessage theMessage) throws MessagingException, URISyntaxException {
CanonicalSubscription subscription = theMessage.getSubscription(); CanonicalSubscription subscription = theMessage.getSubscription();
IBaseResource payloadResource = theMessage.getPayload(myFhirContext); IBaseResource payloadResource = theMessage.getPayload(myFhirContext);
if (payloadResource == null) {
Optional<ResourceModifiedMessage> inflatedMsg =
inflateResourceModifiedMessageFromDeliveryMessage(theMessage);
if (inflatedMsg.isEmpty()) {
return;
}
payloadResource = inflatedMsg.get().getPayload(myFhirContext);
}
ResourceModifiedJsonMessage messageWrapperToSend = ResourceModifiedJsonMessage messageWrapperToSend =
convertDeliveryMessageToResourceModifiedMessage(theMessage, payloadResource); convertDeliveryMessageToResourceModifiedJsonMessage(theMessage, payloadResource);
// Interceptor call: SUBSCRIPTION_BEFORE_MESSAGE_DELIVERY // Interceptor call: SUBSCRIPTION_BEFORE_MESSAGE_DELIVERY
HookParams params = new HookParams() HookParams params = new HookParams()

View File

@ -31,6 +31,7 @@ import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.server.exceptions.ResourceGoneException; import ca.uhn.fhir.rest.server.exceptions.ResourceGoneException;
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException; import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
import ca.uhn.fhir.subscription.SubscriptionConstants; import ca.uhn.fhir.subscription.SubscriptionConstants;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import ca.uhn.fhir.util.SubscriptionUtil; import ca.uhn.fhir.util.SubscriptionUtil;
import org.hl7.fhir.dstu2.model.Subscription; import org.hl7.fhir.dstu2.model.Subscription;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
@ -41,6 +42,7 @@ import org.springframework.messaging.Message;
import org.springframework.messaging.MessageHandler; import org.springframework.messaging.MessageHandler;
import org.springframework.messaging.MessagingException; import org.springframework.messaging.MessagingException;
import java.util.Optional;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
/** /**
@ -64,6 +66,8 @@ public class SubscriptionActivatingSubscriber implements MessageHandler {
@Autowired @Autowired
private StorageSettings myStorageSettings; private StorageSettings myStorageSettings;
@Autowired
private IResourceModifiedMessagePersistenceSvc myResourceModifiedMessagePersistenceSvc;
/** /**
* Constructor * Constructor
*/ */
@ -86,6 +90,16 @@ public class SubscriptionActivatingSubscriber implements MessageHandler {
switch (payload.getOperationType()) { switch (payload.getOperationType()) {
case CREATE: case CREATE:
case UPDATE: case UPDATE:
if (payload.getPayload(myFhirContext) == null) {
Optional<ResourceModifiedMessage> inflatedMsg =
myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(
payload);
if (inflatedMsg.isEmpty()) {
return;
}
payload = inflatedMsg.get();
}
activateSubscriptionIfRequired(payload.getNewPayload(myFhirContext)); activateSubscriptionIfRequired(payload.getNewPayload(myFhirContext));
break; break;
case TRANSACTION: case TRANSACTION:
@ -104,7 +118,7 @@ public class SubscriptionActivatingSubscriber implements MessageHandler {
*/ */
public synchronized boolean activateSubscriptionIfRequired(final IBaseResource theSubscription) { public synchronized boolean activateSubscriptionIfRequired(final IBaseResource theSubscription) {
// Grab the value for "Subscription.channel.type" so we can see if this // Grab the value for "Subscription.channel.type" so we can see if this
// subscriber applies.. // subscriber applies.
CanonicalSubscriptionChannelType subscriptionChannelType = CanonicalSubscriptionChannelType subscriptionChannelType =
mySubscriptionCanonicalizer.getChannelType(theSubscription); mySubscriptionCanonicalizer.getChannelType(theSubscription);

View File

@ -25,6 +25,7 @@ import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.jpa.searchparam.matcher.InMemoryMatchResult; import ca.uhn.fhir.jpa.searchparam.matcher.InMemoryMatchResult;
import ca.uhn.fhir.jpa.subscription.channel.api.PayloadTooLargeException;
import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelRegistry; import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelRegistry;
import ca.uhn.fhir.jpa.subscription.match.registry.ActiveSubscription; import ca.uhn.fhir.jpa.subscription.match.registry.ActiveSubscription;
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription;
@ -156,8 +157,21 @@ public class SubscriptionMatchDeliverer {
ourLog.warn("Failed to send message to Delivery Channel."); ourLog.warn("Failed to send message to Delivery Channel.");
} }
} catch (RuntimeException e) { } catch (RuntimeException e) {
if (e.getCause() instanceof PayloadTooLargeException) {
ourLog.warn("Failed to send message to Delivery Channel because the payload size is larger than broker "
+ "max message size. Retry is about to be performed without payload.");
ResourceDeliveryJsonMessage msgPayloadLess = nullOutPayload(theWrappedMsg);
trySendToDeliveryChannel(msgPayloadLess, theDeliveryChannel);
} else {
ourLog.error("Failed to send message to Delivery Channel", e); ourLog.error("Failed to send message to Delivery Channel", e);
throw new RuntimeException(Msg.code(7) + "Failed to send message to Delivery Channel", e); throw new RuntimeException(Msg.code(7) + "Failed to send message to Delivery Channel", e);
} }
} }
}
private ResourceDeliveryJsonMessage nullOutPayload(ResourceDeliveryJsonMessage theWrappedMsg) {
ResourceDeliveryMessage resourceDeliveryMessage = theWrappedMsg.getPayload();
resourceDeliveryMessage.setPayloadToNull();
return new ResourceDeliveryJsonMessage(resourceDeliveryMessage);
}
} }

View File

@ -30,6 +30,7 @@ import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry;
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription;
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedJsonMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedJsonMessage;
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.slf4j.Logger; import org.slf4j.Logger;
@ -40,6 +41,7 @@ import org.springframework.messaging.MessageHandler;
import org.springframework.messaging.MessagingException; import org.springframework.messaging.MessagingException;
import java.util.Collection; import java.util.Collection;
import java.util.Optional;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
import static ca.uhn.fhir.rest.server.messaging.BaseResourceMessage.OperationTypeEnum.DELETE; import static ca.uhn.fhir.rest.server.messaging.BaseResourceMessage.OperationTypeEnum.DELETE;
@ -64,6 +66,9 @@ public class SubscriptionMatchingSubscriber implements MessageHandler {
@Autowired @Autowired
private SubscriptionMatchDeliverer mySubscriptionMatchDeliverer; private SubscriptionMatchDeliverer mySubscriptionMatchDeliverer;
@Autowired
private IResourceModifiedMessagePersistenceSvc myResourceModifiedMessagePersistenceSvc;
/** /**
* Constructor * Constructor
*/ */
@ -97,6 +102,16 @@ public class SubscriptionMatchingSubscriber implements MessageHandler {
return; return;
} }
if (theMsg.getPayload(myFhirContext) == null) {
// inflate the message and ignore any resource that cannot be found.
Optional<ResourceModifiedMessage> inflatedMsg =
myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(theMsg);
if (inflatedMsg.isEmpty()) {
return;
}
theMsg = inflatedMsg.get();
}
// Interceptor call: SUBSCRIPTION_BEFORE_PERSISTED_RESOURCE_CHECKED // Interceptor call: SUBSCRIPTION_BEFORE_PERSISTED_RESOURCE_CHECKED
HookParams params = new HookParams().add(ResourceModifiedMessage.class, theMsg); HookParams params = new HookParams().add(ResourceModifiedMessage.class, theMsg);
if (!myInterceptorBroadcaster.callHooks(Pointcut.SUBSCRIPTION_BEFORE_PERSISTED_RESOURCE_CHECKED, params)) { if (!myInterceptorBroadcaster.callHooks(Pointcut.SUBSCRIPTION_BEFORE_PERSISTED_RESOURCE_CHECKED, params)) {

View File

@ -20,9 +20,11 @@
package ca.uhn.fhir.jpa.subscription.submit.interceptor; package ca.uhn.fhir.jpa.subscription.submit.interceptor;
import ca.uhn.fhir.jpa.subscription.async.AsyncResourceModifiedProcessingSchedulerSvc; import ca.uhn.fhir.jpa.subscription.async.AsyncResourceModifiedProcessingSchedulerSvc;
import ca.uhn.fhir.jpa.subscription.channel.api.PayloadTooLargeException;
import ca.uhn.fhir.jpa.subscription.match.matcher.matching.IResourceModifiedConsumer; import ca.uhn.fhir.jpa.subscription.match.matcher.matching.IResourceModifiedConsumer;
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.messaging.MessageDeliveryException;
import org.springframework.transaction.support.TransactionSynchronizationAdapter; import org.springframework.transaction.support.TransactionSynchronizationAdapter;
import org.springframework.transaction.support.TransactionSynchronizationManager; import org.springframework.transaction.support.TransactionSynchronizationManager;
@ -49,11 +51,33 @@ public class SynchronousSubscriptionMatcherInterceptor extends SubscriptionMatch
@Override @Override
public void afterCommit() { public void afterCommit() {
myResourceModifiedConsumer.submitResourceModified(theResourceModifiedMessage); doSubmitResourceModified(theResourceModifiedMessage);
} }
}); });
} else { } else {
doSubmitResourceModified(theResourceModifiedMessage);
}
}
/**
* Submit the message through the broker channel to the matcher.
*
* Note: most of our integrated tests for subscription assume we can successfully inflate the message and therefore
* does not run with an actual database to persist the data. In these cases, submitting the complete message (i.e.
* with payload) is OK. However, there are a few tests that do not assume it and do run with an actual DB. For them,
* we should null out the payload body before submitting. This try-catch block only covers the case where the
* payload is too large, which is enough for now. However, for better practice we might want to consider splitting
* this interceptor into two, each for tests with/without DB connection.
* @param theResourceModifiedMessage
*/
private void doSubmitResourceModified(ResourceModifiedMessage theResourceModifiedMessage) {
try {
myResourceModifiedConsumer.submitResourceModified(theResourceModifiedMessage);
} catch (MessageDeliveryException e) {
if (e.getCause() instanceof PayloadTooLargeException) {
theResourceModifiedMessage.setPayloadToNull();
myResourceModifiedConsumer.submitResourceModified(theResourceModifiedMessage); myResourceModifiedConsumer.submitResourceModified(theResourceModifiedMessage);
} }
} }
}
} }

View File

@ -35,7 +35,6 @@ import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.subscription.api.IResourceModifiedConsumerWithRetries; import ca.uhn.fhir.subscription.api.IResourceModifiedConsumerWithRetries;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc; import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import org.apache.commons.lang3.Validate; import org.apache.commons.lang3.Validate;
import org.hl7.fhir.r5.model.IdType;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.context.event.ContextRefreshedEvent; import org.springframework.context.event.ContextRefreshedEvent;
@ -45,8 +44,6 @@ import org.springframework.messaging.MessageDeliveryException;
import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.support.TransactionCallback; import org.springframework.transaction.support.TransactionCallback;
import java.util.Optional;
import static ca.uhn.fhir.jpa.subscription.match.matcher.subscriber.SubscriptionMatchingSubscriber.SUBSCRIPTION_MATCHING_CHANNEL_NAME; import static ca.uhn.fhir.jpa.subscription.match.matcher.subscriber.SubscriptionMatchingSubscriber.SUBSCRIPTION_MATCHING_CHANNEL_NAME;
/** /**
@ -151,12 +148,11 @@ public class ResourceModifiedSubmitterSvc implements IResourceModifiedConsumer,
boolean wasDeleted = deletePersistedResourceModifiedMessage( boolean wasDeleted = deletePersistedResourceModifiedMessage(
thePersistedResourceModifiedMessage.getPersistedResourceModifiedMessagePk()); thePersistedResourceModifiedMessage.getPersistedResourceModifiedMessagePk());
Optional<ResourceModifiedMessage> optionalResourceModifiedMessage = // submit the resource modified message with empty payload, actual inflation is done by the matcher.
inflatePersistedResourceMessage(thePersistedResourceModifiedMessage); resourceModifiedMessage =
createResourceModifiedMessageWithoutInflation(thePersistedResourceModifiedMessage);
if (wasDeleted && optionalResourceModifiedMessage.isPresent()) { if (wasDeleted) {
// the PK did exist and we were able to deleted it, ie, we are the only one processing the message
resourceModifiedMessage = optionalResourceModifiedMessage.get();
submitResourceModified(resourceModifiedMessage); submitResourceModified(resourceModifiedMessage);
} }
} catch (MessageDeliveryException exception) { } catch (MessageDeliveryException exception) {
@ -186,32 +182,10 @@ public class ResourceModifiedSubmitterSvc implements IResourceModifiedConsumer,
}; };
} }
private Optional<ResourceModifiedMessage> inflatePersistedResourceMessage( private ResourceModifiedMessage createResourceModifiedMessageWithoutInflation(
IPersistedResourceModifiedMessage thePersistedResourceModifiedMessage) { IPersistedResourceModifiedMessage thePersistedResourceModifiedMessage) {
ResourceModifiedMessage resourceModifiedMessage = null; return myResourceModifiedMessagePersistenceSvc.createResourceModifiedMessageFromEntityWithoutInflation(
try {
resourceModifiedMessage = myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessage(
thePersistedResourceModifiedMessage); thePersistedResourceModifiedMessage);
} catch (ResourceNotFoundException e) {
IPersistedResourceModifiedMessagePK persistedResourceModifiedMessagePk =
thePersistedResourceModifiedMessage.getPersistedResourceModifiedMessagePk();
IdType idType = new IdType(
thePersistedResourceModifiedMessage.getResourceType(),
persistedResourceModifiedMessagePk.getResourcePid(),
persistedResourceModifiedMessagePk.getResourceVersion());
ourLog.warn(
"Scheduled submission will be ignored since resource {} cannot be found",
idType.asStringValue(),
e);
} catch (Exception ex) {
ourLog.error("Unknown error encountered on inflation of resources.", ex);
}
return Optional.ofNullable(resourceModifiedMessage);
} }
private boolean deletePersistedResourceModifiedMessage(IPersistedResourceModifiedMessagePK theResourceModifiedPK) { private boolean deletePersistedResourceModifiedMessage(IPersistedResourceModifiedMessagePK theResourceModifiedPK) {

View File

@ -30,6 +30,7 @@ import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedJsonMessage;
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage;
import ca.uhn.fhir.jpa.topic.filter.InMemoryTopicFilterMatcher; import ca.uhn.fhir.jpa.topic.filter.InMemoryTopicFilterMatcher;
import ca.uhn.fhir.rest.api.RestOperationTypeEnum; import ca.uhn.fhir.rest.api.RestOperationTypeEnum;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import ca.uhn.fhir.util.Logs; import ca.uhn.fhir.util.Logs;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r5.model.SubscriptionTopic; import org.hl7.fhir.r5.model.SubscriptionTopic;
@ -42,6 +43,7 @@ import org.springframework.messaging.MessagingException;
import java.util.Collection; import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.List; import java.util.List;
import java.util.Optional;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
public class SubscriptionTopicMatchingSubscriber implements MessageHandler { public class SubscriptionTopicMatchingSubscriber implements MessageHandler {
@ -73,6 +75,9 @@ public class SubscriptionTopicMatchingSubscriber implements MessageHandler {
@Autowired @Autowired
private InMemoryTopicFilterMatcher myInMemoryTopicFilterMatcher; private InMemoryTopicFilterMatcher myInMemoryTopicFilterMatcher;
@Autowired
private IResourceModifiedMessagePersistenceSvc myResourceModifiedMessagePersistenceSvc;
public SubscriptionTopicMatchingSubscriber(FhirContext theFhirContext) { public SubscriptionTopicMatchingSubscriber(FhirContext theFhirContext) {
myFhirContext = theFhirContext; myFhirContext = theFhirContext;
} }
@ -88,6 +93,16 @@ public class SubscriptionTopicMatchingSubscriber implements MessageHandler {
ResourceModifiedMessage msg = ((ResourceModifiedJsonMessage) theMessage).getPayload(); ResourceModifiedMessage msg = ((ResourceModifiedJsonMessage) theMessage).getPayload();
if (msg.getPayload(myFhirContext) == null) {
// inflate the message and ignore any resource that cannot be found.
Optional<ResourceModifiedMessage> inflatedMsg =
myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(msg);
if (inflatedMsg.isEmpty()) {
return;
}
msg = inflatedMsg.get();
}
// Interceptor call: SUBSCRIPTION_TOPIC_BEFORE_PERSISTED_RESOURCE_CHECKED // Interceptor call: SUBSCRIPTION_TOPIC_BEFORE_PERSISTED_RESOURCE_CHECKED
HookParams params = new HookParams().add(ResourceModifiedMessage.class, msg); HookParams params = new HookParams().add(ResourceModifiedMessage.class, msg);
if (!myInterceptorBroadcaster.callHooks( if (!myInterceptorBroadcaster.callHooks(

View File

@ -8,10 +8,13 @@ import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.model.entity.StorageSettings;
import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import ca.uhn.fhir.jpa.searchparam.MatchUrlService;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.subscription.channel.api.IChannelFactory; import ca.uhn.fhir.jpa.subscription.channel.api.IChannelFactory;
import ca.uhn.fhir.jpa.subscription.channel.api.IChannelProducer; import ca.uhn.fhir.jpa.subscription.channel.api.IChannelProducer;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.SubscriptionDeliveringEmailSubscriber;
import ca.uhn.fhir.jpa.subscription.match.deliver.message.SubscriptionDeliveringMessageSubscriber; import ca.uhn.fhir.jpa.subscription.match.deliver.message.SubscriptionDeliveringMessageSubscriber;
import ca.uhn.fhir.jpa.subscription.match.deliver.resthook.SubscriptionDeliveringRestHookSubscriber; import ca.uhn.fhir.jpa.subscription.match.deliver.resthook.SubscriptionDeliveringRestHookSubscriber;
import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry; import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry;
@ -26,6 +29,7 @@ import ca.uhn.fhir.rest.client.api.IGenericClient;
import ca.uhn.fhir.rest.client.api.IRestfulClientFactory; import ca.uhn.fhir.rest.client.api.IRestfulClientFactory;
import ca.uhn.fhir.rest.server.SimpleBundleProvider; import ca.uhn.fhir.rest.server.SimpleBundleProvider;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import com.fasterxml.jackson.core.JsonProcessingException; import com.fasterxml.jackson.core.JsonProcessingException;
import org.hl7.fhir.r4.model.Bundle; import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.IdType;
@ -33,6 +37,8 @@ import org.hl7.fhir.r4.model.Patient;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.ValueSource;
import org.mockito.Answers; import org.mockito.Answers;
import org.mockito.ArgumentCaptor; import org.mockito.ArgumentCaptor;
import org.mockito.ArgumentMatchers; import org.mockito.ArgumentMatchers;
@ -57,6 +63,7 @@ import static org.hamcrest.Matchers.hasSize;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.fail;
import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyString; import static org.mockito.ArgumentMatchers.anyString;
@ -71,6 +78,7 @@ public class BaseSubscriptionDeliverySubscriberTest {
private SubscriptionDeliveringRestHookSubscriber mySubscriber; private SubscriptionDeliveringRestHookSubscriber mySubscriber;
private SubscriptionDeliveringMessageSubscriber myMessageSubscriber; private SubscriptionDeliveringMessageSubscriber myMessageSubscriber;
private SubscriptionDeliveringEmailSubscriber myEmailSubscriber;
private final FhirContext myCtx = FhirContext.forR4(); private final FhirContext myCtx = FhirContext.forR4();
@Mock @Mock
@ -96,6 +104,12 @@ public class BaseSubscriptionDeliverySubscriberTest {
@Mock @Mock
private MatchUrlService myMatchUrlService; private MatchUrlService myMatchUrlService;
@Mock
private IResourceModifiedMessagePersistenceSvc myResourceModifiedMessagePersistenceSvc;
@Mock
private IEmailSender myEmailSender;
@BeforeEach @BeforeEach
public void before() { public void before() {
mySubscriber = new SubscriptionDeliveringRestHookSubscriber(); mySubscriber = new SubscriptionDeliveringRestHookSubscriber();
@ -109,8 +123,15 @@ public class BaseSubscriptionDeliverySubscriberTest {
myMessageSubscriber.setSubscriptionRegistryForUnitTest(mySubscriptionRegistry); myMessageSubscriber.setSubscriptionRegistryForUnitTest(mySubscriptionRegistry);
myMessageSubscriber.setDaoRegistryForUnitTest(myDaoRegistry); myMessageSubscriber.setDaoRegistryForUnitTest(myDaoRegistry);
myMessageSubscriber.setMatchUrlServiceForUnitTest(myMatchUrlService); myMessageSubscriber.setMatchUrlServiceForUnitTest(myMatchUrlService);
myMessageSubscriber.setResourceModifiedMessagePersistenceSvcForUnitTest(myResourceModifiedMessagePersistenceSvc);
myCtx.setRestfulClientFactory(myRestfulClientFactory); myCtx.setRestfulClientFactory(myRestfulClientFactory);
when(myRestfulClientFactory.newGenericClient(any())).thenReturn(myGenericClient); when(myRestfulClientFactory.newGenericClient(any())).thenReturn(myGenericClient);
myEmailSubscriber = new SubscriptionDeliveringEmailSubscriber(myEmailSender);
myEmailSubscriber.setFhirContextForUnitTest(myCtx);
myEmailSubscriber.setInterceptorBroadcasterForUnitTest(myInterceptorBroadcaster);
myEmailSubscriber.setSubscriptionRegistryForUnitTest(mySubscriptionRegistry);
myEmailSubscriber.setResourceModifiedMessagePersistenceSvcForUnitTest(myResourceModifiedMessagePersistenceSvc);
} }
@Test @Test
@ -400,6 +421,38 @@ public class BaseSubscriptionDeliverySubscriberTest {
} }
} }
@ParameterizedTest
@ValueSource(strings = {"message", "email"})
public void testMessageAndEmailSubscriber_whenPayloadIsNull_shouldTryInflateMessage(String theSubscriber) {
// setup
when(myInterceptorBroadcaster.callHooks(any(), any())).thenReturn(true);
Patient patient = generatePatient();
CanonicalSubscription subscription = generateSubscription();
ResourceDeliveryMessage payload = new ResourceDeliveryMessage();
payload.setSubscription(subscription);
payload.setPayload(myCtx, patient, EncodingEnum.JSON);
payload.setOperationType(ResourceModifiedMessage.OperationTypeEnum.CREATE);
// mock the inflated message
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(any())).thenReturn(any());
// this will null out the payload but keep the resource id and version.
payload.setPayloadToNull();
// execute & verify
switch (theSubscriber) {
case "message" ->
assertThrows(MessagingException.class, () -> myMessageSubscriber.handleMessage(new ResourceDeliveryJsonMessage(payload)));
case "email" ->
assertThrows(MessagingException.class, () -> myEmailSubscriber.handleMessage(new ResourceDeliveryJsonMessage(payload)));
}
verify(myResourceModifiedMessagePersistenceSvc, times(1)).inflatePersistedResourceModifiedMessageOrNull(any());
}
@Nonnull @Nonnull
private Patient generatePatient() { private Patient generatePatient() {
Patient patient = new Patient(); Patient patient = new Patient();

View File

@ -15,6 +15,7 @@ import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender; import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import ca.uhn.fhir.jpa.subscription.submit.config.SubscriptionSubmitterConfig; import ca.uhn.fhir.jpa.subscription.submit.config.SubscriptionSubmitterConfig;
import ca.uhn.fhir.jpa.subscription.submit.interceptor.SubscriptionQueryValidator; import ca.uhn.fhir.jpa.subscription.submit.interceptor.SubscriptionQueryValidator;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtendWith;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
@ -90,6 +91,11 @@ public class DaoSubscriptionMatcherTest {
public IEmailSender emailSender(){ public IEmailSender emailSender(){
return mock(IEmailSender.class); return mock(IEmailSender.class);
} }
@Bean
public IResourceModifiedMessagePersistenceSvc resourceModifiedMessagePersistenceSvc() {
return mock(IResourceModifiedMessagePersistenceSvc.class);
}
} }
} }

View File

@ -2,8 +2,10 @@ package ca.uhn.fhir.jpa.subscription.module.cache;
import ca.uhn.fhir.jpa.subscription.channel.subscription.ISubscriptionDeliveryChannelNamer; import ca.uhn.fhir.jpa.subscription.channel.subscription.ISubscriptionDeliveryChannelNamer;
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import org.hl7.fhir.dstu3.model.Subscription; import org.hl7.fhir.dstu3.model.Subscription;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary; import org.springframework.context.annotation.Primary;
@ -18,6 +20,9 @@ public class SubscriptionRegistrySharedTest extends BaseSubscriptionRegistryTest
private static final String OTHER_ID = "OTHER_ID"; private static final String OTHER_ID = "OTHER_ID";
@Autowired
private IResourceModifiedMessagePersistenceSvc myResourceModifiedMessagePersistenceSvc;
@Configuration @Configuration
public static class SpringConfig { public static class SpringConfig {

View File

@ -6,6 +6,7 @@ import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.model.sched.ISchedulerService; import ca.uhn.fhir.jpa.model.sched.ISchedulerService;
import ca.uhn.fhir.jpa.searchparam.registry.ISearchParamProvider; import ca.uhn.fhir.jpa.searchparam.registry.ISearchParamProvider;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import com.google.common.collect.Lists; import com.google.common.collect.Lists;
import org.hl7.fhir.dstu3.model.Subscription; import org.hl7.fhir.dstu3.model.Subscription;
import org.slf4j.Logger; import org.slf4j.Logger;
@ -62,4 +63,9 @@ public class TestSubscriptionDstu3Config {
return mock; return mock;
} }
@Bean
public IResourceModifiedMessagePersistenceSvc resourceModifiedMessagePersistenceSvc() {
return mock(IResourceModifiedMessagePersistenceSvc.class);
}
} }

View File

@ -26,6 +26,7 @@ import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.MethodOutcome; import ca.uhn.fhir.rest.api.MethodOutcome;
import ca.uhn.fhir.rest.server.IResourceProvider; import ca.uhn.fhir.rest.server.IResourceProvider;
import ca.uhn.fhir.rest.server.RestfulServer; import ca.uhn.fhir.rest.server.RestfulServer;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import ca.uhn.fhir.test.utilities.JettyUtil; import ca.uhn.fhir.test.utilities.JettyUtil;
import ca.uhn.test.concurrency.IPointcutLatch; import ca.uhn.test.concurrency.IPointcutLatch;
import ca.uhn.test.concurrency.PointcutLatch; import ca.uhn.test.concurrency.PointcutLatch;
@ -54,6 +55,10 @@ import javax.servlet.http.HttpServletRequest;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collections; import java.util.Collections;
import java.util.List; import java.util.List;
import java.util.Optional;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.when;
public abstract class BaseBlockingQueueSubscribableChannelDstu3Test extends BaseSubscriptionDstu3Test { public abstract class BaseBlockingQueueSubscribableChannelDstu3Test extends BaseSubscriptionDstu3Test {
public static final ChannelConsumerSettings CONSUMER_OPTIONS = new ChannelConsumerSettings().setConcurrentConsumers(1); public static final ChannelConsumerSettings CONSUMER_OPTIONS = new ChannelConsumerSettings().setConcurrentConsumers(1);
@ -100,6 +105,8 @@ public abstract class BaseBlockingQueueSubscribableChannelDstu3Test extends Base
IInterceptorService myInterceptorRegistry; IInterceptorService myInterceptorRegistry;
@Autowired @Autowired
private ISubscriptionDeliveryChannelNamer mySubscriptionDeliveryChannelNamer; private ISubscriptionDeliveryChannelNamer mySubscriptionDeliveryChannelNamer;
@Autowired
private IResourceModifiedMessagePersistenceSvc myResourceModifiedMessagePersistenceSvc;
@BeforeEach @BeforeEach
public void beforeReset() { public void beforeReset() {
@ -140,6 +147,8 @@ public abstract class BaseBlockingQueueSubscribableChannelDstu3Test extends Base
public <T extends IBaseResource> T sendResource(T theResource, RequestPartitionId theRequestPartitionId) throws InterruptedException { public <T extends IBaseResource> T sendResource(T theResource, RequestPartitionId theRequestPartitionId) throws InterruptedException {
ResourceModifiedMessage msg = new ResourceModifiedMessage(myFhirContext, theResource, ResourceModifiedMessage.OperationTypeEnum.CREATE, null, theRequestPartitionId); ResourceModifiedMessage msg = new ResourceModifiedMessage(myFhirContext, theResource, ResourceModifiedMessage.OperationTypeEnum.CREATE, null, theRequestPartitionId);
ResourceModifiedJsonMessage message = new ResourceModifiedJsonMessage(msg); ResourceModifiedJsonMessage message = new ResourceModifiedJsonMessage(msg);
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(any())).thenReturn(Optional.of(msg));
mySubscriptionMatchingPost.setExpectedCount(1); mySubscriptionMatchingPost.setExpectedCount(1);
ourSubscribableChannel.send(message); ourSubscribableChannel.send(message);
mySubscriptionMatchingPost.awaitExpected(); mySubscriptionMatchingPost.awaitExpected();

View File

@ -17,6 +17,7 @@ import ca.uhn.fhir.jpa.subscription.module.standalone.BaseBlockingQueueSubscriba
import ca.uhn.fhir.model.primitive.IdDt; import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.server.messaging.BaseResourceModifiedMessage; import ca.uhn.fhir.rest.server.messaging.BaseResourceModifiedMessage;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import ca.uhn.fhir.util.HapiExtensions; import ca.uhn.fhir.util.HapiExtensions;
import com.google.common.collect.Lists; import com.google.common.collect.Lists;
import org.hl7.fhir.dstu3.model.BooleanType; import org.hl7.fhir.dstu3.model.BooleanType;
@ -33,6 +34,7 @@ import org.mockito.Mockito;
import java.util.Collections; import java.util.Collections;
import java.util.List; import java.util.List;
import java.util.Optional;
import static ca.uhn.fhir.jpa.subscription.match.matcher.subscriber.SubscriptionCriteriaParser.TypeEnum.STARTYPE_EXPRESSION; import static ca.uhn.fhir.jpa.subscription.match.matcher.subscriber.SubscriptionCriteriaParser.TypeEnum.STARTYPE_EXPRESSION;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
@ -434,6 +436,8 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri
SubscriptionCriteriaParser.SubscriptionCriteria mySubscriptionCriteria; SubscriptionCriteriaParser.SubscriptionCriteria mySubscriptionCriteria;
@Mock @Mock
SubscriptionMatchDeliverer mySubscriptionMatchDeliverer; SubscriptionMatchDeliverer mySubscriptionMatchDeliverer;
@Mock
IResourceModifiedMessagePersistenceSvc myResourceModifiedMessagePersistenceSvc;
@InjectMocks @InjectMocks
SubscriptionMatchingSubscriber subscriber; SubscriptionMatchingSubscriber subscriber;
@ -445,6 +449,7 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri
when(myInterceptorBroadcaster.callHooks( when(myInterceptorBroadcaster.callHooks(
eq(Pointcut.SUBSCRIPTION_BEFORE_PERSISTED_RESOURCE_CHECKED), any(HookParams.class))).thenReturn(true); eq(Pointcut.SUBSCRIPTION_BEFORE_PERSISTED_RESOURCE_CHECKED), any(HookParams.class))).thenReturn(true);
when(mySubscriptionRegistry.getAll()).thenReturn(Collections.emptyList()); when(mySubscriptionRegistry.getAll()).thenReturn(Collections.emptyList());
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(any())).thenReturn(Optional.ofNullable(message));
subscriber.matchActiveSubscriptionsAndDeliver(message); subscriber.matchActiveSubscriptionsAndDeliver(message);
@ -465,6 +470,7 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri
when(myActiveSubscription.getCriteria()).thenReturn(mySubscriptionCriteria); when(myActiveSubscription.getCriteria()).thenReturn(mySubscriptionCriteria);
when(myActiveSubscription.getId()).thenReturn("Patient/123"); when(myActiveSubscription.getId()).thenReturn("Patient/123");
when(mySubscriptionCriteria.getType()).thenReturn(STARTYPE_EXPRESSION); when(mySubscriptionCriteria.getType()).thenReturn(STARTYPE_EXPRESSION);
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(any())).thenReturn(Optional.ofNullable(message));
subscriber.matchActiveSubscriptionsAndDeliver(message); subscriber.matchActiveSubscriptionsAndDeliver(message);
@ -486,6 +492,7 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri
when(myNonDeleteSubscription.getCriteria()).thenReturn(mySubscriptionCriteria); when(myNonDeleteSubscription.getCriteria()).thenReturn(mySubscriptionCriteria);
when(myNonDeleteSubscription.getId()).thenReturn("Patient/123"); when(myNonDeleteSubscription.getId()).thenReturn("Patient/123");
when(mySubscriptionCriteria.getType()).thenReturn(STARTYPE_EXPRESSION); when(mySubscriptionCriteria.getType()).thenReturn(STARTYPE_EXPRESSION);
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(any())).thenReturn(Optional.ofNullable(message));
subscriber.matchActiveSubscriptionsAndDeliver(message); subscriber.matchActiveSubscriptionsAndDeliver(message);
@ -505,6 +512,7 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri
when(myActiveSubscription.getId()).thenReturn("Patient/123"); when(myActiveSubscription.getId()).thenReturn("Patient/123");
when(mySubscriptionCriteria.getType()).thenReturn(STARTYPE_EXPRESSION); when(mySubscriptionCriteria.getType()).thenReturn(STARTYPE_EXPRESSION);
when(myCanonicalSubscription.getSendDeleteMessages()).thenReturn(true); when(myCanonicalSubscription.getSendDeleteMessages()).thenReturn(true);
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessageOrNull(any())).thenReturn(Optional.ofNullable(message));
subscriber.matchActiveSubscriptionsAndDeliver(message); subscriber.matchActiveSubscriptionsAndDeliver(message);

View File

@ -20,6 +20,7 @@ import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry;
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription;
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscriptionChannelType; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscriptionChannelType;
import ca.uhn.fhir.rest.server.util.ISearchParamRegistry; import ca.uhn.fhir.rest.server.util.ISearchParamRegistry;
import ca.uhn.fhir.subscription.api.IResourceModifiedMessagePersistenceSvc;
import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.IdType;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
@ -146,5 +147,10 @@ public class WebsocketConnectionValidatorTest {
return mock(IEmailSender.class); return mock(IEmailSender.class);
} }
@Bean
public IResourceModifiedMessagePersistenceSvc resourceModifiedMessagePersistenceSvc(){
return mock(IResourceModifiedMessagePersistenceSvc.class);
}
} }
} }

View File

@ -2189,21 +2189,21 @@ public class FhirResourceDaoDstu2Test extends BaseJpaDstu2Test {
pm.setSort(new SortSpec(BaseResource.SP_RES_ID)); pm.setSort(new SortSpec(BaseResource.SP_RES_ID));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm)); actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(5, actual.size()); assertEquals(5, actual.size());
assertThat(actual, contains(idMethodName, id1, id2, id3, id4)); assertThat(actual, contains(id1, id2, id3, id4, idMethodName));
pm = new SearchParameterMap(); pm = new SearchParameterMap();
pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName)); pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName));
pm.setSort(new SortSpec(BaseResource.SP_RES_ID).setOrder(SortOrderEnum.ASC)); pm.setSort(new SortSpec(BaseResource.SP_RES_ID).setOrder(SortOrderEnum.ASC));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm)); actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(5, actual.size()); assertEquals(5, actual.size());
assertThat(actual, contains(idMethodName, id1, id2, id3, id4)); assertThat(actual, contains(id1, id2, id3, id4, idMethodName));
pm = new SearchParameterMap(); pm = new SearchParameterMap();
pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName)); pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName));
pm.setSort(new SortSpec(BaseResource.SP_RES_ID).setOrder(SortOrderEnum.DESC)); pm.setSort(new SortSpec(BaseResource.SP_RES_ID).setOrder(SortOrderEnum.DESC));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm)); actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(5, actual.size()); assertEquals(5, actual.size());
assertThat(actual, contains(id4, id3, id2, id1, idMethodName)); assertThat(actual, contains(idMethodName, id4, id3, id2, id1));
} }
@Test @Test

View File

@ -3323,7 +3323,7 @@ public class FhirResourceDaoDstu3SearchNoFtTest extends BaseJpaDstu3Test {
map = new SearchParameterMap(); map = new SearchParameterMap();
map.setSort(new SortSpec("_id", SortOrderEnum.ASC)); map.setSort(new SortSpec("_id", SortOrderEnum.ASC));
ids = toUnqualifiedVersionlessIdValues(myPatientDao.search(map)); ids = toUnqualifiedVersionlessIdValues(myPatientDao.search(map));
assertThat(ids, contains("Patient/AA", "Patient/AB", id1, id2)); assertThat(ids, contains(id1, id2, "Patient/AA", "Patient/AB"));
} }

View File

@ -2788,21 +2788,21 @@ public class FhirResourceDaoDstu3Test extends BaseJpaDstu3Test {
pm.setSort(new SortSpec(IAnyResource.SP_RES_ID)); pm.setSort(new SortSpec(IAnyResource.SP_RES_ID));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm)); actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(5, actual.size()); assertEquals(5, actual.size());
assertThat(actual.toString(), actual, contains(idMethodName, id1, id2, id3, id4)); assertThat(actual.toString(), actual, contains(id1, id2, id3, id4, idMethodName));
pm = new SearchParameterMap(); pm = new SearchParameterMap();
pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName)); pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName));
pm.setSort(new SortSpec(IAnyResource.SP_RES_ID).setOrder(SortOrderEnum.ASC)); pm.setSort(new SortSpec(IAnyResource.SP_RES_ID).setOrder(SortOrderEnum.ASC));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm)); actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(5, actual.size()); assertEquals(5, actual.size());
assertThat(actual.toString(), actual, contains(idMethodName, id1, id2, id3, id4)); assertThat(actual.toString(), actual, contains(id1, id2, id3, id4, idMethodName));
pm = new SearchParameterMap(); pm = new SearchParameterMap();
pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName)); pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName));
pm.setSort(new SortSpec(IAnyResource.SP_RES_ID).setOrder(SortOrderEnum.DESC)); pm.setSort(new SortSpec(IAnyResource.SP_RES_ID).setOrder(SortOrderEnum.DESC));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm)); actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(5, actual.size()); assertEquals(5, actual.size());
assertThat(actual.toString(), actual, contains(id4, id3, id2, id1, idMethodName)); assertThat(actual.toString(), actual, contains(idMethodName, id4, id3, id2, id1));
} }
@Test @Test

View File

@ -138,7 +138,7 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest {
protected void dropForcedIdUniqueConstraint() { protected void dropForcedIdUniqueConstraint() {
runInTransaction(() -> { runInTransaction(() -> {
myEntityManager.createNativeQuery("alter table " + ForcedId.HFJ_FORCED_ID + " drop constraint " + ForcedId.IDX_FORCEDID_TYPE_FID).executeUpdate(); myEntityManager.createNativeQuery("alter table " + ForcedId.HFJ_FORCED_ID + " drop constraint " + ForcedId.IDX_FORCEDID_TYPE_FID).executeUpdate();
myEntityManager.createNativeQuery("alter table " + ResourceTable.HFJ_RESOURCE + " drop constraint " + ResourceTable.IDX_RES_FHIR_ID).executeUpdate(); myEntityManager.createNativeQuery("alter table " + ResourceTable.HFJ_RESOURCE + " drop constraint " + ResourceTable.IDX_RES_TYPE_FHIR_ID).executeUpdate();
}); });
myHaveDroppedForcedIdUniqueConstraint = true; myHaveDroppedForcedIdUniqueConstraint = true;
} }

View File

@ -0,0 +1,150 @@
package ca.uhn.fhir.jpa.dao.r4;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.interceptor.api.Hook;
import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.jpa.dao.TestDaoSearch;
import ca.uhn.fhir.jpa.test.BaseJpaTest;
import ca.uhn.fhir.jpa.test.config.TestHSearchAddInConfig;
import ca.uhn.fhir.jpa.test.config.TestR4Config;
import ca.uhn.fhir.jpa.util.SqlQuery;
import ca.uhn.fhir.jpa.util.SqlQueryList;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.storage.test.DaoTestDataBuilder;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.TestContext;
import org.springframework.test.context.TestExecutionListeners;
import org.springframework.test.context.junit.jupiter.SpringExtension;
import org.springframework.test.context.support.DependencyInjectionTestExecutionListener;
import org.springframework.test.context.support.DirtiesContextTestExecutionListener;
import org.springframework.transaction.PlatformTransactionManager;
import java.util.ArrayList;
import java.util.List;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.empty;
import static org.hamcrest.Matchers.not;
/**
* Sandbox for implementing queries.
* This will NOT run during the build - use this class as a convenient
* place to explore, debug, profile, and optimize.
*/
@ExtendWith(SpringExtension.class)
@ContextConfiguration(classes = {
TestR4Config.class,
TestHSearchAddInConfig.NoFT.class,
DaoTestDataBuilder.Config.class,
TestDaoSearch.Config.class
})
@DirtiesContext(classMode = DirtiesContext.ClassMode.AFTER_CLASS)
@TestExecutionListeners(listeners = {
DependencyInjectionTestExecutionListener.class
, FhirResourceDaoR4QuerySandbox.TestDirtiesContextTestExecutionListener.class
})
public class FhirResourceDaoR4QuerySandbox extends BaseJpaTest {
private static final Logger ourLog = LoggerFactory.getLogger(FhirResourceDaoR4QuerySandbox.class);
@Autowired
PlatformTransactionManager myTxManager;
@Autowired
FhirContext myFhirCtx;
@RegisterExtension
@Autowired
DaoTestDataBuilder myDataBuilder;
@Autowired
TestDaoSearch myTestDaoSearch;
@Override
protected PlatformTransactionManager getTxManager() {
return myTxManager;
}
@Override
protected FhirContext getFhirContext() {
return myFhirCtx;
}
List<String> myCapturedQueries = new ArrayList<>();
@BeforeEach
void registerLoggingInterceptor() {
registerInterceptor(new Object(){
@Hook(Pointcut.JPA_PERFTRACE_RAW_SQL)
public void captureSql(RequestDetails theRequestDetails, SqlQueryList theQueries) {
for (SqlQuery next : theQueries) {
String output = next.getSql(true, true, true);
ourLog.info("Query: {}", output);
myCapturedQueries.add(output);
}
}
});
}
@Test
public void testSearches_logQueries() {
myDataBuilder.createPatient();
myTestDaoSearch.searchForIds("Patient?name=smith");
assertThat(myCapturedQueries, not(empty()));
}
@Test
void testQueryByPid() {
// sentinel for over-match
myDataBuilder.createPatient();
String id = myDataBuilder.createPatient(
myDataBuilder.withBirthdate("1971-01-01"),
myDataBuilder.withActiveTrue(),
myDataBuilder.withFamily("Smith")).getIdPart();
myTestDaoSearch.assertSearchFindsOnly("search by server assigned id", "Patient?_pid=" + id, id);
}
@Test
void testQueryByPid_withOtherSPAvoidsResourceTable() {
// sentinel for over-match
myDataBuilder.createPatient();
String id = myDataBuilder.createPatient(
myDataBuilder.withBirthdate("1971-01-01"),
myDataBuilder.withActiveTrue(),
myDataBuilder.withFamily("Smith")).getIdPart();
myTestDaoSearch.assertSearchFindsOnly("search by server assigned id", "Patient?name=smith&_pid=" + id, id);
}
@Test
void testSortByPid() {
String id1 = myDataBuilder.createPatient(myDataBuilder.withFamily("Smithy")).getIdPart();
String id2 = myDataBuilder.createPatient(myDataBuilder.withFamily("Smithwick")).getIdPart();
String id3 = myDataBuilder.createPatient(myDataBuilder.withFamily("Smith")).getIdPart();
myTestDaoSearch.assertSearchFindsInOrder("sort by server assigned id", "Patient?family=smith&_sort=_pid", id1,id2,id3);
myTestDaoSearch.assertSearchFindsInOrder("reverse sort by server assigned id", "Patient?family=smith&_sort=-_pid", id3,id2,id1);
}
public static final class TestDirtiesContextTestExecutionListener extends DirtiesContextTestExecutionListener {
@Override
protected void beforeOrAfterTestClass(TestContext testContext, DirtiesContext.ClassMode requiredClassMode) throws Exception {
if (!testContext.getTestClass().getName().contains("$")) {
super.beforeOrAfterTestClass(testContext, requiredClassMode);
}
}
}
}

View File

@ -176,6 +176,7 @@ import static ca.uhn.fhir.rest.param.ParamPrefixEnum.LESSTHAN;
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.LESSTHAN_OR_EQUALS; import static ca.uhn.fhir.rest.param.ParamPrefixEnum.LESSTHAN_OR_EQUALS;
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.NOT_EQUAL; import static ca.uhn.fhir.rest.param.ParamPrefixEnum.NOT_EQUAL;
import static ca.uhn.fhir.test.utilities.CustomMatchersUtil.assertDoesNotContainAnyOf; import static ca.uhn.fhir.test.utilities.CustomMatchersUtil.assertDoesNotContainAnyOf;
import static ca.uhn.fhir.util.DateUtils.convertDateToIso8601String;
import static org.apache.commons.lang3.StringUtils.countMatches; import static org.apache.commons.lang3.StringUtils.countMatches;
import static org.apache.commons.lang3.StringUtils.leftPad; import static org.apache.commons.lang3.StringUtils.leftPad;
import static org.hamcrest.CoreMatchers.is; import static org.hamcrest.CoreMatchers.is;
@ -447,6 +448,57 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
assertEquals(0, ids.size()); assertEquals(0, ids.size());
} }
@Test
public void testHasEncounterAndLastUpdated() {
// setup
Patient patientA = new Patient();
String patientIdA = myPatientDao.create(patientA).getId().toUnqualifiedVersionless().getValue();
Patient patientB = new Patient();
String patientIdB = myPatientDao.create(patientA).getId().toUnqualifiedVersionless().getValue();
Encounter encounterA = new Encounter();
encounterA.getClass_().setSystem("http://snomed.info/sct").setCode("55822004");
encounterA.getSubject().setReference(patientIdA);
// record time between encounter A and B
TestUtil.sleepOneClick();
Date beforeA = new Date();
TestUtil.sleepOneClick();
myEncounterDao.create(encounterA);
Encounter encounterB = new Encounter();
encounterB.getClass_().setSystem("http://snomed.info/sct").setCode("55822005");
encounterB.getSubject().setReference(patientIdB);
// record time between encounter A and B
TestUtil.sleepOneClick();
Date beforeB = new Date();
TestUtil.sleepOneClick();
myEncounterDao.create(encounterB);
// execute
String criteriaA = "_has:Encounter:patient:_lastUpdated=ge" + convertDateToIso8601String(beforeA);
SearchParameterMap mapA = myMatchUrlService.translateMatchUrl(criteriaA, myFhirContext.getResourceDefinition(Patient.class));
mapA.setLoadSynchronous(true);
myCaptureQueriesListener.clear();
IBundleProvider resultA = myPatientDao.search(mapA);
myCaptureQueriesListener.logSelectQueries();
List<String> idsBeforeA = toUnqualifiedVersionlessIdValues(resultA);
String criteriaB = "_has:Encounter:patient:_lastUpdated=ge" + convertDateToIso8601String(beforeB);
SearchParameterMap mapB = myMatchUrlService.translateMatchUrl(criteriaB, myFhirContext.getResourceDefinition(Patient.class));
mapB.setLoadSynchronous(true);
IBundleProvider resultB = myPatientDao.search(mapB);
List<String> idsBeforeB = toUnqualifiedVersionlessIdValues(resultB);
// verify
assertEquals(2, idsBeforeA.size());
assertEquals(1, idsBeforeB.size());
}
@Test @Test
public void testGenderBirthdateHasCondition() { public void testGenderBirthdateHasCondition() {
Patient patient = new Patient(); Patient patient = new Patient();

View File

@ -89,12 +89,12 @@ public class FhirResourceDaoR4SortTest extends BaseJpaR4Test {
map = new SearchParameterMap(); map = new SearchParameterMap();
map.setSort(new SortSpec("_id", SortOrderEnum.ASC)); map.setSort(new SortSpec("_id", SortOrderEnum.ASC));
ids = toUnqualifiedVersionlessIdValues(myPatientDao.search(map)); ids = toUnqualifiedVersionlessIdValues(myPatientDao.search(map));
assertThat(ids, contains("Patient/AA", "Patient/AB", id1, id2)); assertThat(ids, contains(id1, id2, "Patient/AA", "Patient/AB"));
map = new SearchParameterMap(); map = new SearchParameterMap();
map.setSort(new SortSpec("_id", SortOrderEnum.DESC)); map.setSort(new SortSpec("_id", SortOrderEnum.DESC));
ids = toUnqualifiedVersionlessIdValues(myPatientDao.search(map)); ids = toUnqualifiedVersionlessIdValues(myPatientDao.search(map));
assertThat(ids, contains(id2, id1, "Patient/AB", "Patient/AA")); assertThat(ids, contains("Patient/AB", "Patient/AA", id2, id1));
} }
@Test @Test

View File

@ -4,15 +4,17 @@ import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.dao.TestDaoSearch; import ca.uhn.fhir.jpa.dao.TestDaoSearch;
import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases;
import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases;
import ca.uhn.fhir.jpa.search.BaseSourceSearchParameterTestCases; import ca.uhn.fhir.jpa.search.BaseSourceSearchParameterTestCases;
import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases;
import ca.uhn.fhir.jpa.search.IIdSearchTestTemplate;
import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases;
import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import ca.uhn.fhir.jpa.searchparam.MatchUrlService;
import ca.uhn.fhir.jpa.test.BaseJpaTest; import ca.uhn.fhir.jpa.test.BaseJpaTest;
import ca.uhn.fhir.jpa.test.config.TestHSearchAddInConfig; import ca.uhn.fhir.jpa.test.config.TestHSearchAddInConfig;
import ca.uhn.fhir.jpa.test.config.TestR4Config; import ca.uhn.fhir.jpa.test.config.TestR4Config;
import ca.uhn.fhir.storage.test.BaseDateSearchDaoTests; import ca.uhn.fhir.storage.test.BaseDateSearchDaoTests;
import ca.uhn.fhir.storage.test.DaoTestDataBuilder; import ca.uhn.fhir.storage.test.DaoTestDataBuilder;
import ca.uhn.fhir.test.utilities.ITestDataBuilder;
import ca.uhn.fhir.test.utilities.ITestDataBuilder.ICreationArgument; import ca.uhn.fhir.test.utilities.ITestDataBuilder.ICreationArgument;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.Observation; import org.hl7.fhir.r4.model.Observation;
@ -36,9 +38,14 @@ import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.contains;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.hasItem; import static org.hamcrest.Matchers.hasItem;
import static org.hamcrest.Matchers.hasItems;
import static org.hamcrest.Matchers.not; import static org.hamcrest.Matchers.not;
/**
* Verify that our query behaviour matches the spec.
* Note: we do not extend BaseJpaR4Test here.
* That does a full purge in @AfterEach which is a bit slow.
* Instead, this test tracks all created resources in DaoTestDataBuilder, and deletes them in teardown.
*/
@ExtendWith(SpringExtension.class) @ExtendWith(SpringExtension.class)
@ContextConfiguration(classes = { @ContextConfiguration(classes = {
TestR4Config.class, TestR4Config.class,
@ -256,10 +263,10 @@ public class FhirResourceDaoR4StandardQueriesNoFTTest extends BaseJpaTest {
String idExM = withObservation(myDataBuilder.withObservationCode("http://example.org", "MValue")).getIdPart(); String idExM = withObservation(myDataBuilder.withObservationCode("http://example.org", "MValue")).getIdPart();
List<String> allIds = myTestDaoSearch.searchForIds("/Observation?_sort=code"); List<String> allIds = myTestDaoSearch.searchForIds("/Observation?_sort=code");
assertThat(allIds, hasItems(idAlphaA, idAlphaM, idAlphaZ, idExA, idExD, idExM)); assertThat(allIds, contains(idAlphaA, idAlphaM, idAlphaZ, idExA, idExD, idExM));
allIds = myTestDaoSearch.searchForIds("/Observation?_sort=code&code=http://example.org|"); allIds = myTestDaoSearch.searchForIds("/Observation?_sort=code&code=http://example.org|");
assertThat(allIds, hasItems(idExA, idExD, idExM)); assertThat(allIds, contains(idExA, idExD, idExM));
} }
} }
} }
@ -368,7 +375,7 @@ public class FhirResourceDaoR4StandardQueriesNoFTTest extends BaseJpaTest {
String idAlpha5 = withRiskAssessmentWithProbabilty(0.5).getIdPart(); String idAlpha5 = withRiskAssessmentWithProbabilty(0.5).getIdPart();
List<String> allIds = myTestDaoSearch.searchForIds("/RiskAssessment?_sort=probability"); List<String> allIds = myTestDaoSearch.searchForIds("/RiskAssessment?_sort=probability");
assertThat(allIds, hasItems(idAlpha2, idAlpha5, idAlpha7)); assertThat(allIds, contains(idAlpha2, idAlpha5, idAlpha7));
} }
} }
@ -491,12 +498,51 @@ public class FhirResourceDaoR4StandardQueriesNoFTTest extends BaseJpaTest {
String idAlpha5 = withObservationWithValueQuantity(0.5).getIdPart(); String idAlpha5 = withObservationWithValueQuantity(0.5).getIdPart();
List<String> allIds = myTestDaoSearch.searchForIds("/Observation?_sort=value-quantity"); List<String> allIds = myTestDaoSearch.searchForIds("/Observation?_sort=value-quantity");
assertThat(allIds, hasItems(idAlpha2, idAlpha5, idAlpha7)); assertThat(allIds, contains(idAlpha2, idAlpha5, idAlpha7));
} }
} }
} }
@Test
void testQueryByPid() {
// sentinel for over-match
myDataBuilder.createPatient();
String id = myDataBuilder.createPatient(
myDataBuilder.withBirthdate("1971-01-01"),
myDataBuilder.withActiveTrue(),
myDataBuilder.withFamily("Smith")).getIdPart();
myTestDaoSearch.assertSearchFindsOnly("search by server assigned id", "Patient?_pid=" + id, id);
myTestDaoSearch.assertSearchFindsOnly("search by server assigned id", "Patient?family=smith&_pid=" + id, id);
}
@Test
void testSortByPid() {
String id1 = myDataBuilder.createPatient(myDataBuilder.withFamily("Smithy")).getIdPart();
String id2 = myDataBuilder.createPatient(myDataBuilder.withFamily("Smithwick")).getIdPart();
String id3 = myDataBuilder.createPatient(myDataBuilder.withFamily("Smith")).getIdPart();
myTestDaoSearch.assertSearchFindsInOrder("sort by server assigned id", "Patient?family=smith&_sort=_pid", id1,id2,id3);
myTestDaoSearch.assertSearchFindsInOrder("reverse sort by server assigned id", "Patient?family=smith&_sort=-_pid", id3,id2,id1);
}
@Nested
public class IdSearch implements IIdSearchTestTemplate {
@Override
public TestDaoSearch getSearch() {
return myTestDaoSearch;
}
@Override
public ITestDataBuilder getBuilder() {
return myDataBuilder;
}
}
// todo mb re-enable this. Some of these fail! // todo mb re-enable this. Some of these fail!
@Disabled @Disabled
@Nested @Nested

View File

@ -3352,63 +3352,6 @@ public class FhirResourceDaoR4Test extends BaseJpaR4Test {
} }
@Test
public void testSortById() {
String methodName = "testSortBTyId";
Patient p = new Patient();
p.addIdentifier().setSystem("urn:system").setValue(methodName);
IIdType id1 = myPatientDao.create(p, mySrd).getId().toUnqualifiedVersionless();
p = new Patient();
p.addIdentifier().setSystem("urn:system").setValue(methodName);
IIdType id2 = myPatientDao.create(p, mySrd).getId().toUnqualifiedVersionless();
p = new Patient();
p.setId(methodName + "1");
p.addIdentifier().setSystem("urn:system").setValue(methodName);
IIdType idMethodName1 = myPatientDao.update(p, mySrd).getId().toUnqualifiedVersionless();
assertEquals(methodName + "1", idMethodName1.getIdPart());
p = new Patient();
p.addIdentifier().setSystem("urn:system").setValue(methodName);
IIdType id3 = myPatientDao.create(p, mySrd).getId().toUnqualifiedVersionless();
p = new Patient();
p.setId(methodName + "2");
p.addIdentifier().setSystem("urn:system").setValue(methodName);
IIdType idMethodName2 = myPatientDao.update(p, mySrd).getId().toUnqualifiedVersionless();
assertEquals(methodName + "2", idMethodName2.getIdPart());
p = new Patient();
p.addIdentifier().setSystem("urn:system").setValue(methodName);
IIdType id4 = myPatientDao.create(p, mySrd).getId().toUnqualifiedVersionless();
SearchParameterMap pm;
List<IIdType> actual;
pm = SearchParameterMap.newSynchronous();
pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName));
pm.setSort(new SortSpec(IAnyResource.SP_RES_ID));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(6, actual.size());
assertThat(actual, contains(idMethodName1, idMethodName2, id1, id2, id3, id4));
pm = SearchParameterMap.newSynchronous();
pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName));
pm.setSort(new SortSpec(IAnyResource.SP_RES_ID).setOrder(SortOrderEnum.ASC));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(6, actual.size());
assertThat(actual, contains(idMethodName1, idMethodName2, id1, id2, id3, id4));
pm = SearchParameterMap.newSynchronous();
pm.add(Patient.SP_IDENTIFIER, new TokenParam("urn:system", methodName));
pm.setSort(new SortSpec(IAnyResource.SP_RES_ID).setOrder(SortOrderEnum.DESC));
actual = toUnqualifiedVersionlessIds(myPatientDao.search(pm));
assertEquals(6, actual.size());
assertThat(actual, contains(id4, id3, id2, id1, idMethodName2, idMethodName1));
}
@ParameterizedTest @ParameterizedTest
@ValueSource(booleans = {true, false}) @ValueSource(booleans = {true, false})
public void testSortByMissingAttribute(boolean theIndexMissingData) { public void testSortByMissingAttribute(boolean theIndexMissingData) {

View File

@ -1,7 +1,6 @@
package ca.uhn.fhir.jpa.dao.r4; package ca.uhn.fhir.jpa.dao.r4;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.svc.ISearchCoordinatorSvc;
import ca.uhn.fhir.jpa.dao.data.ISearchDao; import ca.uhn.fhir.jpa.dao.data.ISearchDao;
import ca.uhn.fhir.jpa.dao.data.ISearchResultDao; import ca.uhn.fhir.jpa.dao.data.ISearchResultDao;
import ca.uhn.fhir.jpa.entity.Search; import ca.uhn.fhir.jpa.entity.Search;
@ -16,6 +15,8 @@ import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import java.time.Instant;
import java.time.temporal.ChronoUnit;
import java.util.Date; import java.util.Date;
import java.util.UUID; import java.util.UUID;
@ -31,22 +32,20 @@ public class SearchCoordinatorSvcImplTest extends BaseJpaR4Test {
@Autowired @Autowired
private ISearchResultDao mySearchResultDao; private ISearchResultDao mySearchResultDao;
@Autowired
private ISearchCoordinatorSvc mySearchCoordinator;
@Autowired @Autowired
private ISearchCacheSvc myDatabaseCacheSvc; private ISearchCacheSvc myDatabaseCacheSvc;
@AfterEach @AfterEach
public void after() { public void after() {
DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOnePassForUnitTest(DatabaseSearchCacheSvcImpl.DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_PAS); DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOnePassForUnitTest(DatabaseSearchCacheSvcImpl.DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_PAS);
DatabaseSearchCacheSvcImpl.setMaximumSearchesToCheckForDeletionCandidacyForUnitTest(DEFAULT_MAX_DELETE_CANDIDATES_TO_FIND);
} }
/**
* Semi-obsolete test. This used to test incremental deletion, but we now work until done or a timeout.
*/
@Test @Test
public void testDeleteDontMarkPreviouslyMarkedSearchesAsDeleted() { public void testDeleteDontMarkPreviouslyMarkedSearchesAsDeleted() {
DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOnePassForUnitTest(5); DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOnePassForUnitTest(5);
DatabaseSearchCacheSvcImpl.setMaximumSearchesToCheckForDeletionCandidacyForUnitTest(10);
runInTransaction(()->{ runInTransaction(()->{
mySearchResultDao.deleteAll(); mySearchResultDao.deleteAll();
@ -86,28 +85,12 @@ public class SearchCoordinatorSvcImplTest extends BaseJpaR4Test {
assertEquals(30, mySearchResultDao.count()); assertEquals(30, mySearchResultDao.count());
}); });
myDatabaseCacheSvc.pollForStaleSearchesAndDeleteThem(RequestPartitionId.allPartitions()); myDatabaseCacheSvc.pollForStaleSearchesAndDeleteThem(RequestPartitionId.allPartitions(), Instant.now().plus(10, ChronoUnit.SECONDS));
runInTransaction(()->{ runInTransaction(()->{
// We should delete up to 10, but 3 don't get deleted since they have too many results to delete in one pass // We should delete up to 10, but 3 don't get deleted since they have too many results to delete in one pass
assertEquals(13, mySearchDao.count());
assertEquals(3, mySearchDao.countDeleted());
// We delete a max of 5 results per search, so half are gone
assertEquals(15, mySearchResultDao.count());
});
myDatabaseCacheSvc.pollForStaleSearchesAndDeleteThem(RequestPartitionId.allPartitions());
runInTransaction(()->{
// Once again we attempt to delete 10, but the first 3 don't get deleted and still remain
// (total is 6 because 3 weren't deleted, and they blocked another 3 that might have been)
assertEquals(6, mySearchDao.count());
assertEquals(6, mySearchDao.countDeleted());
assertEquals(0, mySearchResultDao.count());
});
myDatabaseCacheSvc.pollForStaleSearchesAndDeleteThem(RequestPartitionId.allPartitions());
runInTransaction(()->{
assertEquals(0, mySearchDao.count()); assertEquals(0, mySearchDao.count());
assertEquals(0, mySearchDao.countDeleted()); assertEquals(0, mySearchDao.countDeleted());
// We delete a max of 5 results per search, so half are gone
assertEquals(0, mySearchResultDao.count()); assertEquals(0, mySearchResultDao.count());
}); });
} }

View File

@ -22,7 +22,6 @@ import java.io.ByteArrayOutputStream;
import java.io.IOException; import java.io.IOException;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import java.util.List; import java.util.List;
import java.util.stream.Collectors;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertFalse;
@ -53,7 +52,7 @@ public class PackageInstallerSvcImplCreateTest extends BaseJpaR4Test {
final NamingSystem namingSystem = new NamingSystem(); final NamingSystem namingSystem = new NamingSystem();
namingSystem.getUniqueId().add(new NamingSystem.NamingSystemUniqueIdComponent().setValue("123")); namingSystem.getUniqueId().add(new NamingSystem.NamingSystemUniqueIdComponent().setValue("123"));
create(namingSystem); install(namingSystem);
assertEquals(1, myNamingSystemDao.search(SearchParameterMap.newSynchronous(), REQUEST_DETAILS).getAllResources().size()); assertEquals(1, myNamingSystemDao.search(SearchParameterMap.newSynchronous(), REQUEST_DETAILS).getAllResources().size());
} }
@ -184,7 +183,7 @@ public class PackageInstallerSvcImplCreateTest extends BaseJpaR4Test {
} }
private void createValueSetAndCallCreate(String theOid, String theResourceVersion, String theValueSetVersion, String theUrl, String theCopyright) throws IOException { private void createValueSetAndCallCreate(String theOid, String theResourceVersion, String theValueSetVersion, String theUrl, String theCopyright) throws IOException {
create(createValueSet(theOid, theResourceVersion, theValueSetVersion, theUrl, theCopyright)); install(createValueSet(theOid, theResourceVersion, theValueSetVersion, theUrl, theCopyright));
} }
@Nonnull @Nonnull
@ -199,8 +198,8 @@ public class PackageInstallerSvcImplCreateTest extends BaseJpaR4Test {
return valueSetFromFirstIg; return valueSetFromFirstIg;
} }
private void create(IBaseResource theResource) throws IOException { private void install(IBaseResource theResource) throws IOException {
mySvc.create(theResource, createInstallationSpec(packageToBytes()), new PackageInstallOutcomeJson()); mySvc.install(theResource, createInstallationSpec(packageToBytes()), new PackageInstallOutcomeJson());
} }
@Nonnull @Nonnull

View File

@ -38,7 +38,7 @@ public class PackageInstallerSvcImplRewriteHistoryTest extends BaseJpaR4Test {
// execute // execute
// red-green this threw a NPE before the fix // red-green this threw a NPE before the fix
mySvc.updateResource(myConceptMapDao, conceptMap); mySvc.createOrUpdateResource(myConceptMapDao, conceptMap, null);
// verify // verify
ConceptMap readConceptMap = myConceptMapDao.read(CONCEPT_MAP_TEST_ID); ConceptMap readConceptMap = myConceptMapDao.read(CONCEPT_MAP_TEST_ID);

View File

@ -10,6 +10,7 @@ import ca.uhn.fhir.jpa.binary.interceptor.BinaryStorageInterceptor;
import ca.uhn.fhir.jpa.binstore.MemoryBinaryStorageSvcImpl; import ca.uhn.fhir.jpa.binstore.MemoryBinaryStorageSvcImpl;
import ca.uhn.fhir.jpa.model.entity.StorageSettings; import ca.uhn.fhir.jpa.model.entity.StorageSettings;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test; import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.api.RestOperationTypeEnum; import ca.uhn.fhir.rest.api.RestOperationTypeEnum;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.client.api.IClientInterceptor; import ca.uhn.fhir.rest.client.api.IClientInterceptor;
@ -18,13 +19,16 @@ import ca.uhn.fhir.rest.client.api.IHttpResponse;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.util.HapiExtensions; import ca.uhn.fhir.util.HapiExtensions;
import org.hl7.fhir.instance.model.api.IBaseHasExtensions; import org.hl7.fhir.instance.model.api.IBaseHasExtensions;
import org.hl7.fhir.instance.model.api.IBaseMetaType;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.Binary; import org.hl7.fhir.r4.model.Binary;
import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.DocumentReference; import org.hl7.fhir.r4.model.DocumentReference;
import org.hl7.fhir.r4.model.Enumerations; import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.Extension; import org.hl7.fhir.r4.model.Extension;
import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Resource;
import org.hl7.fhir.r4.model.StringType; import org.hl7.fhir.r4.model.StringType;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
@ -32,7 +36,6 @@ import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.EnumSource; import org.junit.jupiter.params.provider.EnumSource;
import org.junit.jupiter.params.provider.ValueSource;
import org.mockito.junit.jupiter.MockitoExtension; import org.mockito.junit.jupiter.MockitoExtension;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@ -62,6 +65,7 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
public static final byte[] FEW_BYTES = {4, 3, 2, 1}; public static final byte[] FEW_BYTES = {4, 3, 2, 1};
public static final byte[] SOME_BYTES = {1, 2, 3, 4, 5, 6, 7, 8, 7, 6, 5, 4, 3, 2, 1, 8, 9, 0, 10, 9}; public static final byte[] SOME_BYTES = {1, 2, 3, 4, 5, 6, 7, 8, 7, 6, 5, 4, 3, 2, 1, 8, 9, 0, 10, 9};
public static final byte[] SOME_BYTES_2 = {6, 7, 8, 7, 6, 5, 4, 3, 2, 1, 5, 5, 5, 6}; public static final byte[] SOME_BYTES_2 = {6, 7, 8, 7, 6, 5, 4, 3, 2, 1, 5, 5, 5, 6};
public static final byte[] SOME_BYTES_3 = {5, 5, 5, 6, 6, 7, 7, 7, 8, 8, 8};
private static final Logger ourLog = LoggerFactory.getLogger(BinaryStorageInterceptorR4Test.class); private static final Logger ourLog = LoggerFactory.getLogger(BinaryStorageInterceptorR4Test.class);
@Autowired @Autowired
@ -381,12 +385,8 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
// Create a resource with a big enough docRef // Create a resource with a big enough docRef
DocumentReference docRef = new DocumentReference(); DocumentReference docRef = new DocumentReference();
DocumentReference.DocumentReferenceContentComponent content = docRef.addContent(); addDocumentAttachmentData(docRef, SOME_BYTES);
content.getAttachment().setContentType("application/octet-stream"); addDocumentAttachmentData(docRef, SOME_BYTES_2);
content.getAttachment().setData(SOME_BYTES);
DocumentReference.DocumentReferenceContentComponent content2 = docRef.addContent();
content2.getAttachment().setContentType("application/octet-stream");
content2.getAttachment().setData(SOME_BYTES_2);
DaoMethodOutcome outcome = myDocumentReferenceDao.create(docRef, mySrd); DaoMethodOutcome outcome = myDocumentReferenceDao.create(docRef, mySrd);
// Make sure it was externalized // Make sure it was externalized
@ -422,18 +422,73 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
} }
@Test
public void testCreateBinaryAttachments_bundleWithMultipleDocumentReferences_createdAndReadBackSuccessfully() {
// Create Patient
Patient patient = new Patient();
patient.addIdentifier().setSystem("urn:system").setValue("001");
patient.addName().addGiven("Johnny").setFamily("Walker");
// Create first DocumentReference with a big enough attachments
DocumentReference docRef = new DocumentReference();
addDocumentAttachmentData(docRef, SOME_BYTES);
addDocumentAttachmentData(docRef, SOME_BYTES_2);
// Create second DocumentReference with a big enough attachment
DocumentReference docRef2 = new DocumentReference();
addDocumentAttachmentData(docRef2, SOME_BYTES_3);
// Create Bundle
Bundle bundle = new Bundle();
bundle.setType(Bundle.BundleType.TRANSACTION);
// Patient entry component
addBundleEntry(bundle, patient, "Patient");
// First DocumentReference entry component
addBundleEntry(bundle, docRef, "DocumentReference");
// Second DocumentReference entry component
addBundleEntry(bundle, docRef2, "DocumentReference");
// Execute transaction
Bundle output = myClient.transaction().withBundle(bundle).execute();
ourLog.debug(myFhirContext.newXmlParser().setPrettyPrint(true).encodeResourceToString(output));
// Verify bundle response
assertEquals(3, output.getEntry().size());
output.getEntry().forEach(entry -> assertEquals("201 Created", entry.getResponse().getStatus()));
// Read back and verify first DocumentReference and attachments
IIdType firstDocRef = new IdType(output.getEntry().get(1).getResponse().getLocation());
DocumentReference firstDoc = myDocumentReferenceDao.read(firstDocRef, mySrd);
assertEquals("application/octet-stream", firstDoc.getContentFirstRep().getAttachment().getContentType());
assertArrayEquals(SOME_BYTES, firstDoc.getContentFirstRep().getAttachment().getData());
assertEquals("application/octet-stream", firstDoc.getContent().get(1).getAttachment().getContentType());
assertArrayEquals(SOME_BYTES_2, firstDoc.getContent().get(1).getAttachment().getData());
// Read back and verify second DocumentReference and attachment
IIdType secondDocRef = new IdType(output.getEntry().get(2).getResponse().getLocation());
DocumentReference secondDoc = myDocumentReferenceDao.read(secondDocRef, mySrd);
assertEquals("application/octet-stream", secondDoc.getContentFirstRep().getAttachment().getContentType());
assertArrayEquals(SOME_BYTES_3, secondDoc.getContentFirstRep().getAttachment().getData());
}
private void addBundleEntry(Bundle theBundle, Resource theResource, String theUrl) {
Bundle.BundleEntryComponent getComponent = new Bundle.BundleEntryComponent();
Bundle.BundleEntryRequestComponent requestComponent = new Bundle.BundleEntryRequestComponent();
requestComponent.setMethod(Bundle.HTTPVerb.POST);
requestComponent.setUrl(theUrl);
getComponent.setRequest(requestComponent);
getComponent.setResource(theResource);
getComponent.setFullUrl(IdDt.newRandomUuid().getValue());
theBundle.addEntry(getComponent);
}
@Test @Test
public void testUpdateRejectsIncorrectBinary() { public void testUpdateRejectsIncorrectBinary() {
// Create a resource with a big enough docRef // Create a resource with a big enough docRef
DocumentReference docRef = new DocumentReference(); DocumentReference docRef = new DocumentReference();
DocumentReference.DocumentReferenceContentComponent content = docRef.addContent(); addDocumentAttachmentData(docRef, SOME_BYTES);
content.getAttachment().setContentType("application/octet-stream"); addDocumentAttachmentData(docRef, SOME_BYTES_2);
content.getAttachment().setData(SOME_BYTES);
DocumentReference.DocumentReferenceContentComponent content2 = docRef.addContent();
content2.getAttachment().setContentType("application/octet-stream");
content2.getAttachment().setData(SOME_BYTES_2);
DaoMethodOutcome outcome = myDocumentReferenceDao.create(docRef, mySrd); DaoMethodOutcome outcome = myDocumentReferenceDao.create(docRef, mySrd);
// Make sure it was externalized // Make sure it was externalized
@ -449,13 +504,13 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
docRef = new DocumentReference(); docRef = new DocumentReference();
docRef.setId(id.toUnqualifiedVersionless()); docRef.setId(id.toUnqualifiedVersionless());
docRef.setStatus(Enumerations.DocumentReferenceStatus.CURRENT); docRef.setStatus(Enumerations.DocumentReferenceStatus.CURRENT);
content = docRef.addContent(); DocumentReference.DocumentReferenceContentComponent content = docRef.addContent();
content.getAttachment().setContentType("application/octet-stream"); content.getAttachment().setContentType("application/octet-stream");
content.getAttachment().getDataElement().addExtension( content.getAttachment().getDataElement().addExtension(
HapiExtensions.EXT_EXTERNALIZED_BINARY_ID, HapiExtensions.EXT_EXTERNALIZED_BINARY_ID,
new StringType(binaryId) new StringType(binaryId)
); );
content2 = docRef.addContent(); DocumentReference.DocumentReferenceContentComponent content2 = docRef.addContent();
content2.getAttachment().setContentType("application/octet-stream"); content2.getAttachment().setContentType("application/octet-stream");
content2.getAttachment().getDataElement().addExtension( content2.getAttachment().getDataElement().addExtension(
HapiExtensions.EXT_EXTERNALIZED_BINARY_ID, HapiExtensions.EXT_EXTERNALIZED_BINARY_ID,
@ -497,5 +552,10 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
} }
private void addDocumentAttachmentData(DocumentReference theDocumentReference, byte[] theData) {
DocumentReference.DocumentReferenceContentComponent content = theDocumentReference.addContent();
content.getAttachment().setContentType("application/octet-stream");
content.getAttachment().setData(theData);
}
} }

View File

@ -12,6 +12,7 @@ import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.PreferReturnEnum; import ca.uhn.fhir.rest.api.PreferReturnEnum;
import ca.uhn.fhir.rest.api.RestOperationTypeEnum; import ca.uhn.fhir.rest.api.RestOperationTypeEnum;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.client.api.IHttpResponse;
import ca.uhn.fhir.rest.client.interceptor.CapturingInterceptor; import ca.uhn.fhir.rest.client.interceptor.CapturingInterceptor;
import ca.uhn.fhir.rest.gclient.StringClientParam; import ca.uhn.fhir.rest.gclient.StringClientParam;
import ca.uhn.fhir.rest.server.exceptions.BaseServerResponseException; import ca.uhn.fhir.rest.server.exceptions.BaseServerResponseException;
@ -66,12 +67,15 @@ import java.util.Arrays;
import java.util.List; import java.util.List;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static org.apache.commons.lang3.StringUtils.isEmpty;
import static org.apache.commons.lang3.StringUtils.leftPad; import static org.apache.commons.lang3.StringUtils.leftPad;
import static org.awaitility.Awaitility.await; import static org.awaitility.Awaitility.await;
import static org.hamcrest.CoreMatchers.containsString; import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.CoreMatchers.is; import static org.hamcrest.Matchers.hasItem;
import static org.hamcrest.CoreMatchers.not; import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.empty;
import static org.hamcrest.Matchers.not;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.blankOrNullString; import static org.hamcrest.Matchers.blankOrNullString;
import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.hasSize;
@ -189,8 +193,64 @@ public class ConsentInterceptorResourceProviderR4IT extends BaseResourceProvider
myServer.getRestfulServer().getInterceptorService().registerInterceptor(myConsentInterceptor); myServer.getRestfulServer().getInterceptorService().registerInterceptor(myConsentInterceptor);
// Perform a search and only allow even // Perform a search and only allow even
String context = "active consent - hide odd";
consentService.setTarget(new ConsentSvcCantSeeOddNumbered()); consentService.setTarget(new ConsentSvcCantSeeOddNumbered());
Bundle result = myClient List<String> returnedIdValues = searchForObservations();
assertEquals(myObservationIdsEvenOnly.subList(0, 15), returnedIdValues);
assertResponseIsNotFromCache(context, capture.getLastResponse());
// Perform a search and only allow odd
context = "active consent - hide even";
consentService.setTarget(new ConsentSvcCantSeeEvenNumbered());
returnedIdValues = searchForObservations();
assertEquals(myObservationIdsOddOnly.subList(0, 15), returnedIdValues);
assertResponseIsNotFromCache(context, capture.getLastResponse());
// Perform a search and allow all with a PROCEED
context = "active consent - PROCEED on cache";
consentService.setTarget(new ConsentSvcNop(ConsentOperationStatusEnum.PROCEED));
returnedIdValues = searchForObservations();
assertEquals(myObservationIds.subList(0, 15), returnedIdValues);
assertResponseIsNotFromCache(context, capture.getLastResponse());
// Perform a search and allow all with an AUTHORIZED (no further checking)
context = "active consent - AUTHORIZED after a PROCEED";
consentService.setTarget(new ConsentSvcNop(ConsentOperationStatusEnum.AUTHORIZED));
returnedIdValues = searchForObservations();
assertEquals(myObservationIds.subList(0, 15), returnedIdValues);
// Perform a second search and allow all with an AUTHORIZED (no further checking)
// which means we should finally get one from the cache
context = "active consent - AUTHORIZED after AUTHORIZED";
consentService.setTarget(new ConsentSvcNop(ConsentOperationStatusEnum.AUTHORIZED));
returnedIdValues = searchForObservations();
assertEquals(myObservationIds.subList(0, 15), returnedIdValues);
assertResponseIsFromCache(context, capture.getLastResponse());
// Perform another search, now with an active consent interceptor that promises not to use canSeeResource.
// Should re-use cache result
context = "active consent - canSeeResource disabled, after AUTHORIZED - should reuse cache";
consentService.setTarget(new ConsentSvcNop(ConsentOperationStatusEnum.PROCEED, false));
returnedIdValues = searchForObservations();
assertEquals(myObservationIds.subList(0, 15), returnedIdValues);
assertResponseIsFromCache(context, capture.getLastResponse());
myClient.unregisterInterceptor(capture);
}
private static void assertResponseIsNotFromCache(String theContext, IHttpResponse lastResponse) {
List<String> cacheOutcome= lastResponse.getHeaders(Constants.HEADER_X_CACHE);
assertThat(theContext + " - No cache response headers", cacheOutcome, empty());
}
private static void assertResponseIsFromCache(String theContext, IHttpResponse lastResponse) {
List<String> cacheOutcome = lastResponse.getHeaders(Constants.HEADER_X_CACHE);
assertThat(theContext + " - Response came from cache", cacheOutcome, hasItem(matchesPattern("^HIT from .*")));
}
private List<String> searchForObservations() {
Bundle result;
result = myClient
.search() .search()
.forResource("Observation") .forResource("Observation")
.sort() .sort()
@ -199,77 +259,7 @@ public class ConsentInterceptorResourceProviderR4IT extends BaseResourceProvider
.count(15) .count(15)
.execute(); .execute();
List<IBaseResource> resources = BundleUtil.toListOfResources(myFhirContext, result); List<IBaseResource> resources = BundleUtil.toListOfResources(myFhirContext, result);
List<String> returnedIdValues = toUnqualifiedVersionlessIdValues(resources); return toUnqualifiedVersionlessIdValues(resources);
assertEquals(myObservationIdsEvenOnly.subList(0, 15), returnedIdValues);
List<String> cacheOutcome = capture.getLastResponse().getHeaders(Constants.HEADER_X_CACHE);
assertEquals(0, cacheOutcome.size());
// Perform a search and only allow odd
consentService.setTarget(new ConsentSvcCantSeeEvenNumbered());
result = myClient
.search()
.forResource("Observation")
.sort()
.ascending(Observation.SP_IDENTIFIER)
.returnBundle(Bundle.class)
.count(15)
.execute();
resources = BundleUtil.toListOfResources(myFhirContext, result);
returnedIdValues = toUnqualifiedVersionlessIdValues(resources);
assertEquals(myObservationIdsOddOnly.subList(0, 15), returnedIdValues);
cacheOutcome = capture.getLastResponse().getHeaders(Constants.HEADER_X_CACHE);
assertEquals(0, cacheOutcome.size());
// Perform a search and allow all with a PROCEED
consentService.setTarget(new ConsentSvcNop(ConsentOperationStatusEnum.PROCEED));
result = myClient
.search()
.forResource("Observation")
.sort()
.ascending(Observation.SP_IDENTIFIER)
.returnBundle(Bundle.class)
.count(15)
.execute();
resources = BundleUtil.toListOfResources(myFhirContext, result);
returnedIdValues = toUnqualifiedVersionlessIdValues(resources);
assertEquals(myObservationIds.subList(0, 15), returnedIdValues);
cacheOutcome = capture.getLastResponse().getHeaders(Constants.HEADER_X_CACHE);
assertEquals(0, cacheOutcome.size());
// Perform a search and allow all with an AUTHORIZED (no further checking)
consentService.setTarget(new ConsentSvcNop(ConsentOperationStatusEnum.AUTHORIZED));
result = myClient
.search()
.forResource("Observation")
.sort()
.ascending(Observation.SP_IDENTIFIER)
.returnBundle(Bundle.class)
.count(15)
.execute();
resources = BundleUtil.toListOfResources(myFhirContext, result);
returnedIdValues = toUnqualifiedVersionlessIdValues(resources);
assertEquals(myObservationIds.subList(0, 15), returnedIdValues);
cacheOutcome = capture.getLastResponse().getHeaders(Constants.HEADER_X_CACHE);
assertEquals(0, cacheOutcome.size());
// Perform a second search and allow all with an AUTHORIZED (no further checking)
// which means we should finally get one from the cache
consentService.setTarget(new ConsentSvcNop(ConsentOperationStatusEnum.AUTHORIZED));
result = myClient
.search()
.forResource("Observation")
.sort()
.ascending(Observation.SP_IDENTIFIER)
.returnBundle(Bundle.class)
.count(15)
.execute();
resources = BundleUtil.toListOfResources(myFhirContext, result);
returnedIdValues = toUnqualifiedVersionlessIdValues(resources);
assertEquals(myObservationIds.subList(0, 15), returnedIdValues);
cacheOutcome = capture.getLastResponse().getHeaders(Constants.HEADER_X_CACHE);
assertThat(cacheOutcome.get(0), matchesPattern("^HIT from .*"));
myClient.unregisterInterceptor(capture);
} }
@Test @Test
@ -528,6 +518,7 @@ public class ConsentInterceptorResourceProviderR4IT extends BaseResourceProvider
IConsentService svc = mock(IConsentService.class); IConsentService svc = mock(IConsentService.class);
when(svc.startOperation(any(), any())).thenReturn(ConsentOutcome.PROCEED); when(svc.startOperation(any(), any())).thenReturn(ConsentOutcome.PROCEED);
when(svc.shouldProcessCanSeeResource(any(), any())).thenReturn(true);
when(svc.canSeeResource(any(), any(), any())).thenReturn(ConsentOutcome.REJECT); when(svc.canSeeResource(any(), any(), any())).thenReturn(ConsentOutcome.REJECT);
consentService.setTarget(svc); consentService.setTarget(svc);
@ -560,6 +551,7 @@ public class ConsentInterceptorResourceProviderR4IT extends BaseResourceProvider
IConsentService svc = mock(IConsentService.class); IConsentService svc = mock(IConsentService.class);
when(svc.startOperation(any(), any())).thenReturn(ConsentOutcome.PROCEED); when(svc.startOperation(any(), any())).thenReturn(ConsentOutcome.PROCEED);
when(svc.shouldProcessCanSeeResource(any(), any())).thenReturn(true);
when(svc.canSeeResource(any(RequestDetails.class), any(IBaseResource.class), any())).thenAnswer(t -> { when(svc.canSeeResource(any(RequestDetails.class), any(IBaseResource.class), any())).thenAnswer(t -> {
IBaseResource resource = t.getArgument(1, IBaseResource.class); IBaseResource resource = t.getArgument(1, IBaseResource.class);
if (resource instanceof Organization) { if (resource instanceof Organization) {
@ -998,16 +990,27 @@ public class ConsentInterceptorResourceProviderR4IT extends BaseResourceProvider
private static class ConsentSvcNop implements IConsentService { private static class ConsentSvcNop implements IConsentService {
private final ConsentOperationStatusEnum myOperationStatus; private final ConsentOperationStatusEnum myOperationStatus;
private boolean myEnableCanSeeResource = true;
private ConsentSvcNop(ConsentOperationStatusEnum theOperationStatus) { private ConsentSvcNop(ConsentOperationStatusEnum theOperationStatus) {
myOperationStatus = theOperationStatus; myOperationStatus = theOperationStatus;
} }
private ConsentSvcNop(ConsentOperationStatusEnum theOperationStatus, boolean theEnableCanSeeResource) {
myOperationStatus = theOperationStatus;
myEnableCanSeeResource = theEnableCanSeeResource;
}
@Override @Override
public ConsentOutcome startOperation(RequestDetails theRequestDetails, IConsentContextServices theContextServices) { public ConsentOutcome startOperation(RequestDetails theRequestDetails, IConsentContextServices theContextServices) {
return new ConsentOutcome(myOperationStatus); return new ConsentOutcome(myOperationStatus);
} }
@Override
public boolean shouldProcessCanSeeResource(RequestDetails theRequestDetails, IConsentContextServices theContextServices) {
return myEnableCanSeeResource;
}
@Override @Override
public ConsentOutcome canSeeResource(RequestDetails theRequestDetails, IBaseResource theResource, IConsentContextServices theContextServices) { public ConsentOutcome canSeeResource(RequestDetails theRequestDetails, IBaseResource theResource, IConsentContextServices theContextServices) {
return new ConsentOutcome(ConsentOperationStatusEnum.PROCEED); return new ConsentOutcome(ConsentOperationStatusEnum.PROCEED);

View File

@ -23,6 +23,7 @@ import ca.uhn.fhir.rest.server.RestfulServer;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.MethodNotAllowedException; import ca.uhn.fhir.rest.server.exceptions.MethodNotAllowedException;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.rest.server.interceptor.auth.SearchNarrowingInterceptor;
import ca.uhn.fhir.rest.server.provider.ProviderConstants; import ca.uhn.fhir.rest.server.provider.ProviderConstants;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails; import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import ca.uhn.fhir.test.utilities.ITestDataBuilder; import ca.uhn.fhir.test.utilities.ITestDataBuilder;
@ -40,11 +41,14 @@ import org.hl7.fhir.r4.model.Condition;
import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.Organization; import org.hl7.fhir.r4.model.Organization;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Resource;
import org.hl7.fhir.r4.model.StringType; import org.hl7.fhir.r4.model.StringType;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Nested; import org.junit.jupiter.api.Nested;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.ValueSource;
import org.mockito.InjectMocks; import org.mockito.InjectMocks;
import org.mockito.Mock; import org.mockito.Mock;
import org.mockito.Spy; import org.mockito.Spy;
@ -65,6 +69,7 @@ import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.hasSize;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertNull;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.fail;
@ -424,6 +429,55 @@ public class MultitenantServerR4Test extends BaseMultitenantResourceProviderR4Te
} }
@Test
public void testTransactionPut_withSearchNarrowingInterceptor_createsPatient() {
// setup
IBaseResource patientA = buildPatient(withTenant(TENANT_B), withActiveTrue(), withId("1234a"),
withFamily("Family"), withGiven("Given"));
Bundle transactioBundle = new Bundle();
transactioBundle.setType(Bundle.BundleType.TRANSACTION);
transactioBundle.addEntry()
.setFullUrl("http://localhost:8000/TENANT-A/Patient/1234a")
.setResource((Resource) patientA)
.getRequest().setUrl("Patient/1234a").setMethod(Bundle.HTTPVerb.PUT);
myServer.registerInterceptor(new SearchNarrowingInterceptor());
// execute
myClient.transaction().withBundle(transactioBundle).execute();
// verify - read back using DAO
SystemRequestDetails requestDetails = new SystemRequestDetails();
requestDetails.setTenantId(TENANT_B);
Patient patient1 = myPatientDao.read(new IdType("Patient/1234a"), requestDetails);
assertEquals("Family", patient1.getName().get(0).getFamily());
}
@ParameterizedTest
@ValueSource(strings = {"Patient/1234a", "TENANT-B/Patient/1234a"})
public void testTransactionGet_withSearchNarrowingInterceptor_retrievesPatient(String theEntryUrl) {
// setup
createPatient(withTenant(TENANT_B), withActiveTrue(), withId("1234a"),
withFamily("Family"), withGiven("Given"));
Bundle transactioBundle = new Bundle();
transactioBundle.setType(Bundle.BundleType.TRANSACTION);
transactioBundle.addEntry()
.getRequest().setUrl(theEntryUrl).setMethod(Bundle.HTTPVerb.GET);
myServer.registerInterceptor(new SearchNarrowingInterceptor());
// execute
Bundle result = myClient.transaction().withBundle(transactioBundle).execute();
// verify
assertEquals(1, result.getEntry().size());
Patient retrievedPatient = (Patient) result.getEntry().get(0).getResource();
assertNotNull(retrievedPatient);
assertEquals("Family", retrievedPatient.getName().get(0).getFamily());
}
@Test @Test
public void testDirectDaoAccess_PartitionInRequestDetails_Create() { public void testDirectDaoAccess_PartitionInRequestDetails_Create() {

View File

@ -48,7 +48,7 @@ public class StaleSearchDeletingSvcR4Test extends BaseResourceProviderR4Test {
super.after(); super.after();
DatabaseSearchCacheSvcImpl staleSearchDeletingSvc = AopTestUtils.getTargetObject(mySearchCacheSvc); DatabaseSearchCacheSvcImpl staleSearchDeletingSvc = AopTestUtils.getTargetObject(mySearchCacheSvc);
staleSearchDeletingSvc.setCutoffSlackForUnitTest(DatabaseSearchCacheSvcImpl.SEARCH_CLEANUP_JOB_INTERVAL_MILLIS); staleSearchDeletingSvc.setCutoffSlackForUnitTest(DatabaseSearchCacheSvcImpl.SEARCH_CLEANUP_JOB_INTERVAL_MILLIS);
DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteForUnitTest(DatabaseSearchCacheSvcImpl.DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_STMT); DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOneStatement(DatabaseSearchCacheSvcImpl.DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_STMT);
DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOnePassForUnitTest(DatabaseSearchCacheSvcImpl.DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_PAS); DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOnePassForUnitTest(DatabaseSearchCacheSvcImpl.DEFAULT_MAX_RESULTS_TO_DELETE_IN_ONE_PAS);
} }
@ -108,7 +108,7 @@ public class StaleSearchDeletingSvcR4Test extends BaseResourceProviderR4Test {
@Test @Test
public void testDeleteVeryLargeSearch() { public void testDeleteVeryLargeSearch() {
DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteForUnitTest(10); DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOneStatement(10);
DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOnePassForUnitTest(10); DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOnePassForUnitTest(10);
runInTransaction(() -> { runInTransaction(() -> {
@ -120,24 +120,21 @@ public class StaleSearchDeletingSvcR4Test extends BaseResourceProviderR4Test {
search.setResourceType("Patient"); search.setResourceType("Patient");
search = mySearchEntityDao.save(search); search = mySearchEntityDao.save(search);
for (int i = 0; i < 15; i++) {
ResourceTable resource = new ResourceTable(); ResourceTable resource = new ResourceTable();
resource.setPublished(new Date()); resource.setPublished(new Date());
resource.setUpdated(new Date()); resource.setUpdated(new Date());
resource.setResourceType("Patient"); resource.setResourceType("Patient");
resource = myResourceTableDao.saveAndFlush(resource); resource = myResourceTableDao.saveAndFlush(resource);
for (int i = 0; i < 50; i++) {
SearchResult sr = new SearchResult(search); SearchResult sr = new SearchResult(search);
sr.setOrder(i); sr.setOrder(i);
sr.setResourcePid(resource.getId()); sr.setResourcePid(resource.getId());
mySearchResultDao.save(sr); mySearchResultDao.save(sr);
} }
}); });
// It should take two passes to delete the search fully // we are able to delete this in one pass.
runInTransaction(() -> assertEquals(1, mySearchEntityDao.count()));
myStaleSearchDeletingSvc.pollForStaleSearchesAndDeleteThem();
runInTransaction(() -> assertEquals(1, mySearchEntityDao.count())); runInTransaction(() -> assertEquals(1, mySearchEntityDao.count()));
myStaleSearchDeletingSvc.pollForStaleSearchesAndDeleteThem(); myStaleSearchDeletingSvc.pollForStaleSearchesAndDeleteThem();
runInTransaction(() -> assertEquals(0, mySearchEntityDao.count())); runInTransaction(() -> assertEquals(0, mySearchEntityDao.count()));
@ -146,7 +143,7 @@ public class StaleSearchDeletingSvcR4Test extends BaseResourceProviderR4Test {
@Test @Test
public void testDeleteVerySmallSearch() { public void testDeleteVerySmallSearch() {
DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteForUnitTest(10); DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOneStatement(10);
runInTransaction(() -> { runInTransaction(() -> {
Search search = new Search(); Search search = new Search();
@ -172,7 +169,7 @@ public class StaleSearchDeletingSvcR4Test extends BaseResourceProviderR4Test {
@Test @Test
public void testDontDeleteSearchBeforeExpiry() { public void testDontDeleteSearchBeforeExpiry() {
DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteForUnitTest(10); DatabaseSearchCacheSvcImpl.setMaximumResultsToDeleteInOneStatement(10);
runInTransaction(() -> { runInTransaction(() -> {
Search search = new Search(); Search search = new Search();
@ -186,7 +183,7 @@ public class StaleSearchDeletingSvcR4Test extends BaseResourceProviderR4Test {
search.setCreated(DateUtils.addDays(new Date(), -10000)); search.setCreated(DateUtils.addDays(new Date(), -10000));
search.setSearchType(SearchTypeEnum.SEARCH); search.setSearchType(SearchTypeEnum.SEARCH);
search.setResourceType("Patient"); search.setResourceType("Patient");
search = mySearchEntityDao.save(search); mySearchEntityDao.save(search);
}); });

View File

@ -253,22 +253,24 @@ public class MessageSubscriptionR4Test extends BaseSubscriptionsR4Test {
} }
@Test @Test
public void testPersistedResourceModifiedMessage_whenFetchFromDb_willEqualOriginalMessage() throws JsonProcessingException { public void testMethodInflatePersistedResourceModifiedMessage_whenGivenResourceModifiedMessageWithEmptyPayload_willEqualOriginalMessage() {
mySubscriptionTestUtil.unregisterSubscriptionInterceptor(); mySubscriptionTestUtil.unregisterSubscriptionInterceptor();
// given // setup
TransactionTemplate transactionTemplate = new TransactionTemplate(myTxManager); TransactionTemplate transactionTemplate = new TransactionTemplate(myTxManager);
Observation obs = sendObservation("zoop", "SNOMED-CT", "theExplicitSource", "theRequestId"); Observation obs = sendObservation("zoop", "SNOMED-CT", "theExplicitSource", "theRequestId");
ResourceModifiedMessage originalResourceModifiedMessage = createResourceModifiedMessage(obs); ResourceModifiedMessage originalResourceModifiedMessage = createResourceModifiedMessage(obs);
ResourceModifiedMessage resourceModifiedMessageWithEmptyPayload = createResourceModifiedMessage(obs);
resourceModifiedMessageWithEmptyPayload.setPayloadToNull();
transactionTemplate.execute(tx -> { transactionTemplate.execute(tx -> {
IPersistedResourceModifiedMessage persistedResourceModifiedMessage = myResourceModifiedMessagePersistenceSvc.persist(originalResourceModifiedMessage); myResourceModifiedMessagePersistenceSvc.persist(originalResourceModifiedMessage);
// when // execute
ResourceModifiedMessage restoredResourceModifiedMessage = myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessage(persistedResourceModifiedMessage); ResourceModifiedMessage restoredResourceModifiedMessage = myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessage(resourceModifiedMessageWithEmptyPayload);
// then // verify
assertEquals(toJson(originalResourceModifiedMessage), toJson(restoredResourceModifiedMessage)); assertEquals(toJson(originalResourceModifiedMessage), toJson(restoredResourceModifiedMessage));
assertEquals(originalResourceModifiedMessage, restoredResourceModifiedMessage); assertEquals(originalResourceModifiedMessage, restoredResourceModifiedMessage);

View File

@ -105,7 +105,7 @@ public class ResourceModifiedSubmitterSvcTest {
// given // given
// a successful deletion implies that the message did exist. // a successful deletion implies that the message did exist.
when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any())).thenReturn(true); when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any())).thenReturn(true);
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessage(any())).thenReturn(new ResourceModifiedMessage()); when(myResourceModifiedMessagePersistenceSvc.createResourceModifiedMessageFromEntityWithoutInflation(any())).thenReturn(new ResourceModifiedMessage());
// when // when
boolean wasProcessed = myResourceModifiedSubmitterSvc.submitPersisedResourceModifiedMessage(new ResourceModifiedEntity()); boolean wasProcessed = myResourceModifiedSubmitterSvc.submitPersisedResourceModifiedMessage(new ResourceModifiedEntity());
@ -134,7 +134,7 @@ public class ResourceModifiedSubmitterSvcTest {
// when // when
when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any())) when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any()))
.thenThrow(new RuntimeException(deleteExMsg)); .thenThrow(new RuntimeException(deleteExMsg));
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessage(any())) when(myResourceModifiedMessagePersistenceSvc.createResourceModifiedMessageFromEntityWithoutInflation(any()))
.thenThrow(new RuntimeException(inflationExMsg)); .thenThrow(new RuntimeException(inflationExMsg));
// test // test
@ -180,7 +180,7 @@ public class ResourceModifiedSubmitterSvcTest {
// when // when
when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any())) when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any()))
.thenReturn(true); .thenReturn(true);
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessage(any())) when(myResourceModifiedMessagePersistenceSvc.createResourceModifiedMessageFromEntityWithoutInflation(any()))
.thenReturn(msg); .thenReturn(msg);
when(myChannelProducer.send(any())) when(myChannelProducer.send(any()))
.thenThrow(new RuntimeException(exceptionString)); .thenThrow(new RuntimeException(exceptionString));
@ -206,7 +206,7 @@ public class ResourceModifiedSubmitterSvcTest {
// given // given
// deletion fails, someone else was faster and processed the message // deletion fails, someone else was faster and processed the message
when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any())).thenReturn(false); when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any())).thenReturn(false);
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessage(any())).thenReturn(new ResourceModifiedMessage()); when(myResourceModifiedMessagePersistenceSvc.createResourceModifiedMessageFromEntityWithoutInflation(any())).thenReturn(new ResourceModifiedMessage());
// when // when
boolean wasProcessed = myResourceModifiedSubmitterSvc.submitPersisedResourceModifiedMessage(new ResourceModifiedEntity()); boolean wasProcessed = myResourceModifiedSubmitterSvc.submitPersisedResourceModifiedMessage(new ResourceModifiedEntity());
@ -223,7 +223,7 @@ public class ResourceModifiedSubmitterSvcTest {
public void testSubmitPersistedResourceModifiedMessage_whitErrorOnSending_willRollbackDeletion(){ public void testSubmitPersistedResourceModifiedMessage_whitErrorOnSending_willRollbackDeletion(){
// given // given
when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any())).thenReturn(true); when(myResourceModifiedMessagePersistenceSvc.deleteByPK(any())).thenReturn(true);
when(myResourceModifiedMessagePersistenceSvc.inflatePersistedResourceModifiedMessage(any())).thenReturn(new ResourceModifiedMessage()); when(myResourceModifiedMessagePersistenceSvc.createResourceModifiedMessageFromEntityWithoutInflation(any())).thenReturn(new ResourceModifiedMessage());
// simulate failure writing to the channel // simulate failure writing to the channel
when(myChannelProducer.send(any())).thenThrow(new MessageDeliveryException("sendingError")); when(myChannelProducer.send(any())).thenThrow(new MessageDeliveryException("sendingError"));

View File

@ -46,6 +46,9 @@ import java.util.List;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
import static org.apache.commons.lang3.ArrayUtils.EMPTY_STRING_ARRAY;
import static org.hamcrest.Matchers.contains;
import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.everyItem; import static org.hamcrest.Matchers.everyItem;
import static org.hamcrest.Matchers.hasItems; import static org.hamcrest.Matchers.hasItems;
import static org.hamcrest.Matchers.in; import static org.hamcrest.Matchers.in;
@ -105,6 +108,10 @@ public class TestDaoSearch {
assertSearchResultIds(theQueryUrl, theReason, hasItems(theIds)); assertSearchResultIds(theQueryUrl, theReason, hasItems(theIds));
} }
public void assertSearchFinds(String theReason, String theQueryUrl, List<String> theIds) {
assertSearchFinds(theReason, theQueryUrl, theIds.toArray(EMPTY_STRING_ARRAY));
}
/** /**
* Assert that the FHIR search has theIds in the search results. * Assert that the FHIR search has theIds in the search results.
* @param theReason junit reason message * @param theReason junit reason message
@ -117,6 +124,27 @@ public class TestDaoSearch {
assertSearchResultIds(theQueryUrl, theReason, hasItems(bareIds)); assertSearchResultIds(theQueryUrl, theReason, hasItems(bareIds));
} }
public void assertSearchFindsInOrder(String theReason, String theQueryUrl, String... theIds) {
List<String> ids = searchForIds(theQueryUrl);
MatcherAssert.assertThat(theReason, ids, contains(theIds));
}
public void assertSearchFindsInOrder(String theReason, String theQueryUrl, List<String> theIds) {
assertSearchFindsInOrder(theReason, theQueryUrl, theIds.toArray(EMPTY_STRING_ARRAY));
}
public void assertSearchFindsOnly(String theReason, String theQueryUrl, String... theIds) {
assertSearchIdsMatch(theReason, theQueryUrl, containsInAnyOrder(theIds));
}
public void assertSearchIdsMatch(
String theReason, String theQueryUrl, Matcher<? super Iterable<String>> theMatchers) {
List<String> ids = searchForIds(theQueryUrl);
MatcherAssert.assertThat(theReason, ids, theMatchers);
}
public void assertSearchResultIds(String theQueryUrl, String theReason, Matcher<Iterable<String>> matcher) { public void assertSearchResultIds(String theQueryUrl, String theReason, Matcher<Iterable<String>> matcher) {
List<String> ids = searchForIds(theQueryUrl); List<String> ids = searchForIds(theQueryUrl);

View File

@ -0,0 +1,74 @@
/*-
* #%L
* HAPI FHIR JPA Server Test Utilities
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.search;
import ca.uhn.fhir.jpa.dao.TestDaoSearch;
import ca.uhn.fhir.test.utilities.ITestDataBuilder;
import org.hl7.fhir.instance.model.api.IIdType;
import org.junit.jupiter.api.Test;
import java.util.List;
public interface IIdSearchTestTemplate {
TestDaoSearch getSearch();
ITestDataBuilder getBuilder();
@Test
default void testSearchByServerAssignedId_findsResource() {
IIdType id = getBuilder().createPatient();
getSearch().assertSearchFinds("search by server assigned id", "Patient?_id=" + id.getIdPart(), id);
}
@Test
default void testSearchByClientAssignedId_findsResource() {
ITestDataBuilder b = getBuilder();
b.createPatient(b.withId("client-assigned-id"));
getSearch()
.assertSearchFinds(
"search by client assigned id", "Patient?_id=client-assigned-id", "client-assigned-id");
}
/**
* The _id SP is defined as token, and there is no system.
* So sorting should be string order of the value.
*/
@Test
default void testSortById_treatsIdsAsString() {
ITestDataBuilder b = getBuilder();
b.createPatient(b.withId("client-assigned-id"));
IIdType serverId = b.createPatient();
b.createPatient(b.withId("0-sorts-before-other-numbers"));
getSearch()
.assertSearchFindsInOrder(
"sort by resource id",
"Patient?_sort=_id",
List.of("0-sorts-before-other-numbers", serverId.getIdPart(), "client-assigned-id"));
getSearch()
.assertSearchFindsInOrder(
"reverse sort by resource id",
"Patient?_sort=-_id",
List.of("client-assigned-id", serverId.getIdPart(), "0-sorts-before-other-numbers"));
}
}

View File

@ -431,6 +431,11 @@ public abstract class BaseJpaTest extends BaseTest {
return deliveryLatch; return deliveryLatch;
} }
protected void registerInterceptor(Object theInterceptor) {
myRegisteredInterceptors.add(theInterceptor);
myInterceptorRegistry.registerInterceptor(theInterceptor);
}
protected void purgeHibernateSearch(EntityManager theEntityManager) { protected void purgeHibernateSearch(EntityManager theEntityManager) {
runInTransaction(() -> { runInTransaction(() -> {
if (myFulltestSearchSvc != null && !myFulltestSearchSvc.isDisabled()) { if (myFulltestSearchSvc != null && !myFulltestSearchSvc.isDisabled()) {

View File

@ -65,6 +65,7 @@ public class ConnectionWrapper implements Connection {
@Override @Override
public void commit() throws SQLException { public void commit() throws SQLException {
if (ourLog.isTraceEnabled()) { ourLog.trace("commit: {}", myWrap.hashCode()); }
myWrap.commit(); myWrap.commit();
} }

View File

@ -46,6 +46,7 @@ public class ConnectionWrapper implements Connection {
@Override @Override
public void commit() throws SQLException { public void commit() throws SQLException {
if (ourLog.isTraceEnabled()) { ourLog.trace("Commit: {}", myWrap.hashCode()); }
myWrap.commit(); myWrap.commit();
} }

View File

@ -36,7 +36,7 @@ public class HapiFhirHibernateJpaDialectTest {
assertThat(outcome.getMessage(), containsString("this is a message")); assertThat(outcome.getMessage(), containsString("this is a message"));
try { try {
mySvc.convertHibernateAccessException(new ConstraintViolationException("this is a message", new SQLException("reason"), ResourceTable.IDX_RES_FHIR_ID)); mySvc.convertHibernateAccessException(new ConstraintViolationException("this is a message", new SQLException("reason"), ResourceTable.IDX_RES_TYPE_FHIR_ID));
fail(); fail();
} catch (ResourceVersionConflictException e) { } catch (ResourceVersionConflictException e) {
assertThat(e.getMessage(), containsString("The operation has failed with a client-assigned ID constraint failure")); assertThat(e.getMessage(), containsString("The operation has failed with a client-assigned ID constraint failure"));
@ -67,7 +67,7 @@ public class HapiFhirHibernateJpaDialectTest {
assertEquals("FOO", outcome.getMessage()); assertEquals("FOO", outcome.getMessage());
try { try {
PersistenceException exception = new PersistenceException("a message", new ConstraintViolationException("this is a message", new SQLException("reason"), ResourceTable.IDX_RES_FHIR_ID)); PersistenceException exception = new PersistenceException("a message", new ConstraintViolationException("this is a message", new SQLException("reason"), ResourceTable.IDX_RES_TYPE_FHIR_ID));
mySvc.translate(exception, "a message"); mySvc.translate(exception, "a message");
fail(); fail();
} catch (ResourceVersionConflictException e) { } catch (ResourceVersionConflictException e) {

View File

@ -26,6 +26,7 @@ import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.api.server.SystemRestfulResponse; import ca.uhn.fhir.rest.api.server.SystemRestfulResponse;
import ca.uhn.fhir.rest.server.RestfulServer; import ca.uhn.fhir.rest.server.RestfulServer;
import ca.uhn.hapi.fhir.cdshooks.svc.cr.CdsCrSettings;
import com.fasterxml.jackson.databind.ObjectMapper; import com.fasterxml.jackson.databind.ObjectMapper;
import org.opencds.cqf.fhir.utility.Ids; import org.opencds.cqf.fhir.utility.Ids;
@ -39,6 +40,9 @@ public interface ICdsConfigService {
@Nonnull @Nonnull
ObjectMapper getObjectMapper(); ObjectMapper getObjectMapper();
@Nonnull
CdsCrSettings getCdsCrSettings();
@Nullable @Nullable
default DaoRegistry getDaoRegistry() { default DaoRegistry getDaoRegistry() {
return null; return null;

View File

@ -0,0 +1,37 @@
/*-
* #%L
* HAPI FHIR - CDS Hooks
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.hapi.fhir.cdshooks.config;
import ca.uhn.fhir.cr.config.CrConfigCondition;
import ca.uhn.fhir.cr.config.RepositoryConfig;
import ca.uhn.fhir.cr.config.r4.ApplyOperationConfig;
import org.springframework.context.annotation.Conditional;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
/**
* This class exists as a wrapper for the CR configs required for CDS on FHIR to be loaded only when dependencies are met.
* Adding the condition to the configs themselves causes issues with downstream projects.
*
*/
@Configuration
@Conditional(CrConfigCondition.class)
@Import({RepositoryConfig.class, ApplyOperationConfig.class})
public class CdsCrConfig {}

View File

@ -35,6 +35,7 @@ import ca.uhn.hapi.fhir.cdshooks.svc.CdsConfigServiceImpl;
import ca.uhn.hapi.fhir.cdshooks.svc.CdsHooksContextBooter; import ca.uhn.hapi.fhir.cdshooks.svc.CdsHooksContextBooter;
import ca.uhn.hapi.fhir.cdshooks.svc.CdsServiceRegistryImpl; import ca.uhn.hapi.fhir.cdshooks.svc.CdsServiceRegistryImpl;
import ca.uhn.hapi.fhir.cdshooks.svc.cr.CdsCrServiceRegistry; import ca.uhn.hapi.fhir.cdshooks.svc.cr.CdsCrServiceRegistry;
import ca.uhn.hapi.fhir.cdshooks.svc.cr.CdsCrSettings;
import ca.uhn.hapi.fhir.cdshooks.svc.cr.CdsServiceInterceptor; import ca.uhn.hapi.fhir.cdshooks.svc.cr.CdsServiceInterceptor;
import ca.uhn.hapi.fhir.cdshooks.svc.cr.ICdsCrService; import ca.uhn.hapi.fhir.cdshooks.svc.cr.ICdsCrService;
import ca.uhn.hapi.fhir.cdshooks.svc.cr.ICdsCrServiceFactory; import ca.uhn.hapi.fhir.cdshooks.svc.cr.ICdsCrServiceFactory;
@ -56,12 +57,14 @@ import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import java.lang.reflect.Constructor; import java.lang.reflect.Constructor;
import java.lang.reflect.InvocationTargetException; import java.lang.reflect.InvocationTargetException;
import java.util.Optional; import java.util.Optional;
@Configuration @Configuration
@Import(CdsCrConfig.class)
public class CdsHooksConfig { public class CdsHooksConfig {
private static final Logger ourLog = LoggerFactory.getLogger(CdsHooksConfig.class); private static final Logger ourLog = LoggerFactory.getLogger(CdsHooksConfig.class);
@ -128,8 +131,8 @@ public class CdsHooksConfig {
} }
try { try {
Constructor<? extends ICdsCrService> constructor = Constructor<? extends ICdsCrService> constructor =
clazz.get().getConstructor(RequestDetails.class, Repository.class); clazz.get().getConstructor(RequestDetails.class, Repository.class, ICdsConfigService.class);
return constructor.newInstance(rd, repository); return constructor.newInstance(rd, repository, theCdsConfigService);
} catch (NoSuchMethodException } catch (NoSuchMethodException
| InvocationTargetException | InvocationTargetException
| InstantiationException | InstantiationException
@ -189,9 +192,11 @@ public class CdsHooksConfig {
@Bean @Bean
public ICdsConfigService cdsConfigService( public ICdsConfigService cdsConfigService(
FhirContext theFhirContext, @Qualifier(CDS_HOOKS_OBJECT_MAPPER_FACTORY) ObjectMapper theObjectMapper) { FhirContext theFhirContext,
@Qualifier(CDS_HOOKS_OBJECT_MAPPER_FACTORY) ObjectMapper theObjectMapper,
CdsCrSettings theCdsCrSettings) {
return new CdsConfigServiceImpl( return new CdsConfigServiceImpl(
theFhirContext, theObjectMapper, myDaoRegistry, myRepositoryFactory, myRestfulServer); theFhirContext, theObjectMapper, theCdsCrSettings, myDaoRegistry, myRepositoryFactory, myRestfulServer);
} }
@Bean @Bean

View File

@ -24,6 +24,7 @@ import ca.uhn.fhir.cr.common.IRepositoryFactory;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.rest.server.RestfulServer; import ca.uhn.fhir.rest.server.RestfulServer;
import ca.uhn.hapi.fhir.cdshooks.api.ICdsConfigService; import ca.uhn.hapi.fhir.cdshooks.api.ICdsConfigService;
import ca.uhn.hapi.fhir.cdshooks.svc.cr.CdsCrSettings;
import com.fasterxml.jackson.databind.ObjectMapper; import com.fasterxml.jackson.databind.ObjectMapper;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
@ -32,6 +33,7 @@ import javax.annotation.Nullable;
public class CdsConfigServiceImpl implements ICdsConfigService { public class CdsConfigServiceImpl implements ICdsConfigService {
private final FhirContext myFhirContext; private final FhirContext myFhirContext;
private final ObjectMapper myObjectMapper; private final ObjectMapper myObjectMapper;
private final CdsCrSettings myCdsCrSettings;
private final DaoRegistry myDaoRegistry; private final DaoRegistry myDaoRegistry;
private final IRepositoryFactory myRepositoryFactory; private final IRepositoryFactory myRepositoryFactory;
private final RestfulServer myRestfulServer; private final RestfulServer myRestfulServer;
@ -39,11 +41,13 @@ public class CdsConfigServiceImpl implements ICdsConfigService {
public CdsConfigServiceImpl( public CdsConfigServiceImpl(
@Nonnull FhirContext theFhirContext, @Nonnull FhirContext theFhirContext,
@Nonnull ObjectMapper theObjectMapper, @Nonnull ObjectMapper theObjectMapper,
@Nonnull CdsCrSettings theCdsCrSettings,
@Nullable DaoRegistry theDaoRegistry, @Nullable DaoRegistry theDaoRegistry,
@Nullable IRepositoryFactory theRepositoryFactory, @Nullable IRepositoryFactory theRepositoryFactory,
@Nullable RestfulServer theRestfulServer) { @Nullable RestfulServer theRestfulServer) {
myFhirContext = theFhirContext; myFhirContext = theFhirContext;
myObjectMapper = theObjectMapper; myObjectMapper = theObjectMapper;
myCdsCrSettings = theCdsCrSettings;
myDaoRegistry = theDaoRegistry; myDaoRegistry = theDaoRegistry;
myRepositoryFactory = theRepositoryFactory; myRepositoryFactory = theRepositoryFactory;
myRestfulServer = theRestfulServer; myRestfulServer = theRestfulServer;
@ -61,6 +65,12 @@ public class CdsConfigServiceImpl implements ICdsConfigService {
return myObjectMapper; return myObjectMapper;
} }
@Nonnull
@Override
public CdsCrSettings getCdsCrSettings() {
return myCdsCrSettings;
}
@Nullable @Nullable
@Override @Override
public DaoRegistry getDaoRegistry() { public DaoRegistry getDaoRegistry() {

View File

@ -21,7 +21,16 @@ package ca.uhn.hapi.fhir.cdshooks.svc.cr;
import ca.uhn.fhir.context.FhirVersionEnum; import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.hapi.fhir.cdshooks.api.json.*; import ca.uhn.hapi.fhir.cdshooks.api.ICdsConfigService;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceRequestAuthorizationJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceRequestJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseCardJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseCardSourceJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseLinkJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSuggestionActionJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSuggestionJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSystemActionJson;
import org.hl7.fhir.dstu3.model.Bundle; import org.hl7.fhir.dstu3.model.Bundle;
import org.hl7.fhir.dstu3.model.CarePlan; import org.hl7.fhir.dstu3.model.CarePlan;
import org.hl7.fhir.dstu3.model.Endpoint; import org.hl7.fhir.dstu3.model.Endpoint;
@ -60,10 +69,13 @@ import static org.opencds.cqf.fhir.utility.dstu3.Parameters.part;
public class CdsCrServiceDstu3 implements ICdsCrService { public class CdsCrServiceDstu3 implements ICdsCrService {
protected final RequestDetails myRequestDetails; protected final RequestDetails myRequestDetails;
protected final Repository myRepository; protected final Repository myRepository;
protected final ICdsConfigService myCdsConfigService;
protected CarePlan myResponse; protected CarePlan myResponse;
protected CdsServiceResponseJson myServiceResponse; protected CdsServiceResponseJson myServiceResponse;
public CdsCrServiceDstu3(RequestDetails theRequestDetails, Repository theRepository) { public CdsCrServiceDstu3(
RequestDetails theRequestDetails, Repository theRepository, ICdsConfigService theCdsConfigService) {
myCdsConfigService = theCdsConfigService;
myRequestDetails = theRequestDetails; myRequestDetails = theRequestDetails;
myRepository = theRepository; myRepository = theRepository;
} }
@ -108,6 +120,12 @@ public class CdsCrServiceDstu3 implements ICdsCrService {
endpoint.addHeader(String.format( endpoint.addHeader(String.format(
"Authorization: %s %s", "Authorization: %s %s",
tokenType, theJson.getServiceRequestAuthorizationJson().getAccessToken())); tokenType, theJson.getServiceRequestAuthorizationJson().getAccessToken()));
if (theJson.getServiceRequestAuthorizationJson().getSubject() != null) {
endpoint.addHeader(String.format(
"%s: %s",
myCdsConfigService.getCdsCrSettings().getClientIdHeaderName(),
theJson.getServiceRequestAuthorizationJson().getSubject()));
}
} }
parameters.addParameter(part(APPLY_PARAMETER_DATA_ENDPOINT, endpoint)); parameters.addParameter(part(APPLY_PARAMETER_DATA_ENDPOINT, endpoint));
} }

View File

@ -22,7 +22,17 @@ package ca.uhn.hapi.fhir.cdshooks.svc.cr;
import ca.uhn.fhir.context.FhirVersionEnum; import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.hapi.fhir.cdshooks.api.json.*; import ca.uhn.hapi.fhir.cdshooks.api.ICdsConfigService;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceIndicatorEnum;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceRequestAuthorizationJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceRequestJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseCardJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseCardSourceJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseLinkJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSuggestionActionJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSuggestionJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSystemActionJson;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r4.model.Bundle; import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.CanonicalType; import org.hl7.fhir.r4.model.CanonicalType;
@ -61,10 +71,13 @@ import static org.opencds.cqf.fhir.utility.r4.Parameters.part;
public class CdsCrServiceR4 implements ICdsCrService { public class CdsCrServiceR4 implements ICdsCrService {
protected final RequestDetails myRequestDetails; protected final RequestDetails myRequestDetails;
protected final Repository myRepository; protected final Repository myRepository;
protected final ICdsConfigService myCdsConfigService;
protected Bundle myResponseBundle; protected Bundle myResponseBundle;
protected CdsServiceResponseJson myServiceResponse; protected CdsServiceResponseJson myServiceResponse;
public CdsCrServiceR4(RequestDetails theRequestDetails, Repository theRepository) { public CdsCrServiceR4(
RequestDetails theRequestDetails, Repository theRepository, ICdsConfigService theCdsConfigService) {
myCdsConfigService = theCdsConfigService;
myRequestDetails = theRequestDetails; myRequestDetails = theRequestDetails;
myRepository = theRepository; myRepository = theRepository;
} }
@ -109,8 +122,13 @@ public class CdsCrServiceR4 implements ICdsCrService {
endpoint.addHeader(String.format( endpoint.addHeader(String.format(
"Authorization: %s %s", "Authorization: %s %s",
tokenType, theJson.getServiceRequestAuthorizationJson().getAccessToken())); tokenType, theJson.getServiceRequestAuthorizationJson().getAccessToken()));
if (theJson.getServiceRequestAuthorizationJson().getSubject() != null) {
endpoint.addHeader(String.format(
"%s: %s",
myCdsConfigService.getCdsCrSettings().getClientIdHeaderName(),
theJson.getServiceRequestAuthorizationJson().getSubject()));
}
} }
endpoint.addHeader("Epic-Client-ID: 2cb5af9f-f483-4e2a-aedc-54c3a31cb153");
parameters.addParameter(part(APPLY_PARAMETER_DATA_ENDPOINT, endpoint)); parameters.addParameter(part(APPLY_PARAMETER_DATA_ENDPOINT, endpoint));
} }
return parameters; return parameters;

View File

@ -22,7 +22,17 @@ package ca.uhn.hapi.fhir.cdshooks.svc.cr;
import ca.uhn.fhir.context.FhirVersionEnum; import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.hapi.fhir.cdshooks.api.json.*; import ca.uhn.hapi.fhir.cdshooks.api.ICdsConfigService;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceIndicatorEnum;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceRequestAuthorizationJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceRequestJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseCardJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseCardSourceJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseLinkJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSuggestionActionJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSuggestionJson;
import ca.uhn.hapi.fhir.cdshooks.api.json.CdsServiceResponseSystemActionJson;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r5.model.Bundle; import org.hl7.fhir.r5.model.Bundle;
import org.hl7.fhir.r5.model.CanonicalType; import org.hl7.fhir.r5.model.CanonicalType;
@ -61,10 +71,13 @@ import static org.opencds.cqf.fhir.utility.r5.Parameters.part;
public class CdsCrServiceR5 implements ICdsCrService { public class CdsCrServiceR5 implements ICdsCrService {
protected final RequestDetails myRequestDetails; protected final RequestDetails myRequestDetails;
protected final Repository myRepository; protected final Repository myRepository;
protected final ICdsConfigService myCdsConfigService;
protected Bundle myResponseBundle; protected Bundle myResponseBundle;
protected CdsServiceResponseJson myServiceResponse; protected CdsServiceResponseJson myServiceResponse;
public CdsCrServiceR5(RequestDetails theRequestDetails, Repository theRepository) { public CdsCrServiceR5(
RequestDetails theRequestDetails, Repository theRepository, ICdsConfigService theCdsConfigService) {
myCdsConfigService = theCdsConfigService;
myRequestDetails = theRequestDetails; myRequestDetails = theRequestDetails;
myRepository = theRepository; myRepository = theRepository;
} }
@ -109,6 +122,12 @@ public class CdsCrServiceR5 implements ICdsCrService {
endpoint.addHeader(String.format( endpoint.addHeader(String.format(
"Authorization: %s %s", "Authorization: %s %s",
tokenType, theJson.getServiceRequestAuthorizationJson().getAccessToken())); tokenType, theJson.getServiceRequestAuthorizationJson().getAccessToken()));
if (theJson.getServiceRequestAuthorizationJson().getSubject() != null) {
endpoint.addHeader(String.format(
"%s: %s",
myCdsConfigService.getCdsCrSettings().getClientIdHeaderName(),
theJson.getServiceRequestAuthorizationJson().getSubject()));
}
} }
parameters.addParameter(part(APPLY_PARAMETER_DATA_ENDPOINT, endpoint)); parameters.addParameter(part(APPLY_PARAMETER_DATA_ENDPOINT, endpoint));
} }

View File

@ -0,0 +1,44 @@
/*-
* #%L
* HAPI FHIR - CDS Hooks
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.hapi.fhir.cdshooks.svc.cr;
public class CdsCrSettings {
private final String DEFAULT_CLIENT_ID_HEADER_NAME = "client_id";
private String myClientIdHeaderName;
public static CdsCrSettings getDefault() {
CdsCrSettings settings = new CdsCrSettings();
settings.setClientIdHeaderName(settings.DEFAULT_CLIENT_ID_HEADER_NAME);
return settings;
}
public void setClientIdHeaderName(String theName) {
myClientIdHeaderName = theName;
}
public String getClientIdHeaderName() {
return myClientIdHeaderName;
}
public CdsCrSettings withClientIdHeaderName(String theName) {
myClientIdHeaderName = theName;
return this;
}
}

View File

@ -5,6 +5,7 @@ import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.hapi.fhir.cdshooks.api.ICdsHooksDaoAuthorizationSvc; import ca.uhn.hapi.fhir.cdshooks.api.ICdsHooksDaoAuthorizationSvc;
import ca.uhn.hapi.fhir.cdshooks.controller.TestServerAppCtx; import ca.uhn.hapi.fhir.cdshooks.controller.TestServerAppCtx;
import ca.uhn.hapi.fhir.cdshooks.svc.CdsHooksContextBooter; import ca.uhn.hapi.fhir.cdshooks.svc.CdsHooksContextBooter;
import ca.uhn.hapi.fhir.cdshooks.svc.cr.CdsCrSettings;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Configuration;
@ -16,6 +17,9 @@ public class TestCdsHooksConfig {
return FhirContext.forR4Cached(); return FhirContext.forR4Cached();
} }
@Bean
CdsCrSettings cdsCrSettings() { return CdsCrSettings.getDefault(); }
@Bean @Bean
public CdsHooksContextBooter cdsHooksContextBooter() { public CdsHooksContextBooter cdsHooksContextBooter() {
CdsHooksContextBooter retVal = new CdsHooksContextBooter(); CdsHooksContextBooter retVal = new CdsHooksContextBooter();

View File

@ -14,4 +14,6 @@ public abstract class BaseCrTest {
@Autowired @Autowired
protected FhirContext myFhirContext; protected FhirContext myFhirContext;
@Autowired
protected CdsCrSettings myCdsCrSettings;
} }

View File

@ -2,13 +2,37 @@ package ca.uhn.hapi.fhir.cdshooks.svc.cr;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.hapi.fhir.cdshooks.api.ICdsConfigService;
import ca.uhn.hapi.fhir.cdshooks.module.CdsHooksObjectMapperFactory;
import ca.uhn.hapi.fhir.cdshooks.svc.CdsConfigServiceImpl;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Configuration;
import static ca.uhn.hapi.fhir.cdshooks.config.CdsHooksConfig.CDS_HOOKS_OBJECT_MAPPER_FACTORY;
@Configuration @Configuration
public class TestCrConfig { public class TestCrConfig {
@Bean @Bean
FhirContext fhirContext() { FhirContext fhirContext() {
return FhirContext.forR4Cached(); return FhirContext.forR4Cached();
} }
@Bean(name = CDS_HOOKS_OBJECT_MAPPER_FACTORY)
public ObjectMapper objectMapper(FhirContext theFhirContext) {
return new CdsHooksObjectMapperFactory(theFhirContext).newMapper();
}
@Bean
CdsCrSettings cdsCrSettings() { return CdsCrSettings.getDefault(); }
@Bean
public ICdsConfigService cdsConfigService(
FhirContext theFhirContext,
@Qualifier(CDS_HOOKS_OBJECT_MAPPER_FACTORY) ObjectMapper theObjectMapper,
CdsCrSettings theCdsCrSettings) {
return new CdsConfigServiceImpl(
theFhirContext, theObjectMapper, theCdsCrSettings, null, null, null);
}
} }

Some files were not shown because too many files have changed in this diff Show More