Compare commits

...

6 Commits

Author SHA1 Message Date
Tadgh 7a5731d078
Merge 6c4b50e1cd into 061390d76b 2024-11-27 08:24:37 -05:00
James Agnew 061390d76b
Add composite interceptor registry (#6511)
* Composite interceptor improvements

* Add composite interceptor registry

* Add changelog

* Composite Interceptor Broadcaster Improvements

* Fix compile error

* Update hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_8_0/6511-rework-composite-interceptor-broadcaster.yaml

Co-authored-by: Tadgh <garygrantgraham@gmail.com>

* Address review comments

* Test fixes

* Test fix

* Test fix

---------

Co-authored-by: Tadgh <garygrantgraham@gmail.com>
2024-11-27 07:14:48 -05:00
Michael Buckley 3b8569127e
Start removing dependency from FhirVersionEnum to FhirContext (#6512)
Deprecate path from FhirVersionEnum to FhirContext and replace usages.
2024-11-26 13:46:05 -05:00
JasonRoberts-smile 77fa7f7819
adapt template for reuse in CDA (#6500)
* adapt template for reuse in CDA

* use coerced onset date time value

* fix broken test
2024-11-26 09:14:18 -05:00
Tadgh 86c2c13e0f
7.6.0 Mergeback (#6494)
* 6323 resource creation deadlock (#6324)

* make map reads concurrent

* change log

* CUstom version number for guaranteed build determinism

* licenses

* Expand translation cache (#6341)

* Expand translation cache

* Add changelog

* Correction to #6341 (#6342)

* Contained bug (#6402)

* Contained bug

* more tests

* changelog, tests, implementation

* code review

* backwards logic

* Fix Questionnaire doc (#6400)

* fix reindex optimizeStorage=ALL_VERSIONS (#6421)

* fixing broken rename of last step of reindex (#6429)

* Fixes for the translation of parameter issues as part of the output for $validate-code operation. (#6438)

* Move the validation providers to the test utilities package such that they can be reused.

* Update IValidationSupport.CodeValidationIssue structure such that it meets the FHIR specification. Update RemoteTerminologyServiceValidationSupport and VersionSpecificWorkerContextWrapper translation for issues.

* New tests for issue translation. Move test class to a different package so that we can add another test class.

* Some simplification in the issue API to simplify building of issues.

* Fix compilation errors.

* Update providers to allow multiple responses and add  support for the fetchCodeSystem call through method find.

* Setup the first test for resource validation with remote terminology providers.

* Fix NullPointerException

* avoid calling validateCode for CodeSystem where system is null

* Keep old public API methods (and class name) in IValidationSupport and mark them as deprecated to avoid breaking dependencies.

* Revert local change to debug for duplicate errors.

* Add more test cases for resource validation. Throw exception to signal missing test setup to make it obvious.

* Simplify test setup.

* Add some more javadoc

* Add javadoc for the new test class

* Add more tests

* Address code review comments in IValidationSupport.

* Add changelog

* Change Repository search interface from Map to Multimap (#6445)

Change Map to Multimap to support multiple and clauses.

* licenses

* fix interceptor hooks from requestDetails not getting called for STORAGE_PRECHECK_FOR_CACHED_SEARCH (#6436)

* fix interceptor hooks from requestDetails not getting called for STORAGE_PRECHECK_FOR_CACHED_SEARCH

* added unit tests and updated changelog

* added one more test case

* Add Adapter api (#6450)

Lightweight implementation of the adapter pattern.

* Bulk Import job status not changed after activation - failing test, fix, changelog (#6452)

* Update CR to 3.13.1 (#6433)

* Automated Migration Testing (HAPI-FHIR) - updated test migration scripts for 7_4_0 (#6439)

* Rel 7 6 CVE (#6446)

* Bump commons io

* Bump HS and Lucene, and hibernate

* Jetty bump for CVE

* Resolve CVES

* Bump with mismatch lucene

* Bump velocity template engine

* Revert bom bump

* wip

* Version bump

* move changelog, fix cve

* Bump commons-lang

* Replace imports

* wip

* fix HQL break

* remove dead code

* Fix changelog entry

* Bump org.hl7.fhir.core to 6.4.0 (#6454)

* Bump HAPI version + org.hl7.fhir.core to 6.4.0

* Apply spotless

* Correct a bug with duplicate parser IDs being assigned.  (#6456)

* Fix changelog entry

* wip

* compilation problem

* Correct tests

* changelog

* Update hapi-fhir-structures-r4/src/test/java/ca/uhn/fhir/parser/JsonParserR4Test.java

Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>

* Correct a bug with duplicate parser IDs being assigned - test fixes

* Correct a bug with duplicate parser IDs being assigned - spotless

* Correct a bug with duplicate parser IDs being assigned - added test-utilities dependency to fhir-structures poms

* Correct a bug with duplicate parser IDs being assigned - test fixes

* Contained resources without assigned IDs are now assigned GUIDs - address comments

---------

Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>
Co-authored-by: volodymyr <volodymyr.korzh@smilecdr.com>

* licenses

* ValueSet expansion fails if Hibernate Search configured to use Lucene - fixed incompatibility between Hibernate Search and Lucene versions (#6468)

* Change the migrator to avoid table locks when adding an index. (#6489)

* version bump

* Updating version to: 7.6.1 post release.

* Bump to 7.7.7.

* deadsapce

* Profile reference param (#6501)

* updated QueryStack to not throw error with _profile as a ReferenceParam

* update QueryStack

* remove warnings

* cleanup cast

* cleanup test

* add changelog

* cleanup imports

* updates per review feedback

* updates per review feedback

---------

Co-authored-by: taha.attari@smilecdr.com <taha.attari@smilecdr.com>

* move changelog

---------

Co-authored-by: JasonRoberts-smile <85363818+JasonRoberts-smile@users.noreply.github.com>
Co-authored-by: James Agnew <jamesagnew@gmail.com>
Co-authored-by: Brenin Rhodes <brenin@alphora.com>
Co-authored-by: Emre Dincturk <74370953+mrdnctrk@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: Martha Mitran <martha.mitran@smiledigitalhealth.com>
Co-authored-by: Michael Buckley <michaelabuckley@gmail.com>
Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>
Co-authored-by: dotasek <david.otasek@smilecdr.com>
Co-authored-by: volodymyr <volodymyr.korzh@smilecdr.com>
Co-authored-by: markiantorno <markiantorno@gmail.com>
Co-authored-by: Gary Graham <garygraham@smiledigitalhealth.com>
Co-authored-by: Taha <TahaAttari@users.noreply.github.com>
Co-authored-by: taha.attari@smilecdr.com <taha.attari@smilecdr.com>
2024-11-24 12:31:59 -08:00
Thomas Papke 6c4b50e1cd #5768 Upgrade to latest simple-java-mail 2024-09-01 17:15:49 +02:00
197 changed files with 4814 additions and 1716 deletions

View File

@ -1293,7 +1293,15 @@ public class FhirContext {
* @since 5.1.0 * @since 5.1.0
*/ */
public static FhirContext forCached(FhirVersionEnum theFhirVersionEnum) { public static FhirContext forCached(FhirVersionEnum theFhirVersionEnum) {
return ourStaticContexts.computeIfAbsent(theFhirVersionEnum, v -> new FhirContext(v)); return ourStaticContexts.computeIfAbsent(theFhirVersionEnum, FhirContext::forVersion);
}
/**
* An uncached version of forCached()
* @return a new FhirContext for theFhirVersionEnum
*/
public static FhirContext forVersion(FhirVersionEnum theFhirVersionEnum) {
return new FhirContext(theFhirVersionEnum);
} }
private static Collection<Class<? extends IBaseResource>> toCollection( private static Collection<Class<? extends IBaseResource>> toCollection(

View File

@ -135,15 +135,19 @@ public enum FhirVersionEnum {
/** /**
* Creates a new FhirContext for this FHIR version * Creates a new FhirContext for this FHIR version
* @deprecated since 7.7. Use {@link FhirContext#forVersion(FhirVersionEnum)} instead
*/ */
@Deprecated(forRemoval = true, since = "7.7")
public FhirContext newContext() { public FhirContext newContext() {
return new FhirContext(this); return FhirContext.forVersion(this);
} }
/** /**
* Creates a new FhirContext for this FHIR version, or returns a previously created one if one exists. This * Creates a new FhirContext for this FHIR version, or returns a previously created one if one exists. This
* method uses {@link FhirContext#forCached(FhirVersionEnum)} to return a cached instance. * method uses {@link FhirContext#forCached(FhirVersionEnum)} to return a cached instance.
* @deprecated since 7.7. Use {@link FhirContext#forCached(FhirVersionEnum)} instead
*/ */
@Deprecated(forRemoval = true, since = "7.7")
public FhirContext newContextCached() { public FhirContext newContextCached() {
return FhirContext.forCached(this); return FhirContext.forCached(this);
} }

View File

@ -440,74 +440,259 @@ public interface IValidationSupport {
return "Unknown " + getFhirContext().getVersion().getVersion() + " Validation Support"; return "Unknown " + getFhirContext().getVersion().getVersion() + " Validation Support";
} }
/**
* Defines codes in system <a href="http://hl7.org/fhir/issue-severity">http://hl7.org/fhir/issue-severity</a>.
*/
/* this enum would not be needed if we design/refactor to use org.hl7.fhir.r5.terminologies.utilities.ValidationResult */
enum IssueSeverity { enum IssueSeverity {
/** /**
* The issue caused the action to fail, and no further checking could be performed. * The issue caused the action to fail, and no further checking could be performed.
*/ */
FATAL, FATAL("fatal"),
/** /**
* The issue is sufficiently important to cause the action to fail. * The issue is sufficiently important to cause the action to fail.
*/ */
ERROR, ERROR("error"),
/** /**
* The issue is not important enough to cause the action to fail, but may cause it to be performed suboptimally or in a way that is not as desired. * The issue is not important enough to cause the action to fail, but may cause it to be performed suboptimally or in a way that is not as desired.
*/ */
WARNING, WARNING("warning"),
/** /**
* The issue has no relation to the degree of success of the action. * The issue has no relation to the degree of success of the action.
*/ */
INFORMATION INFORMATION("information"),
/**
* The operation was successful.
*/
SUCCESS("success");
// the spec for OperationOutcome mentions that a code from http://hl7.org/fhir/issue-severity is required
private final String myCode;
IssueSeverity(String theCode) {
myCode = theCode;
}
/**
* Provide mapping to a code in system <a href="http://hl7.org/fhir/issue-severity">http://hl7.org/fhir/issue-severity</a>.
* @return the code
*/
public String getCode() {
return myCode;
}
/**
* Creates a {@link IssueSeverity} object from the given code.
* @return the {@link IssueSeverity}
*/
public static IssueSeverity fromCode(String theCode) {
switch (theCode) {
case "fatal":
return FATAL;
case "error":
return ERROR;
case "warning":
return WARNING;
case "information":
return INFORMATION;
case "success":
return SUCCESS;
default:
return null;
}
}
} }
enum CodeValidationIssueCode { /**
NOT_FOUND, * Defines codes in system <a href="http://hl7.org/fhir/issue-type">http://hl7.org/fhir/issue-type</a>.
CODE_INVALID, * The binding is enforced as a part of validation logic in the FHIR Core Validation library where an exception is thrown.
INVALID, * Only a sub-set of these codes are defined as constants because they relate to validation,
OTHER * If there are additional ones that come up, for Remote Terminology they are currently supported via
} * {@link IValidationSupport.CodeValidationIssue#CodeValidationIssue(String, IssueSeverity, String)}
* while for internal validators, more constants can be added to make things easier and consistent.
* This maps to resource OperationOutcome.issue.code.
*/
/* this enum would not be needed if we design/refactor to use org.hl7.fhir.r5.terminologies.utilities.ValidationResult */
class CodeValidationIssueCode {
public static final CodeValidationIssueCode NOT_FOUND = new CodeValidationIssueCode("not-found");
public static final CodeValidationIssueCode CODE_INVALID = new CodeValidationIssueCode("code-invalid");
public static final CodeValidationIssueCode INVALID = new CodeValidationIssueCode("invalid");
enum CodeValidationIssueCoding { private final String myCode;
VS_INVALID,
NOT_FOUND,
NOT_IN_VS,
INVALID_CODE, // this is intentionally not exposed
INVALID_DISPLAY, CodeValidationIssueCode(String theCode) {
OTHER myCode = theCode;
}
class CodeValidationIssue {
private final String myMessage;
private final IssueSeverity mySeverity;
private final CodeValidationIssueCode myCode;
private final CodeValidationIssueCoding myCoding;
public CodeValidationIssue(
String theMessage,
IssueSeverity mySeverity,
CodeValidationIssueCode theCode,
CodeValidationIssueCoding theCoding) {
this.myMessage = theMessage;
this.mySeverity = mySeverity;
this.myCode = theCode;
this.myCoding = theCoding;
} }
/**
* Retrieve the corresponding code from system <a href="http://hl7.org/fhir/issue-type">http://hl7.org/fhir/issue-type</a>.
* @return the code
*/
public String getCode() {
return myCode;
}
}
/**
* Holds information about the details of a {@link CodeValidationIssue}.
* This maps to resource OperationOutcome.issue.details.
*/
/* this enum would not be needed if we design/refactor to use org.hl7.fhir.r5.terminologies.utilities.ValidationResult */
class CodeValidationIssueDetails {
private final String myText;
private List<CodeValidationIssueCoding> myCodings;
public CodeValidationIssueDetails(String theText) {
myText = theText;
}
// intentionally not exposed
void addCoding(CodeValidationIssueCoding theCoding) {
getCodings().add(theCoding);
}
public CodeValidationIssueDetails addCoding(String theSystem, String theCode) {
if (myCodings == null) {
myCodings = new ArrayList<>();
}
myCodings.add(new CodeValidationIssueCoding(theSystem, theCode));
return this;
}
public String getText() {
return myText;
}
public List<CodeValidationIssueCoding> getCodings() {
if (myCodings == null) {
myCodings = new ArrayList<>();
}
return myCodings;
}
}
/**
* Defines codes that can be part of the details of an issue.
* There are some constants available (pre-defined) for codes for system <a href="http://hl7.org/fhir/tools/CodeSystem/tx-issue-type">http://hl7.org/fhir/tools/CodeSystem/tx-issue-type</a>.
* This maps to resource OperationOutcome.issue.details.coding[0].code.
*/
class CodeValidationIssueCoding {
public static String TX_ISSUE_SYSTEM = "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type";
public static CodeValidationIssueCoding VS_INVALID =
new CodeValidationIssueCoding(TX_ISSUE_SYSTEM, "vs-invalid");
public static final CodeValidationIssueCoding NOT_FOUND =
new CodeValidationIssueCoding(TX_ISSUE_SYSTEM, "not-found");
public static final CodeValidationIssueCoding NOT_IN_VS =
new CodeValidationIssueCoding(TX_ISSUE_SYSTEM, "not-in-vs");
public static final CodeValidationIssueCoding INVALID_CODE =
new CodeValidationIssueCoding(TX_ISSUE_SYSTEM, "invalid-code");
public static final CodeValidationIssueCoding INVALID_DISPLAY =
new CodeValidationIssueCoding(TX_ISSUE_SYSTEM, "vs-display");
private final String mySystem, myCode;
// this is intentionally not exposed
CodeValidationIssueCoding(String theSystem, String theCode) {
mySystem = theSystem;
myCode = theCode;
}
/**
* Retrieve the corresponding code for the details of a validation issue.
* @return the code
*/
public String getCode() {
return myCode;
}
/**
* Retrieve the system for the details of a validation issue.
* @return the system
*/
public String getSystem() {
return mySystem;
}
}
/**
* This is a hapi-fhir internal version agnostic object holding information about a validation issue.
* An alternative (which requires significant refactoring) would be to use org.hl7.fhir.r5.terminologies.utilities.ValidationResult instead.
*/
class CodeValidationIssue {
private final String myDiagnostics;
private final IssueSeverity mySeverity;
private final CodeValidationIssueCode myCode;
private CodeValidationIssueDetails myDetails;
public CodeValidationIssue(
String theDiagnostics, IssueSeverity theSeverity, CodeValidationIssueCode theTypeCode) {
this(theDiagnostics, theSeverity, theTypeCode, null);
}
public CodeValidationIssue(String theDiagnostics, IssueSeverity theSeverity, String theTypeCode) {
this(theDiagnostics, theSeverity, new CodeValidationIssueCode(theTypeCode), null);
}
public CodeValidationIssue(
String theDiagnostics,
IssueSeverity theSeverity,
CodeValidationIssueCode theType,
CodeValidationIssueCoding theDetailsCoding) {
myDiagnostics = theDiagnostics;
mySeverity = theSeverity;
myCode = theType;
// reuse the diagnostics message as a detail text message
myDetails = new CodeValidationIssueDetails(theDiagnostics);
myDetails.addCoding(theDetailsCoding);
}
/**
* @deprecated Please use {@link #getDiagnostics()} instead.
*/
@Deprecated(since = "7.4.6")
public String getMessage() { public String getMessage() {
return myMessage; return getDiagnostics();
}
public String getDiagnostics() {
return myDiagnostics;
} }
public IssueSeverity getSeverity() { public IssueSeverity getSeverity() {
return mySeverity; return mySeverity;
} }
/**
* @deprecated Please use {@link #getType()} instead.
*/
@Deprecated(since = "7.4.6")
public CodeValidationIssueCode getCode() { public CodeValidationIssueCode getCode() {
return getType();
}
public CodeValidationIssueCode getType() {
return myCode; return myCode;
} }
/**
* @deprecated Please use {@link #getDetails()} instead. That has support for multiple codings.
*/
@Deprecated(since = "7.4.6")
public CodeValidationIssueCoding getCoding() { public CodeValidationIssueCoding getCoding() {
return myCoding; return myDetails != null
? myDetails.getCodings().stream().findFirst().orElse(null)
: null;
}
public void setDetails(CodeValidationIssueDetails theDetails) {
this.myDetails = theDetails;
}
public CodeValidationIssueDetails getDetails() {
return myDetails;
}
public boolean hasIssueDetailCode(@Nonnull String theCode) {
// this method is system agnostic at the moment but it can be restricted if needed
return myDetails.getCodings().stream().anyMatch(coding -> theCode.equals(coding.getCode()));
} }
} }
@ -671,6 +856,10 @@ public interface IValidationSupport {
} }
} }
/**
* This is a hapi-fhir internal version agnostic object holding information about the validation result.
* An alternative (which requires significant refactoring) would be to use org.hl7.fhir.r5.terminologies.utilities.ValidationResult.
*/
class CodeValidationResult { class CodeValidationResult {
public static final String SOURCE_DETAILS = "sourceDetails"; public static final String SOURCE_DETAILS = "sourceDetails";
public static final String RESULT = "result"; public static final String RESULT = "result";
@ -686,7 +875,7 @@ public interface IValidationSupport {
private String myDisplay; private String myDisplay;
private String mySourceDetails; private String mySourceDetails;
private List<CodeValidationIssue> myCodeValidationIssues; private List<CodeValidationIssue> myIssues;
public CodeValidationResult() { public CodeValidationResult() {
super(); super();
@ -771,20 +960,45 @@ public interface IValidationSupport {
return this; return this;
} }
/**
* @deprecated Please use method {@link #getIssues()} instead.
*/
@Deprecated(since = "7.4.6")
public List<CodeValidationIssue> getCodeValidationIssues() { public List<CodeValidationIssue> getCodeValidationIssues() {
if (myCodeValidationIssues == null) { return getIssues();
myCodeValidationIssues = new ArrayList<>();
}
return myCodeValidationIssues;
} }
/**
* @deprecated Please use method {@link #setIssues(List)} instead.
*/
@Deprecated(since = "7.4.6")
public CodeValidationResult setCodeValidationIssues(List<CodeValidationIssue> theCodeValidationIssues) { public CodeValidationResult setCodeValidationIssues(List<CodeValidationIssue> theCodeValidationIssues) {
myCodeValidationIssues = new ArrayList<>(theCodeValidationIssues); return setIssues(theCodeValidationIssues);
}
/**
* @deprecated Please use method {@link #addIssue(CodeValidationIssue)} instead.
*/
@Deprecated(since = "7.4.6")
public CodeValidationResult addCodeValidationIssue(CodeValidationIssue theCodeValidationIssue) {
getCodeValidationIssues().add(theCodeValidationIssue);
return this; return this;
} }
public CodeValidationResult addCodeValidationIssue(CodeValidationIssue theCodeValidationIssue) { public List<CodeValidationIssue> getIssues() {
getCodeValidationIssues().add(theCodeValidationIssue); if (myIssues == null) {
myIssues = new ArrayList<>();
}
return myIssues;
}
public CodeValidationResult setIssues(List<CodeValidationIssue> theIssues) {
myIssues = new ArrayList<>(theIssues);
return this;
}
public CodeValidationResult addIssue(CodeValidationIssue theCodeValidationIssue) {
getIssues().add(theCodeValidationIssue);
return this; return this;
} }
@ -811,17 +1025,19 @@ public interface IValidationSupport {
public String getSeverityCode() { public String getSeverityCode() {
String retVal = null; String retVal = null;
if (getSeverity() != null) { if (getSeverity() != null) {
retVal = getSeverity().name().toLowerCase(); retVal = getSeverity().getCode();
} }
return retVal; return retVal;
} }
/** /**
* Sets an issue severity as a string code. Value must be the name of * Sets an issue severity using a severity code. Please use method {@link #setSeverity(IssueSeverity)} instead.
* one of the enum values in {@link IssueSeverity}. Value is case-insensitive. * @param theSeverityCode the code
* @return the current {@link CodeValidationResult} instance
*/ */
public CodeValidationResult setSeverityCode(@Nonnull String theIssueSeverity) { @Deprecated(since = "7.4.6")
setSeverity(IssueSeverity.valueOf(theIssueSeverity.toUpperCase())); public CodeValidationResult setSeverityCode(@Nonnull String theSeverityCode) {
setSeverity(IssueSeverity.fromCode(theSeverityCode));
return this; return this;
} }
@ -838,6 +1054,11 @@ public interface IValidationSupport {
if (isNotBlank(getSourceDetails())) { if (isNotBlank(getSourceDetails())) {
ParametersUtil.addParameterToParametersString(theContext, retVal, SOURCE_DETAILS, getSourceDetails()); ParametersUtil.addParameterToParametersString(theContext, retVal, SOURCE_DETAILS, getSourceDetails());
} }
/*
should translate issues as well, except that is version specific code, so it requires more refactoring
or replace the current class with org.hl7.fhir.r5.terminologies.utilities.ValidationResult
@see VersionSpecificWorkerContextWrapper#getIssuesForCodeValidation
*/
return retVal; return retVal;
} }

View File

@ -46,7 +46,11 @@ public @interface Hook {
* and allowable values can be positive or negative or 0. * and allowable values can be positive or negative or 0.
* <p> * <p>
* If no order is specified, or the order is set to <code>0</code> (the default order), * If no order is specified, or the order is set to <code>0</code> (the default order),
* the order specified at the interceptor type level will take precedence. * the order specified at the {@link Interceptor#order() interceptor type level} will be used.
* </p>
* <p>
* Note that if two hook methods have the same order, then the order of execution is undefined. If
* order is important, then an order must always be explicitly stated.
* </p> * </p>
*/ */
int order() default Interceptor.DEFAULT_ORDER; int order() default Interceptor.DEFAULT_ORDER;

View File

@ -19,6 +19,7 @@
*/ */
package ca.uhn.fhir.interceptor.api; package ca.uhn.fhir.interceptor.api;
import java.util.List;
import java.util.function.Supplier; import java.util.function.Supplier;
public interface IBaseInterceptorBroadcaster<POINTCUT extends IPointcut> { public interface IBaseInterceptorBroadcaster<POINTCUT extends IPointcut> {
@ -73,4 +74,15 @@ public interface IBaseInterceptorBroadcaster<POINTCUT extends IPointcut> {
* @since 4.0.0 * @since 4.0.0
*/ */
boolean hasHooks(POINTCUT thePointcut); boolean hasHooks(POINTCUT thePointcut);
List<IInvoker> getInvokersForPointcut(POINTCUT thePointcut);
interface IInvoker extends Comparable<IInvoker> {
Object invoke(HookParams theParams);
int getOrder();
Object getInterceptor();
}
} }

View File

@ -27,6 +27,8 @@ public interface IPointcut {
@Nonnull @Nonnull
Class<?> getReturnType(); Class<?> getReturnType();
Class<?> getBooleanReturnTypeForEnum();
@Nonnull @Nonnull
List<String> getParameterTypes(); List<String> getParameterTypes();

View File

@ -26,6 +26,7 @@ import ca.uhn.fhir.rest.server.exceptions.AuthenticationException;
import ca.uhn.fhir.rest.server.exceptions.BaseServerResponseException; import ca.uhn.fhir.rest.server.exceptions.BaseServerResponseException;
import ca.uhn.fhir.validation.ValidationResult; import ca.uhn.fhir.validation.ValidationResult;
import jakarta.annotation.Nonnull; import jakarta.annotation.Nonnull;
import org.apache.commons.lang3.Validate;
import org.hl7.fhir.instance.model.api.IBaseConformance; import org.hl7.fhir.instance.model.api.IBaseConformance;
import java.io.Writer; import java.io.Writer;
@ -3107,6 +3108,10 @@ public enum Pointcut implements IPointcut {
@Nonnull Class<?> theReturnType, @Nonnull Class<?> theReturnType,
@Nonnull ExceptionHandlingSpec theExceptionHandlingSpec, @Nonnull ExceptionHandlingSpec theExceptionHandlingSpec,
String... theParameterTypes) { String... theParameterTypes) {
// This enum uses the lowercase-b boolean type to indicate boolean return pointcuts
Validate.isTrue(!theReturnType.equals(Boolean.class), "Return type Boolean not allowed here, must be boolean");
myReturnType = theReturnType; myReturnType = theReturnType;
myExceptionHandlingSpec = theExceptionHandlingSpec; myExceptionHandlingSpec = theExceptionHandlingSpec;
myParameterTypes = Collections.unmodifiableList(Arrays.asList(theParameterTypes)); myParameterTypes = Collections.unmodifiableList(Arrays.asList(theParameterTypes));
@ -3132,6 +3137,11 @@ public enum Pointcut implements IPointcut {
return myReturnType; return myReturnType;
} }
@Override
public Class<?> getBooleanReturnTypeForEnum() {
return boolean.class;
}
@Override @Override
@Nonnull @Nonnull
public List<String> getParameterTypes() { public List<String> getParameterTypes() {

View File

@ -20,6 +20,7 @@
package ca.uhn.fhir.interceptor.executor; package ca.uhn.fhir.interceptor.executor;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.api.Hook;
import ca.uhn.fhir.interceptor.api.HookParams; import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.interceptor.api.IBaseInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.IBaseInterceptorBroadcaster;
import ca.uhn.fhir.interceptor.api.IBaseInterceptorService; import ca.uhn.fhir.interceptor.api.IBaseInterceptorService;
@ -57,12 +58,13 @@ import java.util.HashMap;
import java.util.IdentityHashMap; import java.util.IdentityHashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Objects;
import java.util.Optional; import java.util.Optional;
import java.util.concurrent.atomic.AtomicInteger; import java.util.concurrent.atomic.AtomicInteger;
import java.util.function.Predicate; import java.util.function.Predicate;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static org.apache.commons.lang3.ObjectUtils.defaultIfNull;
public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & IPointcut> public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & IPointcut>
implements IBaseInterceptorService<POINTCUT>, IBaseInterceptorBroadcaster<POINTCUT> { implements IBaseInterceptorService<POINTCUT>, IBaseInterceptorBroadcaster<POINTCUT> {
private static final Logger ourLog = LoggerFactory.getLogger(BaseInterceptorService.class); private static final Logger ourLog = LoggerFactory.getLogger(BaseInterceptorService.class);
@ -74,12 +76,11 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
AttributeKey.stringKey("hapifhir.interceptor.method_name"); AttributeKey.stringKey("hapifhir.interceptor.method_name");
private final List<Object> myInterceptors = new ArrayList<>(); private final List<Object> myInterceptors = new ArrayList<>();
private final ListMultimap<POINTCUT, BaseInvoker> myGlobalInvokers = ArrayListMultimap.create(); private final ListMultimap<POINTCUT, IInvoker> myGlobalInvokers = ArrayListMultimap.create();
private final ListMultimap<POINTCUT, BaseInvoker> myAnonymousInvokers = ArrayListMultimap.create(); private final ListMultimap<POINTCUT, IInvoker> myAnonymousInvokers = ArrayListMultimap.create();
private final Object myRegistryMutex = new Object(); private final Object myRegistryMutex = new Object();
private final Class<POINTCUT> myPointcutType; private final Class<POINTCUT> myPointcutType;
private volatile EnumSet<POINTCUT> myRegisteredPointcuts; private volatile EnumSet<POINTCUT> myRegisteredPointcuts;
private String myName;
private boolean myWarnOnInterceptorWithNoHooks = true; private boolean myWarnOnInterceptorWithNoHooks = true;
/** /**
@ -93,10 +94,11 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
* Constructor * Constructor
* *
* @param theName The name for this registry (useful for troubleshooting) * @param theName The name for this registry (useful for troubleshooting)
* @deprecated The name parameter is not used for anything
*/ */
@Deprecated(since = "8.0.0", forRemoval = true)
public BaseInterceptorService(Class<POINTCUT> thePointcutType, String theName) { public BaseInterceptorService(Class<POINTCUT> thePointcutType, String theName) {
super(); super();
myName = theName;
myPointcutType = thePointcutType; myPointcutType = thePointcutType;
rebuildRegisteredPointcutSet(); rebuildRegisteredPointcutSet();
} }
@ -113,13 +115,17 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
return myInterceptors; return myInterceptors;
} }
public void setName(String theName) { /**
myName = theName; * @deprecated This value is not used anywhere
*/
@Deprecated(since = "8.0.0", forRemoval = true)
public void setName(@SuppressWarnings("unused") String theName) {
// nothing
} }
protected void registerAnonymousInterceptor(POINTCUT thePointcut, Object theInterceptor, BaseInvoker theInvoker) { protected void registerAnonymousInterceptor(POINTCUT thePointcut, Object theInterceptor, BaseInvoker theInvoker) {
Validate.notNull(thePointcut); Validate.notNull(thePointcut, "thePointcut must not be null");
Validate.notNull(theInterceptor); Validate.notNull(theInterceptor, "theInterceptor must not be null");
synchronized (myRegistryMutex) { synchronized (myRegistryMutex) {
myAnonymousInvokers.put(thePointcut, theInvoker); myAnonymousInvokers.put(thePointcut, theInvoker);
if (!isInterceptorAlreadyRegistered(theInterceptor)) { if (!isInterceptorAlreadyRegistered(theInterceptor)) {
@ -179,9 +185,9 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
} }
private void unregisterInterceptorsIf( private void unregisterInterceptorsIf(
Predicate<Object> theShouldUnregisterFunction, ListMultimap<POINTCUT, BaseInvoker> theGlobalInvokers) { Predicate<Object> theShouldUnregisterFunction, ListMultimap<POINTCUT, IInvoker> theGlobalInvokers) {
synchronized (myRegistryMutex) { synchronized (myRegistryMutex) {
for (Map.Entry<POINTCUT, BaseInvoker> nextInvoker : new ArrayList<>(theGlobalInvokers.entries())) { for (Map.Entry<POINTCUT, IInvoker> nextInvoker : new ArrayList<>(theGlobalInvokers.entries())) {
if (theShouldUnregisterFunction.test(nextInvoker.getValue().getInterceptor())) { if (theShouldUnregisterFunction.test(nextInvoker.getValue().getInterceptor())) {
unregisterInterceptor(nextInvoker.getValue().getInterceptor()); unregisterInterceptor(nextInvoker.getValue().getInterceptor());
} }
@ -265,7 +271,7 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
assert haveAppropriateParams(thePointcut, theParams); assert haveAppropriateParams(thePointcut, theParams);
assert thePointcut.getReturnType() != void.class; assert thePointcut.getReturnType() != void.class;
return doCallHooks(thePointcut, theParams, null); return doCallHooks(thePointcut, theParams);
} }
@Override @Override
@ -282,116 +288,47 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
assert haveAppropriateParams(thePointcut, theParams); assert haveAppropriateParams(thePointcut, theParams);
assert thePointcut.getReturnType() == void.class || thePointcut.getReturnType() == getBooleanReturnType(); assert thePointcut.getReturnType() == void.class || thePointcut.getReturnType() == getBooleanReturnType();
Object retValObj = doCallHooks(thePointcut, theParams, true); Object retValObj = doCallHooks(thePointcut, theParams);
retValObj = defaultIfNull(retValObj, true);
return (Boolean) retValObj; return (Boolean) retValObj;
} }
private Object doCallHooks(POINTCUT thePointcut, HookParams theParams, Object theRetVal) { private Object doCallHooks(POINTCUT thePointcut, HookParams theParams) {
// use new list for loop to avoid ConcurrentModificationException in case invoker gets added while looping List<IInvoker> invokers = getInvokersForPointcut(thePointcut);
List<BaseInvoker> invokers = new ArrayList<>(getInvokersForPointcut(thePointcut)); return callInvokers(thePointcut, theParams, invokers);
/*
* Call each hook in order
*/
for (BaseInvoker nextInvoker : invokers) {
Object nextOutcome = nextInvoker.invoke(theParams);
Class<?> pointcutReturnType = thePointcut.getReturnType();
if (pointcutReturnType.equals(getBooleanReturnType())) {
Boolean nextOutcomeAsBoolean = (Boolean) nextOutcome;
if (Boolean.FALSE.equals(nextOutcomeAsBoolean)) {
ourLog.trace("callHooks({}) for invoker({}) returned false", thePointcut, nextInvoker);
theRetVal = false;
break;
} else {
theRetVal = true;
}
} else if (!pointcutReturnType.equals(void.class)) {
if (nextOutcome != null) {
theRetVal = nextOutcome;
break;
}
}
}
return theRetVal;
} }
@VisibleForTesting @VisibleForTesting
List<Object> getInterceptorsWithInvokersForPointcut(POINTCUT thePointcut) { List<Object> getInterceptorsWithInvokersForPointcut(POINTCUT thePointcut) {
return getInvokersForPointcut(thePointcut).stream() return getInvokersForPointcut(thePointcut).stream()
.map(BaseInvoker::getInterceptor) .map(IInvoker::getInterceptor)
.collect(Collectors.toList()); .collect(Collectors.toList());
} }
/** /**
* Returns an ordered list of invokers for the given pointcut. Note that * Returns a list of all invokers registered for the given pointcut. The list
* a new and stable list is returned to.. do whatever you want with it. * is ordered by the invoker order (specified on the {@link Interceptor#order()}
* and {@link Hook#order()} values.
*
* @return The list returned by this method will always be a newly created list, so it will be stable and can be modified.
*/ */
private List<BaseInvoker> getInvokersForPointcut(POINTCUT thePointcut) { @Override
List<BaseInvoker> invokers; public List<IInvoker> getInvokersForPointcut(POINTCUT thePointcut) {
List<IInvoker> invokers;
synchronized (myRegistryMutex) { synchronized (myRegistryMutex) {
List<BaseInvoker> globalInvokers = myGlobalInvokers.get(thePointcut); List<IInvoker> globalInvokers = myGlobalInvokers.get(thePointcut);
List<BaseInvoker> anonymousInvokers = myAnonymousInvokers.get(thePointcut); List<IInvoker> anonymousInvokers = myAnonymousInvokers.get(thePointcut);
List<BaseInvoker> threadLocalInvokers = null; invokers = union(Arrays.asList(globalInvokers, anonymousInvokers));
invokers = union(globalInvokers, anonymousInvokers, threadLocalInvokers);
} }
return invokers; return invokers;
} }
/**
* First argument must be the global invoker list!!
*/
@SafeVarargs
private List<BaseInvoker> union(List<BaseInvoker>... theInvokersLists) {
List<BaseInvoker> haveOne = null;
boolean haveMultiple = false;
for (List<BaseInvoker> nextInvokerList : theInvokersLists) {
if (nextInvokerList == null || nextInvokerList.isEmpty()) {
continue;
}
if (haveOne == null) {
haveOne = nextInvokerList;
} else {
haveMultiple = true;
}
}
if (haveOne == null) {
return Collections.emptyList();
}
List<BaseInvoker> retVal;
if (!haveMultiple) {
// The global list doesn't need to be sorted every time since it's sorted on
// insertion each time. Doing so is a waste of cycles..
if (haveOne == theInvokersLists[0]) {
retVal = haveOne;
} else {
retVal = new ArrayList<>(haveOne);
retVal.sort(Comparator.naturalOrder());
}
} else {
retVal = Arrays.stream(theInvokersLists)
.filter(Objects::nonNull)
.flatMap(Collection::stream)
.sorted()
.collect(Collectors.toList());
}
return retVal;
}
/** /**
* Only call this when assertions are enabled, it's expensive * Only call this when assertions are enabled, it's expensive
*/ */
final boolean haveAppropriateParams(POINTCUT thePointcut, HookParams theParams) { public static boolean haveAppropriateParams(IPointcut thePointcut, HookParams theParams) {
if (theParams.getParamsForType().values().size() if (theParams.getParamsForType().values().size()
!= thePointcut.getParameterTypes().size()) { != thePointcut.getParameterTypes().size()) {
throw new IllegalArgumentException(Msg.code(1909) throw new IllegalArgumentException(Msg.code(1909)
@ -430,7 +367,7 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
} }
private List<HookInvoker> scanInterceptorAndAddToInvokerMultimap( private List<HookInvoker> scanInterceptorAndAddToInvokerMultimap(
Object theInterceptor, ListMultimap<POINTCUT, BaseInvoker> theInvokers) { Object theInterceptor, ListMultimap<POINTCUT, IInvoker> theInvokers) {
Class<?> interceptorClass = theInterceptor.getClass(); Class<?> interceptorClass = theInterceptor.getClass();
int typeOrder = determineOrder(interceptorClass); int typeOrder = determineOrder(interceptorClass);
@ -452,7 +389,7 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
// Make sure we're always sorted according to the order declared in @Order // Make sure we're always sorted according to the order declared in @Order
for (POINTCUT nextPointcut : theInvokers.keys()) { for (POINTCUT nextPointcut : theInvokers.keys()) {
List<BaseInvoker> nextInvokerList = theInvokers.get(nextPointcut); List<IInvoker> nextInvokerList = theInvokers.get(nextPointcut);
nextInvokerList.sort(Comparator.naturalOrder()); nextInvokerList.sort(Comparator.naturalOrder());
} }
@ -483,6 +420,108 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
protected abstract Optional<HookDescriptor> scanForHook(Method nextMethod); protected abstract Optional<HookDescriptor> scanForHook(Method nextMethod);
public static Object callInvokers(IPointcut thePointcut, HookParams theParams, List<IInvoker> invokers) {
Object retVal = null;
/*
* Call each hook in order
*/
for (IInvoker nextInvoker : invokers) {
Object nextOutcome = nextInvoker.invoke(theParams);
Class<?> pointcutReturnType = thePointcut.getReturnType();
if (pointcutReturnType.equals(thePointcut.getBooleanReturnTypeForEnum())) {
Boolean nextOutcomeAsBoolean = (Boolean) nextOutcome;
if (Boolean.FALSE.equals(nextOutcomeAsBoolean)) {
ourLog.trace("callHooks({}) for invoker({}) returned false", thePointcut, nextInvoker);
retVal = false;
break;
} else {
retVal = true;
}
} else if (!pointcutReturnType.equals(void.class)) {
if (nextOutcome != null) {
retVal = nextOutcome;
break;
}
}
}
return retVal;
}
/**
* First argument must be the global invoker list!!
*/
public static List<IInvoker> union(List<List<IInvoker>> theInvokersLists) {
List<IInvoker> haveOne = null;
boolean haveMultiple = false;
for (List<IInvoker> nextInvokerList : theInvokersLists) {
if (nextInvokerList == null || nextInvokerList.isEmpty()) {
continue;
}
if (haveOne == null) {
haveOne = nextInvokerList;
} else {
haveMultiple = true;
}
}
if (haveOne == null) {
return Collections.emptyList();
}
List<IInvoker> retVal;
if (!haveMultiple) {
// The global list doesn't need to be sorted every time since it's sorted on
// insertion each time. Doing so is a waste of cycles..
if (haveOne == theInvokersLists.get(0)) {
retVal = haveOne;
} else {
retVal = new ArrayList<>(haveOne);
retVal.sort(Comparator.naturalOrder());
}
} else {
int totalSize = 0;
for (List<IInvoker> list : theInvokersLists) {
totalSize += list.size();
}
retVal = new ArrayList<>(totalSize);
for (List<IInvoker> list : theInvokersLists) {
retVal.addAll(list);
}
retVal.sort(Comparator.naturalOrder());
}
return retVal;
}
protected static <T extends Annotation> Optional<T> findAnnotation(
AnnotatedElement theObject, Class<T> theHookClass) {
T annotation;
if (theObject instanceof Method) {
annotation = MethodUtils.getAnnotation((Method) theObject, theHookClass, true, true);
} else {
annotation = theObject.getAnnotation(theHookClass);
}
return Optional.ofNullable(annotation);
}
private static int determineOrder(Class<?> theInterceptorClass) {
return findAnnotation(theInterceptorClass, Interceptor.class)
.map(Interceptor::order)
.orElse(Interceptor.DEFAULT_ORDER);
}
private static String toErrorString(List<String> theParameterTypes) {
return theParameterTypes.stream().sorted().collect(Collectors.joining(","));
}
private class HookInvoker extends BaseInvoker { private class HookInvoker extends BaseInvoker {
private final Method myMethod; private final Method myMethod;
@ -501,10 +540,11 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
myMethod = theHookMethod; myMethod = theHookMethod;
Class<?> returnType = theHookMethod.getReturnType(); Class<?> returnType = theHookMethod.getReturnType();
if (myPointcut.getReturnType().equals(getBooleanReturnType())) { if (myPointcut.getReturnType().equals(myPointcut.getBooleanReturnTypeForEnum())) {
Validate.isTrue( Validate.isTrue(
getBooleanReturnType().equals(returnType) || void.class.equals(returnType), myPointcut.getBooleanReturnTypeForEnum().equals(returnType) || void.class.equals(returnType),
"Method does not return boolean or void: %s", "Method does not return %s or void: %s",
myPointcut.getBooleanReturnTypeForEnum().getSimpleName(),
theHookMethod); theHookMethod);
} else if (myPointcut.getReturnType().equals(void.class)) { } else if (myPointcut.getReturnType().equals(void.class)) {
Validate.isTrue(void.class.equals(returnType), "Method does not return void: %s", theHookMethod); Validate.isTrue(void.class.equals(returnType), "Method does not return void: %s", theHookMethod);
@ -541,7 +581,7 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
* @return Returns true/false if the hook method returns a boolean, returns true otherwise * @return Returns true/false if the hook method returns a boolean, returns true otherwise
*/ */
@Override @Override
Object invoke(HookParams theParams) { public Object invoke(HookParams theParams) {
Object[] args = new Object[myParameterTypes.length]; Object[] args = new Object[myParameterTypes.length];
for (int i = 0; i < myParameterTypes.length; i++) { for (int i = 0; i < myParameterTypes.length; i++) {
@ -610,7 +650,7 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
} }
} }
protected abstract static class BaseInvoker implements Comparable<BaseInvoker> { public abstract static class BaseInvoker implements IInvoker {
private final int myOrder; private final int myOrder;
private final Object myInterceptor; private final Object myInterceptor;
@ -620,36 +660,19 @@ public abstract class BaseInterceptorService<POINTCUT extends Enum<POINTCUT> & I
myOrder = theOrder; myOrder = theOrder;
} }
@Override
public Object getInterceptor() { public Object getInterceptor() {
return myInterceptor; return myInterceptor;
} }
abstract Object invoke(HookParams theParams); @Override
public int getOrder() {
return myOrder;
}
@Override @Override
public int compareTo(BaseInvoker theInvoker) { public int compareTo(IInvoker theInvoker) {
return myOrder - theInvoker.myOrder; return myOrder - theInvoker.getOrder();
} }
} }
protected static <T extends Annotation> Optional<T> findAnnotation(
AnnotatedElement theObject, Class<T> theHookClass) {
T annotation;
if (theObject instanceof Method) {
annotation = MethodUtils.getAnnotation((Method) theObject, theHookClass, true, true);
} else {
annotation = theObject.getAnnotation(theHookClass);
}
return Optional.ofNullable(annotation);
}
private static int determineOrder(Class<?> theInterceptorClass) {
return findAnnotation(theInterceptorClass, Interceptor.class)
.map(Interceptor::order)
.orElse(Interceptor.DEFAULT_ORDER);
}
private static String toErrorString(List<String> theParameterTypes) {
return theParameterTypes.stream().sorted().collect(Collectors.joining(","));
}
} }

View File

@ -64,8 +64,8 @@ public class InterceptorService extends BaseInterceptorService<Pointcut>
@Override @Override
public void registerAnonymousInterceptor(Pointcut thePointcut, int theOrder, IAnonymousInterceptor theInterceptor) { public void registerAnonymousInterceptor(Pointcut thePointcut, int theOrder, IAnonymousInterceptor theInterceptor) {
Validate.notNull(thePointcut); Validate.notNull(thePointcut, "thePointcut must not be null");
Validate.notNull(theInterceptor); Validate.notNull(theInterceptor, "theInterceptor must not be null");
BaseInvoker invoker = new AnonymousLambdaInvoker(thePointcut, theInterceptor, theOrder); BaseInvoker invoker = new AnonymousLambdaInvoker(thePointcut, theInterceptor, theOrder);
registerAnonymousInterceptor(thePointcut, theInterceptor, invoker); registerAnonymousInterceptor(thePointcut, theInterceptor, invoker);
} }
@ -81,7 +81,7 @@ public class InterceptorService extends BaseInterceptorService<Pointcut>
} }
@Override @Override
Object invoke(HookParams theParams) { public Object invoke(HookParams theParams) {
myHook.invoke(myPointcut, theParams); myHook.invoke(myPointcut, theParams);
return true; return true;
} }

View File

@ -20,6 +20,7 @@
package ca.uhn.fhir.narrative2; package ca.uhn.fhir.narrative2;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.util.BundleUtil; import ca.uhn.fhir.util.BundleUtil;
import org.apache.commons.lang3.tuple.Pair; import org.apache.commons.lang3.tuple.Pair;
import org.hl7.fhir.instance.model.api.IBaseBundle; import org.hl7.fhir.instance.model.api.IBaseBundle;
@ -42,7 +43,8 @@ public class NarrativeGeneratorTemplateUtils {
* Given a Bundle as input, are any entries present with a given resource type * Given a Bundle as input, are any entries present with a given resource type
*/ */
public boolean bundleHasEntriesWithResourceType(IBaseBundle theBaseBundle, String theResourceType) { public boolean bundleHasEntriesWithResourceType(IBaseBundle theBaseBundle, String theResourceType) {
FhirContext ctx = theBaseBundle.getStructureFhirVersionEnum().newContextCached(); FhirVersionEnum fhirVersionEnum = theBaseBundle.getStructureFhirVersionEnum();
FhirContext ctx = FhirContext.forCached(fhirVersionEnum);
List<Pair<String, IBaseResource>> entryResources = List<Pair<String, IBaseResource>> entryResources =
BundleUtil.getBundleEntryUrlsAndResources(ctx, theBaseBundle); BundleUtil.getBundleEntryUrlsAndResources(ctx, theBaseBundle);
return entryResources.stream() return entryResources.stream()

View File

@ -105,7 +105,6 @@ public abstract class BaseParser implements IParser {
private static final Set<String> notEncodeForContainedResource = private static final Set<String> notEncodeForContainedResource =
new HashSet<>(Arrays.asList("security", "versionId", "lastUpdated")); new HashSet<>(Arrays.asList("security", "versionId", "lastUpdated"));
private FhirTerser.ContainedResources myContainedResources;
private boolean myEncodeElementsAppliesToChildResourcesOnly; private boolean myEncodeElementsAppliesToChildResourcesOnly;
private final FhirContext myContext; private final FhirContext myContext;
private Collection<String> myDontEncodeElements; private Collection<String> myDontEncodeElements;
@ -183,12 +182,15 @@ public abstract class BaseParser implements IParser {
} }
private String determineReferenceText( private String determineReferenceText(
IBaseReference theRef, CompositeChildElement theCompositeChildElement, IBaseResource theResource) { IBaseReference theRef,
CompositeChildElement theCompositeChildElement,
IBaseResource theResource,
EncodeContext theContext) {
IIdType ref = theRef.getReferenceElement(); IIdType ref = theRef.getReferenceElement();
if (isBlank(ref.getIdPart())) { if (isBlank(ref.getIdPart())) {
String reference = ref.getValue(); String reference = ref.getValue();
if (theRef.getResource() != null) { if (theRef.getResource() != null) {
IIdType containedId = getContainedResources().getResourceId(theRef.getResource()); IIdType containedId = theContext.getContainedResources().getResourceId(theRef.getResource());
if (containedId != null && !containedId.isEmpty()) { if (containedId != null && !containedId.isEmpty()) {
if (containedId.isLocal()) { if (containedId.isLocal()) {
reference = containedId.getValue(); reference = containedId.getValue();
@ -262,7 +264,8 @@ public abstract class BaseParser implements IParser {
@Override @Override
public final void encodeResourceToWriter(IBaseResource theResource, Writer theWriter) public final void encodeResourceToWriter(IBaseResource theResource, Writer theWriter)
throws IOException, DataFormatException { throws IOException, DataFormatException {
EncodeContext encodeContext = new EncodeContext(this, myContext.getParserOptions()); EncodeContext encodeContext =
new EncodeContext(this, myContext.getParserOptions(), new FhirTerser.ContainedResources());
encodeResourceToWriter(theResource, theWriter, encodeContext); encodeResourceToWriter(theResource, theWriter, encodeContext);
} }
@ -285,7 +288,8 @@ public abstract class BaseParser implements IParser {
} else if (theElement instanceof IPrimitiveType) { } else if (theElement instanceof IPrimitiveType) {
theWriter.write(((IPrimitiveType<?>) theElement).getValueAsString()); theWriter.write(((IPrimitiveType<?>) theElement).getValueAsString());
} else { } else {
EncodeContext encodeContext = new EncodeContext(this, myContext.getParserOptions()); EncodeContext encodeContext =
new EncodeContext(this, myContext.getParserOptions(), new FhirTerser.ContainedResources());
encodeToWriter(theElement, theWriter, encodeContext); encodeToWriter(theElement, theWriter, encodeContext);
} }
} }
@ -404,10 +408,6 @@ public abstract class BaseParser implements IParser {
return elementId; return elementId;
} }
FhirTerser.ContainedResources getContainedResources() {
return myContainedResources;
}
@Override @Override
public Set<String> getDontStripVersionsFromReferencesAtPaths() { public Set<String> getDontStripVersionsFromReferencesAtPaths() {
return myDontStripVersionsFromReferencesAtPaths; return myDontStripVersionsFromReferencesAtPaths;
@ -539,10 +539,11 @@ public abstract class BaseParser implements IParser {
return mySuppressNarratives; return mySuppressNarratives;
} }
protected boolean isChildContained(BaseRuntimeElementDefinition<?> childDef, boolean theIncludedResource) { protected boolean isChildContained(
BaseRuntimeElementDefinition<?> childDef, boolean theIncludedResource, EncodeContext theContext) {
return (childDef.getChildType() == ChildTypeEnum.CONTAINED_RESOURCES return (childDef.getChildType() == ChildTypeEnum.CONTAINED_RESOURCES
|| childDef.getChildType() == ChildTypeEnum.CONTAINED_RESOURCE_LIST) || childDef.getChildType() == ChildTypeEnum.CONTAINED_RESOURCE_LIST)
&& getContainedResources().isEmpty() == false && theContext.getContainedResources().isEmpty() == false
&& theIncludedResource == false; && theIncludedResource == false;
} }
@ -788,7 +789,8 @@ public abstract class BaseParser implements IParser {
*/ */
if (next instanceof IBaseReference) { if (next instanceof IBaseReference) {
IBaseReference nextRef = (IBaseReference) next; IBaseReference nextRef = (IBaseReference) next;
String refText = determineReferenceText(nextRef, theCompositeChildElement, theResource); String refText =
determineReferenceText(nextRef, theCompositeChildElement, theResource, theEncodeContext);
if (!StringUtils.equals(refText, nextRef.getReferenceElement().getValue())) { if (!StringUtils.equals(refText, nextRef.getReferenceElement().getValue())) {
if (retVal == theValues) { if (retVal == theValues) {
@ -980,7 +982,7 @@ public abstract class BaseParser implements IParser {
return true; return true;
} }
protected void containResourcesInReferences(IBaseResource theResource) { protected void containResourcesInReferences(IBaseResource theResource, EncodeContext theContext) {
/* /*
* If a UUID is present in Bundle.entry.fullUrl but no value is present * If a UUID is present in Bundle.entry.fullUrl but no value is present
@ -1003,7 +1005,7 @@ public abstract class BaseParser implements IParser {
} }
} }
myContainedResources = getContext().newTerser().containResources(theResource); theContext.setContainedResources(getContext().newTerser().containResources(theResource));
} }
static class ChildNameAndDef { static class ChildNameAndDef {
@ -1034,8 +1036,12 @@ public abstract class BaseParser implements IParser {
private final List<EncodeContextPath> myEncodeElementPaths; private final List<EncodeContextPath> myEncodeElementPaths;
private final Set<String> myEncodeElementsAppliesToResourceTypes; private final Set<String> myEncodeElementsAppliesToResourceTypes;
private final List<EncodeContextPath> myDontEncodeElementPaths; private final List<EncodeContextPath> myDontEncodeElementPaths;
private FhirTerser.ContainedResources myContainedResources;
public EncodeContext(BaseParser theParser, ParserOptions theParserOptions) { public EncodeContext(
BaseParser theParser,
ParserOptions theParserOptions,
FhirTerser.ContainedResources theContainedResources) {
Collection<String> encodeElements = theParser.myEncodeElements; Collection<String> encodeElements = theParser.myEncodeElements;
Collection<String> dontEncodeElements = theParser.myDontEncodeElements; Collection<String> dontEncodeElements = theParser.myDontEncodeElements;
if (isSummaryMode()) { if (isSummaryMode()) {
@ -1058,6 +1064,8 @@ public abstract class BaseParser implements IParser {
dontEncodeElements.stream().map(EncodeContextPath::new).collect(Collectors.toList()); dontEncodeElements.stream().map(EncodeContextPath::new).collect(Collectors.toList());
} }
myContainedResources = theContainedResources;
myEncodeElementsAppliesToResourceTypes = myEncodeElementsAppliesToResourceTypes =
ParserUtil.determineApplicableResourceTypesForTerserPaths(myEncodeElementPaths); ParserUtil.determineApplicableResourceTypesForTerserPaths(myEncodeElementPaths);
} }
@ -1065,6 +1073,14 @@ public abstract class BaseParser implements IParser {
private Map<Key, List<BaseParser.CompositeChildElement>> getCompositeChildrenCache() { private Map<Key, List<BaseParser.CompositeChildElement>> getCompositeChildrenCache() {
return myCompositeChildrenCache; return myCompositeChildrenCache;
} }
public FhirTerser.ContainedResources getContainedResources() {
return myContainedResources;
}
public void setContainedResources(FhirTerser.ContainedResources theContainedResources) {
myContainedResources = theContainedResources;
}
} }
protected class CompositeChildElement { protected class CompositeChildElement {

View File

@ -54,6 +54,7 @@ import ca.uhn.fhir.parser.json.JsonLikeStructure;
import ca.uhn.fhir.parser.json.jackson.JacksonStructure; import ca.uhn.fhir.parser.json.jackson.JacksonStructure;
import ca.uhn.fhir.rest.api.EncodingEnum; import ca.uhn.fhir.rest.api.EncodingEnum;
import ca.uhn.fhir.util.ElementUtil; import ca.uhn.fhir.util.ElementUtil;
import ca.uhn.fhir.util.FhirTerser;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.Validate; import org.apache.commons.lang3.Validate;
import org.apache.commons.text.WordUtils; import org.apache.commons.text.WordUtils;
@ -386,12 +387,14 @@ public class JsonParser extends BaseParser implements IJsonLikeParser {
} }
case CONTAINED_RESOURCE_LIST: case CONTAINED_RESOURCE_LIST:
case CONTAINED_RESOURCES: { case CONTAINED_RESOURCES: {
List<IBaseResource> containedResources = getContainedResources().getContainedResources(); List<IBaseResource> containedResources =
theEncodeContext.getContainedResources().getContainedResources();
if (containedResources.size() > 0) { if (containedResources.size() > 0) {
beginArray(theEventWriter, theChildName); beginArray(theEventWriter, theChildName);
for (IBaseResource next : containedResources) { for (IBaseResource next : containedResources) {
IIdType resourceId = getContainedResources().getResourceId(next); IIdType resourceId =
theEncodeContext.getContainedResources().getResourceId(next);
String value = resourceId.getValue(); String value = resourceId.getValue();
encodeResourceToJsonStreamWriter( encodeResourceToJsonStreamWriter(
theResDef, theResDef,
@ -554,7 +557,8 @@ public class JsonParser extends BaseParser implements IJsonLikeParser {
if (nextValue == null || nextValue.isEmpty()) { if (nextValue == null || nextValue.isEmpty()) {
if (nextValue instanceof BaseContainedDt) { if (nextValue instanceof BaseContainedDt) {
if (theContainedResource || getContainedResources().isEmpty()) { if (theContainedResource
|| theEncodeContext.getContainedResources().isEmpty()) {
continue; continue;
} }
} else { } else {
@ -838,7 +842,8 @@ public class JsonParser extends BaseParser implements IJsonLikeParser {
+ theResource.getStructureFhirVersionEnum()); + theResource.getStructureFhirVersionEnum());
} }
EncodeContext encodeContext = new EncodeContext(this, getContext().getParserOptions()); EncodeContext encodeContext =
new EncodeContext(this, getContext().getParserOptions(), new FhirTerser.ContainedResources());
String resourceName = getContext().getResourceType(theResource); String resourceName = getContext().getResourceType(theResource);
encodeContext.pushPath(resourceName, true); encodeContext.pushPath(resourceName, true);
doEncodeResourceToJsonLikeWriter(theResource, theJsonLikeWriter, encodeContext); doEncodeResourceToJsonLikeWriter(theResource, theJsonLikeWriter, encodeContext);
@ -894,7 +899,7 @@ public class JsonParser extends BaseParser implements IJsonLikeParser {
} }
if (!theContainedResource) { if (!theContainedResource) {
containResourcesInReferences(theResource); containResourcesInReferences(theResource, theEncodeContext);
} }
RuntimeResourceDefinition resDef = getContext().getResourceDefinition(theResource); RuntimeResourceDefinition resDef = getContext().getResourceDefinition(theResource);

View File

@ -191,7 +191,7 @@ public class RDFParser extends BaseParser {
} }
if (!containedResource) { if (!containedResource) {
containResourcesInReferences(resource); containResourcesInReferences(resource, encodeContext);
} }
if (!(resource instanceof IAnyResource)) { if (!(resource instanceof IAnyResource)) {
@ -354,7 +354,7 @@ public class RDFParser extends BaseParser {
try { try {
if (element == null || element.isEmpty()) { if (element == null || element.isEmpty()) {
if (!isChildContained(childDef, includedResource)) { if (!isChildContained(childDef, includedResource, theEncodeContext)) {
return rdfModel; return rdfModel;
} }
} }

View File

@ -295,7 +295,7 @@ public class XmlParser extends BaseParser {
try { try {
if (theElement == null || theElement.isEmpty()) { if (theElement == null || theElement.isEmpty()) {
if (isChildContained(childDef, theIncludedResource)) { if (isChildContained(childDef, theIncludedResource, theEncodeContext)) {
// We still want to go in.. // We still want to go in..
} else { } else {
return; return;
@ -359,8 +359,10 @@ public class XmlParser extends BaseParser {
* theEventWriter.writeStartElement("contained"); encodeResourceToXmlStreamWriter(next, theEventWriter, true, fixContainedResourceId(next.getId().getValue())); * theEventWriter.writeStartElement("contained"); encodeResourceToXmlStreamWriter(next, theEventWriter, true, fixContainedResourceId(next.getId().getValue()));
* theEventWriter.writeEndElement(); } * theEventWriter.writeEndElement(); }
*/ */
for (IBaseResource next : getContainedResources().getContainedResources()) { for (IBaseResource next :
IIdType resourceId = getContainedResources().getResourceId(next); theEncodeContext.getContainedResources().getContainedResources()) {
IIdType resourceId =
theEncodeContext.getContainedResources().getResourceId(next);
theEventWriter.writeStartElement("contained"); theEventWriter.writeStartElement("contained");
String value = resourceId.getValue(); String value = resourceId.getValue();
encodeResourceToXmlStreamWriter( encodeResourceToXmlStreamWriter(
@ -682,7 +684,7 @@ public class XmlParser extends BaseParser {
} }
if (!theContainedResource) { if (!theContainedResource) {
containResourcesInReferences(theResource); containResourcesInReferences(theResource, theEncodeContext);
} }
theEventWriter.writeStartElement(resDef.getName()); theEventWriter.writeStartElement(resDef.getName());

View File

@ -28,6 +28,8 @@ import ca.uhn.fhir.rest.server.exceptions.ForbiddenOperationException;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.NotImplementedOperationException; import ca.uhn.fhir.rest.server.exceptions.NotImplementedOperationException;
import com.google.common.annotations.Beta; import com.google.common.annotations.Beta;
import com.google.common.collect.ArrayListMultimap;
import com.google.common.collect.Multimap;
import org.hl7.fhir.instance.model.api.IBaseBundle; import org.hl7.fhir.instance.model.api.IBaseBundle;
import org.hl7.fhir.instance.model.api.IBaseConformance; import org.hl7.fhir.instance.model.api.IBaseConformance;
import org.hl7.fhir.instance.model.api.IBaseParameters; import org.hl7.fhir.instance.model.api.IBaseParameters;
@ -231,6 +233,23 @@ public interface Repository {
// Querying starts here // Querying starts here
/**
* Searches this repository
*
* @see <a href="https://www.hl7.org/fhir/http.html#search">FHIR search</a>
*
* @param <B> a Bundle type
* @param <T> a Resource type
* @param bundleType the class of the Bundle type to return
* @param resourceType the class of the Resource type to search
* @param searchParameters the searchParameters for this search
* @return a Bundle with the results of the search
*/
default <B extends IBaseBundle, T extends IBaseResource> B search(
Class<B> bundleType, Class<T> resourceType, Multimap<String, List<IQueryParameterType>> searchParameters) {
return this.search(bundleType, resourceType, searchParameters, Collections.emptyMap());
}
/** /**
* Searches this repository * Searches this repository
* *
@ -264,9 +283,32 @@ public interface Repository {
<B extends IBaseBundle, T extends IBaseResource> B search( <B extends IBaseBundle, T extends IBaseResource> B search(
Class<B> bundleType, Class<B> bundleType,
Class<T> resourceType, Class<T> resourceType,
Map<String, List<IQueryParameterType>> searchParameters, Multimap<String, List<IQueryParameterType>> searchParameters,
Map<String, String> headers); Map<String, String> headers);
/**
* Searches this repository
*
* @see <a href="https://www.hl7.org/fhir/http.html#search">FHIR search</a>
*
* @param <B> a Bundle type
* @param <T> a Resource type
* @param bundleType the class of the Bundle type to return
* @param resourceType the class of the Resource type to search
* @param searchParameters the searchParameters for this search
* @param headers headers for this request, typically key-value pairs of HTTP headers
* @return a Bundle with the results of the search
*/
default <B extends IBaseBundle, T extends IBaseResource> B search(
Class<B> bundleType,
Class<T> resourceType,
Map<String, List<IQueryParameterType>> searchParameters,
Map<String, String> headers) {
ArrayListMultimap<String, List<IQueryParameterType>> multimap = ArrayListMultimap.create();
searchParameters.forEach(multimap::put);
return this.search(bundleType, resourceType, multimap, headers);
}
// Paging starts here // Paging starts here
/** /**

View File

@ -38,7 +38,6 @@ import ca.uhn.fhir.model.api.IResource;
import ca.uhn.fhir.model.api.ISupportsUndeclaredExtensions; import ca.uhn.fhir.model.api.ISupportsUndeclaredExtensions;
import ca.uhn.fhir.model.base.composite.BaseContainedDt; import ca.uhn.fhir.model.base.composite.BaseContainedDt;
import ca.uhn.fhir.model.base.composite.BaseResourceReferenceDt; import ca.uhn.fhir.model.base.composite.BaseResourceReferenceDt;
import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.model.primitive.StringDt; import ca.uhn.fhir.model.primitive.StringDt;
import ca.uhn.fhir.parser.DataFormatException; import ca.uhn.fhir.parser.DataFormatException;
import com.google.common.collect.Lists; import com.google.common.collect.Lists;
@ -61,6 +60,7 @@ import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collection; import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.Comparator;
import java.util.HashMap; import java.util.HashMap;
import java.util.HashSet; import java.util.HashSet;
import java.util.IdentityHashMap; import java.util.IdentityHashMap;
@ -70,6 +70,7 @@ import java.util.Map;
import java.util.Objects; import java.util.Objects;
import java.util.Optional; import java.util.Optional;
import java.util.Set; import java.util.Set;
import java.util.UUID;
import java.util.regex.Matcher; import java.util.regex.Matcher;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import java.util.stream.Collectors; import java.util.stream.Collectors;
@ -77,16 +78,28 @@ import java.util.stream.Collectors;
import static org.apache.commons.lang3.StringUtils.defaultString; import static org.apache.commons.lang3.StringUtils.defaultString;
import static org.apache.commons.lang3.StringUtils.isBlank; import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.isNotBlank; import static org.apache.commons.lang3.StringUtils.isNotBlank;
import static org.apache.commons.lang3.StringUtils.substring;
public class FhirTerser { public class FhirTerser {
private static final Pattern COMPARTMENT_MATCHER_PATH = private static final Pattern COMPARTMENT_MATCHER_PATH =
Pattern.compile("([a-zA-Z.]+)\\.where\\(resolve\\(\\) is ([a-zA-Z]+)\\)"); Pattern.compile("([a-zA-Z.]+)\\.where\\(resolve\\(\\) is ([a-zA-Z]+)\\)");
private static final String USER_DATA_KEY_CONTAIN_RESOURCES_COMPLETED = private static final String USER_DATA_KEY_CONTAIN_RESOURCES_COMPLETED =
FhirTerser.class.getName() + "_CONTAIN_RESOURCES_COMPLETED"; FhirTerser.class.getName() + "_CONTAIN_RESOURCES_COMPLETED";
private final FhirContext myContext; private final FhirContext myContext;
/**
* This comparator sorts IBaseReferences, and places any that are missing an ID at the end. Those with an ID go to the front.
*/
private static final Comparator<IBaseReference> REFERENCES_WITH_IDS_FIRST =
Comparator.nullsLast(Comparator.comparing(ref -> {
if (ref.getResource() == null) return true;
if (ref.getResource().getIdElement() == null) return true;
if (ref.getResource().getIdElement().getValue() == null) return true;
return false;
}));
public FhirTerser(FhirContext theContext) { public FhirTerser(FhirContext theContext) {
super(); super();
myContext = theContext; myContext = theContext;
@ -1418,6 +1431,13 @@ public class FhirTerser {
private void containResourcesForEncoding( private void containResourcesForEncoding(
ContainedResources theContained, IBaseResource theResource, boolean theModifyResource) { ContainedResources theContained, IBaseResource theResource, boolean theModifyResource) {
List<IBaseReference> allReferences = getAllPopulatedChildElementsOfType(theResource, IBaseReference.class); List<IBaseReference> allReferences = getAllPopulatedChildElementsOfType(theResource, IBaseReference.class);
// Note that we process all contained resources that have arrived here with an ID contained resources first, so
// that we don't accidentally auto-assign an ID
// which may collide with a resource we have yet to process.
// See: https://github.com/hapifhir/hapi-fhir/issues/6403
allReferences.sort(REFERENCES_WITH_IDS_FIRST);
for (IBaseReference next : allReferences) { for (IBaseReference next : allReferences) {
IBaseResource resource = next.getResource(); IBaseResource resource = next.getResource();
if (resource == null && next.getReferenceElement().isLocal()) { if (resource == null && next.getReferenceElement().isLocal()) {
@ -1437,11 +1457,11 @@ public class FhirTerser {
IBaseResource resource = next.getResource(); IBaseResource resource = next.getResource();
if (resource != null) { if (resource != null) {
if (resource.getIdElement().isEmpty() || resource.getIdElement().isLocal()) { if (resource.getIdElement().isEmpty() || resource.getIdElement().isLocal()) {
if (theContained.getResourceId(resource) != null) {
// Prevent infinite recursion if there are circular loops in the contained resources IIdType id = theContained.addContained(resource);
if (id == null) {
continue; continue;
} }
IIdType id = theContained.addContained(resource);
if (theModifyResource) { if (theModifyResource) {
getContainedResourceList(theResource).add(resource); getContainedResourceList(theResource).add(resource);
next.setReference(id.getValue()); next.setReference(id.getValue());
@ -1768,8 +1788,6 @@ public class FhirTerser {
} }
public static class ContainedResources { public static class ContainedResources {
private long myNextContainedId = 1;
private List<IBaseResource> myResourceList; private List<IBaseResource> myResourceList;
private IdentityHashMap<IBaseResource, IIdType> myResourceToIdMap; private IdentityHashMap<IBaseResource, IIdType> myResourceToIdMap;
private Map<String, IBaseResource> myExistingIdToContainedResourceMap; private Map<String, IBaseResource> myExistingIdToContainedResourceMap;
@ -1782,6 +1800,11 @@ public class FhirTerser {
} }
public IIdType addContained(IBaseResource theResource) { public IIdType addContained(IBaseResource theResource) {
if (this.getResourceId(theResource) != null) {
// Prevent infinite recursion if there are circular loops in the contained resources
return null;
}
IIdType existing = getResourceToIdMap().get(theResource); IIdType existing = getResourceToIdMap().get(theResource);
if (existing != null) { if (existing != null) {
return existing; return existing;
@ -1789,16 +1812,7 @@ public class FhirTerser {
IIdType newId = theResource.getIdElement(); IIdType newId = theResource.getIdElement();
if (isBlank(newId.getValue())) { if (isBlank(newId.getValue())) {
newId.setValue("#" + myNextContainedId++); newId.setValue("#" + UUID.randomUUID());
} else {
// Avoid auto-assigned contained IDs colliding with pre-existing ones
String idPart = newId.getValue();
if (substring(idPart, 0, 1).equals("#")) {
idPart = idPart.substring(1);
if (StringUtils.isNumeric(idPart)) {
myNextContainedId = Long.parseLong(idPart) + 1;
}
}
} }
getResourceToIdMap().put(theResource, newId); getResourceToIdMap().put(theResource, newId);
@ -1862,45 +1876,5 @@ public class FhirTerser {
public boolean hasExistingIdToContainedResource() { public boolean hasExistingIdToContainedResource() {
return myExistingIdToContainedResourceMap != null; return myExistingIdToContainedResourceMap != null;
} }
public void assignIdsToContainedResources() {
if (!getContainedResources().isEmpty()) {
/*
* The idea with the code block below:
*
* We want to preserve any IDs that were user-assigned, so that if it's really
* important to someone that their contained resource have the ID of #FOO
* or #1 we will keep that.
*
* For any contained resources where no ID was assigned by the user, we
* want to manually create an ID but make sure we don't reuse an existing ID.
*/
Set<String> ids = new HashSet<>();
// Gather any user assigned IDs
for (IBaseResource nextResource : getContainedResources()) {
if (getResourceToIdMap().get(nextResource) != null) {
ids.add(getResourceToIdMap().get(nextResource).getValue());
}
}
// Automatically assign IDs to the rest
for (IBaseResource nextResource : getContainedResources()) {
while (getResourceToIdMap().get(nextResource) == null) {
String nextCandidate = "#" + myNextContainedId;
myNextContainedId++;
if (!ids.add(nextCandidate)) {
continue;
}
getResourceToIdMap().put(nextResource, new IdDt(nextCandidate));
}
}
}
}
} }
} }

View File

@ -170,6 +170,7 @@ public enum VersionEnum {
V7_5_0, V7_5_0,
V7_6_0, V7_6_0,
V7_6_1,
V7_7_0, V7_7_0,
V7_8_0; V7_8_0;

View File

@ -0,0 +1,62 @@
/*-
* #%L
* HAPI FHIR - Core Library
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.util.adapters;
import jakarta.annotation.Nonnull;
import java.util.HashSet;
import java.util.Optional;
import java.util.Set;
import java.util.stream.Stream;
public class AdapterManager implements IAdapterManager {
public static final AdapterManager INSTANCE = new AdapterManager();
Set<IAdapterFactory> myAdapterFactories = new HashSet<>();
/**
* Hidden to force shared use of the public INSTANCE.
*/
AdapterManager() {}
public <T> @Nonnull Optional<T> getAdapter(Object theObject, Class<T> theTargetType) {
// todo this can be sped up with a cache of type->Factory.
return myAdapterFactories.stream()
.filter(nextFactory -> nextFactory.getAdapters().stream().anyMatch(theTargetType::isAssignableFrom))
.flatMap(nextFactory -> {
var adapter = nextFactory.getAdapter(theObject, theTargetType);
// can't use Optional.stream() because of our Android target is API level 26/JDK 8.
if (adapter.isPresent()) {
return Stream.of(adapter.get());
} else {
return Stream.empty();
}
})
.findFirst();
}
public void registerFactory(@Nonnull IAdapterFactory theFactory) {
myAdapterFactories.add(theFactory);
}
public void unregisterFactory(@Nonnull IAdapterFactory theFactory) {
myAdapterFactories.remove(theFactory);
}
}

View File

@ -0,0 +1,48 @@
/*-
* #%L
* HAPI FHIR - Core Library
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.util.adapters;
import java.util.Optional;
public class AdapterUtils {
/**
* Main entry point for adapter calls.
* Implements three conversions: cast to the target type, use IAdaptable if present, or lastly try the AdapterManager.INSTANCE.
* @param theObject the object to be adapted
* @param theTargetType the type of the adapter requested
*/
static <T> Optional<T> adapt(Object theObject, Class<T> theTargetType) {
if (theTargetType.isInstance(theObject)) {
//noinspection unchecked
return Optional.of((T) theObject);
}
if (theObject instanceof IAdaptable) {
IAdaptable adaptable = (IAdaptable) theObject;
var adapted = adaptable.getAdapter(theTargetType);
if (adapted.isPresent()) {
return adapted;
}
}
return AdapterManager.INSTANCE.getAdapter(theObject, theTargetType);
}
}

View File

@ -0,0 +1,38 @@
/*-
* #%L
* HAPI FHIR - Core Library
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.util.adapters;
import jakarta.annotation.Nonnull;
import java.util.Optional;
/**
* Generic version of Eclipse IAdaptable interface.
*/
public interface IAdaptable {
/**
* Get an adapter of requested type.
* @param theTargetType the desired type of the adapter
* @return an adapter of theTargetType if possible, or empty.
*/
default <T> @Nonnull Optional<T> getAdapter(@Nonnull Class<T> theTargetType) {
return AdapterUtils.adapt(this, theTargetType);
}
}

View File

@ -0,0 +1,44 @@
/*-
* #%L
* HAPI FHIR - Core Library
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.util.adapters;
import java.util.Collection;
import java.util.Optional;
/**
* Interface for external service that builds adaptors for targets.
*/
public interface IAdapterFactory {
/**
* Build an adaptor for the target.
* May return empty() even if the target type is listed in getAdapters() when
* the factory fails to convert a particular instance.
*
* @param theObject the object to be adapted.
* @param theAdapterType the target type
* @return the adapter, if possible.
*/
<T> Optional<T> getAdapter(Object theObject, Class<T> theAdapterType);
/**
* @return the collection of adapter target types handled by this factory.
*/
Collection<Class<?>> getAdapters();
}

View File

@ -0,0 +1,29 @@
/*-
* #%L
* HAPI FHIR - Core Library
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.util.adapters;
import java.util.Optional;
/**
* Get an adaptor
*/
public interface IAdapterManager {
<T> Optional<T> getAdapter(Object theTarget, Class<T> theAdapter);
}

View File

@ -0,0 +1,39 @@
/*-
* #%L
* HAPI FHIR - Core Library
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
/**
* Implements the Adapter pattern to allow external classes to extend/adapt existing classes.
* Useful for extending interfaces that are closed to modification, or restricted for classpath reasons.
* <p>
* For clients, the main entry point is {@link ca.uhn.fhir.util.adapters.AdapterUtils#adapt(java.lang.Object, java.lang.Class)}
* which will attempt to cast to the target type, or build an adapter of the target type.
* </p>
* <p>
* For implementors, you can support adaptation via two mechanisms:
* <ul>
* <li>by implementing {@link ca.uhn.fhir.util.adapters.IAdaptable} directly on a class to provide supported adapters,
* <li>or when the class is closed to direct modification, you can implement
* an instance of {@link ca.uhn.fhir.util.adapters.IAdapterFactory} and register
* it with the public {@link ca.uhn.fhir.util.adapters.AdapterManager#INSTANCE}.</li>
* </ul>
* The AdapterUtils.adapt() supports both of these.
* </p>
* Inspired by the Eclipse runtime.
*/
package ca.uhn.fhir.util.adapters;

View File

@ -0,0 +1,78 @@
package ca.uhn.fhir.util.adapters;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.Test;
import java.util.Collection;
import java.util.List;
import java.util.Optional;
import static org.assertj.core.api.Assertions.assertThat;
public class AdapterManagerTest {
AdapterManager myAdapterManager = new AdapterManager();
@AfterAll
static void tearDown() {
assertThat(AdapterManager.INSTANCE.myAdapterFactories)
.withFailMessage("Don't dirty the public instance").isEmpty();
}
@Test
void testRegisterFactory_providesAdapter() {
// given
myAdapterManager.registerFactory(new StringToIntFactory());
// when
var result = myAdapterManager.getAdapter("22", Integer.class);
// then
assertThat(result).contains(22);
}
@Test
void testRegisterFactory_wrongTypeStillEmpty() {
// given
myAdapterManager.registerFactory(new StringToIntFactory());
// when
var result = myAdapterManager.getAdapter("22", Float.class);
// then
assertThat(result).isEmpty();
}
@Test
void testUnregisterFactory_providesEmpty() {
// given active factory, now gone.
StringToIntFactory factory = new StringToIntFactory();
myAdapterManager.registerFactory(factory);
myAdapterManager.getAdapter("22", Integer.class);
myAdapterManager.unregisterFactory(factory);
// when
var result = myAdapterManager.getAdapter("22", Integer.class);
// then
assertThat(result).isEmpty();
}
static class StringToIntFactory implements IAdapterFactory {
@Override
public <T> Optional<T> getAdapter(Object theObject, Class<T> theAdapterType) {
if (theObject instanceof String s) {
if (theAdapterType.isAssignableFrom(Integer.class)) {
@SuppressWarnings("unchecked")
T i = (T) Integer.valueOf(s);
return Optional.of(i);
}
}
return Optional.empty();
}
public Collection<Class<?>> getAdapters() {
return List.of(Integer.class);
}
}
}

View File

@ -0,0 +1,123 @@
package ca.uhn.fhir.util.adapters;
import jakarta.annotation.Nonnull;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.Test;
import java.util.Collection;
import java.util.Optional;
import java.util.Set;
import static org.assertj.core.api.Assertions.assertThat;
class AdapterUtilsTest {
final private IAdapterFactory myTestFactory = new TestAdaptorFactory();
@AfterEach
void tearDown() {
AdapterManager.INSTANCE.unregisterFactory(myTestFactory);
}
@Test
void testNullDoesNotAdapt() {
// when
var adapted = AdapterUtils.adapt(null, InterfaceA.class);
// then
assertThat(adapted).isEmpty();
}
@Test
void testAdaptObjectImplementingInterface() {
// given
var object = new ClassB();
// when
var adapted = AdapterUtils.adapt(object, InterfaceA.class);
// then
assertThat(adapted)
.isPresent()
.get().isInstanceOf(InterfaceA.class);
assertThat(adapted.get()).withFailMessage("Use object since it implements interface").isSameAs(object);
}
@Test
void testAdaptObjectImplementingAdaptorSupportingInterface() {
// given
var object = new SelfAdaptableClass();
// when
var adapted = AdapterUtils.adapt(object, InterfaceA.class);
// then
assertThat(adapted)
.isPresent()
.get().isInstanceOf(InterfaceA.class);
}
@Test
void testAdaptObjectViaAdapterManager() {
// given
var object = new ManagerAdaptableClass();
AdapterManager.INSTANCE.registerFactory(myTestFactory);
// when
var adapted = AdapterUtils.adapt(object, InterfaceA.class);
// then
assertThat(adapted)
.isPresent()
.get().isInstanceOf(InterfaceA.class);
}
interface InterfaceA {
}
static class ClassB implements InterfaceA {
}
/** class that can adapt itself to IAdaptable */
static class SelfAdaptableClass implements IAdaptable {
@Nonnull
@Override
public <T> Optional<T> getAdapter(@Nonnull Class<T> theTargetType) {
if (theTargetType.isAssignableFrom(InterfaceA.class)) {
T value = theTargetType.cast(buildInterfaceAWrapper(this));
return Optional.of(value);
}
return Optional.empty();
}
}
private static @Nonnull InterfaceA buildInterfaceAWrapper(Object theObject) {
return new InterfaceA() {};
}
/** Class that relies on an external IAdapterFactory */
static class ManagerAdaptableClass {
}
static class TestAdaptorFactory implements IAdapterFactory {
@Override
public <T> Optional<T> getAdapter(Object theObject, Class<T> theAdapterType) {
if (theObject instanceof ManagerAdaptableClass && theAdapterType == InterfaceA.class) {
T adapter = theAdapterType.cast(buildInterfaceAWrapper(theObject));
return Optional.of(adapter);
}
return Optional.empty();
}
@Override
public Collection<Class<?>> getAdapters() {
return Set.of(InterfaceA.class);
}
}
}

View File

@ -668,7 +668,7 @@ public abstract class BaseCommand implements Comparable<BaseCommand> {
protected void parseFhirContext(CommandLine theCommandLine) throws ParseException { protected void parseFhirContext(CommandLine theCommandLine) throws ParseException {
FhirVersionEnum versionEnum = parseFhirVersion(theCommandLine); FhirVersionEnum versionEnum = parseFhirVersion(theCommandLine);
myFhirCtx = versionEnum.newContext(); myFhirCtx = FhirContext.forVersion(versionEnum);
} }
public abstract void run(CommandLine theCommandLine) throws ParseException, ExecutionException; public abstract void run(CommandLine theCommandLine) throws ParseException, ExecutionException;

View File

@ -98,7 +98,7 @@ public class VersionCanonicalizer {
private final FhirContext myContext; private final FhirContext myContext;
public VersionCanonicalizer(FhirVersionEnum theTargetVersion) { public VersionCanonicalizer(FhirVersionEnum theTargetVersion) {
this(theTargetVersion.newContextCached()); this(FhirContext.forCached(theTargetVersion));
} }
public VersionCanonicalizer(FhirContext theTargetContext) { public VersionCanonicalizer(FhirContext theTargetContext) {

View File

@ -0,0 +1,4 @@
---
type: change
jira: SMILE-9161
title: "Contained resources which arrive without assigned IDs are now assigned GUIDs, as opposed to monotonically increasing numeric IDs. This avoids a whole class of issues related to processing order and collisions."

View File

@ -0,0 +1,6 @@
---
type: fix
jira: SMILE-9161
title: "Fixed a rare bug in the JSON Parser, wherein client-assigned contained resource IDs could collide with server-assigned contained IDs. For example if a
resource had a client-assigned contained ID of `#2`, and a contained resource with no ID, then depending on the processing order, the parser could occasionally
provide duplicate contained resource IDs, leading to non-deterministic behaviour."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 6419
title: "Previously, on Postgres, the `$reindex` operation with `optimizeStorage` set to `ALL_VERSIONS` would process
only a subset of versions if there were more than 100 versions to be processed for a resource. This has been fixed
so that all versions of the resource are now processed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 6420
title: "Previously, when the `$reindex` operation is run for a single FHIR resource with `optimizeStorage` set to
`ALL_VERSIONS`, none of the versions of the resource were processed in `hfj_res_ver` table. This has been fixed."

View File

@ -0,0 +1,7 @@
---
type: fix
issue: 6422
title: "Previously, since 7.4.4 the validation issue detail codes were not translated correctly for Remote Terminology
validateCode calls. The detail code used was `invalid-code` for all use-cases which resulted in profile binding strength
not being applied to the issue severity as expected when validating resources against a profile.
This has been fixed and issue detail codes are translated correctly."

View File

@ -0,0 +1,8 @@
---
type: fix
issue: 6440
title: "Previously, if an `IInterceptorBroadcaster` was set in a `RequestDetails` object,
`STORAGE_PRECHECK_FOR_CACHED_SEARCH` hooks that were registered to that `IInterceptorBroadcaster` were not
called. Also, if an `IInterceptorBroadcaster` was set in the `RequestDetails` object, the boolean return value of the hooks
registered to that `IInterceptorBroadcaster` were not taken into account. This second issue existed for all pointcuts
that returned a boolean type, not just for `STORAGE_PRECHECK_FOR_CACHED_SEARCH`. These issues have now been fixed."

View File

@ -0,0 +1,4 @@
---
type: add
issue: 6445
title: "Add Multimap versions of the search() methods to Repository to support queries like `Patient?_tag=a&_tag=b`"

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 6451
jira: SMILE-9089
title: "Previously, activating `BulkDataImport` job would not change jobs status to `RUNNING`,
causing it to be processed multiple times instead of single time. This has been fixed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 6467
title: "Fixed an incompatibility between Hibernate Search and Lucene versions that caused ValueSet expansion to fail
when Hibernate Search was configured to use Lucene."

View File

@ -0,0 +1,4 @@
---
type: perf
issue: 6489
title: "Change the migrator to avoid table locks when adding an index. This allows systems to continue running during upgrade."

View File

@ -4,7 +4,7 @@
title: "The version of a few dependencies have been bumped to more recent versions title: "The version of a few dependencies have been bumped to more recent versions
(dependent HAPI modules listed in brackets): (dependent HAPI modules listed in brackets):
<ul> <ul>
<li>org.hl7.fhir.core (Base): 6.3.18 -&gt; 6.3.25</li> <li>org.hl7.fhir.core (Base): 6.3.18 -&gt; 6.4.0</li>
<li>Bower/Moment.js (hapi-fhir-testpage-overlay): 2.27.0 -&gt; 2.29.4</li> <li>Bower/Moment.js (hapi-fhir-testpage-overlay): 2.27.0 -&gt; 2.29.4</li>
<li>htmlunit (Base): 3.9.0 -&gt; 3.11.0</li> <li>htmlunit (Base): 3.9.0 -&gt; 3.11.0</li>
<li>Elasticsearch (Base): 8.11.1 -&gt; 8.14.3</li> <li>Elasticsearch (Base): 8.11.1 -&gt; 8.14.3</li>

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 6502
backport: 7.6.1
title: "Support ReferenceParam in addition to UriParam for `_profile` in queries using the SearchParameterMap to match the change in the specification from DSTU3 to R4."

View File

@ -0,0 +1,8 @@
---
type: add
issue: 6511
title: "Interceptors can be defined against the registry on the RestfulServer, or on
the registry in the JPA repository. Because these are separate registries, the order()
attribute on the Hook annotation isn't correctly processed today across the two
registries. The CompositeInterceptorRegistry has been reworked so ordering will be
respected across both registries."

View File

@ -0,0 +1,4 @@
---
type: remove
issue: 6512
title: "The methods on FhirVersionEnum which produces a FhirContext (newContext() ,and newContextCached()) have been deprecated, and will be removed."

View File

@ -48,24 +48,24 @@ Additional parameters have been added to support CQL evaluation.
The following parameters are supported for the `Questionnaire/$populate` operation: The following parameters are supported for the `Questionnaire/$populate` operation:
| Parameter | Type | Description | | Parameter | Type | Description |
|---------------------|---------------|-------------| |---------------------|--------------------|-------------|
| questionnaire | Questionnaire | The Questionnaire to populate. Used when the operation is invoked at the 'type' level. | | questionnaire | Questionnaire | The Questionnaire to populate. Used when the operation is invoked at the 'type' level. |
| canonical | canonical | The canonical identifier for the Questionnaire (optionally version-specific). | | canonical | canonical | The canonical identifier for the Questionnaire (optionally version-specific). |
| url | uri | Canonical URL of the Questionnaire when invoked at the resource type level. This is exclusive with the questionnaire and canonical parameters. | | url | uri | Canonical URL of the Questionnaire when invoked at the resource type level. This is exclusive with the questionnaire and canonical parameters. |
| version | string | Version of the Questionnaire when invoked at the resource type level. This is exclusive with the questionnaire and canonical parameters. | | version | string | Version of the Questionnaire when invoked at the resource type level. This is exclusive with the questionnaire and canonical parameters. |
| subject | Reference | The resource that is to be the QuestionnaireResponse.subject. The QuestionnaireResponse instance will reference the provided subject. | | subject | Reference | The resource that is to be the QuestionnaireResponse.subject. The QuestionnaireResponse instance will reference the provided subject. |
| context | | Resources containing information to be used to help populate the QuestionnaireResponse. | | context | | Resources containing information to be used to help populate the QuestionnaireResponse. |
| context.name | string | The name of the launchContext or root Questionnaire variable the passed content should be used as for population purposes. The name SHALL correspond to a launchContext or variable delared at the root of the Questionnaire. | | context.name | string | The name of the launchContext or root Questionnaire variable the passed content should be used as for population purposes. The name SHALL correspond to a launchContext or variable declared at the root of the Questionnaire. |
| context.reference | Reference | The actual resource (or resources) to use as the value of the launchContext or variable. | | context.content | Reference/Resource | The actual resource (or reference) to use as the value of the launchContext or variable. |
| local | boolean | Whether the server should use what resources and other knowledge it has about the referenced subject when pre-populating answers to questions. | | local | boolean | Whether the server should use what resources and other knowledge it has about the referenced subject when pre-populating answers to questions. |
| launchContext | Extension | The [Questionnaire Launch Context](https://hl7.org/fhir/uv/sdc/StructureDefinition-sdc-questionnaire-launchContext.html) extension containing Resources that provide context for form processing logic (pre-population) when creating/displaying/editing a QuestionnaireResponse. | | launchContext | Extension | The [Questionnaire Launch Context](https://hl7.org/fhir/uv/sdc/StructureDefinition-sdc-questionnaire-launchContext.html) extension containing Resources that provide context for form processing logic (pre-population) when creating/displaying/editing a QuestionnaireResponse. |
| parameters | Parameters | Any input parameters defined in libraries referenced by the Questionnaire. | | parameters | Parameters | Any input parameters defined in libraries referenced by the Questionnaire. |
| useServerData | boolean | Whether to use data from the server performing the evaluation. | | useServerData | boolean | Whether to use data from the server performing the evaluation. |
| data | Bundle | Data to be made available during CQL evaluation. | | data | Bundle | Data to be made available during CQL evaluation. |
| dataEndpoint | Endpoint | An endpoint to use to access data referenced by retrieve operations in libraries referenced by the Questionnaire. | | dataEndpoint | Endpoint | An endpoint to use to access data referenced by retrieve operations in libraries referenced by the Questionnaire. |
| contentEndpoint | Endpoint | An endpoint to use to access content (i.e. libraries) referenced by the Questionnaire. | | contentEndpoint | Endpoint | An endpoint to use to access content (i.e. libraries) referenced by the Questionnaire. |
| terminologyEndpoint | Endpoint | An endpoint to use to access terminology (i.e. valuesets, codesystems, and membership testing) referenced by the Questionnaire. | | terminologyEndpoint | Endpoint | An endpoint to use to access terminology (i.e. valuesets, codesystems, and membership testing) referenced by the Questionnaire. |
## Extract ## Extract

View File

@ -27,7 +27,6 @@ import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.model.PersistentIdToForcedIdMap; import ca.uhn.fhir.jpa.api.model.PersistentIdToForcedIdMap;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService; import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor; import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
@ -349,10 +348,9 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
* Get a ISearchBuilder for the given resource type. * Get a ISearchBuilder for the given resource type.
*/ */
protected ISearchBuilder<JpaPid> getSearchBuilderForResourceType(String theResourceType) { protected ISearchBuilder<JpaPid> getSearchBuilderForResourceType(String theResourceType) {
IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(theResourceType);
RuntimeResourceDefinition def = myContext.getResourceDefinition(theResourceType); RuntimeResourceDefinition def = myContext.getResourceDefinition(theResourceType);
Class<? extends IBaseResource> typeClass = def.getImplementingClass(); Class<? extends IBaseResource> typeClass = def.getImplementingClass();
return mySearchBuilderFactory.newSearchBuilder(dao, theResourceType, typeClass); return mySearchBuilderFactory.newSearchBuilder(theResourceType, typeClass);
} }
protected RuntimeSearchParam getPatientSearchParamForCurrentResourceType(String theResourceType) { protected RuntimeSearchParam getPatientSearchParamForCurrentResourceType(String theResourceType) {

View File

@ -217,6 +217,12 @@ public class BulkDataImportSvcImpl implements IBulkDataImportSvc, IHasScheduledJ
String biJobId = null; String biJobId = null;
try { try {
biJobId = processJob(bulkImportJobEntity); biJobId = processJob(bulkImportJobEntity);
// set job status to RUNNING so it would not be processed again
myTxTemplate.execute(t -> {
bulkImportJobEntity.setStatus(BulkImportJobStatusEnum.RUNNING);
myJobDao.save(bulkImportJobEntity);
return null;
});
} catch (Exception e) { } catch (Exception e) {
ourLog.error("Failure while preparing bulk export extract", e); ourLog.error("Failure while preparing bulk export extract", e);
myTxTemplate.execute(t -> { myTxTemplate.execute(t -> {
@ -256,6 +262,7 @@ public class BulkDataImportSvcImpl implements IBulkDataImportSvc, IHasScheduledJ
} }
@Override @Override
@Transactional
public JobInfo getJobStatus(String theBiJobId) { public JobInfo getJobStatus(String theBiJobId) {
BulkImportJobEntity theJob = findJobByBiJobId(theBiJobId); BulkImportJobEntity theJob = findJobByBiJobId(theBiJobId);
return new JobInfo() return new JobInfo()

View File

@ -25,7 +25,6 @@ import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IDao;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService; import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.api.svc.ISearchCoordinatorSvc; import ca.uhn.fhir.jpa.api.svc.ISearchCoordinatorSvc;
import ca.uhn.fhir.jpa.dao.ISearchBuilder; import ca.uhn.fhir.jpa.dao.ISearchBuilder;
@ -158,10 +157,8 @@ public class SearchConfig {
@Bean(name = ISearchBuilder.SEARCH_BUILDER_BEAN_NAME) @Bean(name = ISearchBuilder.SEARCH_BUILDER_BEAN_NAME)
@Scope("prototype") @Scope("prototype")
public ISearchBuilder newSearchBuilder( public ISearchBuilder newSearchBuilder(String theResourceName, Class<? extends IBaseResource> theResourceType) {
IDao theDao, String theResourceName, Class<? extends IBaseResource> theResourceType) {
return new SearchBuilder( return new SearchBuilder(
theDao,
theResourceName, theResourceName,
myStorageSettings, myStorageSettings,
myEntityManagerFactory, myEntityManagerFactory,

View File

@ -1057,8 +1057,9 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
// Interceptor broadcast: JPA_PERFTRACE_INFO // Interceptor broadcast: JPA_PERFTRACE_INFO
if (!presenceCount.isEmpty()) { if (!presenceCount.isEmpty()) {
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.JPA_PERFTRACE_INFO, myInterceptorBroadcaster, theRequest)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_INFO)) {
StorageProcessingMessage message = new StorageProcessingMessage(); StorageProcessingMessage message = new StorageProcessingMessage();
message.setMessage( message.setMessage(
"For " + entity.getIdDt().toUnqualifiedVersionless().getValue() + " added " "For " + entity.getIdDt().toUnqualifiedVersionless().getValue() + " added "
@ -1068,8 +1069,7 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
.add(RequestDetails.class, theRequest) .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest) .addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(StorageProcessingMessage.class, message); .add(StorageProcessingMessage.class, message);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_INFO, params);
myInterceptorBroadcaster, theRequest, Pointcut.JPA_PERFTRACE_INFO, params);
} }
} }
} }
@ -1092,8 +1092,10 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
// Interceptor broadcast: JPA_PERFTRACE_INFO // Interceptor broadcast: JPA_PERFTRACE_INFO
if (!searchParamAddRemoveCount.isEmpty()) { if (!searchParamAddRemoveCount.isEmpty()) {
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.JPA_PERFTRACE_INFO, myInterceptorBroadcaster, theRequest)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(
myInterceptorBroadcaster, theRequest);
if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_INFO)) {
StorageProcessingMessage message = new StorageProcessingMessage(); StorageProcessingMessage message = new StorageProcessingMessage();
message.setMessage("For " message.setMessage("For "
+ entity.getIdDt().toUnqualifiedVersionless().getValue() + " added " + entity.getIdDt().toUnqualifiedVersionless().getValue() + " added "
@ -1104,8 +1106,7 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
.add(RequestDetails.class, theRequest) .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest) .addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(StorageProcessingMessage.class, message); .add(StorageProcessingMessage.class, message);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_INFO, params);
myInterceptorBroadcaster, theRequest, Pointcut.JPA_PERFTRACE_INFO, params);
} }
} }
} }

View File

@ -134,6 +134,7 @@ import org.hl7.fhir.r4.model.Parameters.ParametersParameterComponent;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.PageRequest; import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Slice; import org.springframework.data.domain.Slice;
import org.springframework.data.domain.Sort;
import org.springframework.transaction.PlatformTransactionManager; import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional; import org.springframework.transaction.annotation.Transactional;
@ -226,15 +227,15 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
@Nullable @Nullable
public static <T extends IBaseResource> T invokeStoragePreShowResources( public static <T extends IBaseResource> T invokeStoragePreShowResources(
IInterceptorBroadcaster theInterceptorBroadcaster, RequestDetails theRequest, T retVal) { IInterceptorBroadcaster theInterceptorBroadcaster, RequestDetails theRequest, T retVal) {
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.STORAGE_PRESHOW_RESOURCES, theInterceptorBroadcaster, theRequest)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(theInterceptorBroadcaster, theRequest);
if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PRESHOW_RESOURCES)) {
SimplePreResourceShowDetails showDetails = new SimplePreResourceShowDetails(retVal); SimplePreResourceShowDetails showDetails = new SimplePreResourceShowDetails(retVal);
HookParams params = new HookParams() HookParams params = new HookParams()
.add(IPreResourceShowDetails.class, showDetails) .add(IPreResourceShowDetails.class, showDetails)
.add(RequestDetails.class, theRequest) .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest); .addIfMatchesType(ServletRequestDetails.class, theRequest);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.STORAGE_PRESHOW_RESOURCES, params);
theInterceptorBroadcaster, theRequest, Pointcut.STORAGE_PRESHOW_RESOURCES, params);
//noinspection unchecked //noinspection unchecked
retVal = (T) showDetails.getResource( retVal = (T) showDetails.getResource(
0); // TODO GGG/JA : getting resource 0 is interesting. We apparently allow null values in the list. 0); // TODO GGG/JA : getting resource 0 is interesting. We apparently allow null values in the list.
@ -250,15 +251,15 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
RequestDetails theRequest, RequestDetails theRequest,
IIdType theId, IIdType theId,
IBaseResource theResource) { IBaseResource theResource) {
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.STORAGE_PREACCESS_RESOURCES, theInterceptorBroadcaster, theRequest)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(theInterceptorBroadcaster, theRequest);
if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PREACCESS_RESOURCES)) {
SimplePreResourceAccessDetails accessDetails = new SimplePreResourceAccessDetails(theResource); SimplePreResourceAccessDetails accessDetails = new SimplePreResourceAccessDetails(theResource);
HookParams params = new HookParams() HookParams params = new HookParams()
.add(IPreResourceAccessDetails.class, accessDetails) .add(IPreResourceAccessDetails.class, accessDetails)
.add(RequestDetails.class, theRequest) .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest); .addIfMatchesType(ServletRequestDetails.class, theRequest);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.STORAGE_PREACCESS_RESOURCES, params);
theInterceptorBroadcaster, theRequest, Pointcut.STORAGE_PREACCESS_RESOURCES, params);
if (accessDetails.isDontReturnResourceAtIndex(0)) { if (accessDetails.isDontReturnResourceAtIndex(0)) {
throw new ResourceNotFoundException(Msg.code(1995) + "Resource " + theId + " is not known"); throw new ResourceNotFoundException(Msg.code(1995) + "Resource " + theId + " is not known");
} }
@ -1584,15 +1585,15 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
} }
private Optional<T> invokeStoragePreAccessResources(RequestDetails theRequest, T theResource) { private Optional<T> invokeStoragePreAccessResources(RequestDetails theRequest, T theResource) {
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.STORAGE_PREACCESS_RESOURCES, myInterceptorBroadcaster, theRequest)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PREACCESS_RESOURCES)) {
SimplePreResourceAccessDetails accessDetails = new SimplePreResourceAccessDetails(theResource); SimplePreResourceAccessDetails accessDetails = new SimplePreResourceAccessDetails(theResource);
HookParams params = new HookParams() HookParams params = new HookParams()
.add(IPreResourceAccessDetails.class, accessDetails) .add(IPreResourceAccessDetails.class, accessDetails)
.add(RequestDetails.class, theRequest) .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest); .addIfMatchesType(ServletRequestDetails.class, theRequest);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.STORAGE_PREACCESS_RESOURCES, params);
myInterceptorBroadcaster, theRequest, Pointcut.STORAGE_PREACCESS_RESOURCES, params);
if (accessDetails.isDontReturnResourceAtIndex(0)) { if (accessDetails.isDontReturnResourceAtIndex(0)) {
return Optional.empty(); return Optional.empty();
} }
@ -1697,9 +1698,15 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
if (theOptimizeStorageMode == ReindexParameters.OptimizeStorageModeEnum.ALL_VERSIONS) { if (theOptimizeStorageMode == ReindexParameters.OptimizeStorageModeEnum.ALL_VERSIONS) {
int pageSize = 100; int pageSize = 100;
for (int page = 0; ((long) page * pageSize) < entity.getVersion(); page++) { for (int page = 0; ((long) page * pageSize) < entity.getVersion(); page++) {
// We need to sort the pages, because we are updating the same data we are paging through.
// If not sorted explicitly, a database like Postgres returns the same data multiple times on
// different pages as the underlying data gets updated.
PageRequest pageRequest = PageRequest.of(page, pageSize, Sort.by("myId"));
Slice<ResourceHistoryTable> historyEntities = Slice<ResourceHistoryTable> historyEntities =
myResourceHistoryTableDao.findForResourceIdAndReturnEntitiesAndFetchProvenance( myResourceHistoryTableDao.findForResourceIdAndReturnEntitiesAndFetchProvenance(
PageRequest.of(page, pageSize), entity.getId(), historyEntity.getVersion()); pageRequest, entity.getId(), historyEntity.getVersion());
for (ResourceHistoryTable next : historyEntities) { for (ResourceHistoryTable next : historyEntities) {
reindexOptimizeStorageHistoryEntity(entity, next); reindexOptimizeStorageHistoryEntity(entity, next);
} }
@ -2096,7 +2103,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
} }
ISearchBuilder<JpaPid> builder = ISearchBuilder<JpaPid> builder =
mySearchBuilderFactory.newSearchBuilder(this, getResourceName(), getResourceType()); mySearchBuilderFactory.newSearchBuilder(getResourceName(), getResourceType());
List<JpaPid> ids = new ArrayList<>(); List<JpaPid> ids = new ArrayList<>();
@ -2129,8 +2136,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
myRequestPartitionHelperService.determineReadPartitionForRequestForSearchType( myRequestPartitionHelperService.determineReadPartitionForRequestForSearchType(
theRequest, myResourceName, theParams, theConditionalOperationTargetOrNull); theRequest, myResourceName, theParams, theConditionalOperationTargetOrNull);
ISearchBuilder<JpaPid> builder = ISearchBuilder<JpaPid> builder = mySearchBuilderFactory.newSearchBuilder(getResourceName(), getResourceType());
mySearchBuilderFactory.newSearchBuilder(this, getResourceName(), getResourceType());
String uuid = UUID.randomUUID().toString(); String uuid = UUID.randomUUID().toString();
@ -2168,7 +2174,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
.withPropagation(Propagation.REQUIRED) .withPropagation(Propagation.REQUIRED)
.searchList(() -> { .searchList(() -> {
ISearchBuilder<JpaPid> builder = ISearchBuilder<JpaPid> builder =
mySearchBuilderFactory.newSearchBuilder(this, getResourceName(), getResourceType()); mySearchBuilderFactory.newSearchBuilder(getResourceName(), getResourceType());
Stream<JpaPid> pidStream = Stream<JpaPid> pidStream =
builder.createQueryStream(theParams, searchRuntimeDetails, theRequest, requestPartitionId); builder.createQueryStream(theParams, searchRuntimeDetails, theRequest, requestPartitionId);
@ -2184,7 +2190,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
@Nonnull @Nonnull
private Stream<T> pidsToResource(RequestDetails theRequest, Stream<JpaPid> pidStream) { private Stream<T> pidsToResource(RequestDetails theRequest, Stream<JpaPid> pidStream) {
ISearchBuilder<JpaPid> searchBuilder = ISearchBuilder<JpaPid> searchBuilder =
mySearchBuilderFactory.newSearchBuilder(this, getResourceName(), getResourceType()); mySearchBuilderFactory.newSearchBuilder(getResourceName(), getResourceType());
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
Stream<T> resourceStream = (Stream<T>) new QueryChunker<>() Stream<T> resourceStream = (Stream<T>) new QueryChunker<>()
.chunk(pidStream, SearchBuilder.getMaximumPageSize()) .chunk(pidStream, SearchBuilder.getMaximumPageSize())

View File

@ -205,7 +205,7 @@ public abstract class BaseHapiFhirSystemDao<T extends IBaseBundle, MT> extends B
* *
* However, for realistic average workloads, this should reduce the number of round trips. * However, for realistic average workloads, this should reduce the number of round trips.
*/ */
if (idChunk.size() >= 2) { if (!idChunk.isEmpty()) {
List<ResourceTable> entityChunk = prefetchResourceTableHistoryAndProvenance(idChunk); List<ResourceTable> entityChunk = prefetchResourceTableHistoryAndProvenance(idChunk);
if (thePreFetchIndexes) { if (thePreFetchIndexes) {

View File

@ -516,8 +516,9 @@ public class FulltextSearchSvcImpl implements IFulltextSearchSvc {
*/ */
@SuppressWarnings("rawtypes") @SuppressWarnings("rawtypes")
private void logQuery(SearchQueryOptionsStep theQuery, RequestDetails theRequestDetails) { private void logQuery(SearchQueryOptionsStep theQuery, RequestDetails theRequestDetails) {
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.JPA_PERFTRACE_INFO, myInterceptorBroadcaster, theRequestDetails)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequestDetails);
if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_INFO)) {
StorageProcessingMessage storageProcessingMessage = new StorageProcessingMessage(); StorageProcessingMessage storageProcessingMessage = new StorageProcessingMessage();
String queryString = theQuery.toQuery().queryString(); String queryString = theQuery.toQuery().queryString();
storageProcessingMessage.setMessage(queryString); storageProcessingMessage.setMessage(queryString);
@ -525,8 +526,7 @@ public class FulltextSearchSvcImpl implements IFulltextSearchSvc {
.add(RequestDetails.class, theRequestDetails) .add(RequestDetails.class, theRequestDetails)
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails) .addIfMatchesType(ServletRequestDetails.class, theRequestDetails)
.add(StorageProcessingMessage.class, storageProcessingMessage); .add(StorageProcessingMessage.class, storageProcessingMessage);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_INFO, params);
myInterceptorBroadcaster, theRequestDetails, Pointcut.JPA_PERFTRACE_INFO, params);
} }
} }

View File

@ -150,6 +150,6 @@ public interface IBatch2WorkChunkRepository
@Param("status") WorkChunkStatusEnum theStatus); @Param("status") WorkChunkStatusEnum theStatus);
@Query( @Query(
"SELECT new ca.uhn.fhir.batch2.model.BatchWorkChunkStatusDTO(e.myTargetStepId, e.myStatus, min(e.myStartTime), max(e.myEndTime), avg(e.myEndTime - e.myStartTime), count(*)) FROM Batch2WorkChunkEntity e WHERE e.myInstanceId=:instanceId GROUP BY e.myTargetStepId, e.myStatus") "SELECT new ca.uhn.fhir.batch2.model.BatchWorkChunkStatusDTO(e.myTargetStepId, e.myStatus, min(e.myStartTime), max(e.myEndTime), avg(cast((e.myEndTime - e.myStartTime) as long)), count(*)) FROM Batch2WorkChunkEntity e WHERE e.myInstanceId=:instanceId GROUP BY e.myTargetStepId, e.myStatus")
List<BatchWorkChunkStatusDTO> fetchWorkChunkStatusForInstance(@Param("instanceId") String theInstanceId); List<BatchWorkChunkStatusDTO> fetchWorkChunkStatusForInstance(@Param("instanceId") String theInstanceId);
} }

View File

@ -124,12 +124,15 @@ public class ExpungeEverythingService implements IExpungeEverythingService {
final AtomicInteger counter = new AtomicInteger(); final AtomicInteger counter = new AtomicInteger();
// Notify Interceptors about pre-action call // Notify Interceptors about pre-action call
HookParams hooks = new HookParams() IInterceptorBroadcaster compositeBroadcaster =
.add(AtomicInteger.class, counter) CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
.add(RequestDetails.class, theRequest) if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PRESTORAGE_EXPUNGE_EVERYTHING)) {
.addIfMatchesType(ServletRequestDetails.class, theRequest); HookParams hooks = new HookParams()
CompositeInterceptorBroadcaster.doCallHooks( .add(AtomicInteger.class, counter)
myInterceptorBroadcaster, theRequest, Pointcut.STORAGE_PRESTORAGE_EXPUNGE_EVERYTHING, hooks); .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest);
compositeBroadcaster.callHooks(Pointcut.STORAGE_PRESTORAGE_EXPUNGE_EVERYTHING, hooks);
}
ourLog.info("BEGINNING GLOBAL $expunge"); ourLog.info("BEGINNING GLOBAL $expunge");
Propagation propagation = Propagation.REQUIRES_NEW; Propagation propagation = Propagation.REQUIRES_NEW;

View File

@ -256,8 +256,9 @@ public class JpaResourceExpungeService implements IResourceExpungeService<JpaPid
ResourceHistoryTable theVersion, ResourceHistoryTable theVersion,
IdDt theId) { IdDt theId) {
final AtomicInteger counter = new AtomicInteger(); final AtomicInteger counter = new AtomicInteger();
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.STORAGE_PRESTORAGE_EXPUNGE_RESOURCE, myInterceptorBroadcaster, theRequestDetails)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequestDetails);
if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PRESTORAGE_EXPUNGE_RESOURCE)) {
IBaseResource resource = myJpaStorageResourceParser.toResource(theVersion, false); IBaseResource resource = myJpaStorageResourceParser.toResource(theVersion, false);
HookParams params = new HookParams() HookParams params = new HookParams()
.add(AtomicInteger.class, counter) .add(AtomicInteger.class, counter)
@ -265,8 +266,7 @@ public class JpaResourceExpungeService implements IResourceExpungeService<JpaPid
.add(IBaseResource.class, resource) .add(IBaseResource.class, resource)
.add(RequestDetails.class, theRequestDetails) .add(RequestDetails.class, theRequestDetails)
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails); .addIfMatchesType(ServletRequestDetails.class, theRequestDetails);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.STORAGE_PRESTORAGE_EXPUNGE_RESOURCE, params);
myInterceptorBroadcaster, theRequestDetails, Pointcut.STORAGE_PRESTORAGE_EXPUNGE_RESOURCE, params);
} }
theRemainingCount.addAndGet(-1 * counter.get()); theRemainingCount.addAndGet(-1 * counter.get());
} }

View File

@ -106,13 +106,15 @@ public class DeleteConflictService {
} }
// Notify Interceptors about pre-action call // Notify Interceptors about pre-action call
IInterceptorBroadcaster compositeBroadcaster =
CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
HookParams hooks = new HookParams() HookParams hooks = new HookParams()
.add(DeleteConflictList.class, theDeleteConflicts) .add(DeleteConflictList.class, theDeleteConflicts)
.add(RequestDetails.class, theRequest) .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest) .addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(TransactionDetails.class, theTransactionDetails); .add(TransactionDetails.class, theTransactionDetails);
return (DeleteConflictOutcome) CompositeInterceptorBroadcaster.doCallHooksAndReturnObject( return (DeleteConflictOutcome)
myInterceptorBroadcaster, theRequest, Pointcut.STORAGE_PRESTORAGE_DELETE_CONFLICTS, hooks); compositeBroadcaster.callHooksAndReturnObject(Pointcut.STORAGE_PRESTORAGE_DELETE_CONFLICTS, hooks);
} }
private void addConflictsToList( private void addConflictsToList(

View File

@ -155,17 +155,19 @@ public class ThreadSafeResourceDeleterSvc {
TransactionDetails theTransactionDetails, TransactionDetails theTransactionDetails,
IdDt nextSource, IdDt nextSource,
IFhirResourceDao<?> dao) { IFhirResourceDao<?> dao) {
// Interceptor call: STORAGE_CASCADE_DELETE
// Remove the version so we grab the latest version to delete // Remove the version so we grab the latest version to delete
IBaseResource resource = dao.read(nextSource.toVersionless(), theRequest); IBaseResource resource = dao.read(nextSource.toVersionless(), theRequest);
// Interceptor call: STORAGE_CASCADE_DELETE
IInterceptorBroadcaster compositeBroadcaster =
CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
HookParams params = new HookParams() HookParams params = new HookParams()
.add(RequestDetails.class, theRequest) .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest) .addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(DeleteConflictList.class, theConflictList) .add(DeleteConflictList.class, theConflictList)
.add(IBaseResource.class, resource); .add(IBaseResource.class, resource);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.STORAGE_CASCADE_DELETE, params);
myInterceptorBroadcaster, theRequest, Pointcut.STORAGE_CASCADE_DELETE, params);
return dao.delete(resource.getIdElement(), theConflictList, theRequest, theTransactionDetails); return dao.delete(resource.getIdElement(), theConflictList, theRequest, theTransactionDetails);
} }

View File

@ -27,7 +27,6 @@ import ca.uhn.fhir.interceptor.model.ReadPartitionIdRequestDetails;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.svc.ISearchCoordinatorSvc; import ca.uhn.fhir.jpa.api.svc.ISearchCoordinatorSvc;
import ca.uhn.fhir.jpa.dao.HistoryBuilder; import ca.uhn.fhir.jpa.dao.HistoryBuilder;
import ca.uhn.fhir.jpa.dao.HistoryBuilderFactory; import ca.uhn.fhir.jpa.dao.HistoryBuilderFactory;
@ -189,15 +188,17 @@ public class PersistedJpaBundleProvider implements IBundleProvider {
retVal.add(myJpaStorageResourceParser.toResource(resource, true)); retVal.add(myJpaStorageResourceParser.toResource(resource, true));
} }
IInterceptorBroadcaster compositeBroadcaster =
CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, myRequest);
// Interceptor call: STORAGE_PREACCESS_RESOURCES // Interceptor call: STORAGE_PREACCESS_RESOURCES
{ if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PREACCESS_RESOURCES)) {
SimplePreResourceAccessDetails accessDetails = new SimplePreResourceAccessDetails(retVal); SimplePreResourceAccessDetails accessDetails = new SimplePreResourceAccessDetails(retVal);
HookParams params = new HookParams() HookParams params = new HookParams()
.add(IPreResourceAccessDetails.class, accessDetails) .add(IPreResourceAccessDetails.class, accessDetails)
.add(RequestDetails.class, myRequest) .add(RequestDetails.class, myRequest)
.addIfMatchesType(ServletRequestDetails.class, myRequest); .addIfMatchesType(ServletRequestDetails.class, myRequest);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.STORAGE_PREACCESS_RESOURCES, params);
myInterceptorBroadcaster, myRequest, Pointcut.STORAGE_PREACCESS_RESOURCES, params);
for (int i = retVal.size() - 1; i >= 0; i--) { for (int i = retVal.size() - 1; i >= 0; i--) {
if (accessDetails.isDontReturnResourceAtIndex(i)) { if (accessDetails.isDontReturnResourceAtIndex(i)) {
@ -207,14 +208,13 @@ public class PersistedJpaBundleProvider implements IBundleProvider {
} }
// Interceptor broadcast: STORAGE_PRESHOW_RESOURCES // Interceptor broadcast: STORAGE_PRESHOW_RESOURCES
{ if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PRESHOW_RESOURCES)) {
SimplePreResourceShowDetails showDetails = new SimplePreResourceShowDetails(retVal); SimplePreResourceShowDetails showDetails = new SimplePreResourceShowDetails(retVal);
HookParams params = new HookParams() HookParams params = new HookParams()
.add(IPreResourceShowDetails.class, showDetails) .add(IPreResourceShowDetails.class, showDetails)
.add(RequestDetails.class, myRequest) .add(RequestDetails.class, myRequest)
.addIfMatchesType(ServletRequestDetails.class, myRequest); .addIfMatchesType(ServletRequestDetails.class, myRequest);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.STORAGE_PRESHOW_RESOURCES, params);
myInterceptorBroadcaster, myRequest, Pointcut.STORAGE_PRESHOW_RESOURCES, params);
retVal = showDetails.toList(); retVal = showDetails.toList();
} }
@ -254,9 +254,8 @@ public class PersistedJpaBundleProvider implements IBundleProvider {
String resourceName = mySearchEntity.getResourceType(); String resourceName = mySearchEntity.getResourceType();
Class<? extends IBaseResource> resourceType = Class<? extends IBaseResource> resourceType =
myContext.getResourceDefinition(resourceName).getImplementingClass(); myContext.getResourceDefinition(resourceName).getImplementingClass();
IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(resourceName);
final ISearchBuilder sb = mySearchBuilderFactory.newSearchBuilder(dao, resourceName, resourceType); final ISearchBuilder sb = mySearchBuilderFactory.newSearchBuilder(resourceName, resourceType);
RequestPartitionId requestPartitionId = getRequestPartitionId(); RequestPartitionId requestPartitionId = getRequestPartitionId();
// we request 1 more resource than we need // we request 1 more resource than we need

View File

@ -374,8 +374,7 @@ public class SearchCoordinatorSvcImpl implements ISearchCoordinatorSvc<JpaPid> {
Class<? extends IBaseResource> resourceTypeClass = Class<? extends IBaseResource> resourceTypeClass =
myContext.getResourceDefinition(theResourceType).getImplementingClass(); myContext.getResourceDefinition(theResourceType).getImplementingClass();
final ISearchBuilder<JpaPid> sb = final ISearchBuilder<JpaPid> sb = mySearchBuilderFactory.newSearchBuilder(theResourceType, resourceTypeClass);
mySearchBuilderFactory.newSearchBuilder(theCallingDao, theResourceType, resourceTypeClass);
sb.setFetchSize(mySyncSize); sb.setFetchSize(mySyncSize);
final Integer loadSynchronousUpTo = getLoadSynchronousUpToOrNull(theCacheControlDirective); final Integer loadSynchronousUpTo = getLoadSynchronousUpToOrNull(theCacheControlDirective);
@ -599,18 +598,19 @@ public class SearchCoordinatorSvcImpl implements ISearchCoordinatorSvc<JpaPid> {
.withRequest(theRequestDetails) .withRequest(theRequestDetails)
.withRequestPartitionId(theRequestPartitionId) .withRequestPartitionId(theRequestPartitionId)
.execute(() -> { .execute(() -> {
IInterceptorBroadcaster compositeBroadcaster =
CompositeInterceptorBroadcaster.newCompositeBroadcaster(
myInterceptorBroadcaster, theRequestDetails);
// Interceptor call: STORAGE_PRECHECK_FOR_CACHED_SEARCH // Interceptor call: STORAGE_PRECHECK_FOR_CACHED_SEARCH
HookParams params = new HookParams() HookParams params = new HookParams()
.add(SearchParameterMap.class, theParams) .add(SearchParameterMap.class, theParams)
.add(RequestDetails.class, theRequestDetails) .add(RequestDetails.class, theRequestDetails)
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails); .addIfMatchesType(ServletRequestDetails.class, theRequestDetails);
Object outcome = CompositeInterceptorBroadcaster.doCallHooksAndReturnObject( boolean canUseCache =
myInterceptorBroadcaster, compositeBroadcaster.callHooks(Pointcut.STORAGE_PRECHECK_FOR_CACHED_SEARCH, params);
theRequestDetails, if (!canUseCache) {
Pointcut.STORAGE_PRECHECK_FOR_CACHED_SEARCH,
params);
if (Boolean.FALSE.equals(outcome)) {
return null; return null;
} }
@ -626,11 +626,7 @@ public class SearchCoordinatorSvcImpl implements ISearchCoordinatorSvc<JpaPid> {
.add(SearchParameterMap.class, theParams) .add(SearchParameterMap.class, theParams)
.add(RequestDetails.class, theRequestDetails) .add(RequestDetails.class, theRequestDetails)
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails); .addIfMatchesType(ServletRequestDetails.class, theRequestDetails);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_SEARCH_REUSING_CACHED, params);
myInterceptorBroadcaster,
theRequestDetails,
Pointcut.JPA_PERFTRACE_SEARCH_REUSING_CACHED,
params);
return myPersistedJpaBundleProviderFactory.newInstance(theRequestDetails, searchToUse.getUuid()); return myPersistedJpaBundleProviderFactory.newInstance(theRequestDetails, searchToUse.getUuid());
}); });

View File

@ -27,7 +27,6 @@ import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.dao.IResultIterator; import ca.uhn.fhir.jpa.dao.IResultIterator;
import ca.uhn.fhir.jpa.dao.ISearchBuilder; import ca.uhn.fhir.jpa.dao.ISearchBuilder;
import ca.uhn.fhir.jpa.dao.SearchBuilderFactory; import ca.uhn.fhir.jpa.dao.SearchBuilderFactory;
@ -181,12 +180,16 @@ public class SynchronousSearchSvcImpl implements ISynchronousSearchSvc {
} }
JpaPreResourceAccessDetails accessDetails = new JpaPreResourceAccessDetails(pids, () -> theSb); JpaPreResourceAccessDetails accessDetails = new JpaPreResourceAccessDetails(pids, () -> theSb);
HookParams params = new HookParams() IInterceptorBroadcaster compositeBroadcaster =
.add(IPreResourceAccessDetails.class, accessDetails) CompositeInterceptorBroadcaster.newCompositeBroadcaster(
.add(RequestDetails.class, theRequestDetails) myInterceptorBroadcaster, theRequestDetails);
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails); if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PREACCESS_RESOURCES)) {
CompositeInterceptorBroadcaster.doCallHooks( HookParams params = new HookParams()
myInterceptorBroadcaster, theRequestDetails, Pointcut.STORAGE_PREACCESS_RESOURCES, params); .add(IPreResourceAccessDetails.class, accessDetails)
.add(RequestDetails.class, theRequestDetails)
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails);
compositeBroadcaster.callHooks(Pointcut.STORAGE_PREACCESS_RESOURCES, params);
}
for (int i = pids.size() - 1; i >= 0; i--) { for (int i = pids.size() - 1; i >= 0; i--) {
if (accessDetails.isDontReturnResourceAtIndex(i)) { if (accessDetails.isDontReturnResourceAtIndex(i)) {
@ -279,12 +282,9 @@ public class SynchronousSearchSvcImpl implements ISynchronousSearchSvc {
RequestPartitionId theRequestPartitionId) { RequestPartitionId theRequestPartitionId) {
final String searchUuid = UUID.randomUUID().toString(); final String searchUuid = UUID.randomUUID().toString();
IFhirResourceDao<?> callingDao = myDaoRegistry.getResourceDao(theResourceType);
Class<? extends IBaseResource> resourceTypeClass = Class<? extends IBaseResource> resourceTypeClass =
myContext.getResourceDefinition(theResourceType).getImplementingClass(); myContext.getResourceDefinition(theResourceType).getImplementingClass();
final ISearchBuilder sb = final ISearchBuilder sb = mySearchBuilderFactory.newSearchBuilder(theResourceType, resourceTypeClass);
mySearchBuilderFactory.newSearchBuilder(callingDao, theResourceType, resourceTypeClass);
sb.setFetchSize(mySyncSize); sb.setFetchSize(mySyncSize);
return executeQuery( return executeQuery(
theSearchParameterMap, theSearchParameterMap,

View File

@ -210,7 +210,7 @@ public class QueryStack {
CoordsPredicateBuilder coordsBuilder = (CoordsPredicateBuilder) builder; CoordsPredicateBuilder coordsBuilder = (CoordsPredicateBuilder) builder;
List<List<IQueryParameterType>> params = theParams.get(theParamName); List<List<IQueryParameterType>> params = theParams.get(theParamName);
if (params.size() > 0 && params.get(0).size() > 0) { if (!params.isEmpty() && !params.get(0).isEmpty()) {
IQueryParameterType param = params.get(0).get(0); IQueryParameterType param = params.get(0).get(0);
ParsedLocationParam location = ParsedLocationParam.from(theParams, param); ParsedLocationParam location = ParsedLocationParam.from(theParams, param);
double latitudeValue = location.getLatitudeValue(); double latitudeValue = location.getLatitudeValue();
@ -2134,6 +2134,10 @@ public class QueryStack {
if (nextParam.getModifier() == TokenParamModifier.NOT) { if (nextParam.getModifier() == TokenParamModifier.NOT) {
paramInverted = true; paramInverted = true;
} }
} else if (nextOrParam instanceof ReferenceParam) {
ReferenceParam nextParam = (ReferenceParam) nextOrParam;
code = nextParam.getValue();
system = null;
} else { } else {
UriParam nextParam = (UriParam) nextOrParam; UriParam nextParam = (UriParam) nextOrParam;
code = nextParam.getValue(); code = nextParam.getValue();
@ -2160,8 +2164,10 @@ public class QueryStack {
} }
} }
UriParam nextParam = (UriParam) nextParamUncasted; if (nextParamUncasted instanceof ReferenceParam
if (isNotBlank(nextParam.getValue())) { && isNotBlank(((ReferenceParam) nextParamUncasted).getValue())) {
return true;
} else if (nextParamUncasted instanceof UriParam && isNotBlank(((UriParam) nextParamUncasted).getValue())) {
return true; return true;
} }
} }

View File

@ -33,7 +33,6 @@ import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IDao;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService; import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.api.svc.ResolveIdentityMode; import ca.uhn.fhir.jpa.api.svc.ResolveIdentityMode;
@ -192,7 +191,6 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
private final FhirContext myContext; private final FhirContext myContext;
private final IIdHelperService<JpaPid> myIdHelperService; private final IIdHelperService<JpaPid> myIdHelperService;
private final JpaStorageSettings myStorageSettings; private final JpaStorageSettings myStorageSettings;
private final IDao myCallingDao;
@PersistenceContext(type = PersistenceContextType.TRANSACTION) @PersistenceContext(type = PersistenceContextType.TRANSACTION)
protected EntityManager myEntityManager; protected EntityManager myEntityManager;
@ -220,7 +218,6 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
*/ */
@SuppressWarnings({"rawtypes", "unchecked"}) @SuppressWarnings({"rawtypes", "unchecked"})
public SearchBuilder( public SearchBuilder(
IDao theDao,
String theResourceName, String theResourceName,
JpaStorageSettings theStorageSettings, JpaStorageSettings theStorageSettings,
HapiFhirLocalContainerEntityManagerFactoryBean theEntityManagerFactory, HapiFhirLocalContainerEntityManagerFactoryBean theEntityManagerFactory,
@ -235,7 +232,6 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
FhirContext theContext, FhirContext theContext,
IIdHelperService theIdHelperService, IIdHelperService theIdHelperService,
Class<? extends IBaseResource> theResourceType) { Class<? extends IBaseResource> theResourceType) {
myCallingDao = theDao;
myResourceName = theResourceName; myResourceName = theResourceName;
myResourceType = theResourceType; myResourceType = theResourceType;
myStorageSettings = theStorageSettings; myStorageSettings = theStorageSettings;
@ -426,15 +422,15 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
if (theSearchRuntimeDetails != null) { if (theSearchRuntimeDetails != null) {
theSearchRuntimeDetails.setFoundIndexMatchesCount(resultCount); theSearchRuntimeDetails.setFoundIndexMatchesCount(resultCount);
HookParams params = new HookParams() IInterceptorBroadcaster compositeBroadcaster =
.add(RequestDetails.class, theRequest) CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
.addIfMatchesType(ServletRequestDetails.class, theRequest) if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_INDEXSEARCH_QUERY_COMPLETE)) {
.add(SearchRuntimeDetails.class, theSearchRuntimeDetails); HookParams params = new HookParams()
CompositeInterceptorBroadcaster.doCallHooks( .add(RequestDetails.class, theRequest)
myInterceptorBroadcaster, .addIfMatchesType(ServletRequestDetails.class, theRequest)
theRequest, .add(SearchRuntimeDetails.class, theSearchRuntimeDetails);
Pointcut.JPA_PERFTRACE_INDEXSEARCH_QUERY_COMPLETE, compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_INDEXSEARCH_QUERY_COMPLETE, params);
params); }
} }
// can we skip the database entirely and return the pid list from here? // can we skip the database entirely and return the pid list from here?
@ -1393,8 +1389,10 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
String searchIdOrDescription = theParameters.getSearchIdOrDescription(); String searchIdOrDescription = theParameters.getSearchIdOrDescription();
List<String> desiredResourceTypes = theParameters.getDesiredResourceTypes(); List<String> desiredResourceTypes = theParameters.getDesiredResourceTypes();
boolean hasDesiredResourceTypes = desiredResourceTypes != null && !desiredResourceTypes.isEmpty(); boolean hasDesiredResourceTypes = desiredResourceTypes != null && !desiredResourceTypes.isEmpty();
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.JPA_PERFTRACE_RAW_SQL, myInterceptorBroadcaster, theParameters.getRequestDetails())) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, request);
if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_RAW_SQL)) {
CurrentThreadCaptureQueriesListener.startCapturing(); CurrentThreadCaptureQueriesListener.startCapturing();
} }
if (matches.isEmpty()) { if (matches.isEmpty()) {
@ -1498,17 +1496,16 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
w.getMillisAndRestart(), w.getMillisAndRestart(),
searchIdOrDescription); searchIdOrDescription);
if (CompositeInterceptorBroadcaster.hasHooks( if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_RAW_SQL)) {
Pointcut.JPA_PERFTRACE_RAW_SQL, myInterceptorBroadcaster, request)) { callRawSqlHookWithCurrentThreadQueries(request, compositeBroadcaster);
callRawSqlHookWithCurrentThreadQueries(request);
} }
// Interceptor call: STORAGE_PREACCESS_RESOURCES // Interceptor call: STORAGE_PREACCESS_RESOURCES
// This can be used to remove results from the search result details before // This can be used to remove results from the search result details before
// the user has a chance to know that they were in the results // the user has a chance to know that they were in the results
if (!allAdded.isEmpty()) { if (!allAdded.isEmpty()) {
if (CompositeInterceptorBroadcaster.hasHooks( if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PREACCESS_RESOURCES)) {
Pointcut.STORAGE_PREACCESS_RESOURCES, myInterceptorBroadcaster, request)) {
List<JpaPid> includedPidList = new ArrayList<>(allAdded); List<JpaPid> includedPidList = new ArrayList<>(allAdded);
JpaPreResourceAccessDetails accessDetails = JpaPreResourceAccessDetails accessDetails =
new JpaPreResourceAccessDetails(includedPidList, () -> this); new JpaPreResourceAccessDetails(includedPidList, () -> this);
@ -1516,8 +1513,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
.add(IPreResourceAccessDetails.class, accessDetails) .add(IPreResourceAccessDetails.class, accessDetails)
.add(RequestDetails.class, request) .add(RequestDetails.class, request)
.addIfMatchesType(ServletRequestDetails.class, request); .addIfMatchesType(ServletRequestDetails.class, request);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.STORAGE_PREACCESS_RESOURCES, params);
myInterceptorBroadcaster, request, Pointcut.STORAGE_PREACCESS_RESOURCES, params);
for (int i = includedPidList.size() - 1; i >= 0; i--) { for (int i = includedPidList.size() - 1; i >= 0; i--) {
if (accessDetails.isDontReturnResourceAtIndex(i)) { if (accessDetails.isDontReturnResourceAtIndex(i)) {
@ -1813,17 +1809,18 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
/** /**
* Calls Performance Trace Hook * Calls Performance Trace Hook
* @param request the request deatils *
* Sends a raw SQL query to the Pointcut for raw SQL queries. * @param request the request deatils
* Sends a raw SQL query to the Pointcut for raw SQL queries.
*/ */
private void callRawSqlHookWithCurrentThreadQueries(RequestDetails request) { private void callRawSqlHookWithCurrentThreadQueries(
RequestDetails request, IInterceptorBroadcaster theCompositeBroadcaster) {
SqlQueryList capturedQueries = CurrentThreadCaptureQueriesListener.getCurrentQueueAndStopCapturing(); SqlQueryList capturedQueries = CurrentThreadCaptureQueriesListener.getCurrentQueueAndStopCapturing();
HookParams params = new HookParams() HookParams params = new HookParams()
.add(RequestDetails.class, request) .add(RequestDetails.class, request)
.addIfMatchesType(ServletRequestDetails.class, request) .addIfMatchesType(ServletRequestDetails.class, request)
.add(SqlQueryList.class, capturedQueries); .add(SqlQueryList.class, capturedQueries);
CompositeInterceptorBroadcaster.doCallHooks( theCompositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_RAW_SQL, params);
myInterceptorBroadcaster, request, Pointcut.JPA_PERFTRACE_RAW_SQL, params);
} }
@Nullable @Nullable
@ -2095,16 +2092,19 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
indexStrings.sort(Comparator.naturalOrder()); indexStrings.sort(Comparator.naturalOrder());
// Interceptor broadcast: JPA_PERFTRACE_INFO // Interceptor broadcast: JPA_PERFTRACE_INFO
String indexStringForLog = indexStrings.size() > 1 ? indexStrings.toString() : indexStrings.get(0); IInterceptorBroadcaster compositeBroadcaster =
StorageProcessingMessage msg = new StorageProcessingMessage() CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
.setMessage("Using " + theComboParam.getComboSearchParamType() + " index(es) for query for search: " if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_INFO)) {
+ indexStringForLog); String indexStringForLog = indexStrings.size() > 1 ? indexStrings.toString() : indexStrings.get(0);
HookParams params = new HookParams() StorageProcessingMessage msg = new StorageProcessingMessage()
.add(RequestDetails.class, theRequest) .setMessage("Using " + theComboParam.getComboSearchParamType() + " index(es) for query for search: "
.addIfMatchesType(ServletRequestDetails.class, theRequest) + indexStringForLog);
.add(StorageProcessingMessage.class, msg); HookParams params = new HookParams()
CompositeInterceptorBroadcaster.doCallHooks( .add(RequestDetails.class, theRequest)
myInterceptorBroadcaster, theRequest, Pointcut.JPA_PERFTRACE_INFO, params); .addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(StorageProcessingMessage.class, msg);
compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_INFO, params);
}
switch (requireNonNull(theComboParam.getComboSearchParamType())) { switch (requireNonNull(theComboParam.getComboSearchParamType())) {
case UNIQUE: case UNIQUE:
@ -2330,6 +2330,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
private final boolean myHavePerfTraceFoundIdHook; private final boolean myHavePerfTraceFoundIdHook;
private final SortSpec mySort; private final SortSpec mySort;
private final Integer myOffset; private final Integer myOffset;
private final IInterceptorBroadcaster myCompositeBroadcaster;
private boolean myFirst = true; private boolean myFirst = true;
private IncludesIterator myIncludesIterator; private IncludesIterator myIncludesIterator;
/** /**
@ -2368,16 +2369,16 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
mySort = myParams.getSort(); mySort = myParams.getSort();
myOffset = myParams.getOffset(); myOffset = myParams.getOffset();
myRequest = theRequest; myRequest = theRequest;
myCompositeBroadcaster =
CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
// everything requires fetching recursively all related resources // everything requires fetching recursively all related resources
if (myParams.getEverythingMode() != null) { if (myParams.getEverythingMode() != null) {
myFetchIncludesForEverythingOperation = true; myFetchIncludesForEverythingOperation = true;
} }
myHavePerfTraceFoundIdHook = CompositeInterceptorBroadcaster.hasHooks( myHavePerfTraceFoundIdHook = myCompositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_SEARCH_FOUND_ID);
Pointcut.JPA_PERFTRACE_SEARCH_FOUND_ID, myInterceptorBroadcaster, myRequest); myHaveRawSqlHooks = myCompositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_RAW_SQL);
myHaveRawSqlHooks = CompositeInterceptorBroadcaster.hasHooks(
Pointcut.JPA_PERFTRACE_RAW_SQL, myInterceptorBroadcaster, myRequest);
} }
private void fetchNext() { private void fetchNext() {
@ -2479,7 +2480,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
} finally { } finally {
// search finished - fire hooks // search finished - fire hooks
if (myHaveRawSqlHooks) { if (myHaveRawSqlHooks) {
callRawSqlHookWithCurrentThreadQueries(myRequest); callRawSqlHookWithCurrentThreadQueries(myRequest, myCompositeBroadcaster);
} }
} }
@ -2488,8 +2489,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
.add(RequestDetails.class, myRequest) .add(RequestDetails.class, myRequest)
.addIfMatchesType(ServletRequestDetails.class, myRequest) .addIfMatchesType(ServletRequestDetails.class, myRequest)
.add(SearchRuntimeDetails.class, mySearchRuntimeDetails); .add(SearchRuntimeDetails.class, mySearchRuntimeDetails);
CompositeInterceptorBroadcaster.doCallHooks( myCompositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_SEARCH_FIRST_RESULT_LOADED, params);
myInterceptorBroadcaster, myRequest, Pointcut.JPA_PERFTRACE_SEARCH_FIRST_RESULT_LOADED, params);
myFirst = false; myFirst = false;
} }
@ -2498,8 +2498,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
.add(RequestDetails.class, myRequest) .add(RequestDetails.class, myRequest)
.addIfMatchesType(ServletRequestDetails.class, myRequest) .addIfMatchesType(ServletRequestDetails.class, myRequest)
.add(SearchRuntimeDetails.class, mySearchRuntimeDetails); .add(SearchRuntimeDetails.class, mySearchRuntimeDetails);
CompositeInterceptorBroadcaster.doCallHooks( myCompositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_SEARCH_SELECT_COMPLETE, params);
myInterceptorBroadcaster, myRequest, Pointcut.JPA_PERFTRACE_SEARCH_SELECT_COMPLETE, params);
} }
} }
@ -2523,8 +2522,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
HookParams params = new HookParams() HookParams params = new HookParams()
.add(Integer.class, System.identityHashCode(this)) .add(Integer.class, System.identityHashCode(this))
.add(Object.class, theNextLong); .add(Object.class, theNextLong);
CompositeInterceptorBroadcaster.doCallHooks( myCompositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_SEARCH_FOUND_ID, params);
myInterceptorBroadcaster, myRequest, Pointcut.JPA_PERFTRACE_SEARCH_FOUND_ID, params);
} }
private void sendProcessingMsgAndFirePerformanceHook() { private void sendProcessingMsgAndFirePerformanceHook() {
@ -2627,14 +2625,18 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
firePerformanceMessage(theRequest, theMessage, Pointcut.JPA_PERFTRACE_WARNING); firePerformanceMessage(theRequest, theMessage, Pointcut.JPA_PERFTRACE_WARNING);
} }
private void firePerformanceMessage(RequestDetails theRequest, String theMessage, Pointcut pointcut) { private void firePerformanceMessage(RequestDetails theRequest, String theMessage, Pointcut thePointcut) {
StorageProcessingMessage message = new StorageProcessingMessage(); IInterceptorBroadcaster compositeBroadcaster =
message.setMessage(theMessage); CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
HookParams params = new HookParams() if (compositeBroadcaster.hasHooks(thePointcut)) {
.add(RequestDetails.class, theRequest) StorageProcessingMessage message = new StorageProcessingMessage();
.addIfMatchesType(ServletRequestDetails.class, theRequest) message.setMessage(theMessage);
.add(StorageProcessingMessage.class, message); HookParams params = new HookParams()
CompositeInterceptorBroadcaster.doCallHooks(myInterceptorBroadcaster, theRequest, pointcut, params); .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(StorageProcessingMessage.class, message);
compositeBroadcaster.callHooks(thePointcut, params);
}
} }
public static int getMaximumPageSize() { public static int getMaximumPageSize() {

View File

@ -53,14 +53,17 @@ public class StorageInterceptorHooksFacade {
SearchParameterMap theParams, SearchParameterMap theParams,
Search search, Search search,
RequestPartitionId theRequestPartitionId) { RequestPartitionId theRequestPartitionId) {
HookParams params = new HookParams() IInterceptorBroadcaster compositeBroadcaster =
.add(ICachedSearchDetails.class, search) CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequestDetails);
.add(RequestDetails.class, theRequestDetails) if (compositeBroadcaster.hasHooks(Pointcut.STORAGE_PRESEARCH_REGISTERED)) {
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails) HookParams params = new HookParams()
.add(SearchParameterMap.class, theParams) .add(ICachedSearchDetails.class, search)
.add(RequestPartitionId.class, theRequestPartitionId); .add(RequestDetails.class, theRequestDetails)
CompositeInterceptorBroadcaster.doCallHooks( .addIfMatchesType(ServletRequestDetails.class, theRequestDetails)
myInterceptorBroadcaster, theRequestDetails, Pointcut.STORAGE_PRESEARCH_REGISTERED, params); .add(SearchParameterMap.class, theParams)
.add(RequestPartitionId.class, theRequestPartitionId);
compositeBroadcaster.callHooks(Pointcut.STORAGE_PRESEARCH_REGISTERED, params);
}
} }
// private IInterceptorBroadcaster myInterceptorBroadcaster; // private IInterceptorBroadcaster myInterceptorBroadcaster;
} }

View File

@ -407,12 +407,16 @@ public class ResourceLinkPredicateBuilder extends BaseJoiningPredicateBuilder im
} }
String message = builder.toString(); String message = builder.toString();
StorageProcessingMessage msg = new StorageProcessingMessage().setMessage(message); StorageProcessingMessage msg = new StorageProcessingMessage().setMessage(message);
HookParams params = new HookParams()
.add(RequestDetails.class, theRequest) IInterceptorBroadcaster compositeBroadcaster =
.addIfMatchesType(ServletRequestDetails.class, theRequest) CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
.add(StorageProcessingMessage.class, msg); if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_WARNING)) {
CompositeInterceptorBroadcaster.doCallHooks( HookParams params = new HookParams()
myInterceptorBroadcaster, theRequest, Pointcut.JPA_PERFTRACE_WARNING, params); .add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(StorageProcessingMessage.class, msg);
compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_WARNING, params);
}
} }
/** /**

View File

@ -109,15 +109,18 @@ public class UriPredicateBuilder extends BaseSearchParamPredicateBuilder {
+ "] param[" + theParamName + "]"; + "] param[" + theParamName + "]";
ourLog.info(msg); ourLog.info(msg);
StorageProcessingMessage message = new StorageProcessingMessage(); IInterceptorBroadcaster compositeBroadcaster =
ourLog.warn(msg); CompositeInterceptorBroadcaster.newCompositeBroadcaster(
message.setMessage(msg); myInterceptorBroadcaster, theRequestDetails);
HookParams params = new HookParams() if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_WARNING)) {
.add(RequestDetails.class, theRequestDetails) StorageProcessingMessage message = new StorageProcessingMessage();
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails) message.setMessage(msg);
.add(StorageProcessingMessage.class, message); HookParams params = new HookParams()
CompositeInterceptorBroadcaster.doCallHooks( .add(RequestDetails.class, theRequestDetails)
myInterceptorBroadcaster, theRequestDetails, Pointcut.JPA_PERFTRACE_WARNING, params); .addIfMatchesType(ServletRequestDetails.class, theRequestDetails)
.add(StorageProcessingMessage.class, message);
compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_WARNING, params);
}
long hashIdentity = BaseResourceIndexedSearchParam.calculateHashIdentity( long hashIdentity = BaseResourceIndexedSearchParam.calculateHashIdentity(
getPartitionSettings(), getRequestPartitionId(), getResourceType(), theParamName); getPartitionSettings(), getRequestPartitionId(), getResourceType(), theParamName);

View File

@ -26,7 +26,6 @@ import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.IDao;
import ca.uhn.fhir.jpa.dao.IResultIterator; import ca.uhn.fhir.jpa.dao.IResultIterator;
import ca.uhn.fhir.jpa.dao.ISearchBuilder; import ca.uhn.fhir.jpa.dao.ISearchBuilder;
import ca.uhn.fhir.jpa.dao.SearchBuilderFactory; import ca.uhn.fhir.jpa.dao.SearchBuilderFactory;
@ -95,7 +94,6 @@ public class SearchTask implements Callable<Void> {
protected final FhirContext myContext; protected final FhirContext myContext;
protected final ISearchResultCacheSvc mySearchResultCacheSvc; protected final ISearchResultCacheSvc mySearchResultCacheSvc;
private final SearchParameterMap myParams; private final SearchParameterMap myParams;
private final IDao myCallingDao;
private final String myResourceType; private final String myResourceType;
private final ArrayList<JpaPid> mySyncedPids = new ArrayList<>(); private final ArrayList<JpaPid> mySyncedPids = new ArrayList<>();
private final CountDownLatch myInitialCollectionLatch = new CountDownLatch(1); private final CountDownLatch myInitialCollectionLatch = new CountDownLatch(1);
@ -113,6 +111,7 @@ public class SearchTask implements Callable<Void> {
private final JpaStorageSettings myStorageSettings; private final JpaStorageSettings myStorageSettings;
private final ISearchCacheSvc mySearchCacheSvc; private final ISearchCacheSvc mySearchCacheSvc;
private final IPagingProvider myPagingProvider; private final IPagingProvider myPagingProvider;
private final IInterceptorBroadcaster myCompositeBroadcaster;
private Search mySearch; private Search mySearch;
private boolean myAbortRequested; private boolean myAbortRequested;
private int myCountSavedTotal = 0; private int myCountSavedTotal = 0;
@ -149,7 +148,6 @@ public class SearchTask implements Callable<Void> {
// values // values
myOnRemove = theCreationParams.OnRemove; myOnRemove = theCreationParams.OnRemove;
mySearch = theCreationParams.Search; mySearch = theCreationParams.Search;
myCallingDao = theCreationParams.CallingDao;
myParams = theCreationParams.Params; myParams = theCreationParams.Params;
myResourceType = theCreationParams.ResourceType; myResourceType = theCreationParams.ResourceType;
myRequest = theCreationParams.Request; myRequest = theCreationParams.Request;
@ -158,9 +156,11 @@ public class SearchTask implements Callable<Void> {
myLoadingThrottleForUnitTests = theCreationParams.getLoadingThrottleForUnitTests(); myLoadingThrottleForUnitTests = theCreationParams.getLoadingThrottleForUnitTests();
mySearchRuntimeDetails = new SearchRuntimeDetails(myRequest, mySearch.getUuid()); mySearchRuntimeDetails = new SearchRuntimeDetails(myRequest, mySearch.getUuid());
mySearchRuntimeDetails.setQueryString(myParams.toNormalizedQueryString(myCallingDao.getContext())); mySearchRuntimeDetails.setQueryString(myParams.toNormalizedQueryString(myContext));
myRequestPartitionId = theCreationParams.RequestPartitionId; myRequestPartitionId = theCreationParams.RequestPartitionId;
myParentTransaction = ElasticApm.currentTransaction(); myParentTransaction = ElasticApm.currentTransaction();
myCompositeBroadcaster =
CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, myRequest);
} }
protected RequestPartitionId getRequestPartitionId() { protected RequestPartitionId getRequestPartitionId() {
@ -203,7 +203,7 @@ public class SearchTask implements Callable<Void> {
private ISearchBuilder newSearchBuilder() { private ISearchBuilder newSearchBuilder() {
Class<? extends IBaseResource> resourceTypeClass = Class<? extends IBaseResource> resourceTypeClass =
myContext.getResourceDefinition(myResourceType).getImplementingClass(); myContext.getResourceDefinition(myResourceType).getImplementingClass();
return mySearchBuilderFactory.newSearchBuilder(myCallingDao, myResourceType, resourceTypeClass); return mySearchBuilderFactory.newSearchBuilder(myResourceType, resourceTypeClass);
} }
@Nonnull @Nonnull
@ -280,7 +280,7 @@ public class SearchTask implements Callable<Void> {
.withRequest(myRequest) .withRequest(myRequest)
.withRequestPartitionId(myRequestPartitionId) .withRequestPartitionId(myRequestPartitionId)
.withPropagation(Propagation.REQUIRES_NEW) .withPropagation(Propagation.REQUIRES_NEW)
.execute(() -> doSaveSearch()); .execute(this::doSaveSearch);
} }
@SuppressWarnings("rawtypes") @SuppressWarnings("rawtypes")
@ -307,8 +307,7 @@ public class SearchTask implements Callable<Void> {
.add(RequestDetails.class, mySearchRuntimeDetails.getRequestDetails()) .add(RequestDetails.class, mySearchRuntimeDetails.getRequestDetails())
.addIfMatchesType( .addIfMatchesType(
ServletRequestDetails.class, mySearchRuntimeDetails.getRequestDetails()); ServletRequestDetails.class, mySearchRuntimeDetails.getRequestDetails());
CompositeInterceptorBroadcaster.doCallHooks( myCompositeBroadcaster.callHooks(Pointcut.STORAGE_PREACCESS_RESOURCES, params);
myInterceptorBroadcaster, myRequest, Pointcut.STORAGE_PREACCESS_RESOURCES, params);
for (int i = unsyncedPids.size() - 1; i >= 0; i--) { for (int i = unsyncedPids.size() - 1; i >= 0; i--) {
if (accessDetails.isDontReturnResourceAtIndex(i)) { if (accessDetails.isDontReturnResourceAtIndex(i)) {
@ -454,15 +453,13 @@ public class SearchTask implements Callable<Void> {
.add(RequestDetails.class, myRequest) .add(RequestDetails.class, myRequest)
.addIfMatchesType(ServletRequestDetails.class, myRequest) .addIfMatchesType(ServletRequestDetails.class, myRequest)
.add(SearchRuntimeDetails.class, mySearchRuntimeDetails); .add(SearchRuntimeDetails.class, mySearchRuntimeDetails);
CompositeInterceptorBroadcaster.doCallHooks( myCompositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_SEARCH_COMPLETE, params);
myInterceptorBroadcaster, myRequest, Pointcut.JPA_PERFTRACE_SEARCH_COMPLETE, params);
} else { } else {
HookParams params = new HookParams() HookParams params = new HookParams()
.add(RequestDetails.class, myRequest) .add(RequestDetails.class, myRequest)
.addIfMatchesType(ServletRequestDetails.class, myRequest) .addIfMatchesType(ServletRequestDetails.class, myRequest)
.add(SearchRuntimeDetails.class, mySearchRuntimeDetails); .add(SearchRuntimeDetails.class, mySearchRuntimeDetails);
CompositeInterceptorBroadcaster.doCallHooks( myCompositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_SEARCH_PASS_COMPLETE, params);
myInterceptorBroadcaster, myRequest, Pointcut.JPA_PERFTRACE_SEARCH_PASS_COMPLETE, params);
} }
ourLog.trace( ourLog.trace(
@ -516,8 +513,7 @@ public class SearchTask implements Callable<Void> {
.add(RequestDetails.class, myRequest) .add(RequestDetails.class, myRequest)
.addIfMatchesType(ServletRequestDetails.class, myRequest) .addIfMatchesType(ServletRequestDetails.class, myRequest)
.add(SearchRuntimeDetails.class, mySearchRuntimeDetails); .add(SearchRuntimeDetails.class, mySearchRuntimeDetails);
CompositeInterceptorBroadcaster.doCallHooks( myCompositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_SEARCH_FAILED, params);
myInterceptorBroadcaster, myRequest, Pointcut.JPA_PERFTRACE_SEARCH_FAILED, params);
saveSearch(); saveSearch();
span.captureException(t); span.captureException(t);

View File

@ -1056,7 +1056,8 @@ public class TermReadSvcImpl implements ITermReadSvc, IHasScheduledJobs {
if (theExpansionOptions != null if (theExpansionOptions != null
&& !theExpansionOptions.isFailOnMissingCodeSystem() && !theExpansionOptions.isFailOnMissingCodeSystem()
// Code system is unknown, therefore NOT_FOUND // Code system is unknown, therefore NOT_FOUND
&& e.getCodeValidationIssue().getCoding() == CodeValidationIssueCoding.NOT_FOUND) { && e.getCodeValidationIssue()
.hasIssueDetailCode(CodeValidationIssueCoding.NOT_FOUND.getCode())) {
return; return;
} }
throw new InternalErrorException(Msg.code(888) + e); throw new InternalErrorException(Msg.code(888) + e);
@ -2203,7 +2204,7 @@ public class TermReadSvcImpl implements ITermReadSvc, IHasScheduledJobs {
.setSeverity(IssueSeverity.ERROR) .setSeverity(IssueSeverity.ERROR)
.setCodeSystemVersion(theCodeSystemVersion) .setCodeSystemVersion(theCodeSystemVersion)
.setMessage(theMessage) .setMessage(theMessage)
.addCodeValidationIssue(new CodeValidationIssue( .addIssue(new CodeValidationIssue(
theMessage, theMessage,
IssueSeverity.ERROR, IssueSeverity.ERROR,
CodeValidationIssueCode.CODE_INVALID, CodeValidationIssueCode.CODE_INVALID,

View File

@ -214,9 +214,7 @@ public class JpaBulkExportProcessorTest {
when(myBulkExportHelperService.createSearchParameterMapsForResourceType(any(RuntimeResourceDefinition.class), eq(parameters), any(boolean.class))) when(myBulkExportHelperService.createSearchParameterMapsForResourceType(any(RuntimeResourceDefinition.class), eq(parameters), any(boolean.class)))
.thenReturn(maps); .thenReturn(maps);
// from getSearchBuilderForLocalResourceType // from getSearchBuilderForLocalResourceType
when(myDaoRegistry.getResourceDao(anyString())) when(mySearchBuilderFactory.newSearchBuilder(eq(parameters.getResourceType()), any()))
.thenReturn(mockDao);
when(mySearchBuilderFactory.newSearchBuilder(eq(mockDao), eq(parameters.getResourceType()), any()))
.thenReturn(searchBuilder); .thenReturn(searchBuilder);
// ret // ret
when(searchBuilder.createQuery( when(searchBuilder.createQuery(
@ -304,9 +302,7 @@ public class JpaBulkExportProcessorTest {
when(myBulkExportHelperService.createSearchParameterMapsForResourceType(any(RuntimeResourceDefinition.class), eq(parameters), any(boolean.class))) when(myBulkExportHelperService.createSearchParameterMapsForResourceType(any(RuntimeResourceDefinition.class), eq(parameters), any(boolean.class)))
.thenReturn(Collections.singletonList(new SearchParameterMap())); .thenReturn(Collections.singletonList(new SearchParameterMap()));
// from getSearchBuilderForLocalResourceType // from getSearchBuilderForLocalResourceType
when(myDaoRegistry.getResourceDao(not(eq("Group")))) when(mySearchBuilderFactory.newSearchBuilder(eq(parameters.getResourceType()), any()))
.thenReturn(mockDao);
when(mySearchBuilderFactory.newSearchBuilder(eq(mockDao), eq(parameters.getResourceType()), any()))
.thenReturn(searchBuilder); .thenReturn(searchBuilder);
// ret // ret
when(searchBuilder.createQuery( when(searchBuilder.createQuery(
@ -432,9 +428,7 @@ public class JpaBulkExportProcessorTest {
when(myIdHelperService.getPidOrNull(eq(getPartitionIdFromParams(thePartitioned)), eq(groupResource))) when(myIdHelperService.getPidOrNull(eq(getPartitionIdFromParams(thePartitioned)), eq(groupResource)))
.thenReturn(groupId); .thenReturn(groupId);
// getMembersFromGroupWithFilter // getMembersFromGroupWithFilter
when(myDaoRegistry.getResourceDao(eq("Patient"))) when(mySearchBuilderFactory.newSearchBuilder(eq("Patient"), eq(Patient.class)))
.thenReturn(patientDao);
when(mySearchBuilderFactory.newSearchBuilder(eq(patientDao), eq("Patient"), eq(Patient.class)))
.thenReturn(patientSearchBuilder); .thenReturn(patientSearchBuilder);
RuntimeResourceDefinition patientDef = myFhirContext.getResourceDefinition("Patient"); RuntimeResourceDefinition patientDef = myFhirContext.getResourceDefinition("Patient");
SearchParameterMap patientSpMap = new SearchParameterMap(); SearchParameterMap patientSpMap = new SearchParameterMap();
@ -447,9 +441,7 @@ public class JpaBulkExportProcessorTest {
RuntimeResourceDefinition observationDef = myFhirContext.getResourceDefinition("Observation"); RuntimeResourceDefinition observationDef = myFhirContext.getResourceDefinition("Observation");
when(myBulkExportHelperService.createSearchParameterMapsForResourceType(eq(observationDef), eq(parameters), any(boolean.class))) when(myBulkExportHelperService.createSearchParameterMapsForResourceType(eq(observationDef), eq(parameters), any(boolean.class)))
.thenReturn(Collections.singletonList(observationSpMap)); .thenReturn(Collections.singletonList(observationSpMap));
when(myDaoRegistry.getResourceDao((eq("Observation")))) when(mySearchBuilderFactory.newSearchBuilder(eq("Observation"), eq(Observation.class)))
.thenReturn(observationDao);
when(mySearchBuilderFactory.newSearchBuilder(eq(observationDao), eq("Observation"), eq(Observation.class)))
.thenReturn(observationSearchBuilder); .thenReturn(observationSearchBuilder);
when(observationSearchBuilder.loadIncludes( when(observationSearchBuilder.loadIncludes(
any(SearchBuilderLoadIncludesParameters.class) any(SearchBuilderLoadIncludesParameters.class)
@ -520,10 +512,7 @@ public class JpaBulkExportProcessorTest {
any(ExportPIDIteratorParameters.class), any(ExportPIDIteratorParameters.class),
any(boolean.class) any(boolean.class)
)).thenReturn(Collections.singletonList(new SearchParameterMap())); )).thenReturn(Collections.singletonList(new SearchParameterMap()));
when(myDaoRegistry.getResourceDao(eq("Patient")))
.thenReturn(dao);
when(mySearchBuilderFactory.newSearchBuilder( when(mySearchBuilderFactory.newSearchBuilder(
any(IFhirResourceDao.class),
anyString(), anyString(),
any() any()
)).thenReturn(searchBuilder); )).thenReturn(searchBuilder);

View File

@ -23,8 +23,8 @@ Comments: AllergyIntolerance.note[x].text (separated by <br />)
</thead> </thead>
<tbody> <tbody>
<th:block th:each="entry : ${resource.entry}" th:object="${entry.getResource()}"> <th:block th:each="entry : ${resource.entry}" th:object="${entry.getResource()}">
<th:block th:with="extension=${entry.getResource().getExtensionByUrl('http://hl7.org/fhir/StructureDefinition/narrativeLink').getValue().getValue()}"> <th:block th:with="extension=${entry.getResource().getExtensionByUrl('http://hl7.org/fhir/StructureDefinition/narrativeLink')}">
<tr th:id="${#strings.arraySplit(extension, '#')[1]}"> <tr th:id="${extension != null} ? ${#strings.arraySplit(extension.getValue().getValue(), '#')[1]} : ''">
<td th:insert="IpsUtilityFragments :: codeableConcept (cc=*{getCode()},attr='display')">Allergen</td> <td th:insert="IpsUtilityFragments :: codeableConcept (cc=*{getCode()},attr='display')">Allergen</td>
<td th:insert="~{IpsUtilityFragments :: codeableConcept (cc=*{getClinicalStatus()},attr='code')}">Status</td> <td th:insert="~{IpsUtilityFragments :: codeableConcept (cc=*{getClinicalStatus()},attr='code')}">Status</td>
<td th:insert="~{IpsUtilityFragments :: concat (list=*{getCategory()},attr='value')}">Category</td> <td th:insert="~{IpsUtilityFragments :: concat (list=*{getCategory()},attr='value')}">Category</td>
@ -33,7 +33,7 @@ Comments: AllergyIntolerance.note[x].text (separated by <br />)
<td th:insert="~{IpsUtilityFragments :: concat (list=*{getNote()},attr='text')}">Comments</td> <td th:insert="~{IpsUtilityFragments :: concat (list=*{getNote()},attr='text')}">Comments</td>
<th:block th:if="*{hasOnsetDateTimeType()}"> <th:block th:if="*{hasOnsetDateTimeType()}">
<td th:text="*{getOnsetDateTimeType().getValue()}">Onset</td> <td th:text="*{getOnsetDateTimeType().getValueAsString()}">Onset</td>
</th:block> </th:block>
<th:block th:if="*{hasOnsetStringType()}"> <th:block th:if="*{hasOnsetStringType()}">
<td th:text="*{getOnsetStringType().getValue()}">Onset</td> <td th:text="*{getOnsetStringType().getValue()}">Onset</td>

View File

@ -220,7 +220,7 @@ public class IpsGeneratorSvcImplTest {
HtmlTable table = (HtmlTable) tables.get(0); HtmlTable table = (HtmlTable) tables.get(0);
int onsetIndex = 6; int onsetIndex = 6;
assertEquals("Onset", table.getHeader().getRows().get(0).getCell(onsetIndex).asNormalizedText()); assertEquals("Onset", table.getHeader().getRows().get(0).getCell(onsetIndex).asNormalizedText());
assertEquals(new DateTimeType("2020-02-03T11:22:33Z").getValue().toString(), table.getBodies().get(0).getRows().get(0).getCell(onsetIndex).asNormalizedText()); assertEquals(new DateTimeType("2020-02-03T11:22:33Z").getValueAsString(), table.getBodies().get(0).getRows().get(0).getCell(onsetIndex).asNormalizedText());
assertEquals("Some Onset", table.getBodies().get(0).getRows().get(1).getCell(onsetIndex).asNormalizedText()); assertEquals("Some Onset", table.getBodies().get(0).getRows().get(1).getCell(onsetIndex).asNormalizedText());
assertEquals("", table.getBodies().get(0).getRows().get(2).getCell(onsetIndex).asNormalizedText()); assertEquals("", table.getBodies().get(0).getRows().get(2).getCell(onsetIndex).asNormalizedText());
} }

View File

@ -1,6 +1,7 @@
package ca.uhn.fhir.jpa.mdm.helper; package ca.uhn.fhir.jpa.mdm.helper;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster;
import ca.uhn.fhir.interceptor.api.IInterceptorService; import ca.uhn.fhir.interceptor.api.IInterceptorService;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
@ -23,6 +24,7 @@ import org.springframework.beans.factory.annotation.Autowired;
import java.util.function.Supplier; import java.util.function.Supplier;
import static org.awaitility.Awaitility.await; import static org.awaitility.Awaitility.await;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.when; import static org.mockito.Mockito.when;
/** /**
@ -78,6 +80,7 @@ public abstract class BaseMdmHelper implements BeforeEachCallback, AfterEachCall
//they are coming from an external HTTP Request. //they are coming from an external HTTP Request.
MockitoAnnotations.initMocks(this); MockitoAnnotations.initMocks(this);
when(myMockSrd.getInterceptorBroadcaster()).thenReturn(myMockInterceptorBroadcaster); when(myMockSrd.getInterceptorBroadcaster()).thenReturn(myMockInterceptorBroadcaster);
when(myMockInterceptorBroadcaster.callHooks(any(Pointcut.class), any(HookParams.class))).thenReturn(true);
when(myMockSrd.getServletRequest()).thenReturn(myMockServletRequest); when(myMockSrd.getServletRequest()).thenReturn(myMockServletRequest);
when(myMockSrd.getServer()).thenReturn(myMockRestfulServer); when(myMockSrd.getServer()).thenReturn(myMockRestfulServer);
when(myMockSrd.getRequestId()).thenReturn("MOCK_REQUEST"); when(myMockSrd.getRequestId()).thenReturn("MOCK_REQUEST");

View File

@ -941,8 +941,9 @@ public class SearchParamExtractorService {
if (myPartitionSettings.getAllowReferencesAcrossPartitions() == ALLOWED_UNQUALIFIED) { if (myPartitionSettings.getAllowReferencesAcrossPartitions() == ALLOWED_UNQUALIFIED) {
// Interceptor: Pointcut.JPA_CROSS_PARTITION_REFERENCE_DETECTED // Interceptor: Pointcut.JPA_CROSS_PARTITION_REFERENCE_DETECTED
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.JPA_RESOLVE_CROSS_PARTITION_REFERENCE, myInterceptorBroadcaster, theRequest)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
if (compositeBroadcaster.hasHooks(Pointcut.JPA_RESOLVE_CROSS_PARTITION_REFERENCE)) {
CrossPartitionReferenceDetails referenceDetails = new CrossPartitionReferenceDetails( CrossPartitionReferenceDetails referenceDetails = new CrossPartitionReferenceDetails(
theRequestPartitionId, theRequestPartitionId,
theSourceResourceName, theSourceResourceName,
@ -950,12 +951,8 @@ public class SearchParamExtractorService {
theRequest, theRequest,
theTransactionDetails); theTransactionDetails);
HookParams params = new HookParams(referenceDetails); HookParams params = new HookParams(referenceDetails);
targetResource = targetResource = (IResourceLookup<JpaPid>) compositeBroadcaster.callHooksAndReturnObject(
(IResourceLookup<JpaPid>) CompositeInterceptorBroadcaster.doCallHooksAndReturnObject( Pointcut.JPA_RESOLVE_CROSS_PARTITION_REFERENCE, params);
myInterceptorBroadcaster,
theRequest,
Pointcut.JPA_RESOLVE_CROSS_PARTITION_REFERENCE,
params);
} else { } else {
targetResource = myResourceLinkResolver.findTargetResource( targetResource = myResourceLinkResolver.findTargetResource(
RequestPartitionId.allPartitions(), RequestPartitionId.allPartitions(),
@ -1089,8 +1086,9 @@ public class SearchParamExtractorService {
} }
// If extraction generated any warnings, broadcast an error // If extraction generated any warnings, broadcast an error
if (CompositeInterceptorBroadcaster.hasHooks( IInterceptorBroadcaster compositeBroadcaster =
Pointcut.JPA_PERFTRACE_WARNING, theInterceptorBroadcaster, theRequestDetails)) { CompositeInterceptorBroadcaster.newCompositeBroadcaster(theInterceptorBroadcaster, theRequestDetails);
if (compositeBroadcaster.hasHooks(Pointcut.JPA_PERFTRACE_WARNING)) {
for (String next : theSearchParamSet.getWarnings()) { for (String next : theSearchParamSet.getWarnings()) {
StorageProcessingMessage messageHolder = new StorageProcessingMessage(); StorageProcessingMessage messageHolder = new StorageProcessingMessage();
messageHolder.setMessage(next); messageHolder.setMessage(next);
@ -1098,8 +1096,7 @@ public class SearchParamExtractorService {
.add(RequestDetails.class, theRequestDetails) .add(RequestDetails.class, theRequestDetails)
.addIfMatchesType(ServletRequestDetails.class, theRequestDetails) .addIfMatchesType(ServletRequestDetails.class, theRequestDetails)
.add(StorageProcessingMessage.class, messageHolder); .add(StorageProcessingMessage.class, messageHolder);
CompositeInterceptorBroadcaster.doCallHooks( compositeBroadcaster.callHooks(Pointcut.JPA_PERFTRACE_WARNING, params);
theInterceptorBroadcaster, theRequestDetails, Pointcut.JPA_PERFTRACE_WARNING, params);
} }
} }
} }

View File

@ -1,14 +1,20 @@
package ca.uhn.fhir.jpa.searchparam.extractor; package ca.uhn.fhir.jpa.searchparam.extractor;
import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails; import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import ca.uhn.fhir.test.utilities.MockInvoker;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.Mock; import org.mockito.Mock;
import org.mockito.junit.jupiter.MockitoExtension; import org.mockito.junit.jupiter.MockitoExtension;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.function.Consumer;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.eq; import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.Mockito.times; import static org.mockito.Mockito.times;
@ -36,14 +42,17 @@ public class SearchParamExtractorServiceTest {
searchParamSet.addWarning("help i'm a bug"); searchParamSet.addWarning("help i'm a bug");
searchParamSet.addWarning("Spiff"); searchParamSet.addWarning("Spiff");
when(myJpaInterceptorBroadcaster.hasHooks(any())).thenReturn(true); AtomicInteger counter = new AtomicInteger();
when(myJpaInterceptorBroadcaster.callHooks(any(), any())).thenReturn(true);
when(myJpaInterceptorBroadcaster.hasHooks(eq(Pointcut.JPA_PERFTRACE_WARNING))).thenReturn(true);
when(myJpaInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.JPA_PERFTRACE_WARNING))).thenReturn(MockInvoker.list((Consumer<HookParams>) params->counter.incrementAndGet()));
ServletRequestDetails requestDetails = new ServletRequestDetails(myRequestInterceptorBroadcaster); ServletRequestDetails requestDetails = new ServletRequestDetails(myRequestInterceptorBroadcaster);
SearchParamExtractorService.handleWarnings(requestDetails, myJpaInterceptorBroadcaster, searchParamSet); SearchParamExtractorService.handleWarnings(requestDetails, myJpaInterceptorBroadcaster, searchParamSet);
verify(myJpaInterceptorBroadcaster, times(2)).callHooks(eq(Pointcut.JPA_PERFTRACE_WARNING), any()); verify(myJpaInterceptorBroadcaster, times(3)).hasHooks(eq(Pointcut.JPA_PERFTRACE_WARNING));
verify(myRequestInterceptorBroadcaster, times(2)).callHooks(eq(Pointcut.JPA_PERFTRACE_WARNING), any()); verify(myRequestInterceptorBroadcaster, times(2)).hasHooks(eq(Pointcut.JPA_PERFTRACE_WARNING));
assertEquals(2, counter.get());
} }
} }

View File

@ -118,12 +118,15 @@ public class SubscriptionMatcherInterceptor {
ResourceModifiedMessage msg = createResourceModifiedMessage(theNewResource, theOperationType, theRequest); ResourceModifiedMessage msg = createResourceModifiedMessage(theNewResource, theOperationType, theRequest);
// Interceptor call: SUBSCRIPTION_RESOURCE_MODIFIED // Interceptor call: SUBSCRIPTION_RESOURCE_MODIFIED
HookParams params = new HookParams().add(ResourceModifiedMessage.class, msg); IInterceptorBroadcaster compositeBroadcaster =
boolean outcome = CompositeInterceptorBroadcaster.doCallHooks( CompositeInterceptorBroadcaster.newCompositeBroadcaster(myInterceptorBroadcaster, theRequest);
myInterceptorBroadcaster, theRequest, Pointcut.SUBSCRIPTION_RESOURCE_MODIFIED, params); if (compositeBroadcaster.hasHooks(Pointcut.SUBSCRIPTION_RESOURCE_MODIFIED)) {
HookParams params = new HookParams().add(ResourceModifiedMessage.class, msg);
boolean outcome = compositeBroadcaster.callHooks(Pointcut.SUBSCRIPTION_RESOURCE_MODIFIED, params);
if (!outcome) { if (!outcome) {
return; return;
}
} }
processResourceModifiedMessage(msg); processResourceModifiedMessage(msg);

View File

@ -383,8 +383,8 @@ public class SubscriptionTriggeringSvcImpl implements ISubscriptionTriggeringSvc
String resourceType = myFhirContext.getResourceType(theJobDetails.getCurrentSearchResourceType()); String resourceType = myFhirContext.getResourceType(theJobDetails.getCurrentSearchResourceType());
RuntimeResourceDefinition resourceDef = RuntimeResourceDefinition resourceDef =
myFhirContext.getResourceDefinition(theJobDetails.getCurrentSearchResourceType()); myFhirContext.getResourceDefinition(theJobDetails.getCurrentSearchResourceType());
ISearchBuilder searchBuilder = mySearchBuilderFactory.newSearchBuilder( ISearchBuilder searchBuilder =
resourceDao, resourceType, resourceDef.getImplementingClass()); mySearchBuilderFactory.newSearchBuilder(resourceType, resourceDef.getImplementingClass());
List<IBaseResource> listToPopulate = new ArrayList<>(); List<IBaseResource> listToPopulate = new ArrayList<>();
myTransactionService.withRequest(null).execute(() -> { myTransactionService.withRequest(null).execute(() -> {

View File

@ -113,7 +113,7 @@ public class SubscriptionsDstu2Test extends BaseResourceProviderDstu2Test {
myClient.create().resource(subs).execute(); myClient.create().resource(subs).execute();
fail(""); fail("");
} catch (UnprocessableEntityException e) { } catch (UnprocessableEntityException e) {
assertThat(e.getMessage()).contains("Unknown SubscriptionStatus code 'aaaaa'"); assertThat(e.getMessage()).containsAnyOf("invalid value aaaaa", "Unknown SubscriptionStatus code 'aaaaa'");
} }
} }

View File

@ -52,7 +52,7 @@ public class BaseSearchSvc {
protected static final FhirContext ourCtx = FhirContext.forDstu3Cached(); protected static final FhirContext ourCtx = FhirContext.forDstu3Cached();
public void after() { public void after() {
verify(mySearchBuilderFactory, atMost(myExpectedNumberOfSearchBuildersCreated)).newSearchBuilder(any(), any(), any()); verify(mySearchBuilderFactory, atMost(myExpectedNumberOfSearchBuildersCreated)).newSearchBuilder(any(), any());
} }
protected List<JpaPid> createPidSequence(int to) { protected List<JpaPid> createPidSequence(int to) {

View File

@ -76,6 +76,7 @@ import static org.mockito.Mockito.doAnswer;
import static org.mockito.Mockito.lenient; import static org.mockito.Mockito.lenient;
import static org.mockito.Mockito.mock; import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.verifyNoInteractions;
import static org.mockito.Mockito.when; import static org.mockito.Mockito.when;
@SuppressWarnings({"unchecked"}) @SuppressWarnings({"unchecked"})
@ -91,7 +92,7 @@ public class SearchCoordinatorSvcImplTest extends BaseSearchSvc {
@Mock @Mock
private ISearchResultCacheSvc mySearchResultCacheSvc; private ISearchResultCacheSvc mySearchResultCacheSvc;
private Search myCurrentSearch; private Search myCurrentSearch;
@Mock @Mock(strictness = Mock.Strictness.STRICT_STUBS)
private IInterceptorBroadcaster myInterceptorBroadcaster; private IInterceptorBroadcaster myInterceptorBroadcaster;
@Mock @Mock
private SearchBuilderFactory<JpaPid> mySearchBuilderFactory; private SearchBuilderFactory<JpaPid> mySearchBuilderFactory;
@ -289,7 +290,7 @@ public class SearchCoordinatorSvcImplTest extends BaseSearchSvc {
} }
private void initSearches() { private void initSearches() {
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())).thenReturn(mySearchBuilder); when(mySearchBuilderFactory.newSearchBuilder(any(), any())).thenReturn(mySearchBuilder);
} }
private void initAsyncSearches() { private void initAsyncSearches() {
@ -318,8 +319,8 @@ public class SearchCoordinatorSvcImplTest extends BaseSearchSvc {
SlowIterator iter = new SlowIterator(pids.iterator(), 500); SlowIterator iter = new SlowIterator(pids.iterator(), 500);
when(mySearchBuilder.createQuery(same(params), any(), any(), nullable(RequestPartitionId.class))).thenReturn(iter); when(mySearchBuilder.createQuery(same(params), any(), any(), nullable(RequestPartitionId.class))).thenReturn(iter);
mockSearchTask(); mockSearchTask();
when(myInterceptorBroadcaster.callHooks(any(), any())) when(myInterceptorBroadcaster.hasHooks(any())).thenReturn(true);
.thenReturn(true); when(myInterceptorBroadcaster.getInvokersForPointcut(any())).thenReturn(List.of());
ourLog.info("Registering the first search"); ourLog.info("Registering the first search");
new Thread(() -> mySvc.registerSearch(myCallingDao, params, "Patient", new CacheControlDirective(), null, RequestPartitionId.allPartitions())).start(); new Thread(() -> mySvc.registerSearch(myCallingDao, params, "Patient", new CacheControlDirective(), null, RequestPartitionId.allPartitions())).start();
@ -437,7 +438,7 @@ public class SearchCoordinatorSvcImplTest extends BaseSearchSvc {
@Test @Test
public void testLoadSearchResultsFromDifferentCoordinator() { public void testLoadSearchResultsFromDifferentCoordinator() {
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())).thenReturn(mySearchBuilder); when(mySearchBuilderFactory.newSearchBuilder(any(), any())).thenReturn(mySearchBuilder);
final String uuid = UUID.randomUUID().toString(); final String uuid = UUID.randomUUID().toString();
@ -517,7 +518,7 @@ public class SearchCoordinatorSvcImplTest extends BaseSearchSvc {
@Test @Test
public void testSynchronousSearch() { public void testSynchronousSearch() {
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())).thenReturn(mySearchBuilder); when(mySearchBuilderFactory.newSearchBuilder(any(), any())).thenReturn(mySearchBuilder);
SearchParameterMap params = new SearchParameterMap(); SearchParameterMap params = new SearchParameterMap();
params.setLoadSynchronous(true); params.setLoadSynchronous(true);
@ -531,7 +532,7 @@ public class SearchCoordinatorSvcImplTest extends BaseSearchSvc {
@Test @Test
public void testSynchronousSearchWithOffset() { public void testSynchronousSearchWithOffset() {
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())).thenReturn(mySearchBuilder); when(mySearchBuilderFactory.newSearchBuilder(any(), any())).thenReturn(mySearchBuilder);
SearchParameterMap params = new SearchParameterMap(); SearchParameterMap params = new SearchParameterMap();
params.setOffset(10); params.setOffset(10);
@ -544,7 +545,7 @@ public class SearchCoordinatorSvcImplTest extends BaseSearchSvc {
@Test @Test
public void testSynchronousSearchUpTo() { public void testSynchronousSearchUpTo() {
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())).thenReturn(mySearchBuilder); when(mySearchBuilderFactory.newSearchBuilder(any(), any())).thenReturn(mySearchBuilder);
int loadUpto = 30; int loadUpto = 30;
SearchParameterMap params = new SearchParameterMap(); SearchParameterMap params = new SearchParameterMap();
@ -584,7 +585,6 @@ public class SearchCoordinatorSvcImplTest extends BaseSearchSvc {
@Test @Test
public void testFetchAllResultsReturnsNull() { public void testFetchAllResultsReturnsNull() {
when(myDaoRegistry.getResourceDao(anyString())).thenReturn(myCallingDao); when(myDaoRegistry.getResourceDao(anyString())).thenReturn(myCallingDao);
when(myCallingDao.getContext()).thenReturn(ourCtx);
Search search = new Search(); Search search = new Search();
search.setUuid("0000-1111"); search.setUuid("0000-1111");

View File

@ -41,7 +41,7 @@ public class SynchronousSearchSvcImplTest extends BaseSearchSvc {
@Test @Test
public void testSynchronousSearch() { public void testSynchronousSearch() {
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())) when(mySearchBuilderFactory.newSearchBuilder(any(), any()))
.thenReturn(mySearchBuilder); .thenReturn(mySearchBuilder);
SearchParameterMap params = new SearchParameterMap(); SearchParameterMap params = new SearchParameterMap();
@ -65,7 +65,7 @@ public class SynchronousSearchSvcImplTest extends BaseSearchSvc {
@Test @Test
public void testSynchronousSearchWithOffset() { public void testSynchronousSearchWithOffset() {
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())).thenReturn(mySearchBuilder); when(mySearchBuilderFactory.newSearchBuilder(any(), any())).thenReturn(mySearchBuilder);
SearchParameterMap params = new SearchParameterMap(); SearchParameterMap params = new SearchParameterMap();
params.setCount(10); params.setCount(10);
@ -87,7 +87,7 @@ public class SynchronousSearchSvcImplTest extends BaseSearchSvc {
@Test @Test
public void testSynchronousSearchUpTo() { public void testSynchronousSearchUpTo() {
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())).thenReturn(mySearchBuilder); when(mySearchBuilderFactory.newSearchBuilder(any(), any())).thenReturn(mySearchBuilder);
when(myStorageSettings.getDefaultTotalMode()).thenReturn(null); when(myStorageSettings.getDefaultTotalMode()).thenReturn(null);
SearchParameterMap params = new SearchParameterMap(); SearchParameterMap params = new SearchParameterMap();

View File

@ -15,6 +15,7 @@ import java.util.List;
import java.util.Set; import java.util.Set;
import static org.assertj.core.api.Assertions.assertThat; import static org.assertj.core.api.Assertions.assertThat;
import static org.mockito.Mockito.when;
public class TerminologySvcImplDstu2Test extends BaseJpaDstu2Test { public class TerminologySvcImplDstu2Test extends BaseJpaDstu2Test {
@ -29,6 +30,8 @@ public class TerminologySvcImplDstu2Test extends BaseJpaDstu2Test {
List<FhirVersionIndependentConcept> concepts; List<FhirVersionIndependentConcept> concepts;
Set<String> codes; Set<String> codes;
when(mySrd.getInterceptorBroadcaster()).thenReturn(null);
ValueSet upload = new ValueSet(); ValueSet upload = new ValueSet();
upload.setId(new IdDt("testVs")); upload.setId(new IdDt("testVs"));
upload.setUrl("http://myVs"); upload.setUrl("http://myVs");
@ -61,6 +64,8 @@ public class TerminologySvcImplDstu2Test extends BaseJpaDstu2Test {
List<FhirVersionIndependentConcept> concepts; List<FhirVersionIndependentConcept> concepts;
Set<String> codes; Set<String> codes;
when(mySrd.getInterceptorBroadcaster()).thenReturn(null);
ValueSet upload = new ValueSet(); ValueSet upload = new ValueSet();
upload.setId(new IdDt("testVs")); upload.setId(new IdDt("testVs"));
upload.setUrl("http://myVs"); upload.setUrl("http://myVs");

View File

@ -881,6 +881,11 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
public void testFetchInstanceAndWorkChunkStatus() { public void testFetchInstanceAndWorkChunkStatus() {
// Setup // Setup
Date date1 = new Date();
Date date2 = new Date();
List<String> chunkIds = new ArrayList<>(); List<String> chunkIds = new ArrayList<>();
JobInstance instance = createInstance(); JobInstance instance = createInstance();
String instanceId = mySvc.storeNewInstance(instance); String instanceId = mySvc.storeNewInstance(instance);

View File

@ -33,6 +33,7 @@ import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.rest.server.provider.ProviderConstants; import ca.uhn.fhir.rest.server.provider.ProviderConstants;
import ca.uhn.fhir.rest.server.tenant.UrlBaseTenantIdentificationStrategy; import ca.uhn.fhir.rest.server.tenant.UrlBaseTenantIdentificationStrategy;
import ca.uhn.fhir.test.utilities.HttpClientExtension; import ca.uhn.fhir.test.utilities.HttpClientExtension;
import ca.uhn.fhir.test.utilities.MockInvoker;
import ca.uhn.fhir.test.utilities.server.RestfulServerExtension; import ca.uhn.fhir.test.utilities.server.RestfulServerExtension;
import ca.uhn.fhir.util.JsonUtil; import ca.uhn.fhir.util.JsonUtil;
import ca.uhn.fhir.util.SearchParameterUtil; import ca.uhn.fhir.util.SearchParameterUtil;
@ -1054,18 +1055,18 @@ public class BulkDataExportProviderR4Test {
AtomicBoolean initiateCalled = new AtomicBoolean(false); AtomicBoolean initiateCalled = new AtomicBoolean(false);
// when // when
when(myInterceptorBroadcaster.callHooks(eq(Pointcut.STORAGE_PRE_INITIATE_BULK_EXPORT), any(HookParams.class))) when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.STORAGE_PRE_INITIATE_BULK_EXPORT))).thenReturn(true);
.thenAnswer((args) -> { when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.STORAGE_INITIATE_BULK_EXPORT))).thenReturn(true);
when(myInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.STORAGE_PRE_INITIATE_BULK_EXPORT))).thenReturn(MockInvoker.list(params->{
assertFalse(initiateCalled.get()); assertFalse(initiateCalled.get());
assertFalse(preInitiateCalled.getAndSet(true)); assertFalse(preInitiateCalled.getAndSet(true));
return true; return true;
}); }));
when(myInterceptorBroadcaster.callHooks(eq(Pointcut.STORAGE_INITIATE_BULK_EXPORT), any(HookParams.class))) when(myInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.STORAGE_INITIATE_BULK_EXPORT))).thenReturn(MockInvoker.list(params->{
.thenAnswer((args) -> {
assertTrue(preInitiateCalled.get()); assertTrue(preInitiateCalled.get());
assertFalse(initiateCalled.getAndSet(true)); assertFalse(initiateCalled.getAndSet(true));
return true; return true;
}); }));
when(myJobCoordinator.startInstance(isNotNull(), any())) when(myJobCoordinator.startInstance(isNotNull(), any()))
.thenReturn(createJobStartResponse()); .thenReturn(createJobStartResponse());

View File

@ -30,6 +30,7 @@ import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.rest.server.provider.ProviderConstants; import ca.uhn.fhir.rest.server.provider.ProviderConstants;
import ca.uhn.fhir.rest.server.tenant.UrlBaseTenantIdentificationStrategy; import ca.uhn.fhir.rest.server.tenant.UrlBaseTenantIdentificationStrategy;
import ca.uhn.fhir.test.utilities.HttpClientExtension; import ca.uhn.fhir.test.utilities.HttpClientExtension;
import ca.uhn.fhir.test.utilities.MockInvoker;
import ca.uhn.fhir.test.utilities.server.RestfulServerExtension; import ca.uhn.fhir.test.utilities.server.RestfulServerExtension;
import ca.uhn.fhir.util.JsonUtil; import ca.uhn.fhir.util.JsonUtil;
import ca.uhn.fhir.util.SearchParameterUtil; import ca.uhn.fhir.util.SearchParameterUtil;
@ -1057,18 +1058,18 @@ public class BulkDataExportProviderR5Test {
AtomicBoolean initiateCalled = new AtomicBoolean(false); AtomicBoolean initiateCalled = new AtomicBoolean(false);
// when // when
when(myInterceptorBroadcaster.callHooks(eq(Pointcut.STORAGE_PRE_INITIATE_BULK_EXPORT), any(HookParams.class))) when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.STORAGE_PRE_INITIATE_BULK_EXPORT))).thenReturn(true);
.thenAnswer((args) -> { when(myInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.STORAGE_PRE_INITIATE_BULK_EXPORT))).thenReturn(MockInvoker.list(params -> {
assertFalse(initiateCalled.get()); assertFalse(initiateCalled.get());
assertFalse(preInitiateCalled.getAndSet(true)); assertFalse(preInitiateCalled.getAndSet(true));
return true; return true;
}); }));
when(myInterceptorBroadcaster.callHooks(eq(Pointcut.STORAGE_INITIATE_BULK_EXPORT), any(HookParams.class))) when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.STORAGE_INITIATE_BULK_EXPORT))).thenReturn(true);
.thenAnswer((args) -> { when(myInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.STORAGE_INITIATE_BULK_EXPORT))).thenReturn(MockInvoker.list(params -> {
assertTrue(preInitiateCalled.get()); assertTrue(preInitiateCalled.get());
assertFalse(initiateCalled.getAndSet(true)); assertFalse(initiateCalled.getAndSet(true));
return true; return true;
}); }));
when(myJobCoordinator.startInstance(isNotNull(), any())) when(myJobCoordinator.startInstance(isNotNull(), any()))
.thenReturn(createJobStartResponse()); .thenReturn(createJobStartResponse());

View File

@ -17,6 +17,7 @@ import ca.uhn.fhir.jpa.bulk.imprt.api.IBulkDataImportSvc;
import ca.uhn.fhir.jpa.bulk.imprt.model.ActivateJobResult; import ca.uhn.fhir.jpa.bulk.imprt.model.ActivateJobResult;
import ca.uhn.fhir.jpa.bulk.imprt.model.BulkImportJobFileJson; import ca.uhn.fhir.jpa.bulk.imprt.model.BulkImportJobFileJson;
import ca.uhn.fhir.jpa.bulk.imprt.model.BulkImportJobJson; import ca.uhn.fhir.jpa.bulk.imprt.model.BulkImportJobJson;
import ca.uhn.fhir.jpa.bulk.imprt.model.BulkImportJobStatusEnum;
import ca.uhn.fhir.jpa.bulk.imprt.model.JobFileRowProcessingModeEnum; import ca.uhn.fhir.jpa.bulk.imprt.model.JobFileRowProcessingModeEnum;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.subscription.channel.api.ChannelConsumerSettings; import ca.uhn.fhir.jpa.subscription.channel.api.ChannelConsumerSettings;
@ -166,6 +167,8 @@ public class BulkDataImportR4Test extends BaseJpaR4Test implements ITestDataBuil
ActivateJobResult activateJobOutcome = mySvc.activateNextReadyJob(); ActivateJobResult activateJobOutcome = mySvc.activateNextReadyJob();
assertTrue(activateJobOutcome.isActivated); assertTrue(activateJobOutcome.isActivated);
// validate that job changed status from READY to RUNNING
assertEquals(BulkImportJobStatusEnum.RUNNING, mySvc.getJobStatus(jobId).getStatus());
JobInstance instance = myBatch2JobHelper.awaitJobCompletion(activateJobOutcome.jobId, 60); JobInstance instance = myBatch2JobHelper.awaitJobCompletion(activateJobOutcome.jobId, 60);
assertNotNull(instance); assertNotNull(instance);
@ -196,6 +199,8 @@ public class BulkDataImportR4Test extends BaseJpaR4Test implements ITestDataBuil
ActivateJobResult activateJobOutcome = mySvc.activateNextReadyJob(); ActivateJobResult activateJobOutcome = mySvc.activateNextReadyJob();
assertTrue(activateJobOutcome.isActivated); assertTrue(activateJobOutcome.isActivated);
// validate that job changed status from READY to RUNNING
assertEquals(BulkImportJobStatusEnum.RUNNING, mySvc.getJobStatus(jobId).getStatus());
JobInstance instance = myBatch2JobHelper.awaitJobCompletion(activateJobOutcome.jobId); JobInstance instance = myBatch2JobHelper.awaitJobCompletion(activateJobOutcome.jobId);
assertNotNull(instance); assertNotNull(instance);

View File

@ -320,7 +320,7 @@ class BaseHapiFhirResourceDaoTest {
mySvc.setTransactionService(myTransactionService); mySvc.setTransactionService(myTransactionService);
when(myRequestPartitionHelperSvc.determineReadPartitionForRequestForSearchType(any(), any(), any(), any())).thenReturn(mock(RequestPartitionId.class)); when(myRequestPartitionHelperSvc.determineReadPartitionForRequestForSearchType(any(), any(), any(), any())).thenReturn(mock(RequestPartitionId.class));
when(mySearchBuilderFactory.newSearchBuilder(any(), any(), any())).thenReturn(myISearchBuilder); when(mySearchBuilderFactory.newSearchBuilder(any(), any())).thenReturn(myISearchBuilder);
when(myISearchBuilder.createQuery(any(), any(), any(), any())).thenReturn(mock(IResultIterator.class)); when(myISearchBuilder.createQuery(any(), any(), any(), any())).thenReturn(mock(IResultIterator.class));
lenient().when(myStorageSettings.getInternalSynchronousSearchSize()).thenReturn(5000); lenient().when(myStorageSettings.getInternalSynchronousSearchSize()).thenReturn(5000);

View File

@ -1,6 +1,5 @@
package ca.uhn.fhir.jpa.dao.r4; package ca.uhn.fhir.jpa.dao.r4;
import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
@ -10,9 +9,9 @@ import ca.uhn.fhir.jpa.search.reindex.ResourceReindexingSvcImpl;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test; import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import ca.uhn.fhir.jpa.util.SpringObjectCaster; import ca.uhn.fhir.jpa.util.SpringObjectCaster;
import ca.uhn.fhir.rest.server.util.ISearchParamRegistry; import ca.uhn.fhir.rest.server.util.ISearchParamRegistry;
import ca.uhn.fhir.test.utilities.MockInvoker;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.mockito.ArgumentMatchers;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
@ -47,21 +46,22 @@ public abstract class BaseComboParamsR4Test extends BaseJpaR4Test {
when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.JPA_PERFTRACE_WARNING))).thenReturn(true); when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.JPA_PERFTRACE_WARNING))).thenReturn(true);
when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.JPA_PERFTRACE_INFO))).thenReturn(true); when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.JPA_PERFTRACE_INFO))).thenReturn(true);
when(myInterceptorBroadcaster.callHooks(eq(Pointcut.JPA_PERFTRACE_INFO), ArgumentMatchers.any(HookParams.class))).thenAnswer(t -> { when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.JPA_PERFTRACE_SEARCH_REUSING_CACHED))).thenReturn(true);
HookParams params = t.getArgument(1, HookParams.class); when(myInterceptorBroadcaster.hasHooks(eq(Pointcut.STORAGE_PRECHECK_FOR_CACHED_SEARCH))).thenReturn(true);
when(myInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.JPA_PERFTRACE_INFO))).thenReturn(MockInvoker.list(params->{
myMessages.add("INFO " + params.get(StorageProcessingMessage.class).getMessage()); myMessages.add("INFO " + params.get(StorageProcessingMessage.class).getMessage());
return null; }));
});
when(myInterceptorBroadcaster.callHooks(eq(Pointcut.JPA_PERFTRACE_WARNING), ArgumentMatchers.any(HookParams.class))).thenAnswer(t -> {
HookParams params = t.getArgument(1, HookParams.class); when(myInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.JPA_PERFTRACE_WARNING))).thenReturn(MockInvoker.list(params->{
myMessages.add("WARN " + params.get(StorageProcessingMessage.class).getMessage()); myMessages.add("WARN " + params.get(StorageProcessingMessage.class).getMessage());
return null; }));
}); when(myInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.JPA_PERFTRACE_SEARCH_REUSING_CACHED))).thenReturn(MockInvoker.list(params->{
when(myInterceptorBroadcaster.callHooks(eq(Pointcut.JPA_PERFTRACE_SEARCH_REUSING_CACHED), ArgumentMatchers.any(HookParams.class))).thenAnswer(t -> {
HookParams params = t.getArgument(1, HookParams.class);
myMessages.add("REUSING CACHED SEARCH"); myMessages.add("REUSING CACHED SEARCH");
return null; }));
});
// allow searches to use cached results
when(myInterceptorBroadcaster.getInvokersForPointcut(eq(Pointcut.STORAGE_PRECHECK_FOR_CACHED_SEARCH))).thenReturn(MockInvoker.list(params->true));
} }
@AfterEach @AfterEach
@ -80,4 +80,5 @@ public abstract class BaseComboParamsR4Test extends BaseJpaR4Test {
ourLog.info("Messages:\n {}", String.join("\n ", myMessages)); ourLog.info("Messages:\n {}", String.join("\n ", myMessages));
} }
} }

View File

@ -1,6 +1,7 @@
package ca.uhn.fhir.jpa.dao.r4; package ca.uhn.fhir.jpa.dao.r4;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.jpa.api.model.DaoMethodOutcome;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test; import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import ca.uhn.fhir.rest.api.server.IBundleProvider; import ca.uhn.fhir.rest.api.server.IBundleProvider;
@ -23,6 +24,8 @@ import org.hl7.fhir.r4.model.Reference;
import org.hl7.fhir.r4.model.ServiceRequest; import org.hl7.fhir.r4.model.ServiceRequest;
import org.hl7.fhir.r4.model.ServiceRequest.ServiceRequestIntent; import org.hl7.fhir.r4.model.ServiceRequest.ServiceRequestIntent;
import org.hl7.fhir.r4.model.ServiceRequest.ServiceRequestStatus; import org.hl7.fhir.r4.model.ServiceRequest.ServiceRequestStatus;
import org.hl7.fhir.r4.model.Specimen;
import org.hl7.fhir.r4.model.StringType;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;

View File

@ -28,6 +28,7 @@ import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException; import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException;
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException; import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
import ca.uhn.fhir.test.utilities.UuidUtils;
import ca.uhn.fhir.util.BundleBuilder; import ca.uhn.fhir.util.BundleBuilder;
import ca.uhn.fhir.util.ClasspathUtil; import ca.uhn.fhir.util.ClasspathUtil;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
@ -81,6 +82,7 @@ import java.util.concurrent.Executors;
import java.util.concurrent.Future; import java.util.concurrent.Future;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static ca.uhn.fhir.test.utilities.UuidUtils.HASH_UUID_PATTERN;
import static org.assertj.core.api.Assertions.assertThat; import static org.assertj.core.api.Assertions.assertThat;
import static org.assertj.core.api.Assertions.assertThatThrownBy; import static org.assertj.core.api.Assertions.assertThatThrownBy;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
@ -738,7 +740,8 @@ public class FhirResourceDaoR4CreateTest extends BaseJpaR4Test {
String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(p); String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(p);
ourLog.info("Input: {}", encoded); ourLog.info("Input: {}", encoded);
assertThat(encoded).contains("#1"); String organizationUuid = UuidUtils.findFirstUUID(encoded);
assertNotNull(organizationUuid);
IIdType id = myPatientDao.create(p).getId().toUnqualifiedVersionless(); IIdType id = myPatientDao.create(p).getId().toUnqualifiedVersionless();
@ -746,10 +749,12 @@ public class FhirResourceDaoR4CreateTest extends BaseJpaR4Test {
encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(p); encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(p);
ourLog.info("Output: {}", encoded); ourLog.info("Output: {}", encoded);
assertThat(encoded).contains("#1"); String organizationUuidParsed = UuidUtils.findFirstUUID(encoded);
assertNotNull(organizationUuidParsed);
assertEquals(organizationUuid, organizationUuidParsed);
Organization org = (Organization) p.getManagingOrganization().getResource(); Organization org = (Organization) p.getManagingOrganization().getResource();
assertEquals("#1", org.getId()); assertEquals("#" + organizationUuid, org.getId());
assertThat(org.getMeta().getTag()).hasSize(1); assertThat(org.getMeta().getTag()).hasSize(1);
} }

View File

@ -2722,7 +2722,7 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test
ourLog.debug(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(output)); ourLog.debug(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(output));
myCaptureQueriesListener.logSelectQueriesForCurrentThread(); myCaptureQueriesListener.logSelectQueriesForCurrentThread();
assertEquals(3, myCaptureQueriesListener.countSelectQueriesForCurrentThread()); assertEquals(2, myCaptureQueriesListener.countSelectQueriesForCurrentThread());
myCaptureQueriesListener.logInsertQueriesForCurrentThread(); myCaptureQueriesListener.logInsertQueriesForCurrentThread();
assertEquals(2, myCaptureQueriesListener.countInsertQueriesForCurrentThread()); assertEquals(2, myCaptureQueriesListener.countInsertQueriesForCurrentThread());
myCaptureQueriesListener.logUpdateQueriesForCurrentThread(); myCaptureQueriesListener.logUpdateQueriesForCurrentThread();

View File

@ -428,7 +428,7 @@ public class FhirResourceDaoR4VersionedReferenceTest extends BaseJpaR4Test {
IdType observationId = new IdType(outcome.getEntry().get(1).getResponse().getLocation()); IdType observationId = new IdType(outcome.getEntry().get(1).getResponse().getLocation());
// Make sure we're not introducing any extra DB operations // Make sure we're not introducing any extra DB operations
assertThat(myCaptureQueriesListener.logSelectQueries()).hasSize(3); assertThat(myCaptureQueriesListener.logSelectQueries()).hasSize(2);
// Read back and verify that reference is now versioned // Read back and verify that reference is now versioned
observation = myObservationDao.read(observationId); observation = myObservationDao.read(observationId);
@ -463,7 +463,7 @@ public class FhirResourceDaoR4VersionedReferenceTest extends BaseJpaR4Test {
IdType observationId = new IdType(outcome.getEntry().get(1).getResponse().getLocation()); IdType observationId = new IdType(outcome.getEntry().get(1).getResponse().getLocation());
// Make sure we're not introducing any extra DB operations // Make sure we're not introducing any extra DB operations
assertThat(myCaptureQueriesListener.logSelectQueries()).hasSize(4); assertThat(myCaptureQueriesListener.logSelectQueries()).hasSize(3);
// Read back and verify that reference is now versioned // Read back and verify that reference is now versioned
observation = myObservationDao.read(observationId); observation = myObservationDao.read(observationId);

View File

@ -1,5 +1,6 @@
package ca.uhn.fhir.jpa.dao.r4; package ca.uhn.fhir.jpa.dao.r4;
import static ca.uhn.fhir.test.utilities.UuidUtils.HASH_UUID_PATTERN;
import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertNull;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertFalse;
@ -3219,8 +3220,8 @@ public class FhirSystemDaoR4Test extends BaseJpaR4SystemTest {
String id = outcome.getEntry().get(0).getResponse().getLocation(); String id = outcome.getEntry().get(0).getResponse().getLocation();
patient = myPatientDao.read(new IdType(id)); patient = myPatientDao.read(new IdType(id));
assertEquals("#1", patient.getManagingOrganization().getReference()); assertThat(patient.getManagingOrganization().getReference()).containsPattern(HASH_UUID_PATTERN);
assertEquals("#1", patient.getContained().get(0).getId()); assertEquals(patient.getManagingOrganization().getReference(), patient.getContained().get(0).getId());
} }
@Nonnull @Nonnull

View File

@ -263,6 +263,8 @@ public class ResourceProviderR4Test extends BaseResourceProviderR4Test {
myStorageSettings.setSearchPreFetchThresholds(new JpaStorageSettings().getSearchPreFetchThresholds()); myStorageSettings.setSearchPreFetchThresholds(new JpaStorageSettings().getSearchPreFetchThresholds());
} }
@Test @Test
public void testParameterWithNoValueThrowsError_InvalidChainOnCustomSearch() throws IOException { public void testParameterWithNoValueThrowsError_InvalidChainOnCustomSearch() throws IOException {
SearchParameter searchParameter = new SearchParameter(); SearchParameter searchParameter = new SearchParameter();

View File

@ -180,6 +180,59 @@ public class ReindexTaskTest extends BaseJpaR4Test {
} }
@Test
public void testOptimizeStorage_AllVersions_SingleResourceWithMultipleVersion() {
// this difference of this test from testOptimizeStorage_AllVersions is that this one has only 1 resource
// (with multiple versions) in the db. There was a bug where if only one resource were being re-indexed, the
// resource wasn't processed for optimize storage.
// Setup
IIdType patientId = createPatient(withActiveTrue());
for (int i = 0; i < 10; i++) {
Patient p = new Patient();
p.setId(patientId.toUnqualifiedVersionless());
p.setActive(true);
p.addIdentifier().setValue(String.valueOf(i));
myPatientDao.update(p, mySrd);
}
// Move resource text to compressed storage, which we don't write to anymore but legacy
// data may exist that was previously stored there, so we're simulating that.
List<ResourceHistoryTable> allHistoryEntities = runInTransaction(() -> myResourceHistoryTableDao.findAll());
allHistoryEntities.forEach(t->relocateResourceTextToCompressedColumn(t.getResourceId(), t.getVersion()));
runInTransaction(()->{
assertEquals(11, myResourceHistoryTableDao.count());
for (ResourceHistoryTable history : myResourceHistoryTableDao.findAll()) {
assertNull(history.getResourceTextVc());
assertNotNull(history.getResource());
}
});
// execute
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(JOB_REINDEX);
startRequest.setParameters(
new ReindexJobParameters()
.setOptimizeStorage(ReindexParameters.OptimizeStorageModeEnum.ALL_VERSIONS)
.setReindexSearchParameters(ReindexParameters.ReindexSearchParametersEnum.NONE)
);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(mySrd, startRequest);
myBatch2JobHelper.awaitJobCompletion(startResponse);
// validate
runInTransaction(()->{
assertEquals(11, myResourceHistoryTableDao.count());
for (ResourceHistoryTable history : myResourceHistoryTableDao.findAll()) {
assertNotNull(history.getResourceTextVc());
assertNull(history.getResource());
}
});
Patient patient = myPatientDao.read(patientId, mySrd);
assertTrue(patient.getActive());
}
@Test @Test
public void testOptimizeStorage_AllVersions_CopyProvenanceEntityData() { public void testOptimizeStorage_AllVersions_CopyProvenanceEntityData() {
// Setup // Setup

View File

@ -1,29 +1,21 @@
package ca.uhn.fhir.jpa.provider.r4; package ca.uhn.fhir.jpa.validation;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.config.JpaConfig; import ca.uhn.fhir.jpa.config.JpaConfig;
import ca.uhn.fhir.jpa.model.util.JpaConstants; import ca.uhn.fhir.jpa.model.util.JpaConstants;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test; import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.rest.annotation.IdParam;
import ca.uhn.fhir.rest.annotation.Operation;
import ca.uhn.fhir.rest.annotation.OperationParam;
import ca.uhn.fhir.rest.annotation.RequiredParam;
import ca.uhn.fhir.rest.annotation.Search;
import ca.uhn.fhir.rest.param.UriParam;
import ca.uhn.fhir.rest.server.IResourceProvider;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.test.utilities.server.RestfulServerExtension; import ca.uhn.fhir.test.utilities.server.RestfulServerExtension;
import jakarta.servlet.http.HttpServletRequest; import ca.uhn.fhir.test.utilities.validation.IValidationProviders;
import ca.uhn.fhir.test.utilities.validation.IValidationProvidersR4;
import org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport; import org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport;
import org.hl7.fhir.common.hapi.validation.support.ValidationSupportChain; import org.hl7.fhir.common.hapi.validation.support.ValidationSupportChain;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseParameters;
import org.hl7.fhir.r4.model.BooleanType; import org.hl7.fhir.r4.model.BooleanType;
import org.hl7.fhir.r4.model.CodeSystem; import org.hl7.fhir.r4.model.CodeSystem;
import org.hl7.fhir.r4.model.CodeType; import org.hl7.fhir.r4.model.CodeType;
import org.hl7.fhir.r4.model.Coding; import org.hl7.fhir.r4.model.Coding;
import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.Parameters; import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.StringType;
import org.hl7.fhir.r4.model.UriType; import org.hl7.fhir.r4.model.UriType;
import org.hl7.fhir.r4.model.ValueSet; import org.hl7.fhir.r4.model.ValueSet;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
@ -33,9 +25,7 @@ import org.junit.jupiter.api.extension.RegisterExtension;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.beans.factory.annotation.Qualifier;
import java.util.ArrayList; import static ca.uhn.fhir.jpa.model.util.JpaConstants.OPERATION_VALIDATE_CODE;
import java.util.List;
import static org.assertj.core.api.Assertions.assertThat; import static org.assertj.core.api.Assertions.assertThat;
import static org.assertj.core.api.AssertionsForClassTypes.assertThatExceptionOfType; import static org.assertj.core.api.AssertionsForClassTypes.assertThatExceptionOfType;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
@ -43,15 +33,15 @@ import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.fail;
/* /**
* This set of integration tests that instantiates and injects an instance of * This set of integration tests that instantiates and injects an instance of
* {@link org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport} * {@link org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport}
* into the ValidationSupportChain, which tests the logic of dynamically selecting the correct Remote Terminology * into the ValidationSupportChain, which tests the logic of dynamically selecting the correct Remote Terminology
* implementation. It also exercises the code found in * implementation. It also exercises the validateCode output translation code found in
* {@link org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport#invokeRemoteValidateCode} * {@link org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport}
*/ */
public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResourceProviderR4Test { public class ValidateCodeWithRemoteTerminologyR4Test extends BaseResourceProviderR4Test {
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(ValidateCodeOperationWithRemoteTerminologyR4Test.class); private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(ValidateCodeWithRemoteTerminologyR4Test.class);
private static final String DISPLAY = "DISPLAY"; private static final String DISPLAY = "DISPLAY";
private static final String DISPLAY_BODY_MASS_INDEX = "Body mass index (BMI) [Ratio]"; private static final String DISPLAY_BODY_MASS_INDEX = "Body mass index (BMI) [Ratio]";
private static final String CODE_BODY_MASS_INDEX = "39156-5"; private static final String CODE_BODY_MASS_INDEX = "39156-5";
@ -64,8 +54,8 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
protected static RestfulServerExtension ourRestfulServerExtension = new RestfulServerExtension(ourCtx); protected static RestfulServerExtension ourRestfulServerExtension = new RestfulServerExtension(ourCtx);
private RemoteTerminologyServiceValidationSupport mySvc; private RemoteTerminologyServiceValidationSupport mySvc;
private MyCodeSystemProvider myCodeSystemProvider; private IValidationProviders.MyValidationProvider<CodeSystem> myCodeSystemProvider;
private MyValueSetProvider myValueSetProvider; private IValidationProviders.MyValidationProvider<ValueSet> myValueSetProvider;
@Autowired @Autowired
@Qualifier(JpaConfig.JPA_VALIDATION_SUPPORT_CHAIN) @Qualifier(JpaConfig.JPA_VALIDATION_SUPPORT_CHAIN)
@ -76,8 +66,8 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
String baseUrl = "http://localhost:" + ourRestfulServerExtension.getPort(); String baseUrl = "http://localhost:" + ourRestfulServerExtension.getPort();
mySvc = new RemoteTerminologyServiceValidationSupport(ourCtx, baseUrl); mySvc = new RemoteTerminologyServiceValidationSupport(ourCtx, baseUrl);
myValidationSupportChain.addValidationSupport(0, mySvc); myValidationSupportChain.addValidationSupport(0, mySvc);
myCodeSystemProvider = new MyCodeSystemProvider(); myCodeSystemProvider = new IValidationProvidersR4.MyCodeSystemProviderR4();
myValueSetProvider = new MyValueSetProvider(); myValueSetProvider = new IValidationProvidersR4.MyValueSetProviderR4();
ourRestfulServerExtension.registerProvider(myCodeSystemProvider); ourRestfulServerExtension.registerProvider(myCodeSystemProvider);
ourRestfulServerExtension.registerProvider(myValueSetProvider); ourRestfulServerExtension.registerProvider(myValueSetProvider);
} }
@ -103,11 +93,11 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
@Test @Test
public void validateCodeOperationOnCodeSystem_byCodingAndUrl_usingBuiltInCodeSystems() { public void validateCodeOperationOnCodeSystem_byCodingAndUrl_usingBuiltInCodeSystems() {
myCodeSystemProvider.myReturnCodeSystems = new ArrayList<>(); final String code = "P";
myCodeSystemProvider.myReturnCodeSystems.add((CodeSystem) new CodeSystem().setId("CodeSystem/v2-0247")); final String system = CODE_SYSTEM_V2_0247_URI;;
myCodeSystemProvider.myReturnParams = new Parameters();
myCodeSystemProvider.myReturnParams.addParameter("result", true); Parameters params = new Parameters().addParameter("result", true).addParameter("display", DISPLAY);
myCodeSystemProvider.myReturnParams.addParameter("display", DISPLAY); setupCodeSystemValidateCode(system, code, params);
logAllConcepts(); logAllConcepts();
@ -115,8 +105,8 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
.operation() .operation()
.onType(CodeSystem.class) .onType(CodeSystem.class)
.named(JpaConstants.OPERATION_VALIDATE_CODE) .named(JpaConstants.OPERATION_VALIDATE_CODE)
.withParameter(Parameters.class, "coding", new Coding().setSystem(CODE_SYSTEM_V2_0247_URI).setCode("P")) .withParameter(Parameters.class, "coding", new Coding().setSystem(system).setCode(code))
.andParameter("url", new UriType(CODE_SYSTEM_V2_0247_URI)) .andParameter("url", new UriType(system))
.execute(); .execute();
String resp = myFhirContext.newXmlParser().setPrettyPrint(true).encodeResourceToString(respParam); String resp = myFhirContext.newXmlParser().setPrettyPrint(true).encodeResourceToString(respParam);
@ -128,7 +118,7 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
@Test @Test
public void validateCodeOperationOnCodeSystem_byCodingAndUrlWhereCodeSystemIsUnknown_returnsFalse() { public void validateCodeOperationOnCodeSystem_byCodingAndUrlWhereCodeSystemIsUnknown_returnsFalse() {
myCodeSystemProvider.myReturnCodeSystems = new ArrayList<>(); myCodeSystemProvider.setShouldThrowExceptionForResourceNotFound(false);
Parameters respParam = myClient Parameters respParam = myClient
.operation() .operation()
@ -166,21 +156,21 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
@Test @Test
public void validateCodeOperationOnValueSet_byUrlAndSystem_usingBuiltInCodeSystems() { public void validateCodeOperationOnValueSet_byUrlAndSystem_usingBuiltInCodeSystems() {
myCodeSystemProvider.myReturnCodeSystems = new ArrayList<>(); final String code = "alerts";
myCodeSystemProvider.myReturnCodeSystems.add((CodeSystem) new CodeSystem().setId("CodeSystem/list-example-use-codes")); final String system = "http://terminology.hl7.org/CodeSystem/list-example-use-codes";
myValueSetProvider.myReturnValueSets = new ArrayList<>(); final String valueSetUrl = "http://hl7.org/fhir/ValueSet/list-example-codes";
myValueSetProvider.myReturnValueSets.add((ValueSet) new ValueSet().setId("ValueSet/list-example-codes"));
myValueSetProvider.myReturnParams = new Parameters(); Parameters params = new Parameters().addParameter("result", true).addParameter("display", DISPLAY);
myValueSetProvider.myReturnParams.addParameter("result", true); setupValueSetValidateCode(valueSetUrl, system, code, params);
myValueSetProvider.myReturnParams.addParameter("display", DISPLAY); setupCodeSystemValidateCode(system, code, params);
Parameters respParam = myClient Parameters respParam = myClient
.operation() .operation()
.onType(ValueSet.class) .onType(ValueSet.class)
.named(JpaConstants.OPERATION_VALIDATE_CODE) .named(JpaConstants.OPERATION_VALIDATE_CODE)
.withParameter(Parameters.class, "code", new CodeType("alerts")) .withParameter(Parameters.class, "code", new CodeType(code))
.andParameter("system", new UriType("http://terminology.hl7.org/CodeSystem/list-example-use-codes")) .andParameter("system", new UriType(system))
.andParameter("url", new UriType("http://hl7.org/fhir/ValueSet/list-example-codes")) .andParameter("url", new UriType(valueSetUrl))
.useHttpGet() .useHttpGet()
.execute(); .execute();
@ -193,21 +183,20 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
@Test @Test
public void validateCodeOperationOnValueSet_byUrlSystemAndCode() { public void validateCodeOperationOnValueSet_byUrlSystemAndCode() {
myCodeSystemProvider.myReturnCodeSystems = new ArrayList<>(); final String code = CODE_BODY_MASS_INDEX;
myCodeSystemProvider.myReturnCodeSystems.add((CodeSystem) new CodeSystem().setId("CodeSystem/list-example-use-codes")); final String system = "http://terminology.hl7.org/CodeSystem/list-example-use-codes";
myValueSetProvider.myReturnValueSets = new ArrayList<>(); final String valueSetUrl = "http://hl7.org/fhir/ValueSet/list-example-codes";
myValueSetProvider.myReturnValueSets.add((ValueSet) new ValueSet().setId("ValueSet/list-example-codes"));
myValueSetProvider.myReturnParams = new Parameters(); Parameters params = new Parameters().addParameter("result", true).addParameter("display", DISPLAY_BODY_MASS_INDEX);
myValueSetProvider.myReturnParams.addParameter("result", true); setupValueSetValidateCode(valueSetUrl, system, code, params);
myValueSetProvider.myReturnParams.addParameter("display", DISPLAY_BODY_MASS_INDEX);
Parameters respParam = myClient Parameters respParam = myClient
.operation() .operation()
.onType(ValueSet.class) .onType(ValueSet.class)
.named(JpaConstants.OPERATION_VALIDATE_CODE) .named(JpaConstants.OPERATION_VALIDATE_CODE)
.withParameter(Parameters.class, "code", new CodeType(CODE_BODY_MASS_INDEX)) .withParameter(Parameters.class, "code", new CodeType(code))
.andParameter("url", new UriType("https://loinc.org")) .andParameter("url", new UriType(valueSetUrl))
.andParameter("system", new UriType("http://loinc.org")) .andParameter("system", new UriType(system))
.execute(); .execute();
String resp = myFhirContext.newXmlParser().setPrettyPrint(true).encodeResourceToString(respParam); String resp = myFhirContext.newXmlParser().setPrettyPrint(true).encodeResourceToString(respParam);
@ -219,7 +208,7 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
@Test @Test
public void validateCodeOperationOnValueSet_byCodingAndUrlWhereValueSetIsUnknown_returnsFalse() { public void validateCodeOperationOnValueSet_byCodingAndUrlWhereValueSetIsUnknown_returnsFalse() {
myValueSetProvider.myReturnValueSets = new ArrayList<>(); myValueSetProvider.setShouldThrowExceptionForResourceNotFound(false);
Parameters respParam = myClient Parameters respParam = myClient
.operation() .operation()
@ -238,70 +227,18 @@ public class ValidateCodeOperationWithRemoteTerminologyR4Test extends BaseResour
" - Unknown or unusable ValueSet[" + UNKNOWN_VALUE_SYSTEM_URI + "]"); " - Unknown or unusable ValueSet[" + UNKNOWN_VALUE_SYSTEM_URI + "]");
} }
@SuppressWarnings("unused") private void setupValueSetValidateCode(String theUrl, String theSystem, String theCode, IBaseParameters theResponseParams) {
private static class MyCodeSystemProvider implements IResourceProvider { ValueSet valueSet = myValueSetProvider.addTerminologyResource(theUrl);
private List<CodeSystem> myReturnCodeSystems; myValueSetProvider.addTerminologyResource(theSystem);
private Parameters myReturnParams; myValueSetProvider.addTerminologyResponse(OPERATION_VALIDATE_CODE, valueSet.getUrl(), theCode, theResponseParams);
@Operation(name = "validate-code", idempotent = true, returnParameters = { // we currently do this because VersionSpecificWorkerContextWrapper has logic to infer the system when missing
@OperationParam(name = "result", type = BooleanType.class, min = 1), // based on the ValueSet by calling ValidationSupportUtils#extractCodeSystemForCode.
@OperationParam(name = "message", type = StringType.class), valueSet.getCompose().addInclude().setSystem(theSystem);
@OperationParam(name = "display", type = StringType.class)
})
public Parameters validateCode(
HttpServletRequest theServletRequest,
@IdParam(optional = true) IdType theId,
@OperationParam(name = "url", min = 0, max = 1) UriType theCodeSystemUrl,
@OperationParam(name = "code", min = 0, max = 1) CodeType theCode,
@OperationParam(name = "display", min = 0, max = 1) StringType theDisplay
) {
return myReturnParams;
}
@Search
public List<CodeSystem> find(@RequiredParam(name = "url") UriParam theUrlParam) {
assert myReturnCodeSystems != null;
return myReturnCodeSystems;
}
@Override
public Class<? extends IBaseResource> getResourceType() {
return CodeSystem.class;
}
} }
@SuppressWarnings("unused") private void setupCodeSystemValidateCode(String theUrl, String theCode, IBaseParameters theResponseParams) {
private static class MyValueSetProvider implements IResourceProvider { CodeSystem codeSystem = myCodeSystemProvider.addTerminologyResource(theUrl);
private Parameters myReturnParams; myCodeSystemProvider.addTerminologyResponse(OPERATION_VALIDATE_CODE, codeSystem.getUrl(), theCode, theResponseParams);
private List<ValueSet> myReturnValueSets;
@Operation(name = "validate-code", idempotent = true, returnParameters = {
@OperationParam(name = "result", type = BooleanType.class, min = 1),
@OperationParam(name = "message", type = StringType.class),
@OperationParam(name = "display", type = StringType.class)
})
public Parameters validateCode(
HttpServletRequest theServletRequest,
@IdParam(optional = true) IdType theId,
@OperationParam(name = "url", min = 0, max = 1) UriType theValueSetUrl,
@OperationParam(name = "code", min = 0, max = 1) CodeType theCode,
@OperationParam(name = "system", min = 0, max = 1) UriType theSystem,
@OperationParam(name = "display", min = 0, max = 1) StringType theDisplay,
@OperationParam(name = "valueSet") ValueSet theValueSet
) {
return myReturnParams;
}
@Search
public List<ValueSet> find(@RequiredParam(name = "url") UriParam theUrlParam) {
assert myReturnValueSets != null;
return myReturnValueSets;
}
@Override
public Class<? extends IBaseResource> getResourceType() {
return ValueSet.class;
}
} }
} }

View File

@ -0,0 +1,261 @@
package ca.uhn.fhir.jpa.validation;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.config.JpaConfig;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.rest.api.MethodOutcome;
import ca.uhn.fhir.test.utilities.server.RestfulServerExtension;
import ca.uhn.fhir.test.utilities.validation.IValidationProviders;
import ca.uhn.fhir.test.utilities.validation.IValidationProvidersR4;
import ca.uhn.fhir.util.ClasspathUtil;
import org.apache.commons.lang3.StringUtils;
import org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport;
import org.hl7.fhir.common.hapi.validation.support.ValidationSupportChain;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r4.model.CodeSystem;
import org.hl7.fhir.r4.model.Encounter;
import org.hl7.fhir.r4.model.Observation;
import org.hl7.fhir.r4.model.OperationOutcome;
import org.hl7.fhir.r4.model.Procedure;
import org.hl7.fhir.r4.model.Reference;
import org.hl7.fhir.r4.model.StructureDefinition;
import org.hl7.fhir.r4.model.ValueSet;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import java.util.List;
import static ca.uhn.fhir.jpa.model.util.JpaConstants.OPERATION_VALIDATE_CODE;
import static org.assertj.core.api.Assertions.assertThat;
/**
* Tests resource validation with Remote Terminology bindings.
* To create a new test, you need to do 3 things:
* (1) the resource profile, if any custom one is needed should be stored in the FHIR repository
* (2) all the CodeSystem and ValueSet terminology resources need to be added to the corresponding resource provider.
* At the moment only placeholder CodeSystem/ValueSet resources are returned with id and url populated. For the moment
* there was no need to load the full resource, but that can be done if there is logic run which requires it.
* This is a minimal setup.
* (3) the Remote Terminology operation responses that are needed for the test need to be added to the corresponding
* resource provider. The intention is to record and use the responses of an actual terminology server
* e.g. <a href="https://r4.ontoserver.csiro.au/fhir/">OntoServer</a>.
* This is done as a result of the fact that unit test cannot always catch bugs which are introduced as a result of
* changes in the OntoServer or FHIR Validator library, or both.
* @see #setupValueSetValidateCode
* @see #setupCodeSystemValidateCode
* The responses are in Parameters resource format where issues is an OperationOutcome resource.
*/
public class ValidateWithRemoteTerminologyTest extends BaseResourceProviderR4Test {
private static final FhirContext ourCtx = FhirContext.forR4Cached();
@RegisterExtension
protected static RestfulServerExtension ourRestfulServerExtension = new RestfulServerExtension(ourCtx);
private RemoteTerminologyServiceValidationSupport mySvc;
@Autowired
@Qualifier(JpaConfig.JPA_VALIDATION_SUPPORT_CHAIN)
private ValidationSupportChain myValidationSupportChain;
private IValidationProviders.MyValidationProvider<CodeSystem> myCodeSystemProvider;
private IValidationProviders.MyValidationProvider<ValueSet> myValueSetProvider;
@BeforeEach
public void before() {
String baseUrl = "http://localhost:" + ourRestfulServerExtension.getPort();
mySvc = new RemoteTerminologyServiceValidationSupport(ourCtx, baseUrl);
myValidationSupportChain.addValidationSupport(0, mySvc);
myCodeSystemProvider = new IValidationProvidersR4.MyCodeSystemProviderR4();
myValueSetProvider = new IValidationProvidersR4.MyValueSetProviderR4();
ourRestfulServerExtension.registerProvider(myCodeSystemProvider);
ourRestfulServerExtension.registerProvider(myValueSetProvider);
}
@AfterEach
public void after() {
myValidationSupportChain.removeValidationSupport(mySvc);
ourRestfulServerExtension.getRestfulServer().getInterceptorService().unregisterAllInterceptors();
ourRestfulServerExtension.unregisterProvider(myCodeSystemProvider);
ourRestfulServerExtension.unregisterProvider(myValueSetProvider);
}
@Test
public void validate_withProfileWithValidCodesFromAllBindingTypes_returnsNoErrors() {
// setup
final StructureDefinition profileEncounter = ClasspathUtil.loadResource(ourCtx, StructureDefinition.class, "validation/encounter/profile-encounter-custom.json");
myClient.update().resource(profileEncounter).execute();
final String statusCode = "planned";
final String classCode = "IMP";
final String identifierTypeCode = "VN";
final String statusSystem = "http://hl7.org/fhir/encounter-status"; // implied system
final String classSystem = "http://terminology.hl7.org/CodeSystem/v3-ActCode";
final String identifierTypeSystem = "http://terminology.hl7.org/CodeSystem/v2-0203";
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/encounter-status", "http://hl7.org/fhir/encounter-status", statusCode, "validation/encounter/validateCode-ValueSet-encounter-status.json");
setupValueSetValidateCode("http://terminology.hl7.org/ValueSet/v3-ActEncounterCode", "http://terminology.hl7.org/CodeSystem/v3-ActCode", classCode, "validation/encounter/validateCode-ValueSet-v3-ActEncounterCode.json");
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/identifier-type", "http://hl7.org/fhir/identifier-type", identifierTypeCode, "validation/encounter/validateCode-ValueSet-identifier-type.json");
setupCodeSystemValidateCode(statusSystem, statusCode, "validation/encounter/validateCode-CodeSystem-encounter-status.json");
setupCodeSystemValidateCode(classSystem, classCode, "validation/encounter/validateCode-CodeSystem-v3-ActCode.json");
setupCodeSystemValidateCode(identifierTypeSystem, identifierTypeCode, "validation/encounter/validateCode-CodeSystem-v2-0203.json");
Encounter encounter = new Encounter();
encounter.getMeta().addProfile("http://example.ca/fhir/StructureDefinition/profile-encounter");
// required binding
encounter.setStatus(Encounter.EncounterStatus.fromCode(statusCode));
// preferred binding
encounter.getClass_()
.setSystem(classSystem)
.setCode(classCode)
.setDisplay("inpatient encounter");
// extensible binding
encounter.addIdentifier()
.getType().addCoding()
.setSystem(identifierTypeSystem)
.setCode(identifierTypeCode)
.setDisplay("Visit number");
// execute
List<String> errors = getValidationErrors(encounter);
// verify
assertThat(errors).isEmpty();
}
@Test
public void validate_withInvalidCode_returnsErrors() {
// setup
final String statusCode = "final";
final String code = "10xx";
final String statusSystem = "http://hl7.org/fhir/observation-status";
final String loincSystem = "http://loinc.org";
final String system = "http://fhir.infoway-inforoute.ca/io/psca/CodeSystem/ICD9CM";
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/observation-status", statusSystem, statusCode, "validation/observation/validateCode-ValueSet-observation-status.json");
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/observation-codes", loincSystem, statusCode, "validation/observation/validateCode-ValueSet-codes.json");
setupCodeSystemValidateCode(statusSystem, statusCode, "validation/observation/validateCode-CodeSystem-observation-status.json");
setupCodeSystemValidateCode(system, code, "validation/observation/validateCode-CodeSystem-ICD9CM.json");
Observation obs = new Observation();
obs.setStatus(Observation.ObservationStatus.fromCode(statusCode));
obs.getCode().addCoding().setCode(code).setSystem(system);
// execute
List<String> errors = getValidationErrors(obs);
assertThat(errors).hasSize(1);
// verify
assertThat(errors.get(0))
.contains("Unknown code '10xx' in the CodeSystem 'http://fhir.infoway-inforoute.ca/io/psca/CodeSystem/ICD9CM");
}
@Test
public void validate_withProfileWithInvalidCode_returnsErrors() {
// setup
String profile = "http://example.ca/fhir/StructureDefinition/profile-procedure";
StructureDefinition profileProcedure = ClasspathUtil.loadResource(myFhirContext, StructureDefinition.class, "validation/procedure/profile-procedure.json");
myClient.update().resource(profileProcedure).execute();
final String statusCode = "completed";
final String procedureCode1 = "417005";
final String procedureCode2 = "xx417005";
final String statusSystem = "http://hl7.org/fhir/event-status";
final String snomedSystem = "http://snomed.info/sct";
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/event-status", statusSystem, statusCode, "validation/procedure/validateCode-ValueSet-event-status.json");
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/procedure-code", snomedSystem, procedureCode1, "validation/procedure/validateCode-ValueSet-procedure-code-valid.json");
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/procedure-code", snomedSystem, procedureCode2, "validation/procedure/validateCode-ValueSet-procedure-code-invalid.json");
setupCodeSystemValidateCode(statusSystem, statusCode, "validation/procedure/validateCode-CodeSystem-event-status.json");
setupCodeSystemValidateCode(snomedSystem, procedureCode1, "validation/procedure/validateCode-CodeSystem-snomed-valid.json");
setupCodeSystemValidateCode(snomedSystem, procedureCode2, "validation/procedure/validateCode-CodeSystem-snomed-invalid.json");
Procedure procedure = new Procedure();
procedure.setSubject(new Reference("Patient/P1"));
procedure.setStatus(Procedure.ProcedureStatus.fromCode(statusCode));
procedure.getCode().addCoding().setSystem(snomedSystem).setCode(procedureCode1);
procedure.getCode().addCoding().setSystem(snomedSystem).setCode(procedureCode2);
procedure.getMeta().addProfile(profile);
// execute
List<String> errors = getValidationErrors(procedure);
// TODO: there is currently some duplication in the errors returned. This needs to be investigated and fixed.
// assertThat(errors).hasSize(1);
// verify
// note that we're not selecting an explicit versions (using latest) so the message verification does not include it.
assertThat(StringUtils.join("", errors))
.contains("Unknown code 'xx417005' in the CodeSystem 'http://snomed.info/sct'")
.doesNotContain("The provided code 'http://snomed.info/sct#xx417005' was not found in the value set 'http://hl7.org/fhir/ValueSet/procedure-code")
.doesNotContain("http://snomed.info/sct#417005");
}
@Test
public void validate_withProfileWithSlicingWithValidCode_returnsNoErrors() {
// setup
String profile = "http://example.ca/fhir/StructureDefinition/profile-procedure-with-slicing";
StructureDefinition profileProcedure = ClasspathUtil.loadResource(myFhirContext, StructureDefinition.class, "validation/procedure/profile-procedure-slicing.json");
myClient.update().resource(profileProcedure).execute();
final String statusCode = "completed";
final String procedureCode = "no-procedure-info";
final String statusSystem = "http://hl7.org/fhir/event-status";
final String snomedSystem = "http://snomed.info/sct";
final String absentUnknownSystem = "http://hl7.org/fhir/uv/ips/CodeSystem/absent-unknown-uv-ips";
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/event-status", statusSystem, statusCode, "validation/procedure/validateCode-ValueSet-event-status.json");
setupValueSetValidateCode("http://hl7.org/fhir/ValueSet/procedure-code", snomedSystem, procedureCode, "validation/procedure/validateCode-ValueSet-procedure-code-invalid-slice.json");
setupValueSetValidateCode("http://hl7.org/fhir/uv/ips/ValueSet/absent-or-unknown-procedures-uv-ips", absentUnknownSystem, procedureCode, "validation/procedure/validateCode-ValueSet-absent-or-unknown-procedure.json");
setupCodeSystemValidateCode(statusSystem, statusCode, "validation/procedure/validateCode-CodeSystem-event-status.json");
setupCodeSystemValidateCode(absentUnknownSystem, procedureCode, "validation/procedure/validateCode-CodeSystem-absent-or-unknown.json");
Procedure procedure = new Procedure();
procedure.setSubject(new Reference("Patient/P1"));
procedure.setStatus(Procedure.ProcedureStatus.fromCode(statusCode));
procedure.getCode().addCoding().setSystem(absentUnknownSystem).setCode(procedureCode);
procedure.getMeta().addProfile(profile);
// execute
List<String> errors = getValidationErrors(procedure);
assertThat(errors).hasSize(0);
}
private void setupValueSetValidateCode(String theUrl, String theSystem, String theCode, String theTerminologyResponseFile) {
ValueSet valueSet = myValueSetProvider.addTerminologyResource(theUrl);
myCodeSystemProvider.addTerminologyResource(theSystem);
myValueSetProvider.addTerminologyResponse(OPERATION_VALIDATE_CODE, valueSet.getUrl(), theCode, ourCtx, theTerminologyResponseFile);
// we currently do this because VersionSpecificWorkerContextWrapper has logic to infer the system when missing
// based on the ValueSet by calling ValidationSupportUtils#extractCodeSystemForCode.
valueSet.getCompose().addInclude().setSystem(theSystem);
// you will notice each of these calls require also a call to setupCodeSystemValidateCode
// that is necessary because VersionSpecificWorkerContextWrapper#validateCodeInValueSet
// which also attempts a validateCode against the CodeSystem after the validateCode against the ValueSet
}
private void setupCodeSystemValidateCode(String theUrl, String theCode, String theTerminologyResponseFile) {
CodeSystem codeSystem = myCodeSystemProvider.addTerminologyResource(theUrl);
myCodeSystemProvider.addTerminologyResponse(OPERATION_VALIDATE_CODE, codeSystem.getUrl(), theCode, ourCtx, theTerminologyResponseFile);
}
private List<String> getValidationErrors(IBaseResource theResource) {
MethodOutcome resultProcedure = myClient.validate().resource(theResource).execute();
OperationOutcome operationOutcome = (OperationOutcome) resultProcedure.getOperationOutcome();
return operationOutcome.getIssue().stream()
.filter(issue -> issue.getSeverity() == OperationOutcome.IssueSeverity.ERROR)
.map(OperationOutcome.OperationOutcomeIssueComponent::getDiagnostics)
.toList();
}
}

View File

@ -0,0 +1,49 @@
{
"resourceType": "StructureDefinition",
"id": "profile-encounter",
"url": "http://example.ca/fhir/StructureDefinition/profile-encounter",
"version": "0.11.0",
"name": "EncounterProfile",
"title": "Encounter Profile",
"status": "active",
"date": "2022-10-15T12:00:00+00:00",
"publisher": "Example Organization",
"fhirVersion": "4.0.1",
"kind": "resource",
"abstract": false,
"type": "Encounter",
"baseDefinition": "http://hl7.org/fhir/StructureDefinition/Encounter",
"derivation": "constraint",
"differential": {
"element": [
{
"id": "Encounter.identifier.type.coding",
"path": "Encounter.identifier.type.coding",
"min": 1,
"max": "1",
"mustSupport": true
},
{
"id": "Encounter.identifier.type.coding.system",
"path": "Encounter.identifier.type.coding.system",
"min": 1,
"fixedUri": "http://terminology.hl7.org/CodeSystem/v2-0203",
"mustSupport": true
},
{
"id": "Encounter.identifier.type.coding.code",
"path": "Encounter.identifier.type.coding.code",
"min": 1,
"fixedCode": "VN",
"mustSupport": true
},
{
"id": "Encounter.identifier.type.coding.display",
"path": "Encounter.identifier.type.coding.display",
"min": 1,
"fixedString": "Visit number",
"mustSupport": true
}
]
}
}

View File

@ -0,0 +1,59 @@
{
"resourceType": "Parameters",
"parameter": [
{
"name": "result",
"valueBoolean": true
},
{
"name": "code",
"valueCode": "planned"
},
{
"name": "system",
"valueUri": "http://hl7.org/fhir/encounter-status"
},
{
"name": "version",
"valueString": "5.0.0-ballot"
},
{
"name": "display",
"valueString": "Planned"
},
{
"name": "issues",
"resource": {
"resourceType": "OperationOutcome",
"issue": [
{
"severity": "information",
"code": "business-rule",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "status-check"
}
],
"text": "Reference to trial-use CodeSystem http://hl7.org/fhir/encounter-status|5.0.0-ballot"
}
},
{
"severity": "information",
"code": "business-rule",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "status-check"
}
],
"text": "Reference to draft CodeSystem http://hl7.org/fhir/encounter-status|5.0.0-ballot"
}
}
]
}
}
]
}

View File

@ -0,0 +1,25 @@
{
"resourceType": "Parameters",
"parameter": [
{
"name": "result",
"valueBoolean": true
},
{
"name": "code",
"valueCode": "VN"
},
{
"name": "system",
"valueUri": "http://terminology.hl7.org/CodeSystem/v2-0203"
},
{
"name": "version",
"valueString": "3.0.0"
},
{
"name": "display",
"valueString": "Visit number"
}
]
}

View File

@ -0,0 +1,46 @@
{
"resourceType": "Parameters",
"parameter": [
{
"name": "result",
"valueBoolean": true
},
{
"name": "code",
"valueCode": "IMP"
},
{
"name": "system",
"valueUri": "http://terminology.hl7.org/CodeSystem/v3-ActCode"
},
{
"name": "version",
"valueString": "2018-08-12"
},
{
"name": "display",
"valueString": "inpatient encounter"
},
{
"name": "issues",
"resource": {
"resourceType": "OperationOutcome",
"issue": [
{
"severity": "information",
"code": "business-rule",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "status-check"
}
],
"text": "Reference to draft CodeSystem http://terminology.hl7.org/CodeSystem/v3-ActCode|2018-08-12"
}
}
]
}
}
]
}

View File

@ -0,0 +1,59 @@
{
"resourceType": "Parameters",
"parameter": [
{
"name": "result",
"valueBoolean": true
},
{
"name": "code",
"valueCode": "planned"
},
{
"name": "system",
"valueUri": "http://hl7.org/fhir/encounter-status"
},
{
"name": "version",
"valueString": "5.0.0-ballot"
},
{
"name": "display",
"valueString": "Planned"
},
{
"name": "issues",
"resource": {
"resourceType": "OperationOutcome",
"issue": [
{
"severity": "information",
"code": "business-rule",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "status-check"
}
],
"text": "Reference to trial-use CodeSystem http://hl7.org/fhir/encounter-status|5.0.0-ballot"
}
},
{
"severity": "information",
"code": "business-rule",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "status-check"
}
],
"text": "Reference to draft CodeSystem http://hl7.org/fhir/encounter-status|5.0.0-ballot"
}
}
]
}
}
]
}

View File

@ -0,0 +1,52 @@
{
"resourceType": "Parameters",
"parameter": [
{
"name": "result",
"valueBoolean": false
},
{
"name": "code",
"valueCode": "VN"
},
{
"name": "system",
"valueUri": "http://terminology.hl7.org/CodeSystem/v2-0203"
},
{
"name": "version",
"valueString": "3.0.0"
},
{
"name": "issues",
"resource": {
"resourceType": "OperationOutcome",
"issue": [
{
"severity": "error",
"code": "code-invalid",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "not-in-vs"
}
],
"text": "The provided code 'http://terminology.hl7.org/CodeSystem/v2-0203#VN' was not found in the value set 'http://hl7.org/fhir/ValueSet/identifier-type|5.0.0-ballot'"
},
"location": [
"code"
],
"expression": [
"code"
]
}
]
}
},
{
"name": "message",
"valueString": "The provided code 'http://terminology.hl7.org/CodeSystem/v2-0203#VN' was not found in the value set 'http://hl7.org/fhir/ValueSet/identifier-type|5.0.0-ballot'"
}
]
}

View File

@ -0,0 +1,59 @@
{
"resourceType": "Parameters",
"parameter": [
{
"name": "result",
"valueBoolean": true
},
{
"name": "code",
"valueCode": "IMP"
},
{
"name": "system",
"valueUri": "http://terminology.hl7.org/CodeSystem/v3-ActCode"
},
{
"name": "version",
"valueString": "2018-08-12"
},
{
"name": "display",
"valueString": "inpatient encounter"
},
{
"name": "issues",
"resource": {
"resourceType": "OperationOutcome",
"issue": [
{
"severity": "information",
"code": "business-rule",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "status-check"
}
],
"text": "Reference to trial-use ValueSet http://terminology.hl7.org/ValueSet/v3-ActEncounterCode|2014-03-26"
}
},
{
"severity": "information",
"code": "business-rule",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "status-check"
}
],
"text": "Reference to draft CodeSystem http://terminology.hl7.org/CodeSystem/v3-ActCode|2018-08-12"
}
}
]
}
}
]
}

View File

@ -0,0 +1,48 @@
{
"resourceType": "Parameters",
"parameter": [
{
"name": "result",
"valueBoolean": false
},
{
"name": "code",
"valueCode": "10xx"
},
{
"name": "system",
"valueUri": "http://fhir.infoway-inforoute.ca/io/psca/CodeSystem/ICD9CM"
},
{
"name": "issues",
"resource": {
"resourceType": "OperationOutcome",
"issue": [
{
"severity": "error",
"code": "code-invalid",
"details": {
"coding": [
{
"system": "http://hl7.org/fhir/tools/CodeSystem/tx-issue-type",
"code": "invalid-code"
}
],
"text": "Unknown code '10xx' in the CodeSystem 'http://fhir.infoway-inforoute.ca/io/psca/CodeSystem/ICD9CM' version '0.1.0'"
},
"location": [
"code"
],
"expression": [
"code"
]
}
]
}
},
{
"name": "message",
"valueString": "Unknown code '10xx' in the CodeSystem 'http://fhir.infoway-inforoute.ca/io/psca/CodeSystem/ICD9CM' version '0.1.0'"
}
]
}

Some files were not shown because too many files have changed in this diff Show More