6.6.0 Mergeback (#4924)

* Force Verify tests

* fix ITs (#4809)

* fix RestHookTestR5IT

* fix intermittent

---------

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Fix migrator error on Oracle (#4814)

* Fix Oracle SQL error

* Add changelog

* Update clinical reasoning version (#4816)

* Update clinical reasoning version

* Update version

* Update version

* Clean-up and more wireup of evaluationSettings

* Add changelog

---------

Co-authored-by: Jonathan Percival <jonathan.i.percival@gmail.com>

* Opening the care-gaps endpoint for GET. (#4823)

Co-authored-by: Chalma Maadaadi <chalma@alphora.com>

* added version to mdm golden resource tag (#4820)

Co-authored-by: Long Ma <long@smilecdr.com>

* Update the changelog for 4697 to be more descriptive (#4827)

* Update the changelog for 4697 to be more descriptive

* Futher tweaks of the changelog

* Fixes a bug with tags.  (#4813)

* Test, fix

* Drop constraint, add migration

* Add changelog

* Fix userSelected null vs false

* Fix merge

* Fix up checkstyle whining

* One more failure

* Fix test

* wip

* changelog clarity

Co-authored-by: James Agnew <jamesagnew@gmail.com>

* change index

---------

Co-authored-by: Michael Buckley <michaelabuckley@gmail.com>
Co-authored-by: James Agnew <jamesagnew@gmail.com>

* fix migration issue (#4830)

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Create correct version enum

* Remove superfluous migration

* fixing test (#4835)

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* email subscription, throw NullPointerException (#4790)

* fix bug

* Favoring constructor initialization to autowiring.

* enhancing test.

* Making class LoggingEmailSender available outside of the hapi-fhir-japserver-uhnfhirtest module.

* Passing all tests.

* adding changelog.

* Bumping version to 6.5.20-SNAPSHOT

* addressing code review comment.

---------

Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* Add docs for CR operations (#4855)

* Add docs for CR operations

* Correct changelog and javadoc for $package

* Add documentation for $apply parameters

* Add additional documentation for $package

* Cleanup

* Cleanup

* Cleanup

* Address review comments

* Add documentation for $care-gaps operation. (#4862)

* Add documentation for -gaps.

* addressing the comments.

---------

Co-authored-by: Chalma Maadaadi <chalma@alphora.com>

* 4853 validation does not error when display is not the same as the display defined in the codesystem 2 (#4854)

* added failing test

* implemented the solution

* changed test name

* added change log

* Update hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/6_6_0/4853-validation-does-not-error-when-display-is-not-the-same-as-the-display-defined-in-the-codesystem-2.yaml

Co-authored-by: James Agnew <jamesagnew@gmail.com>

---------

Co-authored-by: Steven Li <steven@smilecdr.com>
Co-authored-by: James Agnew <jamesagnew@gmail.com>

* fixing patient everything operator (#4845)

* fixing patient everything operator

* review fix

---------

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* fix link

* Move image file

* Bundle resources containing over 100 references to the same Organization will fail with HAPI-2207 (#4871)

* Add failing unit test.

* Fix JpaId Long equality comparison to use ! equals() instead of !=, which fails for different instances of the same Long value.

* Add changelog.

* added warn message and test (#4848)

* added warn message and test

* code review fixes

---------

Co-authored-by: Long Ma <long@smilecdr.com>

* Issue 4804 full table scan on mpi link during mdm clear (#4805)

* version bump for next release  (#4793)

* version bump

* Bump to correctnumber

* Version Enum and folder

* Remove interim from list

* wip

* Fix operation on nested type-choices in FhirPatch implementation (#4783)

* Fix operation on nested type-choices in FhirPatch implementation

* Add credit for #4783

---------

Co-authored-by: James Agnew <jamesagnew@gmail.com>

* #4468 fix previous link offset no cache pagination (#4489)

* #4468 Add test reproducing the issue

* #4468 Fix previous link for no cache offset pagination

* #4468 Use unchecked URI parsing

* Credit for #4489

---------

Co-authored-by: James Agnew <jamesagnew@gmail.com>

* Changelog and data generating test

* Add MdmLink index

* Avoid double link deletion

* Use ThreadLocal safely

---------

Co-authored-by: Tadgh <garygrantgraham@gmail.com>
Co-authored-by: Zach Smith <85943952+zachdoctolib@users.noreply.github.com>
Co-authored-by: James Agnew <jamesagnew@gmail.com>
Co-authored-by: Aleksej Parovysnik <100864000+alparodev@users.noreply.github.com>
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* Fix erroneous batch 2 $export 75% complete count when the job is COMPLETE (#4859)

* Add failing unit test.

* Add conditional logic to the InstanceProgress progress percentage to disregard the incomplete count if this is called from the reduction step.  This is to get around a race condition in which a work chunk is QUEUED and not yet complete when the reduction step calculates the progress.

* Add final.

* Add changelog.

* disable wars (#4877)

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* 4868 fix paging hapi (#4870)

* fixing some offset and adding a test

* fixing the offset paging

* Removing duplicate

---------

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: Aleksej Parovysnik <100864000+alparodev@users.noreply.github.com>

* 4875-binary-access-write-doest-trigger-STORAGE-BINARY-ASSIGN-BLOB-ID-PREFIX-pointcut (#4876)

* Add failing test

* Add failing test

* Fix and changelog

* Pass content type parameter

* Back to auto wiring the context

* Invoke interceptor only when getting blobId, not also when storing it

* Avoid breaking implementers

* Address review comment

* Add new exception Msg code

* Fix broken test

---------

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* Fix batch job (bulk export) processed record count (#4879)

* Remove racy stats recalc.

* Throw 404 when requesting $export of non-existent Group or Patient (#4890)

* Remove default implementation intended only for interim backwards compatibility (#4894)

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* Rule apply patient export (#4893)

* Test, fix, and changelog

* Better partition resolution

* Add checks based on rule applier

* Fix ordering failure due to hash set

* Allow empty auth interceptor

* Fix up operation type on invocation

* Add more tests, make hack implementation for patient instance level operation

* Tighten test name

* Changelog

* Default method

* remove dead method

* Remove dead autowire

---------

Co-authored-by: Michael Buckley <michaelabuckley@gmail.com>

* cve pom changes (#4898)

Co-authored-by: Long Ma <long@smilecdr.com>

* backport subscription topic bean cleanup (#4904)

* 4891 bulk export do not recurse unasked for resources (#4895)

* updating tests

* fixing bulk export to not fetch resources not requested

* cleanup

* cleanup

* more warning suppressing

* adding error code

* blah

* fix test

* review fixes

---------

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* lowers log level to remove bootup noise (#4908)

* CVE rel 6 6 (#4907)

* cve pom changes

* bump javax.el to jakarta.el

---------

Co-authored-by: Long Ma <long@smilecdr.com>

* Issue 4905 post binary failure invoking interceptor for pointcuts storage preshow resources (#4906)

* Initial failing test

* Avoid applying binary blob id prefix multiple times

* Remove recently introduced method not needed anymore

---------

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* Enhance LogbackCaptureTestExtension (#4869)

* repro bug with test, fix bug

* ken informed me he resolved this bug on master, so i'm switching to use his solution

* disable wars

* review feedback

* review feedback

* review feedback again

---------

Co-authored-by: josie <josie.vandewetering@smilecdr.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Resolve 4863 from release branch searchparametercanonicalizer does not account for search parameters for custom resources types when converting dstu23 into runtimesearchparam (#4887)

* Modified canonicalizeSearchParameterDstu2 and 3, now correctly detect search parameters for custom resources

* Canonicalizers now correctly handle search parameters for custom resources

* created changelog

* Modification based on comments:
- remove Resource from target field when there are custom resource types
- fixed changelog typo
- removed unnecessary variable providesMembershipInCompartments

* Added tests for the SearchParameterCanonicalizer to test if base and target of RuntimeSearchParam is set as expected for DSTU2, DSTU3, R4, R4B, and R5 resources

* Fixed typo and removed commented code

* re-ordered init methods

* Update changelog

Co-authored-by: Tadgh <garygrantgraham@gmail.com>

* modifications following first code review.

---------

Co-authored-by: Tadgh <garygrantgraham@gmail.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* License

* Remove _lastUpdated filtering of _revincludes. (#4899)

Remove _lastUpdated filtering of _revincludes.

* 4910-dm-migration-error-for-oracle-19c (#4916)

* Remove all_constraints references which break in oracle 19c

* Add changelog

---------

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* 4873 empty fhirid causes missing resource (#4874)

* add check for empty fhirid string and add test

* add test for populateid

* changelog

* version bump

* version bump

* reverse version bump

* Back to 6.5.21-SNAPSHOT.

---------

Co-authored-by: justindar <justin.dar@smilecdr.com>
Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com>

* Fix include canonical url performance (#4919)

Use hash_identity for canonical join

* License

* Version bump

* Fix failure in test

* Licenses

* Review comments for pipeline

* Dead entry

* other typo

---------

Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: James Agnew <jamesagnew@gmail.com>
Co-authored-by: Brenin Rhodes <brenin@alphora.com>
Co-authored-by: Jonathan Percival <jonathan.i.percival@gmail.com>
Co-authored-by: chalmarm <44471040+chalmarm@users.noreply.github.com>
Co-authored-by: Chalma Maadaadi <chalma@alphora.com>
Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com>
Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: Michael Buckley <michaelabuckley@gmail.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: Sam Gunter <123124187+samguntersmilecdr@users.noreply.github.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: StevenXLi <stevenli_8118@hotmail.com>
Co-authored-by: Steven Li <steven@smilecdr.com>
Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com>
Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com>
Co-authored-by: Zach Smith <85943952+zachdoctolib@users.noreply.github.com>
Co-authored-by: Aleksej Parovysnik <100864000+alparodev@users.noreply.github.com>
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
Co-authored-by: Josie <80289977+pepsiofficial@users.noreply.github.com>
Co-authored-by: josie <josie.vandewetering@smilecdr.com>
Co-authored-by: TynerGjs <132295567+TynerGjs@users.noreply.github.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: justindar <justin.dar@smilecdr.com>
This commit is contained in:
Tadgh 2023-05-20 23:38:35 -04:00 committed by GitHub
parent 307e52f88f
commit 805e80e61f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
117 changed files with 7088 additions and 556 deletions

View File

@ -1,4 +1,4 @@
--- ---
type: add type: add
issue: 4697 issue: 4697
title: "Added R4 support for Questionnaire/$prepopulate and PlanDefinition/$package operations. These are operations are intended to support extended DaVinci DTR and SDC uses cases." title: "Added R4 support for Questionnaire/$prepopulate, Questionnaire/$package and PlanDefinition/$package operations. These are operations are intended to support extended DaVinci DTR and SDC uses cases."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 4789
title: "Previously, there was the possibility for a race condition to happen in the initialization
of the email subscription processing component that would result in email not being sent out. This
issue has been fixed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4804
jira: SMILE-5145
title: "Improved performance of `mdm-clear` operation by adding index and avoiding redundant deletion."

View File

@ -0,0 +1,11 @@
---
type: fix
issue: 4844
title: "/Patient/{patientid}/$everything?_type={resource types}
would omit resources that were not directly related to the Patient
resource (even if those resources were specified in the _type list).
This was in conflict with /Patient/{patientid}/$everything operation,
which did return said resources.
This has been fixed so both return all related resources, even if
those resources are not directly related to the Patient resource.
"

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4846
title: "Job maintenance service would throw an exception if a job definition is unknown, this would run maintenance on every job instance after it.
Now the maintenance will skip over unknown job definitions and display a warning log message indication a job definition is missing."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4853
title: "Previously, when validating resources that contain a display in a Coding/CodeableConcept different from the
display defined in the CodeSystem that is used, no errors are returned in the outcome. This is now fixed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4860
title: "Running an $export that completes successfully results in a progress percentage of less than 100%.
This has now been fixed."

View File

@ -0,0 +1,4 @@
---
type: add
issue: 4861
title: "Add documentation for $care-gaps operation"

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4863
title: "Previously the SearchParameterCanonicalizer did not correctly convert DSTU2 and DSTU3 custom resources SearchParameters
into RuntimeSearchParam. This is now fixed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4872
title: "POSTing a Bundle with over 100 references to the same resource will fail with HAPI-2207 'Multiple resources match'.
This has been fixed."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 4873
title: "Previously, if the fhirId in ResourceTable happened to be set to an empty string, the resourceId would be missing when trying to generate the full ID string. This has now been fixed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4875
title: "Previously, `$binary-access-write` operation didn't trigger `STORAGE_BINARY_ASSIGN_BLOB_ID_PREFIX` Pointcut.
This has been fixed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 4886
title: "Requests to start an $export of Patient or Group will now fail with 404 ResourceNotFound when the target
resources do not exist. Before, the system would start a bulk export background job which would then fail."

View File

@ -0,0 +1,7 @@
---
type: fix
issue: 4891
title: "Initiating a bulk export with a _type filter would sometimes return
resource types not specified in the filter.
This has been fixed.
"

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 4893
title: "Update the IRuleBuilder to support Patient Export rules via the new `patientExportOnPatient` method on the IRuleBuilder. Previously, it was accidentally using Group Export rules."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 4910
title: "Remove some references to `all_constraints` table in oracle database migration tasks which were causing errors for version 19c."

View File

@ -0,0 +1,4 @@
---
type: perf
issue: 4915
title: "Includes by canonical url now use an indexed query, and are much faster."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 4878
title: "Batch jobs occasionaly reported zero (0) record processed counts. This has been corrected."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 4896
title: "The _lastUpdated query parameter is no longer applied to _include or _revinclude search results."

View File

@ -0,0 +1,73 @@
# Care Gaps
## Overview
A gap in care refers to a discrepancy or gap in a patient's care that has been identified through analysis of their medical records, history, and current health status.
These gaps can include missing or incomplete information, unmet health needs, and opportunities for preventative care or intervention. Identifying and addressing care gaps can help improve the quality of care provided to patients, reduce healthcare costs, and ultimately lead to better health outcomes.
Example: This woman was supposed to have a breast cancer screening this year but did not. Lets reach out to her and get that scheduled.
A Gaps in Care Report is designed to communicate actual or perceived gaps in care between systems, such as the payers system and providers EMR. The report provides opportunities for providers to provide missing care and/or to communicate care provision data to payers. The report may also provide information for upcoming care opportunities, prospective gaps.
The gaps in care flow is between a provider and a measurement organizations system performing analytics.
<a href="/hapi-fhir/docs/images/caregapsflow.png"><img src="/hapi-fhir/docs/images/caregapsflow.png" alt="Care Gaps Flow" style="margin-left: 15px; margin-bottom: 15px;" /></a><sub><sup>Sourced from [Implementation Guide](http://hl7.org/fhir/us/davinci-deqm/2023Jan/gaps-in-care-reporting.html)</sup></sub>
The Gaps in Care Reporting uses the [DEQM Individual MeasureReport Profile](http://hl7.org/fhir/us/davinci-deqm/2023Jan/StructureDefinition-indv-measurereport-deqm.html). This allows the Gaps in Care Reporting to use the same machinery as the Individual Reporting to calculate measures and represent the results of individual calculation.
The following resources are used in the Gaps in Care Reporting Scenario:
| Report Type | Profile Name | Link to Profile |
|---------------|:---------------------------------------:|-----------------------------------------------------------------------------------------------------------------------------------|
| Bundle | DEQM Gaps In Care Bundle Profile | [DEQM Gaps In Care Bundle Profile](http://hl7.org/fhir/us/davinci-deqm/2023Jan/StructureDefinition-gaps-bundle-deqm.html) |
| Composition | DEQM Gaps In Care Composition Profile | [DEQM Gaps In Care Composition Profile](http://hl7.org/fhir/us/davinci-deqm/2023Jan/StructureDefinition-gaps-composition-deqm.html) |
| DetectedIssue | DEQM Gaps In Care DetectedIssue Profile | [DEQM Gaps In Care Detected Profile](http://hl7.org/fhir/us/davinci-deqm/2023Jan/StructureDefinition-gaps-detectedissue-deqm.html) |
| Group | DEQM Gaps In Care Group Profile | [DEQM Gaps In Care Group Profile](http://hl7.org/fhir/us/davinci-deqm/2023Jan/StructureDefinition-gaps-group-deqm.html) |
| MeasureReport | DEQM Gaps In Care MeasureReport Profile | [DEQM Gaps In Care MeasureReport Profile](http://hl7.org/fhir/us/davinci-deqm/2023Jan/StructureDefinition-indv-measurereport-deqm.html) |
## Gaps in Care Reporting
[Gaps through period](http://hl7.org/fhir/us/davinci-deqm/2023Jan/index.html#glossary) is the time period defined by a Client for running the Gaps in Care Report.
* When the [gaps through period](http://hl7.org/fhir/us/davinci-deqm/2023Jan/index.html#glossary) ends on a date that is in the future, the Gaps in Care Reporting is said to look for care gaps prospectively. In this scenario, it provides providers with opportunities to assess anticipated [open gaps](http://build.fhir.org/ig/HL7/davinci-deqm/index.html#glossary) and take proper actions to close the gaps.
* When the [gaps through period](http://hl7.org/fhir/us/davinci-deqm/2023Jan/index.html#glossary) ends on a date that is in the past, the Gaps in Care Reporting is said to look for care gaps retrospectively. In the retrospective scenario, identified [open gaps](http://build.fhir.org/ig/HL7/davinci-deqm/index.html#glossary) can no longer be acted upon to meet the quality measure.
| Use Case | care-gaps Operation | Gaps Through Period Start Date | Gaps Through Period End Date | Report Calculated Date | Colorectal Cancer Screening - Colonoscopy Date | Gaps in Care Report |
|---------------|:---------------------------------------:|---------------------------------------------------------------------------------------------------------------------------------------|------------------------------|------------------------|------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Prospective Use Case | $care-gaps?periodStart=2021-01-01&periodEnd=2021-06-30&subject=Patient/123&measureId=EXM130-7.3.000&status=open-gap | 2021-01-01 | 2021-06-30 | 2021-04-01 | Example: patient had colonoscopy on 2011-05-03 | Returns gaps through 2021-06-30. The Gaps in Care Report indicates the patient has an [open gaps](http://build.fhir.org/ig/HL7/davinci-deqm/index.html#glossary) for the colorectal cancer screening measure. By 2021-06-30, the colonoscopy would be over 10 years. |
| Retrospective Use Case | $care-gaps?periodStart=2020-01-01&periodEnd=2020-12-31&subject=Patient/123&measureId=EXM130-7.3.000&status=open-gap | 2020-01-01| 2020-12-31 | 2021-04-01 | Example: patient had colonoscopy on 2011-05-03 | Returns gaps through 2020-12-31. The Gaps in Care Report indicates the patient has a [closed gaps](http://build.fhir.org/ig/HL7/davinci-deqm/index.html#glossary) for the colorectal cancer screening measure. Since on 2020-12-31, the procedure would have occurred within the specified 10-year timeframe. |
## Operations
Hapi FHIR implements the [$care-gaps](http://hl7.org/fhir/us/davinci-deqm/2023Jan/OperationDefinition-care-gaps.html) operation.
## Care Gaps
The `$care-gaps` operation is used to run a Gaps in Care Report.
### Testing care gaps on Hapi FHIR
Hapi FHIR is integrated with `$care-gaps` operations and following are the steps to identify open gap on sample data following the remediation step to generate a report for closed gap.
All the sample files used below are available on [hapi-fhir](https://github.com/hapifhir/hapi-fhir/tree/master/hapi-fhir-storage-cr/src/test/resources) code base under resources folder.
1. Submit payer content
```bash
POST http://localhost/fhir/ CaregapsColorectalCancerScreeningsFHIR-bundle.json
```
2. Submit payer org data
```bash
POST http://localhost/fhir/ CaregapsAuthorAndReporter.json
```
3. Submit provider data
```bash
POST http://localhost/fhir/Measure/ColorectalCancerScreeningsFHIR/$submit-data CaregapsPatientData.json
```
4. Provider runs care-gaps operation to identify open gap.
```bash
GET http://localhost/fhir/Measure/$care-gaps?periodStart=2020-01-01&periodEnd=2020-12-31&status=open-gap&status=closed-gap&subject=Patient/end-to-end-EXM130&measureId=ColorectalCancerScreeningsFHIR
```
5. Provider fixes gaps
```bash
POST http://localhost/fhir/Measure/ColorectalCancerScreeningsFHIR/$submit-data CaregapsSubmitDataCloseGap.json
```
6. Provider runs care-gaps operation to identify the gap is closed.
```bash
GET http://localhost/fhir/Measure/$care-gaps?periodStart=2020-01-01&periodEnd=2020-12-31&status=open-gap&status=closed-gap&subject=Patient/end-to-end-EXM130&measureId=ColorectalCancerScreeningsFHIR
```

View File

@ -0,0 +1,270 @@
# PlanDefinition
## Introduction
The FHIR Clinical Reasoning Module defines the [PlanDefinition resource](https://www.hl7.org/fhir/plandefinition.html) and several [associated operations](https://www.hl7.org/fhir/plandefinition-operations.html). A plan definition is a pre-defined group of actions to be taken in particular circumstances, often including conditional elements, options, and other decision points. The resource is flexible enough to be used to represent a variety of workflows, as well as clinical decision support and quality improvement assets, including order sets, protocols, and decision support rules.
PlanDefinitions can contain hierarchical groups of action definitions, where each action definition describes an activity to be performed (often in terms of an ActivityDefinition resource), and each group defines additional behavior, relationships, and applicable conditions between the actions in the overall definition.
In addition to describing what should take place, each action in a plan definition can specify when and whether the action should take place. For when the action should be taken, the trigger element specifies the action should be taken in response to some trigger occurring (such as a particular point in a workflow being reached, or as the result of a prescription being ordered). For whether the action should be taken, the condition element can be used to provide an expression that evaluates to true or false to indicate the applicability of the action to the specific context.
The process of applying a PlanDefinition to a particular context typically produces request resources representing the actions that should be performed, grouped within a RequestOrchestration to capture relationships between the resulting request resources.
Each ActivityDefinition is used to construct a specific resource, based on the definition of the activity and combined with contextual information for the particular patient that the plan definition is being applied to.
```json
{
"resourceType": "PlanDefinition",
"id": "opioidcds-04",
"url": "http://hl7.org/fhir/ig/opioid-cds/PlanDefinition/opioidcds-04",
"identifier": [
{
"system": "urn:ietf:rfc:3986",
"value": "urn:oid:2.16.840.1.113883.4.642.11.4"
},
{
"use": "official",
"value": "cdc-opioid-guidance"
}
],
"version": "0.1.0",
"name": "Cdcopioid04",
"title": "CDC Opioid Prescribing Guideline Recommendation #4",
"type": {
"coding": [
{
"system": "http://terminology.hl7.org/CodeSystem/plan-definition-type",
"code": "eca-rule",
"display": "ECA Rule"
}
]
},
"status": "draft",
"date": "2018-03-19",
"publisher": "Centers for Disease Control and Prevention (CDC)",
"description": "When starting opioid therapy for chronic pain, clinicians should prescribe immediate-release opioids instead of extended-release/long-acting (ER/LA) opioids.",
"useContext": [
{
"code": {
"system": "http://terminology.hl7.org/CodeSystem/usage-context-type",
"code": "focus",
"display": "Clinical Focus"
},
"valueCodeableConcept": {
"coding": [
{
"system": "http://snomed.info/sct",
"code": "182888003",
"display": "Medication requested (situation)"
}
]
}
},
{
"code": {
"system": "http://terminology.hl7.org/CodeSystem/usage-context-type",
"code": "focus",
"display": "Clinical Focus"
},
"valueCodeableConcept": {
"coding": [
{
"system": "http://snomed.info/sct",
"code": "82423001",
"display": "Chronic pain (finding)"
}
]
}
}
],
"jurisdiction": [
{
"coding": [
{
"system": "urn:iso:std:iso:3166",
"code": "US",
"display": "United States of America"
}
]
}
],
"purpose": "CDCs Guideline for Prescribing Opioids for Chronic Pain is intended to improve communication between providers and patients about the risks and benefits of opioid therapy for chronic pain, improve the safety and effectiveness of pain treatment, and reduce the risks associated with long-term opioid therapy, including opioid use disorder and overdose. The Guideline is not intended for patients who are in active cancer treatment, palliative care, or end-of-life care.",
"usage": "Providers should use caution when prescribing extended-release/long-acting (ER/LA) opioids as they carry a higher risk and negligible benefit compared to immediate-release opioids.",
"copyright": "© CDC 2016+.",
"topic": [
{
"text": "Opioid Prescribing"
}
],
"author": [
{
"name": "Kensaku Kawamoto, MD, PhD, MHS"
},
{
"name": "Bryn Rhodes"
},
{
"name": "Floyd Eisenberg, MD, MPH"
},
{
"name": "Robert McClure, MD, MPH"
}
],
"relatedArtifact": [
{
"type": "documentation",
"display": "CDC guideline for prescribing opioids for chronic pain",
"document": {
"url": "https://guidelines.gov/summaries/summary/50153/cdc-guideline-for-prescribing-opioids-for-chronic-pain---united-states-2016#420"
}
},
{
"type": "documentation",
"display": "MME Conversion Tables",
"document": {
"url": "https://www.cdc.gov/drugoverdose/pdf/calculating_total_daily_dose-a.pdf"
}
}
],
"library": [
"http://example.org/fhir/Library/opioidcds-recommendation-04"
],
"action": [
{
"title": "Extended-release opioid prescription triggered.",
"description": "Checking if the trigger prescription meets the inclusion criteria for recommendation #4 workflow.",
"documentation": [
{
"type": "documentation",
"document": {
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/cqf-strengthOfRecommendation",
"valueCodeableConcept": {
"coding": [
{
"system": "http://terminology.hl7.org/CodeSystem/recommendation-strength",
"code": "strong",
"display": "Strong"
}
]
}
},
{
"url": "http://hl7.org/fhir/StructureDefinition/cqf-qualityOfEvidence",
"valueCodeableConcept": {
"coding": [
{
"system": "http://terminology.hl7.org/CodeSystem/evidence-quality",
"code": "low",
"display": "Low quality"
}
]
}
}
]
}
}
],
"trigger": [
{
"type": "named-event",
"name": "medication-prescribe"
}
],
"condition": [
{
"kind": "applicability",
"expression": {
"description": "Check whether the opioid prescription for the existing patient is extended-release without any opioids-with-abuse-potential prescribed in the past 90 days.",
"language": "text/cql-identifier",
"expression": "Inclusion Criteria"
}
}
],
"groupingBehavior": "visual-group",
"selectionBehavior": "exactly-one",
"dynamicValue": [
{
"path": "action.title",
"expression": {
"language": "text/cql-identifier",
"expression": "Get Summary"
}
},
{
"path": "action.description",
"expression": {
"language": "text/cql-identifier",
"expression": "Get Detail"
}
},
{
"path": "activity.extension",
"expression": {
"language": "text/cql-identifier",
"expression": "Get Indicator"
}
}
],
"action": [
{
"description": "Will prescribe immediate release"
},
{
"description": "Risk of overdose carefully considered and outweighed by benefit; snooze 3 mo"
},
{
"description": "N/A - see comment; snooze 3 mo"
}
]
}
]
}
```
## Operations
HAPI implements the [$apply](http://hl7.org/fhir/uv/cpg/OperationDefinition-cpg-plandefinition-apply.html) operation. Support for additional operations is planned.
## Apply
The `$apply` operation applies a PlanDefinition to a given context. This implementation follows the [FHIR Specification](https://www.hl7.org/fhir/plandefinition.html#12.23.4.3) and supports the [FHIR Clinical Guidelines IG](http://hl7.org/fhir/uv/cpg/index.html). In addition, an R5 version of apply is made available for R4 instances. This will cause $apply to return a Bundle of resources instead of a CarePlan. This can be invoked with `$r5.apply`.
### Example PlanDefinition
Some example PlanDefinition workflows are available in the [opioid-cds-r4](https://github.com/cqframework/opioid-cds-r4) IG. Full Bundles with all the required supporting resources are available [here](https://github.com/cqframework/opioid-cds-r4/tree/1e543f781138f3d85404b7f65a92ff713519ef2c/bundles). You can download a Bundle and load it on your server as a transaction:
```bash
POST http://your-server-base/fhir opioidcds-10-patient-view-bundle.json
```
These Bundles do not include example Patient clinical data. Applying a PlanDefinition can be invoked with:
```bash
GET http://your-server-base/fhir/PlanDefinition/opioidcds-10-patient-view/$apply?subject=Patient/patientId&encounter=Encounter/encounterId&practitioner=Practitioner/practitionerId
```
### Additional Parameters
The following additional parameters are supported for the `$apply` and `$r5.apply` operation:
| Parameter | Type | Description |
|-----------|------------|-------------|
| organization | String | The organization in context |
| userType | String | The type of user initiating the request, e.g. patient, healthcare provider, or specific type of healthcare provider (physician, nurse, etc.) |
| userLanguage | String | Preferred language of the person using the system |
| userTaskContext | String | The task the system user is performing, e.g. laboratory results review, medication list review, etc. This information can be used to tailor decision support outputs, such as recommended information resources |
| setting | String | The current setting of the request (inpatient, outpatient, etc.) |
| settingContext | String | Additional detail about the setting of the request, if any |
| parameters | Parameters | Any input parameters defined in libraries referenced by the PlanDefinition. |
| data | Bundle | Data to be made available to the PlanDefinition evaluation. |
| dataEndpoint | Endpoint | An endpoint to use to access data referenced by retrieve operations in libraries referenced by the PlanDefinition. |
| contentEndpoint | Endpoint | An endpoint to use to access content (i.e. libraries) referenced by the PlanDefinition. |
| terminologyEndpoint | Endpoint | An endpoint to use to access terminology (i.e. valuesets, codesystems, and membership testing) referenced by the PlanDefinition. |
## Package
The `package` operation for [PlanDefinition](https://www.hl7.org/fhir/plandefinition.html) will generate a Bundle of resources that includes the PlanDefinition as well as any related resources which can then be shared. This implementation follows the [CRMI IG](https://build.fhir.org/ig/HL7/crmi-ig/branches/master/index.html) guidance for [packaging artifacts](https://build.fhir.org/ig/HL7/crmi-ig/branches/master/packaging.html).

View File

@ -0,0 +1,499 @@
# Questionnaires
## Introduction
The FHIR Clinical Reasoning Module defines the [Questionnaire resource](https://www.hl7.org/fhir/questionnaire.html). A Questionnaire is an organized collection of questions intended to solicit information from patients, providers or other individuals involved in the healthcare domain. They may be simple flat lists of questions or can be hierarchically organized in groups and sub-groups, each containing questions. The Questionnaire defines the questions to be asked, how they are ordered and grouped, any intervening instructional text and what the constraints are on the allowed answers. The results of a Questionnaire can be communicated using the QuestionnaireResponse resource.
Questionnaires cover the need to communicate data originating from forms used in medical history examinations, research questionnaires and sometimes full clinical specialty records. In many systems this data is collected using user-defined screens and forms. Questionnaires define specifics about data capture - exactly what questions were asked, in what order, what choices for answers were, etc. Each of these questions is part of the Questionnaire, and as such the Questionnaire is a separately identifiable Resource, whereas the individual questions are not. (Questionnaire questions can be linked to shared data elements using the Questionnaire.item.definition element.)
In addition to its use as a means for capturing data, Questionnaires can also be useful as a mechanism of defining a standardized 'presentation' of data that might already exist. For example, a peri-natal form or diabetes management form. In this use, the benefit is to expose a large volume of data in a predictable way that can be defined outside the user-interface design of the relevant system. The form might allow data to be edited or might be read-only. In some cases, the QuestionnaireResponse might not be intended to be persisted.
## Operations
HAPI implements the following operations from the [Structured Data Capture IG](https://hl7.org/fhir/uv/sdc/index.html)
* [$populate](https://hl7.org/fhir/uv/sdc/OperationDefinition-Questionnaire-populate.html)
* [$extract](http://hl7.org/fhir/uv/sdc/OperationDefinition-QuestionnaireResponse-extract.html)
Support for additional operations is planned.
## Populate
The `populate` operation generates a [QuestionnaireResponse](https://www.hl7.org/fhir/questionnaireresponse.html) based on a specific [Questionnaire](https://www.hl7.org/fhir/questionnaire.html), filling in answers to questions where possible based on information provided as part of the operation or already known by the server about the subject of the Questionnaire.
### Example Questionnaire
```json
{
"resourceType": "Questionnaire",
"id": "ASLPA1",
"meta": {
"versionId": "1",
"lastUpdated": "2023-05-09T19:02:10.538-06:00",
"source": "#jucRbegv3NMJkZ8X"
},
"extension": [
{
"url": "http://hl7.org/fhir/uv/cpg/StructureDefinition/cpg-knowledgeCapability",
"valueCode": "shareable"
},
{
"url": "http://hl7.org/fhir/uv/cpg/StructureDefinition/cpg-knowledgeCapability",
"valueCode": "computable"
},
{
"url": "http://hl7.org/fhir/uv/cpg/StructureDefinition/cpg-knowledgeCapability",
"valueCode": "publishable"
},
{
"url": "http://hl7.org/fhir/uv/cpg/StructureDefinition/cpg-knowledgeRepresentationLevel",
"valueCode": "structured"
},
{
"url": "http://hl7.org/fhir/StructureDefinition/cqf-library",
"valueCanonical": "http://example.org/sdh/dtr/aslp/Library/ASLPDataElements"
}
],
"url": "http://example.org/sdh/dtr/aslp/Questionnaire/ASLPA1",
"name": "ASLPA1",
"title": "ASLP.A1 Adult Sleep Studies",
"status": "active",
"experimental": false,
"description": "Adult Sleep Studies Prior Authorization Form",
"useContext": [
{
"code": {
"system": "http://terminology.hl7.org/CodeSystem/usage-context-type",
"code": "task",
"display": "Workflow Task"
},
"valueCodeableConcept": {
"coding": [
{
"system": "http://fhir.org/guides/nachc/hiv-cds/CodeSystem/activity-codes",
"code": "ASLP.A1",
"display": "Adult Sleep Studies"
}
]
}
}
],
"item": [
{
"extension": [
{
"url": "http://hl7.org/fhir/uv/sdc/StructureDefinition/sdc-questionnaire-itemPopulationContext",
"valueExpression": {
"language": "text/cql-identifier",
"expression": "Sleep Study"
}
}
],
"linkId": "0",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order",
"text": "A sleep study procedure being ordered",
"type": "group",
"repeats": true,
"item": [
{
"linkId": "1",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order#ServiceRequest.code",
"text": "A sleep study procedure being ordered",
"type": "choice",
"answerValueSet": "http://example.org/sdh/dtr/aslp/ValueSet/aslp-a1-de1-codes-grouper"
},
{
"linkId": "2",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order#ServiceRequest.occurrence[x]",
"text": "Date of the procedure",
"type": "dateTime"
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/uv/sdc/StructureDefinition/sdc-questionnaire-initialExpression",
"valueExpression": {
"language": "text/cql-identifier",
"expression": "Diagnosis of Obstructive Sleep Apnea"
}
}
],
"linkId": "3",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-diagnosis-of-obstructive-sleep-apnea#Condition.code",
"text": "Diagnosis of Obstructive Sleep Apnea",
"type": "choice",
"answerValueSet": "http://example.org/sdh/dtr/aslp/ValueSet/aslp-a1-de17"
},
{
"extension": [
{
"url": "http://hl7.org/fhir/uv/sdc/StructureDefinition/sdc-questionnaire-initialExpression",
"valueExpression": {
"language": "text/cql-identifier",
"expression": "History of Hypertension"
}
}
],
"linkId": "4",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-history-of-hypertension#Observation.value[x]",
"text": "History of Hypertension",
"type": "boolean"
},
{
"extension": [
{
"url": "http://hl7.org/fhir/uv/sdc/StructureDefinition/sdc-questionnaire-initialExpression",
"valueExpression": {
"language": "text/cql-identifier",
"expression": "History of Diabetes"
}
}
],
"linkId": "5",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-history-of-diabetes#Observation.value[x]",
"text": "History of Diabetes",
"type": "boolean"
},
{
"extension": [
{
"url": "http://hl7.org/fhir/uv/sdc/StructureDefinition/sdc-questionnaire-initialExpression",
"valueExpression": {
"language": "text/cql-identifier",
"expression": "Neck Circumference"
}
}
],
"linkId": "6",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-height#Observation.value[x]",
"text": "Neck circumference (in inches)",
"type": "quantity"
},
{
"extension": [
{
"url": "http://hl7.org/fhir/uv/sdc/StructureDefinition/sdc-questionnaire-initialExpression",
"valueExpression": {
"language": "text/cql-identifier",
"expression": "Height"
}
}
],
"linkId": "7",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-height#Observation.value[x]",
"text": "Height (in inches)",
"type": "quantity"
},
{
"extension": [
{
"url": "http://hl7.org/fhir/uv/sdc/StructureDefinition/sdc-questionnaire-initialExpression",
"valueExpression": {
"language": "text/cql-identifier",
"expression": "Weight"
}
}
],
"linkId": "8",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-weight#Observation.value[x]",
"text": "Weight (in pounds)",
"type": "quantity"
},
{
"extension": [
{
"url": "http://hl7.org/fhir/uv/sdc/StructureDefinition/sdc-questionnaire-initialExpression",
"valueExpression": {
"language": "text/cql-identifier",
"expression": "BMI"
}
}
],
"linkId": "9",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-bmi#Observation.value[x]",
"text": "Body mass index (BMI)",
"type": "quantity"
}
]
}
```
### Example QuestionnaireResponse
```json
{
"resourceType": "QuestionnaireResponse",
"id": "ASLPA1-positive-response",
"extension": [
{
"url": "http://hl7.org/fhir/us/davinci-dtr/StructureDefinition/dtr-questionnaireresponse-questionnaire",
"valueReference": {
"reference": "#ASLPA1-positive"
}
}
],
"questionnaire": "http://example.org/sdh/dtr/aslp/Questionnaire/ASLPA1",
"status": "in-progress",
"subject": {
"reference": "Patient/positive"
},
"item": [
{
"linkId": "0",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order",
"text": "A sleep study procedure being ordered",
"item": [
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "1",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order#ServiceRequest.code",
"text": "A sleep study procedure being ordered",
"answer": [
{
"valueCoding": {
"system": "http://example.org/sdh/dtr/aslp/CodeSystem/aslp-codes",
"code": "ASLP.A1.DE2",
"display": "Home sleep apnea testing (HSAT)"
}
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "2",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order#ServiceRequest.occurrence[x]",
"text": "Date of the procedure",
"answer": [
{
"valueDateTime": "2023-04-10T08:00:00.000Z"
}
]
}
]
},
{
"linkId": "0",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order",
"text": "A sleep study procedure being ordered",
"item": [
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "1",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order#ServiceRequest.code",
"text": "A sleep study procedure being ordered",
"answer": [
{
"valueCoding": {
"system": "http://example.org/sdh/dtr/aslp/CodeSystem/aslp-codes",
"code": "ASLP.A1.DE14",
"display": "Artificial intelligence (AI)"
}
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "2",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-sleep-study-order#ServiceRequest.occurrence[x]",
"text": "Date of the procedure",
"answer": [
{
"valueDateTime": "2023-04-15T08:00:00.000Z"
}
]
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "3",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-diagnosis-of-obstructive-sleep-apnea#Condition.code",
"text": "Diagnosis of Obstructive Sleep Apnea",
"answer": [
{
"valueCoding": {
"system": "http://example.org/sdh/dtr/aslp/CodeSystem/aslp-codes",
"code": "ASLP.A1.DE17",
"display": "Obstructive sleep apnea (OSA)"
}
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "4",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-history-of-hypertension#Observation.value[x]",
"text": "History of Hypertension",
"answer": [
{
"valueBoolean": true
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "5",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-history-of-diabetes#Observation.value[x]",
"text": "History of Diabetes",
"answer": [
{
"valueBoolean": true
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "6",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-height#Observation.value[x]",
"text": "Neck circumference (in inches)",
"answer": [
{
"valueQuantity": {
"value": 16,
"unit": "[in_i]",
"system": "http://unitsofmeasure.org",
"code": "[in_i]"
}
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "7",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-height#Observation.value[x]",
"text": "Height (in inches)",
"answer": [
{
"valueQuantity": {
"value": 69,
"unit": "[in_i]",
"system": "http://unitsofmeasure.org",
"code": "[in_i]"
}
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "8",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-weight#Observation.value[x]",
"text": "Weight (in pounds)",
"answer": [
{
"valueQuantity": {
"value": 185,
"unit": "[lb_av]",
"system": "http://unitsofmeasure.org",
"code": "[lb_av]"
}
}
]
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/questionnaireresponse-author",
"valueReference": {
"reference": "http://cqframework.org/fhir/Device/clinical-quality-language"
}
}
],
"linkId": "9",
"definition": "http://example.org/sdh/dtr/aslp/StructureDefinition/aslp-bmi#Observation.value[x]",
"text": "Body mass index (BMI)",
"answer": [
{
"valueQuantity": {
"value": 16.2,
"unit": "kg/m2",
"system": "http://unitsofmeasure.org",
"code": "kg/m2"
}
}
]
}
]
}
```
## Extract
The `extract` operation takes a completed [QuestionnaireResponse](https://www.hl7.org/fhir/questionnaireresponse.html) and converts it to a Bundle of resources by using metadata embedded in the [Questionnaire](https://www.hl7.org/fhir/questionnaire.html) the QuestionnaireResponse is based on. The extracted resources might include Observations, MedicationStatements and other standard FHIR resources which can then be shared and manipulated. When invoking the $extract operation, care should be taken that the submitted QuestionnaireResponse is itself valid. If not, the extract operation could fail (with appropriate OperationOutcomes) or, more problematic, might succeed but provide incorrect output.
This implementation allows for both [Observation based](https://hl7.org/fhir/uv/sdc/extraction.html#observation-based-extraction) and [Definition based](https://hl7.org/fhir/uv/sdc/extraction.html#definition-based-extraction) extraction.
## Package
The `package` operation for [Questionnaire](https://www.hl7.org/fhir/questionnaire.html) will generate a Bundle of resources that includes the Questionnaire as well as any related Library or ValueSet resources which can then be shared. This implementation follows the [CRMI IG](https://build.fhir.org/ig/HL7/crmi-ig/branches/master/index.html) guidance for [packaging artifacts](https://build.fhir.org/ig/HL7/crmi-ig/branches/master/packaging.html).

View File

@ -88,7 +88,10 @@ page.server_jpa_batch.introduction=Batch Introduction
section.clinical_reasoning.title=Clinical Reasoning section.clinical_reasoning.title=Clinical Reasoning
page.clinical_reasoning.overview=Clinical Reasoning Overview page.clinical_reasoning.overview=Clinical Reasoning Overview
page.clinical_reasoning.cql=CQL page.clinical_reasoning.cql=CQL
page.clinical_reasoning.caregaps=Care Gaps
page.clinical_reasoning.measures=Measures page.clinical_reasoning.measures=Measures
page.clinical_reasoning.plan_definitions=PlanDefinitions
page.clinical_reasoning.questionnaires=Questionnaires
section.interceptors.title=Interceptors section.interceptors.title=Interceptors
page.interceptors.interceptors=Interceptors Overview page.interceptors.interceptors=Interceptors Overview

Binary file not shown.

After

Width:  |  Height:  |  Size: 95 KiB

View File

@ -315,7 +315,7 @@
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.glassfish</groupId> <groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId> <artifactId>jakarta.el</artifactId>
</dependency> </dependency>
<!-- Note that we need this dependency to send log4j logging requests to slf4j --> <!-- Note that we need this dependency to send log4j logging requests to slf4j -->

View File

@ -24,6 +24,7 @@ import ca.uhn.fhir.jpa.binary.api.StoredDetails;
import ca.uhn.fhir.jpa.binary.svc.BaseBinaryStorageSvcImpl; import ca.uhn.fhir.jpa.binary.svc.BaseBinaryStorageSvcImpl;
import ca.uhn.fhir.jpa.dao.data.IBinaryStorageEntityDao; import ca.uhn.fhir.jpa.dao.data.IBinaryStorageEntityDao;
import ca.uhn.fhir.jpa.model.entity.BinaryStorageEntity; import ca.uhn.fhir.jpa.model.entity.BinaryStorageEntity;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import com.google.common.hash.HashingInputStream; import com.google.common.hash.HashingInputStream;
import com.google.common.io.ByteStreams; import com.google.common.io.ByteStreams;
@ -36,6 +37,7 @@ import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional; import org.springframework.transaction.annotation.Transactional;
import javax.annotation.Nonnull;
import javax.persistence.EntityManager; import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext; import javax.persistence.PersistenceContext;
import javax.persistence.PersistenceContextType; import javax.persistence.PersistenceContextType;
@ -55,9 +57,11 @@ public class DatabaseBlobBinaryStorageSvcImpl extends BaseBinaryStorageSvcImpl {
@Autowired @Autowired
private IBinaryStorageEntityDao myBinaryStorageEntityDao; private IBinaryStorageEntityDao myBinaryStorageEntityDao;
@Nonnull
@Override @Override
@Transactional(propagation = Propagation.REQUIRED) @Transactional(propagation = Propagation.REQUIRED)
public StoredDetails storeBlob(IIdType theResourceId, String theBlobIdOrNull, String theContentType, InputStream theInputStream) throws IOException { public StoredDetails storeBlob(IIdType theResourceId, String theBlobIdOrNull, String theContentType,
InputStream theInputStream, RequestDetails theRequestDetails) throws IOException {
/* /*
* Note on transactionality: This method used to have a propagation value of SUPPORTS and then do the actual * Note on transactionality: This method used to have a propagation value of SUPPORTS and then do the actual
@ -70,17 +74,16 @@ public class DatabaseBlobBinaryStorageSvcImpl extends BaseBinaryStorageSvcImpl {
HashingInputStream hashingInputStream = createHashingInputStream(theInputStream); HashingInputStream hashingInputStream = createHashingInputStream(theInputStream);
CountingInputStream countingInputStream = createCountingInputStream(hashingInputStream); CountingInputStream countingInputStream = createCountingInputStream(hashingInputStream);
String id = super.provideIdForNewBlob(theBlobIdOrNull);
BinaryStorageEntity entity = new BinaryStorageEntity(); BinaryStorageEntity entity = new BinaryStorageEntity();
entity.setResourceId(theResourceId.toUnqualifiedVersionless().getValue()); entity.setResourceId(theResourceId.toUnqualifiedVersionless().getValue());
entity.setBlobId(id);
entity.setBlobContentType(theContentType); entity.setBlobContentType(theContentType);
entity.setPublished(publishedDate); entity.setPublished(publishedDate);
Session session = (Session) myEntityManager.getDelegate(); Session session = (Session) myEntityManager.getDelegate();
LobHelper lobHelper = session.getLobHelper(); LobHelper lobHelper = session.getLobHelper();
byte[] loadedStream = IOUtils.toByteArray(countingInputStream); byte[] loadedStream = IOUtils.toByteArray(countingInputStream);
String id = super.provideIdForNewBlob(theBlobIdOrNull, loadedStream, theRequestDetails, theContentType);
entity.setBlobId(id);
Blob dataBlob = lobHelper.createBlob(loadedStream); Blob dataBlob = lobHelper.createBlob(loadedStream);
entity.setBlob(dataBlob); entity.setBlob(dataBlob);
@ -105,7 +108,7 @@ public class DatabaseBlobBinaryStorageSvcImpl extends BaseBinaryStorageSvcImpl {
public StoredDetails fetchBlobDetails(IIdType theResourceId, String theBlobId) { public StoredDetails fetchBlobDetails(IIdType theResourceId, String theBlobId) {
Optional<BinaryStorageEntity> entityOpt = myBinaryStorageEntityDao.findByIdAndResourceId(theBlobId, theResourceId.toUnqualifiedVersionless().getValue()); Optional<BinaryStorageEntity> entityOpt = myBinaryStorageEntityDao.findByIdAndResourceId(theBlobId, theResourceId.toUnqualifiedVersionless().getValue());
if (entityOpt.isPresent() == false) { if (entityOpt.isEmpty()) {
return null; return null;
} }
@ -121,7 +124,7 @@ public class DatabaseBlobBinaryStorageSvcImpl extends BaseBinaryStorageSvcImpl {
@Override @Override
public boolean writeBlob(IIdType theResourceId, String theBlobId, OutputStream theOutputStream) throws IOException { public boolean writeBlob(IIdType theResourceId, String theBlobId, OutputStream theOutputStream) throws IOException {
Optional<BinaryStorageEntity> entityOpt = myBinaryStorageEntityDao.findByIdAndResourceId(theBlobId, theResourceId.toUnqualifiedVersionless().getValue()); Optional<BinaryStorageEntity> entityOpt = myBinaryStorageEntityDao.findByIdAndResourceId(theBlobId, theResourceId.toUnqualifiedVersionless().getValue());
if (entityOpt.isPresent() == false) { if (entityOpt.isEmpty()) {
return false; return false;
} }

View File

@ -38,6 +38,7 @@ import ca.uhn.fhir.jpa.dao.SearchBuilderFactory;
import ca.uhn.fhir.jpa.dao.mdm.MdmExpansionCacheSvc; import ca.uhn.fhir.jpa.dao.mdm.MdmExpansionCacheSvc;
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
import ca.uhn.fhir.jpa.model.dao.JpaPid; import ca.uhn.fhir.jpa.model.dao.JpaPid;
import ca.uhn.fhir.jpa.model.search.SearchBuilderLoadIncludesParameters;
import ca.uhn.fhir.jpa.model.search.SearchRuntimeDetails; import ca.uhn.fhir.jpa.model.search.SearchRuntimeDetails;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.util.QueryChunker; import ca.uhn.fhir.jpa.util.QueryChunker;
@ -67,8 +68,8 @@ import org.springframework.beans.factory.annotation.Autowired;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
import javax.persistence.EntityManager; import javax.persistence.EntityManager;
import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap; import java.util.HashMap;
import java.util.HashSet; import java.util.HashSet;
import java.util.Iterator; import java.util.Iterator;
@ -106,6 +107,7 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
@Autowired @Autowired
private IIdHelperService<JpaPid> myIdHelperService; private IIdHelperService<JpaPid> myIdHelperService;
@SuppressWarnings("rawtypes")
@Autowired @Autowired
private IMdmLinkDao myMdmLinkDao; private IMdmLinkDao myMdmLinkDao;
@ -145,7 +147,8 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
}); });
} }
private LinkedHashSet<JpaPid> getPidsForPatientStyleExport(ExportPIDIteratorParameters theParams, String resourceType, String theJobId, String theChunkId, RuntimeResourceDefinition def) { @SuppressWarnings("unchecked")
private LinkedHashSet<JpaPid> getPidsForPatientStyleExport(ExportPIDIteratorParameters theParams, String resourceType, String theJobId, String theChunkId, RuntimeResourceDefinition def) throws IOException {
LinkedHashSet<JpaPid> pids = new LinkedHashSet<>(); LinkedHashSet<JpaPid> pids = new LinkedHashSet<>();
// Patient // Patient
if (myStorageSettings.getIndexMissingFields() == JpaStorageSettings.IndexEnabledEnum.DISABLED) { if (myStorageSettings.getIndexMissingFields() == JpaStorageSettings.IndexEnabledEnum.DISABLED) {
@ -170,7 +173,7 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
Logs.getBatchTroubleshootingLog().debug("Executing query for bulk export job[{}] chunk[{}]: {}", theJobId, theChunkId, map.toNormalizedQueryString(myContext)); Logs.getBatchTroubleshootingLog().debug("Executing query for bulk export job[{}] chunk[{}]: {}", theJobId, theChunkId, map.toNormalizedQueryString(myContext));
IResultIterator<JpaPid> resultIterator = searchBuilder.createQuery(map, searchRuntime, new SystemRequestDetails(), theParams.getPartitionIdOrAllPartitions()); try (IResultIterator<JpaPid> resultIterator = searchBuilder.createQuery(map, searchRuntime, new SystemRequestDetails(), theParams.getPartitionIdOrAllPartitions())) {
int pidCount = 0; int pidCount = 0;
while (resultIterator.hasNext()) { while (resultIterator.hasNext()) {
if (pidCount % 10000 == 0) { if (pidCount % 10000 == 0) {
@ -181,6 +184,7 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
} }
} }
} }
}
return pids; return pids;
} }
@ -209,7 +213,8 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
return referenceOrListParam; return referenceOrListParam;
} }
private LinkedHashSet<JpaPid> getPidsForSystemStyleExport(ExportPIDIteratorParameters theParams, String theJobId, String theChunkId, RuntimeResourceDefinition theDef) { @SuppressWarnings("unchecked")
private LinkedHashSet<JpaPid> getPidsForSystemStyleExport(ExportPIDIteratorParameters theParams, String theJobId, String theChunkId, RuntimeResourceDefinition theDef) throws IOException {
LinkedHashSet<JpaPid> pids = new LinkedHashSet<>(); LinkedHashSet<JpaPid> pids = new LinkedHashSet<>();
// System // System
List<SearchParameterMap> maps = myBulkExportHelperSvc.createSearchParameterMapsForResourceType(theDef, theParams, true); List<SearchParameterMap> maps = myBulkExportHelperSvc.createSearchParameterMapsForResourceType(theDef, theParams, true);
@ -219,10 +224,10 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
Logs.getBatchTroubleshootingLog().debug("Executing query for bulk export job[{}] chunk[{}]: {}", theJobId, theChunkId, map.toNormalizedQueryString(myContext)); Logs.getBatchTroubleshootingLog().debug("Executing query for bulk export job[{}] chunk[{}]: {}", theJobId, theChunkId, map.toNormalizedQueryString(myContext));
// requires a transaction // requires a transaction
IResultIterator<JpaPid> resultIterator = searchBuilder.createQuery(map, try (IResultIterator<JpaPid> resultIterator = searchBuilder.createQuery(map,
new SearchRuntimeDetails(null, theJobId), new SearchRuntimeDetails(null, theJobId),
null, null,
theParams.getPartitionIdOrAllPartitions()); theParams.getPartitionIdOrAllPartitions())) {
int pidCount = 0; int pidCount = 0;
while (resultIterator.hasNext()) { while (resultIterator.hasNext()) {
if (pidCount % 10000 == 0) { if (pidCount % 10000 == 0) {
@ -232,10 +237,11 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
pids.add(resultIterator.next()); pids.add(resultIterator.next());
} }
} }
}
return pids; return pids;
} }
private LinkedHashSet<JpaPid> getPidsForGroupStyleExport(ExportPIDIteratorParameters theParams, String theResourceType, RuntimeResourceDefinition theDef) { private LinkedHashSet<JpaPid> getPidsForGroupStyleExport(ExportPIDIteratorParameters theParams, String theResourceType, RuntimeResourceDefinition theDef) throws IOException {
LinkedHashSet<JpaPid> pids; LinkedHashSet<JpaPid> pids;
if (theResourceType.equalsIgnoreCase("Patient")) { if (theResourceType.equalsIgnoreCase("Patient")) {
@ -250,17 +256,28 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
return pids; return pids;
} }
private LinkedHashSet<JpaPid> getRelatedResourceTypePids(ExportPIDIteratorParameters theParams, RuntimeResourceDefinition theDef) { private LinkedHashSet<JpaPid> getRelatedResourceTypePids(ExportPIDIteratorParameters theParams, RuntimeResourceDefinition theDef) throws IOException {
LinkedHashSet<JpaPid> pids = new LinkedHashSet<>(); LinkedHashSet<JpaPid> pids = new LinkedHashSet<>();
// expand the group pid -> list of patients in that group (list of patient pids)
Set<JpaPid> expandedMemberResourceIds = expandAllPatientPidsFromGroup(theParams); Set<JpaPid> expandedMemberResourceIds = expandAllPatientPidsFromGroup(theParams);
assert expandedMemberResourceIds != null && !expandedMemberResourceIds.isEmpty(); assert !expandedMemberResourceIds.isEmpty();
Logs.getBatchTroubleshootingLog().debug("{} has been expanded to members:[{}]", theParams.getGroupId(), expandedMemberResourceIds); Logs.getBatchTroubleshootingLog().debug("{} has been expanded to members:[{}]", theParams.getGroupId(), expandedMemberResourceIds);
//Next, let's search for the target resources, with their correct patient references, chunked. // for each patient pid ->
//The results will be jammed into myReadPids // search for the target resources, with their correct patient references, chunked.
// The results will be jammed into myReadPids
QueryChunker<JpaPid> queryChunker = new QueryChunker<>(); QueryChunker<JpaPid> queryChunker = new QueryChunker<>();
queryChunker.chunk(expandedMemberResourceIds, QUERY_CHUNK_SIZE, (idChunk) -> { queryChunker.chunk(expandedMemberResourceIds, QUERY_CHUNK_SIZE, (idChunk) -> {
try {
queryResourceTypeWithReferencesToPatients(pids, idChunk, theParams, theDef); queryResourceTypeWithReferencesToPatients(pids, idChunk, theParams, theDef);
} catch (IOException ex) {
// we will never see this;
// SearchBuilder#QueryIterator does not (nor can ever) throw
// an IOException... but Java requires the check,
// so we'll put a log here (just in the off chance)
ourLog.error("Couldn't close query iterator ", ex);
throw new RuntimeException(Msg.code(2346) + "Couldn't close query iterator", ex);
}
}); });
return pids; return pids;
} }
@ -333,7 +350,8 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
* In case we are doing a Group Bulk Export and resourceType `Patient` is requested, we can just return the group members, * In case we are doing a Group Bulk Export and resourceType `Patient` is requested, we can just return the group members,
* possibly expanded by MDM, and don't have to go and fetch other resource DAOs. * possibly expanded by MDM, and don't have to go and fetch other resource DAOs.
*/ */
private LinkedHashSet<JpaPid> getExpandedPatientList(ExportPIDIteratorParameters theParameters) { @SuppressWarnings("unchecked")
private LinkedHashSet<JpaPid> getExpandedPatientList(ExportPIDIteratorParameters theParameters) throws IOException {
List<JpaPid> members = getMembersFromGroupWithFilter(theParameters, true); List<JpaPid> members = getMembersFromGroupWithFilter(theParameters, true);
List<IIdType> ids = members.stream().map(member -> new IdDt("Patient/" + member)).collect(Collectors.toList()); List<IIdType> ids = members.stream().map(member -> new IdDt("Patient/" + member)).collect(Collectors.toList());
ourLog.info("While extracting patients from a group, we found {} patients.", ids.size()); ourLog.info("While extracting patients from a group, we found {} patients.", ids.size());
@ -362,7 +380,8 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
* *
* @return A list of strings representing the Patient IDs of the members (e.g. ["P1", "P2", "P3"] * @return A list of strings representing the Patient IDs of the members (e.g. ["P1", "P2", "P3"]
*/ */
private List<JpaPid> getMembersFromGroupWithFilter(ExportPIDIteratorParameters theParameters, boolean theConsiderSince) { @SuppressWarnings("unchecked")
private List<JpaPid> getMembersFromGroupWithFilter(ExportPIDIteratorParameters theParameters, boolean theConsiderSince) throws IOException {
RuntimeResourceDefinition def = myContext.getResourceDefinition("Patient"); RuntimeResourceDefinition def = myContext.getResourceDefinition("Patient");
List<JpaPid> resPids = new ArrayList<>(); List<JpaPid> resPids = new ArrayList<>();
@ -373,15 +392,16 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
for (SearchParameterMap map : maps) { for (SearchParameterMap map : maps) {
ISearchBuilder<JpaPid> searchBuilder = getSearchBuilderForResourceType("Patient"); ISearchBuilder<JpaPid> searchBuilder = getSearchBuilderForResourceType("Patient");
ourLog.debug("Searching for members of group {} with job instance {} with map {}", theParameters.getGroupId(), theParameters.getInstanceId(), map); ourLog.debug("Searching for members of group {} with job instance {} with map {}", theParameters.getGroupId(), theParameters.getInstanceId(), map);
IResultIterator<JpaPid> resultIterator = searchBuilder.createQuery(map, try (IResultIterator<JpaPid> resultIterator = searchBuilder.createQuery(map,
new SearchRuntimeDetails(null, theParameters.getInstanceId()), new SearchRuntimeDetails(null, theParameters.getInstanceId()),
null, null,
theParameters.getPartitionIdOrAllPartitions()); theParameters.getPartitionIdOrAllPartitions())) {
while (resultIterator.hasNext()) { while (resultIterator.hasNext()) {
resPids.add(resultIterator.next()); resPids.add(resultIterator.next());
} }
} }
}
return resPids; return resPids;
} }
@ -401,6 +421,7 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
/** /**
* @param thePidTuples * @param thePidTuples
*/ */
@SuppressWarnings({ "unchecked", "rawtypes" })
private void populateMdmResourceCache(List<MdmPidTuple<JpaPid>> thePidTuples) { private void populateMdmResourceCache(List<MdmPidTuple<JpaPid>> thePidTuples) {
if (myMdmExpansionCacheSvc.hasBeenPopulated()) { if (myMdmExpansionCacheSvc.hasBeenPopulated()) {
return; return;
@ -443,14 +464,16 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
} }
} }
// gets all the resources related to each patient provided in the list of thePatientPids
@SuppressWarnings("unchecked")
private void queryResourceTypeWithReferencesToPatients(Set<JpaPid> theReadPids, private void queryResourceTypeWithReferencesToPatients(Set<JpaPid> theReadPids,
List<JpaPid> JpaPidChunk, List<JpaPid> thePatientPids,
ExportPIDIteratorParameters theParams, ExportPIDIteratorParameters theParams,
RuntimeResourceDefinition theDef) { RuntimeResourceDefinition theDef) throws IOException {
//Convert Resource Persistent IDs to actual client IDs. //Convert Resource Persistent IDs to actual client IDs.
Set<JpaPid> pidSet = new HashSet<>(JpaPidChunk); Set<JpaPid> pidSet = new HashSet<>(thePatientPids);
Set<String> resourceIds = myIdHelperService.translatePidsToFhirResourceIds(pidSet); Set<String> patientIds = myIdHelperService.translatePidsToFhirResourceIds(pidSet);
//Build SP map //Build SP map
//First, inject the _typeFilters and _since from the export job //First, inject the _typeFilters and _since from the export job
@ -461,29 +484,49 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
validateSearchParametersForGroup(expandedSpMap, theParams.getResourceType()); validateSearchParametersForGroup(expandedSpMap, theParams.getResourceType());
// Fetch and cache a search builder for this resource type // Fetch and cache a search builder for this resource type
// filter by ResourceType
ISearchBuilder<JpaPid> searchBuilder = getSearchBuilderForResourceType(theParams.getResourceType()); ISearchBuilder<JpaPid> searchBuilder = getSearchBuilderForResourceType(theParams.getResourceType());
// Now, further filter the query with patient references defined by the chunk of IDs we have. // Now, further filter the query with patient references defined by the chunk of IDs we have.
// filter by PatientIds
if (PATIENT_BULK_EXPORT_FORWARD_REFERENCE_RESOURCE_TYPES.contains(theParams.getResourceType())) { if (PATIENT_BULK_EXPORT_FORWARD_REFERENCE_RESOURCE_TYPES.contains(theParams.getResourceType())) {
filterSearchByHasParam(resourceIds, expandedSpMap, theParams); filterSearchByHasParam(patientIds, expandedSpMap, theParams);
} else { } else {
filterSearchByResourceIds(resourceIds, expandedSpMap, theParams); filterSearchByResourceIds(patientIds, expandedSpMap, theParams);
} }
//Execute query and all found pids to our local iterator. //Execute query and all found pids to our local iterator.
RequestPartitionId partitionId = theParams.getPartitionIdOrAllPartitions(); RequestPartitionId partitionId = theParams.getPartitionIdOrAllPartitions();
IResultIterator<JpaPid> resultIterator = searchBuilder.createQuery(expandedSpMap, try (IResultIterator<JpaPid> resultIterator = searchBuilder.createQuery(expandedSpMap,
new SearchRuntimeDetails(null, theParams.getInstanceId()), new SearchRuntimeDetails(null, theParams.getInstanceId()),
null, null,
partitionId); partitionId)) {
while (resultIterator.hasNext()) { while (resultIterator.hasNext()) {
theReadPids.add(resultIterator.next()); theReadPids.add(resultIterator.next());
} }
}
// Construct our Includes filter
// We use this to recursively fetch resources of interest
// (but should only request those the user has requested/can see)
Set<Include> includes = new HashSet<>();
for (String resourceType : theParams.getRequestedResourceTypes()) {
includes.add(new Include(resourceType + ":*", true));
}
// add _include to results to support ONC
Set<Include> includes = Collections.singleton(new Include("*", true));
SystemRequestDetails requestDetails = new SystemRequestDetails().setRequestPartitionId(partitionId); SystemRequestDetails requestDetails = new SystemRequestDetails().setRequestPartitionId(partitionId);
Set<JpaPid> includeIds = searchBuilder.loadIncludes(myContext, myEntityManager, theReadPids, includes, false, expandedSpMap.getLastUpdated(), theParams.getInstanceId(), requestDetails, null); SearchBuilderLoadIncludesParameters<JpaPid> loadIncludesParameters = new SearchBuilderLoadIncludesParameters<>();
loadIncludesParameters.setFhirContext(myContext);
loadIncludesParameters.setMatches(theReadPids);
loadIncludesParameters.setEntityManager(myEntityManager);
loadIncludesParameters.setRequestDetails(requestDetails);
loadIncludesParameters.setIncludeFilters(includes);
loadIncludesParameters.setReverseMode(false);
loadIncludesParameters.setLastUpdated(expandedSpMap.getLastUpdated());
loadIncludesParameters.setSearchIdOrDescription(theParams.getInstanceId());
loadIncludesParameters.setDesiredResourceTypes(theParams.getRequestedResourceTypes());
Set<JpaPid> includeIds = searchBuilder.loadIncludes(loadIncludesParameters);
// gets rid of the Patient duplicates // gets rid of the Patient duplicates
theReadPids.addAll(includeIds.stream().filter((id) -> !id.getResourceType().equals("Patient")).collect(Collectors.toSet())); theReadPids.addAll(includeIds.stream().filter((id) -> !id.getResourceType().equals("Patient")).collect(Collectors.toSet()));
} }
@ -530,7 +573,7 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
* *
* @return a Set of Strings representing the resource IDs of all members of a group. * @return a Set of Strings representing the resource IDs of all members of a group.
*/ */
private Set<JpaPid> expandAllPatientPidsFromGroup(ExportPIDIteratorParameters theParams) { private Set<JpaPid> expandAllPatientPidsFromGroup(ExportPIDIteratorParameters theParams) throws IOException {
Set<JpaPid> expandedIds = new HashSet<>(); Set<JpaPid> expandedIds = new HashSet<>();
RequestPartitionId partitionId = theParams.getPartitionIdOrAllPartitions(); RequestPartitionId partitionId = theParams.getPartitionIdOrAllPartitions();
SystemRequestDetails requestDetails = new SystemRequestDetails().setRequestPartitionId(partitionId); SystemRequestDetails requestDetails = new SystemRequestDetails().setRequestPartitionId(partitionId);
@ -551,6 +594,7 @@ public class JpaBulkExportProcessor implements IBulkExportProcessor<JpaPid> {
return expandedIds; return expandedIds;
} }
@SuppressWarnings({"rawtypes", "unchecked"})
private Set<JpaPid> performMembershipExpansionViaMdmTable(JpaPid pidOrNull) { private Set<JpaPid> performMembershipExpansionViaMdmTable(JpaPid pidOrNull) {
List<MdmPidTuple<JpaPid>> goldenPidTargetPidTuples = myMdmLinkDao.expandPidsFromGroupPidGivenMatchResult(pidOrNull, MdmMatchResultEnum.MATCH); List<MdmPidTuple<JpaPid>> goldenPidTargetPidTuples = myMdmLinkDao.expandPidsFromGroupPidGivenMatchResult(pidOrNull, MdmMatchResultEnum.MATCH);
//Now lets translate these pids into resource IDs //Now lets translate these pids into resource IDs

View File

@ -423,7 +423,9 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
String resourceIdBeforeStorage = theResource.getIdElement().getIdPart(); String resourceIdBeforeStorage = theResource.getIdElement().getIdPart();
boolean resourceHadIdBeforeStorage = isNotBlank(resourceIdBeforeStorage); boolean resourceHadIdBeforeStorage = isNotBlank(resourceIdBeforeStorage);
boolean resourceIdWasServerAssigned = theResource.getUserData(JpaConstants.RESOURCE_ID_SERVER_ASSIGNED) == Boolean.TRUE; boolean resourceIdWasServerAssigned = theResource.getUserData(JpaConstants.RESOURCE_ID_SERVER_ASSIGNED) == Boolean.TRUE;
if (resourceHadIdBeforeStorage) {
entity.setFhirId(resourceIdBeforeStorage); entity.setFhirId(resourceIdBeforeStorage);
}
HookParams hookParams; HookParams hookParams;

View File

@ -26,6 +26,7 @@ import org.hl7.fhir.instance.model.api.IBaseBooleanDatatype;
import org.hl7.fhir.instance.model.api.IBaseCoding; import org.hl7.fhir.instance.model.api.IBaseCoding;
import java.lang.reflect.Field; import java.lang.reflect.Field;
import java.util.HashMap;
import java.util.Map; import java.util.Map;
import java.util.concurrent.ConcurrentHashMap; import java.util.concurrent.ConcurrentHashMap;

View File

@ -53,11 +53,11 @@ import java.util.Date;
@Table(name = "MPI_LINK", uniqueConstraints = { @Table(name = "MPI_LINK", uniqueConstraints = {
// TODO GGG DROP this index, and instead use the below one // TODO GGG DROP this index, and instead use the below one
@UniqueConstraint(name = "IDX_EMPI_PERSON_TGT", columnNames = {"PERSON_PID", "TARGET_PID"}), @UniqueConstraint(name = "IDX_EMPI_PERSON_TGT", columnNames = {"PERSON_PID", "TARGET_PID"}),
// v---- this one
//TODO GGG revisit adding this: @UniqueConstraint(name = "IDX_EMPI_GR_TGT", columnNames = {"GOLDEN_RESOURCE_PID", "TARGET_PID"}),
//TODO GGG Should i make individual indices for PERSON/TARGET? //TODO GGG Should i make individual indices for PERSON/TARGET?
}, indexes = { }, indexes = {
@Index(name = "IDX_EMPI_MATCH_TGT_VER", columnList = "MATCH_RESULT, TARGET_PID, VERSION") @Index(name = "IDX_EMPI_MATCH_TGT_VER", columnList = "MATCH_RESULT, TARGET_PID, VERSION"),
// v---- this one
@Index(name = "IDX_EMPI_GR_TGT", columnList = "GOLDEN_RESOURCE_PID, TARGET_PID")
}) })
@Audited @Audited
// This is the table name generated by default by envers, but we set it explicitly for clarity // This is the table name generated by default by envers, but we set it explicitly for clarity

View File

@ -45,6 +45,7 @@ import ca.uhn.fhir.util.ClasspathUtil;
import ca.uhn.fhir.util.VersionEnum; import ca.uhn.fhir.util.VersionEnum;
import software.amazon.awssdk.utils.StringUtils; import software.amazon.awssdk.utils.StringUtils;
import javax.persistence.Index;
import java.util.Arrays; import java.util.Arrays;
import java.util.HashMap; import java.util.HashMap;
import java.util.List; import java.util.List;
@ -383,6 +384,14 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks<VersionEnum> {
linkTable.addForeignKey("20230424.5", "FK_RESLINK_TARGET") linkTable.addForeignKey("20230424.5", "FK_RESLINK_TARGET")
.toColumn("TARGET_RESOURCE_ID").references("HFJ_RESOURCE", "RES_ID"); .toColumn("TARGET_RESOURCE_ID").references("HFJ_RESOURCE", "RES_ID");
} }
{
version.onTable("MPI_LINK")
.addIndex("20230504.1", "IDX_EMPI_GR_TGT")
.unique(false)
.withColumns("GOLDEN_RESOURCE_PID", "TARGET_PID");
}
} }
protected void init640() { protected void init640() {

View File

@ -48,8 +48,10 @@ import ca.uhn.fhir.jpa.entity.ResourceSearchView;
import ca.uhn.fhir.jpa.interceptor.JpaPreResourceAccessDetails; import ca.uhn.fhir.jpa.interceptor.JpaPreResourceAccessDetails;
import ca.uhn.fhir.jpa.model.config.PartitionSettings; import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.dao.JpaPid; import ca.uhn.fhir.jpa.model.dao.JpaPid;
import ca.uhn.fhir.jpa.model.entity.BaseResourceIndexedSearchParam;
import ca.uhn.fhir.jpa.model.entity.IBaseResourceEntity; import ca.uhn.fhir.jpa.model.entity.IBaseResourceEntity;
import ca.uhn.fhir.jpa.model.entity.ResourceTag; import ca.uhn.fhir.jpa.model.entity.ResourceTag;
import ca.uhn.fhir.jpa.model.search.SearchBuilderLoadIncludesParameters;
import ca.uhn.fhir.jpa.model.search.SearchRuntimeDetails; import ca.uhn.fhir.jpa.model.search.SearchRuntimeDetails;
import ca.uhn.fhir.jpa.model.search.StorageProcessingMessage; import ca.uhn.fhir.jpa.model.search.StorageProcessingMessage;
import ca.uhn.fhir.jpa.search.SearchConstants; import ca.uhn.fhir.jpa.search.SearchConstants;
@ -79,6 +81,7 @@ import ca.uhn.fhir.rest.api.SortSpec;
import ca.uhn.fhir.rest.api.server.IPreResourceAccessDetails; import ca.uhn.fhir.rest.api.server.IPreResourceAccessDetails;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.param.DateRangeParam; import ca.uhn.fhir.rest.param.DateRangeParam;
import ca.uhn.fhir.rest.param.ParameterUtil;
import ca.uhn.fhir.rest.param.ReferenceParam; import ca.uhn.fhir.rest.param.ReferenceParam;
import ca.uhn.fhir.rest.param.StringParam; import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.param.TokenParam; import ca.uhn.fhir.rest.param.TokenParam;
@ -95,6 +98,7 @@ import com.healthmarketscience.sqlbuilder.Condition;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.Validate; import org.apache.commons.lang3.Validate;
import org.apache.commons.lang3.math.NumberUtils; import org.apache.commons.lang3.math.NumberUtils;
import org.apache.commons.lang3.tuple.Pair;
import org.hl7.fhir.instance.model.api.IAnyResource; import org.hl7.fhir.instance.model.api.IAnyResource;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.slf4j.Logger; import org.slf4j.Logger;
@ -105,6 +109,7 @@ import org.springframework.jdbc.core.SingleColumnRowMapper;
import org.springframework.transaction.support.TransactionSynchronizationManager; import org.springframework.transaction.support.TransactionSynchronizationManager;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
import javax.annotation.Nullable;
import javax.persistence.EntityManager; import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext; import javax.persistence.PersistenceContext;
import javax.persistence.PersistenceContextType; import javax.persistence.PersistenceContextType;
@ -126,7 +131,6 @@ import java.util.Set;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static ca.uhn.fhir.jpa.search.builder.QueryStack.LOCATION_POSITION; import static ca.uhn.fhir.jpa.search.builder.QueryStack.LOCATION_POSITION;
import static org.apache.commons.lang3.StringUtils.countMatches;
import static org.apache.commons.lang3.StringUtils.defaultString; import static org.apache.commons.lang3.StringUtils.defaultString;
import static org.apache.commons.lang3.StringUtils.isBlank; import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.isNotBlank; import static org.apache.commons.lang3.StringUtils.isNotBlank;
@ -1108,26 +1112,62 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
* The JpaPid returned will have resource type populated. * The JpaPid returned will have resource type populated.
*/ */
@Override @Override
public Set<JpaPid> loadIncludes(FhirContext theContext, EntityManager theEntityManager, Collection<JpaPid> theMatches, Collection<Include> theIncludes, public Set<JpaPid> loadIncludes(
boolean theReverseMode, DateRangeParam theLastUpdated, String theSearchIdOrDescription, RequestDetails theRequest, Integer theMaxCount) { FhirContext theContext,
if (theMatches.size() == 0) { EntityManager theEntityManager,
Collection<JpaPid> theMatches,
Collection<Include> theIncludes,
boolean theReverseMode,
DateRangeParam theLastUpdated,
String theSearchIdOrDescription,
RequestDetails theRequest,
Integer theMaxCount
) {
SearchBuilderLoadIncludesParameters<JpaPid> parameters = new SearchBuilderLoadIncludesParameters<>();
parameters.setFhirContext(theContext);
parameters.setEntityManager(theEntityManager);
parameters.setMatches(theMatches);
parameters.setIncludeFilters(theIncludes);
parameters.setReverseMode(theReverseMode);
parameters.setLastUpdated(theLastUpdated);
parameters.setSearchIdOrDescription(theSearchIdOrDescription);
parameters.setRequestDetails(theRequest);
parameters.setMaxCount(theMaxCount);
return loadIncludes(parameters);
}
@Override
public Set<JpaPid> loadIncludes(SearchBuilderLoadIncludesParameters<JpaPid> theParameters) {
Collection<JpaPid> matches = theParameters.getMatches();
Collection<Include> currentIncludes = theParameters.getIncludeFilters();
boolean reverseMode = theParameters.isReverseMode();
EntityManager entityManager = theParameters.getEntityManager();
Integer maxCount = theParameters.getMaxCount();
FhirContext fhirContext = theParameters.getFhirContext();
DateRangeParam lastUpdated = theParameters.getLastUpdated();
RequestDetails request = theParameters.getRequestDetails();
String searchIdOrDescription = theParameters.getSearchIdOrDescription();
List<String> desiredResourceTypes = theParameters.getDesiredResourceTypes();
boolean hasDesiredResourceTypes = desiredResourceTypes != null && !desiredResourceTypes.isEmpty();
if (matches.size() == 0) {
return new HashSet<>(); return new HashSet<>();
} }
if (theIncludes == null || theIncludes.isEmpty()) { if (currentIncludes == null || currentIncludes.isEmpty()) {
return new HashSet<>(); return new HashSet<>();
} }
String searchPidFieldName = theReverseMode ? MY_TARGET_RESOURCE_PID : MY_SOURCE_RESOURCE_PID; String searchPidFieldName = reverseMode ? MY_TARGET_RESOURCE_PID : MY_SOURCE_RESOURCE_PID;
String findPidFieldName = theReverseMode ? MY_SOURCE_RESOURCE_PID : MY_TARGET_RESOURCE_PID; String findPidFieldName = reverseMode ? MY_SOURCE_RESOURCE_PID : MY_TARGET_RESOURCE_PID;
String findResourceTypeFieldName = theReverseMode ? MY_SOURCE_RESOURCE_TYPE : MY_TARGET_RESOURCE_TYPE; String findResourceTypeFieldName = reverseMode ? MY_SOURCE_RESOURCE_TYPE : MY_TARGET_RESOURCE_TYPE;
String findVersionFieldName = null; String findVersionFieldName = null;
if (!theReverseMode && myStorageSettings.isRespectVersionsForSearchIncludes()) { if (!reverseMode && myStorageSettings.isRespectVersionsForSearchIncludes()) {
findVersionFieldName = MY_TARGET_RESOURCE_VERSION; findVersionFieldName = MY_TARGET_RESOURCE_VERSION;
} }
List<JpaPid> nextRoundMatches = new ArrayList<>(theMatches); List<JpaPid> nextRoundMatches = new ArrayList<>(matches);
HashSet<JpaPid> allAdded = new HashSet<>(); HashSet<JpaPid> allAdded = new HashSet<>();
HashSet<JpaPid> original = new HashSet<>(theMatches); HashSet<JpaPid> original = new HashSet<>(matches);
ArrayList<Include> includes = new ArrayList<>(theIncludes); ArrayList<Include> includes = new ArrayList<>(currentIncludes);
int roundCounts = 0; int roundCounts = 0;
StopWatch w = new StopWatch(); StopWatch w = new StopWatch();
@ -1161,42 +1201,62 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
sqlBuilder.append("SELECT r.").append(findPidFieldName); sqlBuilder.append("SELECT r.").append(findPidFieldName);
sqlBuilder.append(", r.").append(findResourceTypeFieldName); sqlBuilder.append(", r.").append(findResourceTypeFieldName);
if (findVersionFieldName != null) { if (findVersionFieldName != null) {
sqlBuilder.append(", r." + findVersionFieldName); sqlBuilder.append(", r.").append(findVersionFieldName);
} }
sqlBuilder.append(" FROM ResourceLink r WHERE "); sqlBuilder.append(" FROM ResourceLink r WHERE ");
sqlBuilder.append("r."); sqlBuilder.append("r.");
sqlBuilder.append(searchPidFieldName); sqlBuilder.append(searchPidFieldName); // (rev mode) target_resource_id | source_resource_id
sqlBuilder.append(" IN (:target_pids)"); sqlBuilder.append(" IN (:target_pids)");
// Technically if the request is a qualified star (e.g. _include=Observation:*) we /*
// should always be checking the source resource type on the resource link. We don't * We need to set the resource type in 2 cases only:
// actually index that column though by default, so in order to try and be efficient * 1) we are in $everything mode
// we don't actually include it for includes (but we do for revincludes). This is * (where we only want to fetch specific resource types, regardless of what is
// because for an include it doesn't really make sense to include a different * available to fetch)
// resource type than the one you are searching on. * 2) we are doing revincludes
if (wantResourceType != null && theReverseMode) { *
* Technically if the request is a qualified star (e.g. _include=Observation:*) we
* should always be checking the source resource type on the resource link. We don't
* actually index that column though by default, so in order to try and be efficient
* we don't actually include it for includes (but we do for revincludes). This is
* because for an include, it doesn't really make sense to include a different
* resource type than the one you are searching on.
*/
if (wantResourceType != null
&& (reverseMode || (myParams != null && myParams.getEverythingMode() != null))
) {
// because mySourceResourceType is not part of the HFJ_RES_LINK
// index, this might not be the most optimal performance.
// but it is for an $everything operation (and maybe we should update the index)
sqlBuilder.append(" AND r.mySourceResourceType = :want_resource_type"); sqlBuilder.append(" AND r.mySourceResourceType = :want_resource_type");
} else { } else {
wantResourceType = null; wantResourceType = null;
} }
// When calling $everything on a Patient instance, we don't want to recurse into new Patient resources // When calling $everything on a Patient instance, we don't want to recurse into new Patient resources
// (e.g. via Provenance, List, or Group) when in an $everything operation // (e.g. via Provenance, List, or Group) when in an $everything operation
if (myParams != null && myParams.getEverythingMode() == SearchParameterMap.EverythingModeEnum.PATIENT_INSTANCE) { if (myParams != null && myParams.getEverythingMode() == SearchParameterMap.EverythingModeEnum.PATIENT_INSTANCE) {
sqlBuilder.append(" AND r.myTargetResourceType != 'Patient'"); sqlBuilder.append(" AND r.myTargetResourceType != 'Patient'");
sqlBuilder.append(" AND r.mySourceResourceType != 'Provenance'"); sqlBuilder.append(" AND r.mySourceResourceType != 'Provenance'");
} }
if (hasDesiredResourceTypes) {
sqlBuilder.append(" AND r.myTargetResourceType IN (:desired_target_resource_types)");
}
String sql = sqlBuilder.toString(); String sql = sqlBuilder.toString();
List<Collection<JpaPid>> partitions = partition(nextRoundMatches, getMaximumPageSize()); List<Collection<JpaPid>> partitions = partition(nextRoundMatches, getMaximumPageSize());
for (Collection<JpaPid> nextPartition : partitions) { for (Collection<JpaPid> nextPartition : partitions) {
TypedQuery<?> q = theEntityManager.createQuery(sql, Object[].class); TypedQuery<?> q = entityManager.createQuery(sql, Object[].class);
q.setParameter("target_pids", JpaPid.toLongList(nextPartition)); q.setParameter("target_pids", JpaPid.toLongList(nextPartition));
if (wantResourceType != null) { if (wantResourceType != null) {
q.setParameter("want_resource_type", wantResourceType); q.setParameter("want_resource_type", wantResourceType);
} }
if (theMaxCount != null) { if (maxCount != null) {
q.setMaxResults(theMaxCount); q.setMaxResults(maxCount);
}
if (hasDesiredResourceTypes) {
q.setParameter("desired_target_resource_types", String.join(", ", desiredResourceTypes));
} }
List<?> results = q.getResultList(); List<?> results = q.getResultList();
for (Object nextRow : results) { for (Object nextRow : results) {
@ -1220,7 +1280,6 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
} }
} }
} else { } else {
List<String> paths; List<String> paths;
// Start replace // Start replace
@ -1229,7 +1288,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
if (isBlank(resType)) { if (isBlank(resType)) {
continue; continue;
} }
RuntimeResourceDefinition def = theContext.getResourceDefinition(resType); RuntimeResourceDefinition def = fhirContext.getResourceDefinition(resType);
if (def == null) { if (def == null) {
ourLog.warn("Unknown resource type in include/revinclude=" + nextInclude.getValue()); ourLog.warn("Unknown resource type in include/revinclude=" + nextInclude.getValue());
continue; continue;
@ -1249,77 +1308,58 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
paths = param.getPathsSplitForResourceType(resType); paths = param.getPathsSplitForResourceType(resType);
// end replace // end replace
String targetResourceType = defaultString(nextInclude.getParamTargetType(), null); Set<String> targetResourceTypes = computeTargetResourceTypes(nextInclude, param);
for (String nextPath : paths) { for (String nextPath : paths) {
boolean haveTargetTypesDefinedByParam = param.hasTargets();
String findPidFieldSqlColumn = findPidFieldName.equals(MY_SOURCE_RESOURCE_PID) ? "src_resource_id" : "target_resource_id"; String findPidFieldSqlColumn = findPidFieldName.equals(MY_SOURCE_RESOURCE_PID) ? "src_resource_id" : "target_resource_id";
String fieldsToLoad = "r." + findPidFieldSqlColumn + " AS " + RESOURCE_ID_ALIAS; String fieldsToLoad = "r." + findPidFieldSqlColumn + " AS " + RESOURCE_ID_ALIAS;
if (findVersionFieldName != null) { if (findVersionFieldName != null) {
fieldsToLoad += ", r.target_resource_version AS " + RESOURCE_VERSION_ALIAS; fieldsToLoad += ", r.target_resource_version AS " + RESOURCE_VERSION_ALIAS;
} }
// Query for includes lookup has consider 2 cases // Query for includes lookup has 2 cases
// Case 1: Where target_resource_id is available in hfj_res_link table for local references // Case 1: Where target_resource_id is available in hfj_res_link table for local references
// Case 2: Where target_resource_id is null in hfj_res_link table and referred by a canonical url in target_resource_url // Case 2: Where target_resource_id is null in hfj_res_link table and referred by a canonical url in target_resource_url
// Case 1: // Case 1:
Map<String, Object> localReferenceQueryParams = new HashMap<>();
String searchPidFieldSqlColumn = searchPidFieldName.equals(MY_TARGET_RESOURCE_PID) ? "target_resource_id" : "src_resource_id"; String searchPidFieldSqlColumn = searchPidFieldName.equals(MY_TARGET_RESOURCE_PID) ? "target_resource_id" : "src_resource_id";
StringBuilder resourceIdBasedQuery = new StringBuilder("SELECT " + fieldsToLoad + StringBuilder localReferenceQuery = new StringBuilder("SELECT " + fieldsToLoad +
" FROM hfj_res_link r " + " FROM hfj_res_link r " +
" WHERE r.src_path = :src_path AND " + " WHERE r.src_path = :src_path AND " +
" r.target_resource_id IS NOT NULL AND " + " r.target_resource_id IS NOT NULL AND " +
" r." + searchPidFieldSqlColumn + " IN (:target_pids) "); " r." + searchPidFieldSqlColumn + " IN (:target_pids) ");
if (targetResourceType != null) { localReferenceQueryParams.put("src_path", nextPath);
resourceIdBasedQuery.append(" AND r.target_resource_type = :target_resource_type "); // we loop over target_pids later.
} else if (haveTargetTypesDefinedByParam) { if (targetResourceTypes != null) {
resourceIdBasedQuery.append(" AND r.target_resource_type in (:target_resource_types) "); if (targetResourceTypes.size() == 1) {
localReferenceQuery.append(" AND r.target_resource_type = :target_resource_type ");
localReferenceQueryParams.put("target_resource_type", targetResourceTypes.iterator().next());
} else {
localReferenceQuery.append(" AND r.target_resource_type in (:target_resource_types) ");
localReferenceQueryParams.put("target_resource_types", targetResourceTypes);
}
} }
// Case 2: // Case 2:
String fieldsToLoadFromSpidxUriTable = "rUri.res_id"; Pair<String, Map<String, Object>> canonicalQuery = buildCanonicalUrlQuery(findVersionFieldName, searchPidFieldSqlColumn, targetResourceTypes);
// to match the fields loaded in union
if (fieldsToLoad.split(",").length > 1) {
for (int i = 0; i < fieldsToLoad.split(",").length - 1; i++) {
fieldsToLoadFromSpidxUriTable += ", NULL";
}
}
//@formatter:off
StringBuilder resourceUrlBasedQuery = new StringBuilder("SELECT " + fieldsToLoadFromSpidxUriTable +
" FROM hfj_res_link r " +
" JOIN hfj_spidx_uri rUri ON ( " +
" r.target_resource_url = rUri.sp_uri AND " +
" rUri.sp_name = 'url' ");
if (targetResourceType != null) {
resourceUrlBasedQuery.append(" AND rUri.res_type = :target_resource_type ");
} else if (haveTargetTypesDefinedByParam) {
resourceUrlBasedQuery.append(" AND rUri.res_type IN (:target_resource_types) ");
}
resourceUrlBasedQuery.append(" ) ");
resourceUrlBasedQuery.append(
" WHERE r.src_path = :src_path AND " +
" r.target_resource_id IS NULL AND " +
" r." + searchPidFieldSqlColumn + " IN (:target_pids) ");
//@formatter:on //@formatter:on
String sql = resourceIdBasedQuery + " UNION " + resourceUrlBasedQuery; String sql = localReferenceQuery + " UNION " + canonicalQuery.getLeft();
List<Collection<JpaPid>> partitions = partition(nextRoundMatches, getMaximumPageSize()); List<Collection<JpaPid>> partitions = partition(nextRoundMatches, getMaximumPageSize());
for (Collection<JpaPid> nextPartition : partitions) { for (Collection<JpaPid> nextPartition : partitions) {
Query q = theEntityManager.createNativeQuery(sql, Tuple.class); Query q = entityManager.createNativeQuery(sql, Tuple.class);
q.setParameter("src_path", nextPath);
q.setParameter("target_pids", JpaPid.toLongList(nextPartition)); q.setParameter("target_pids", JpaPid.toLongList(nextPartition));
if (targetResourceType != null) { localReferenceQueryParams.forEach(q::setParameter);
q.setParameter("target_resource_type", targetResourceType); canonicalQuery.getRight().forEach(q::setParameter);
} else if (haveTargetTypesDefinedByParam) {
q.setParameter("target_resource_types", param.getTargets());
}
if (theMaxCount != null) { if (maxCount != null) {
q.setMaxResults(theMaxCount); q.setMaxResults(maxCount);
} }
@SuppressWarnings("unchecked")
List<Tuple> results = q.getResultList(); List<Tuple> results = q.getResultList();
for (Tuple result : results) { for (Tuple result : results) {
if (result != null) { if (result != null) {
@ -1336,44 +1376,38 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
} }
} }
if (theReverseMode) {
if (theLastUpdated != null && (theLastUpdated.getLowerBoundAsInstant() != null || theLastUpdated.getUpperBoundAsInstant() != null)) {
pidsToInclude = new HashSet<>(QueryParameterUtils.filterResourceIdsByLastUpdated(theEntityManager, theLastUpdated, pidsToInclude));
}
}
nextRoundMatches.clear(); nextRoundMatches.clear();
for (JpaPid next : pidsToInclude) { for (JpaPid next : pidsToInclude) {
if (original.contains(next) == false && allAdded.contains(next) == false) { if ( !original.contains(next) && !allAdded.contains(next) ) {
nextRoundMatches.add(next); nextRoundMatches.add(next);
} }
} }
addedSomeThisRound = allAdded.addAll(pidsToInclude); addedSomeThisRound = allAdded.addAll(pidsToInclude);
if (theMaxCount != null && allAdded.size() >= theMaxCount) { if (maxCount != null && allAdded.size() >= maxCount) {
break; break;
} }
} while (includes.size() > 0 && nextRoundMatches.size() > 0 && addedSomeThisRound); } while (!includes.isEmpty() && !nextRoundMatches.isEmpty() && addedSomeThisRound);
allAdded.removeAll(original); allAdded.removeAll(original);
ourLog.info("Loaded {} {} in {} rounds and {} ms for search {}", allAdded.size(), theReverseMode ? "_revincludes" : "_includes", roundCounts, w.getMillisAndRestart(), theSearchIdOrDescription); ourLog.info("Loaded {} {} in {} rounds and {} ms for search {}", allAdded.size(), reverseMode ? "_revincludes" : "_includes", roundCounts, w.getMillisAndRestart(), searchIdOrDescription);
// Interceptor call: STORAGE_PREACCESS_RESOURCES // Interceptor call: STORAGE_PREACCESS_RESOURCES
// This can be used to remove results from the search result details before // This can be used to remove results from the search result details before
// the user has a chance to know that they were in the results // the user has a chance to know that they were in the results
if (allAdded.size() > 0) { if (!allAdded.isEmpty()) {
if (CompositeInterceptorBroadcaster.hasHooks(Pointcut.STORAGE_PREACCESS_RESOURCES, myInterceptorBroadcaster, theRequest)) { if (CompositeInterceptorBroadcaster.hasHooks(Pointcut.STORAGE_PREACCESS_RESOURCES, myInterceptorBroadcaster, request)) {
List<JpaPid> includedPidList = new ArrayList<>(allAdded); List<JpaPid> includedPidList = new ArrayList<>(allAdded);
JpaPreResourceAccessDetails accessDetails = new JpaPreResourceAccessDetails(includedPidList, () -> this); JpaPreResourceAccessDetails accessDetails = new JpaPreResourceAccessDetails(includedPidList, () -> this);
HookParams params = new HookParams() HookParams params = new HookParams()
.add(IPreResourceAccessDetails.class, accessDetails) .add(IPreResourceAccessDetails.class, accessDetails)
.add(RequestDetails.class, theRequest) .add(RequestDetails.class, request)
.addIfMatchesType(ServletRequestDetails.class, theRequest); .addIfMatchesType(ServletRequestDetails.class, request);
CompositeInterceptorBroadcaster.doCallHooks(myInterceptorBroadcaster, theRequest, Pointcut.STORAGE_PREACCESS_RESOURCES, params); CompositeInterceptorBroadcaster.doCallHooks(myInterceptorBroadcaster, request, Pointcut.STORAGE_PREACCESS_RESOURCES, params);
for (int i = includedPidList.size() - 1; i >= 0; i--) { for (int i = includedPidList.size() - 1; i >= 0; i--) {
if (accessDetails.isDontReturnResourceAtIndex(i)) { if (accessDetails.isDontReturnResourceAtIndex(i)) {
@ -1389,6 +1423,62 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
return allAdded; return allAdded;
} }
@Nullable
private static Set<String> computeTargetResourceTypes(Include nextInclude, RuntimeSearchParam param) {
String targetResourceType = defaultString(nextInclude.getParamTargetType(), null);
boolean haveTargetTypesDefinedByParam = param.hasTargets();
Set<String> targetResourceTypes;
if (targetResourceType != null) {
targetResourceTypes = Set.of(targetResourceType);
} else if (haveTargetTypesDefinedByParam) {
targetResourceTypes = param.getTargets();
} else {
// all types!
targetResourceTypes = null;
}
return targetResourceTypes;
}
@Nonnull
private Pair<String, Map<String, Object>> buildCanonicalUrlQuery(String theVersionFieldName, String thePidFieldSqlColumn, Set<String> theTargetResourceTypes) {
String fieldsToLoadFromSpidxUriTable = "rUri.res_id";
if (theVersionFieldName != null) {
// canonical-uri references aren't versioned, but we need to match the column count for the UNION
fieldsToLoadFromSpidxUriTable += ", NULL";
}
// The logical join will be by hfj_spidx_uri on sp_name='uri' and sp_uri=target_resource_url.
// But sp_name isn't indexed, so we use hash_identity instead.
if (theTargetResourceTypes == null) {
// hash_identity includes the resource type. So a null wildcard must be replaced with a list of all types.
theTargetResourceTypes = myDaoRegistry.getRegisteredDaoTypes();
}
assert !theTargetResourceTypes.isEmpty();
Set<Long> identityHashesForTypes = theTargetResourceTypes.stream()
.map(type-> BaseResourceIndexedSearchParam.calculateHashIdentity(myPartitionSettings, myRequestPartitionId, type, "url"))
.collect(Collectors.toSet());
Map<String, Object> canonicalUriQueryParams = new HashMap<>();
StringBuilder canonicalUrlQuery = new StringBuilder(
"SELECT " + fieldsToLoadFromSpidxUriTable +
" FROM hfj_res_link r " +
" JOIN hfj_spidx_uri rUri ON ( ");
// join on hash_identity and sp_uri - indexed in IDX_SP_URI_HASH_IDENTITY_V2
if (theTargetResourceTypes.size() == 1) {
canonicalUrlQuery.append(" rUri.hash_identity = :uri_identity_hash ");
canonicalUriQueryParams.put("uri_identity_hash", identityHashesForTypes.iterator().next());
} else {
canonicalUrlQuery.append(" rUri.hash_identity in (:uri_identity_hashes) ");
canonicalUriQueryParams.put("uri_identity_hashes", identityHashesForTypes);
}
canonicalUrlQuery.append(" AND r.target_resource_url = rUri.sp_uri )" +
" WHERE r.src_path = :src_path AND " +
" r.target_resource_id IS NULL AND " +
" r." + thePidFieldSqlColumn + " IN (:target_pids) ");
return Pair.of(canonicalUrlQuery.toString(), canonicalUriQueryParams);
}
private List<Collection<JpaPid>> partition(Collection<JpaPid> theNextRoundMatches, int theMaxLoad) { private List<Collection<JpaPid>> partition(Collection<JpaPid> theNextRoundMatches, int theMaxLoad) {
if (theNextRoundMatches.size() <= theMaxLoad) { if (theNextRoundMatches.size() <= theMaxLoad) {
return Collections.singletonList(theNextRoundMatches); return Collections.singletonList(theNextRoundMatches);
@ -1557,6 +1647,9 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
return myResourceName; return myResourceName;
} }
/**
* IncludesIterator, used to recursively fetch resources from the provided list of PIDs
*/
public class IncludesIterator extends BaseIterator<JpaPid> implements Iterator<JpaPid> { public class IncludesIterator extends BaseIterator<JpaPid> implements Iterator<JpaPid> {
private final RequestDetails myRequest; private final RequestDetails myRequest;
@ -1574,7 +1667,23 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
while (myNext == null) { while (myNext == null) {
if (myCurrentIterator == null) { if (myCurrentIterator == null) {
Set<Include> includes = Collections.singleton(new Include("*", true)); Set<Include> includes = new HashSet<>();
if (myParams.containsKey(Constants.PARAM_TYPE)) {
for (List<IQueryParameterType> typeList : myParams.get(Constants.PARAM_TYPE)) {
for (IQueryParameterType type : typeList) {
String queryString = ParameterUtil.unescape(type.getValueAsQueryToken(myContext));
for (String resourceType : queryString.split(",")) {
String rt = resourceType.trim();
if (isNotBlank(rt)) {
includes.add(new Include(rt + ":*", true));
}
}
}
}
}
if (includes.isEmpty()) {
includes.add(new Include("*", true));
}
Set<JpaPid> newPids = loadIncludes(myContext, myEntityManager, myCurrentPids, includes, false, getParams().getLastUpdated(), mySearchUuid, myRequest, null); Set<JpaPid> newPids = loadIncludes(myContext, myEntityManager, myCurrentPids, includes, false, getParams().getLastUpdated(), mySearchUuid, myRequest, null);
myCurrentIterator = newPids.iterator(); myCurrentIterator = newPids.iterator();
} }
@ -1604,6 +1713,9 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
} }
/**
* Basic Query iterator, used to fetch the results of a query.
*/
private final class QueryIterator extends BaseIterator<JpaPid> implements IResultIterator<JpaPid> { private final class QueryIterator extends BaseIterator<JpaPid> implements IResultIterator<JpaPid> {
private final SearchRuntimeDetails mySearchRuntimeDetails; private final SearchRuntimeDetails mySearchRuntimeDetails;
@ -1627,8 +1739,8 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
myOffset = myParams.getOffset(); myOffset = myParams.getOffset();
myRequest = theRequest; myRequest = theRequest;
// Includes are processed inline for $everything query when we don't have a '_type' specified // everything requires fetching recursively all related resources
if (myParams.getEverythingMode() != null && !myParams.containsKey(Constants.PARAM_TYPE)) { if (myParams.getEverythingMode() != null) {
myFetchIncludesForEverythingOperation = true; myFetchIncludesForEverythingOperation = true;
} }
@ -1638,7 +1750,6 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
} }
private void fetchNext() { private void fetchNext() {
try { try {
if (myHaveRawSqlHooks) { if (myHaveRawSqlHooks) {
CurrentThreadCaptureQueriesListener.startCapturing(); CurrentThreadCaptureQueriesListener.startCapturing();
@ -1656,6 +1767,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
} }
} }
// assigns the results iterator
initializeIteratorQuery(myOffset, myMaxResultsToFetch); initializeIteratorQuery(myOffset, myMaxResultsToFetch);
if (myAlsoIncludePids == null) { if (myAlsoIncludePids == null) {
@ -1663,9 +1775,8 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
} }
} }
if (myNext == null) { if (myNext == null) {
for (Iterator<JpaPid> myPreResultsIterator = myAlsoIncludePids.iterator(); myPreResultsIterator.hasNext(); ) { for (Iterator<JpaPid> myPreResultsIterator = myAlsoIncludePids.iterator(); myPreResultsIterator.hasNext(); ) {
JpaPid next = myPreResultsIterator.next(); JpaPid next = myPreResultsIterator.next();
if (next != null) if (next != null)
@ -1724,6 +1835,8 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
} }
if (myNext == null) { if (myNext == null) {
// if we got here, it means the current PjaPid has already been processed
// and we will decide (here) if we need to fetch related resources recursively
if (myFetchIncludesForEverythingOperation) { if (myFetchIncludesForEverythingOperation) {
myIncludesIterator = new IncludesIterator(myPidSet, myRequest); myIncludesIterator = new IncludesIterator(myPidSet, myRequest);
myFetchIncludesForEverythingOperation = false; myFetchIncludesForEverythingOperation = false;
@ -1750,6 +1863,7 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
mySearchRuntimeDetails.setFoundMatchesCount(myPidSet.size()); mySearchRuntimeDetails.setFoundMatchesCount(myPidSet.size());
} finally { } finally {
// search finished - fire hooks
if (myHaveRawSqlHooks) { if (myHaveRawSqlHooks) {
SqlQueryList capturedQueries = CurrentThreadCaptureQueriesListener.getCurrentQueueAndStopCapturing(); SqlQueryList capturedQueries = CurrentThreadCaptureQueriesListener.getCurrentQueueAndStopCapturing();
HookParams params = new HookParams() HookParams params = new HookParams()

View File

@ -40,6 +40,7 @@ import javax.persistence.PersistenceContextType;
import javax.persistence.Query; import javax.persistence.Query;
import java.sql.Connection; import java.sql.Connection;
import java.sql.PreparedStatement; import java.sql.PreparedStatement;
import java.util.Arrays;
public class SearchQueryExecutor implements ISearchQueryExecutor { public class SearchQueryExecutor implements ISearchQueryExecutor {
@ -119,7 +120,7 @@ public class SearchQueryExecutor implements ISearchQueryExecutor {
hibernateQuery.setParameter(i, args[i - 1]); hibernateQuery.setParameter(i, args[i - 1]);
} }
ourLog.trace("About to execute SQL: {}", sql); ourLog.trace("About to execute SQL: {}. Parameters: {}", sql, Arrays.toString(args));
/* /*
* These settings help to ensure that we use a search cursor * These settings help to ensure that we use a search cursor

View File

@ -2200,6 +2200,7 @@ public class TermReadSvcImpl implements ITermReadSvc, IHasScheduledJobs {
public IValidationSupport.CodeValidationResult validateCode(@Nonnull ValidationSupportContext theValidationSupportContext, @Nonnull ConceptValidationOptions theOptions, String theCodeSystemUrl, String theCode, String theDisplay, String theValueSetUrl) { public IValidationSupport.CodeValidationResult validateCode(@Nonnull ValidationSupportContext theValidationSupportContext, @Nonnull ConceptValidationOptions theOptions, String theCodeSystemUrl, String theCode, String theDisplay, String theValueSetUrl) {
//TODO GGG TRY TO JUST AUTO_PASS HERE AND SEE WHAT HAPPENS. //TODO GGG TRY TO JUST AUTO_PASS HERE AND SEE WHAT HAPPENS.
invokeRunnableForUnitTest(); invokeRunnableForUnitTest();
theOptions.setValidateDisplay(isNotBlank(theDisplay));
if (isNotBlank(theValueSetUrl)) { if (isNotBlank(theValueSetUrl)) {
return validateCodeInValueSet(theValidationSupportContext, theOptions, theValueSetUrl, theCodeSystemUrl, theCode, theDisplay); return validateCodeInValueSet(theValidationSupportContext, theOptions, theValueSetUrl, theCodeSystemUrl, theCode, theDisplay);

View File

@ -0,0 +1,34 @@
/*-
* #%L
* HAPI FHIR JPA Server
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.util;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.EmailDetails;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class LoggingEmailSender implements IEmailSender {
private static final Logger ourLog = LoggerFactory.getLogger(LoggingEmailSender.class);
@Override
public void send(EmailDetails theDetails) {
ourLog.info("Not sending subscription email to: {}", theDetails.getTo());
}
}

View File

@ -198,25 +198,6 @@ public class QueryParameterUtils {
return lastUpdatedPredicates; return lastUpdatedPredicates;
} }
public static List<JpaPid> filterResourceIdsByLastUpdated(EntityManager theEntityManager, final DateRangeParam theLastUpdated, Collection<JpaPid> thePids) {
if (thePids.isEmpty()) {
return Collections.emptyList();
}
CriteriaBuilder builder = theEntityManager.getCriteriaBuilder();
CriteriaQuery<Long> cq = builder.createQuery(Long.class);
Root<ResourceTable> from = cq.from(ResourceTable.class);
cq.select(from.get("myId").as(Long.class));
List<Predicate> lastUpdatedPredicates = createLastUpdatedPredicates(theLastUpdated, builder, from);
List<Long> longIds = thePids.stream().map(JpaPid::getId).collect(Collectors.toList());
lastUpdatedPredicates.add(from.get("myId").as(Long.class).in(longIds));
cq.where(toPredicateArray(lastUpdatedPredicates));
TypedQuery<Long> query = theEntityManager.createQuery(cq);
return query.getResultList().stream().map(JpaPid::fromId).collect(Collectors.toList());
}
public static void verifySearchHasntFailedOrThrowInternalErrorException(Search theSearch) { public static void verifySearchHasntFailedOrThrowInternalErrorException(Search theSearch) {
if (theSearch.getStatus() == SearchStatusEnum.FAILED) { if (theSearch.getStatus() == SearchStatusEnum.FAILED) {
Integer status = theSearch.getFailureCode(); Integer status = theSearch.getFailureCode();

View File

@ -16,6 +16,7 @@ import ca.uhn.fhir.jpa.dao.mdm.MdmExpansionCacheSvc;
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
import ca.uhn.fhir.jpa.dao.tx.NonTransactionalHapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.NonTransactionalHapiTransactionService;
import ca.uhn.fhir.jpa.model.dao.JpaPid; import ca.uhn.fhir.jpa.model.dao.JpaPid;
import ca.uhn.fhir.jpa.model.search.SearchBuilderLoadIncludesParameters;
import ca.uhn.fhir.jpa.model.search.SearchRuntimeDetails; import ca.uhn.fhir.jpa.model.search.SearchRuntimeDetails;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.mdm.api.MdmMatchResultEnum; import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;
@ -25,8 +26,6 @@ import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions; import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions;
import ca.uhn.fhir.rest.api.server.storage.BaseResourcePersistentId;
import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.Group; import org.hl7.fhir.r4.model.Group;
@ -43,7 +42,6 @@ import org.mockito.Mock;
import org.mockito.Spy; import org.mockito.Spy;
import org.mockito.junit.jupiter.MockitoExtension; import org.mockito.junit.jupiter.MockitoExtension;
import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collection; import java.util.Collection;
@ -57,6 +55,7 @@ import java.util.Optional;
import java.util.Set; import java.util.Set;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.containsString;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNotNull;
@ -67,11 +66,10 @@ import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyString; import static org.mockito.ArgumentMatchers.anyString;
import static org.mockito.ArgumentMatchers.eq; import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.ArgumentMatchers.nullable; import static org.mockito.ArgumentMatchers.nullable;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.mock; import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when; import static org.mockito.Mockito.when;
import static org.hamcrest.Matchers.containsString;
@ExtendWith(MockitoExtension.class) @ExtendWith(MockitoExtension.class)
public class JpaBulkExportProcessorTest { public class JpaBulkExportProcessorTest {
@ -366,8 +364,10 @@ public class JpaBulkExportProcessorTest {
} }
// source is: "isExpandMdm,(whether or not to test on a specific partition)
@ParameterizedTest @ParameterizedTest
@CsvSource({"false, false", "false, true", "true, true", "true, false"}) @CsvSource({"false, false", "false, true", "true, true", "true, false"})
@SuppressWarnings({"rawtypes", "unchecked"})
public void getResourcePidIterator_groupExportStyleWithNonPatientResource_returnsIterator(boolean theMdm, boolean thePartitioned) { public void getResourcePidIterator_groupExportStyleWithNonPatientResource_returnsIterator(boolean theMdm, boolean thePartitioned) {
// setup // setup
ExportPIDIteratorParameters parameters = createExportParameters(BulkDataExportOptions.ExportStyle.GROUP); ExportPIDIteratorParameters parameters = createExportParameters(BulkDataExportOptions.ExportStyle.GROUP);
@ -436,8 +436,9 @@ public class JpaBulkExportProcessorTest {
.thenReturn(observationDao); .thenReturn(observationDao);
when(mySearchBuilderFactory.newSearchBuilder(eq(observationDao), eq("Observation"), eq(Observation.class))) when(mySearchBuilderFactory.newSearchBuilder(eq(observationDao), eq("Observation"), eq(Observation.class)))
.thenReturn(observationSearchBuilder); .thenReturn(observationSearchBuilder);
when(observationSearchBuilder.loadIncludes(any(), any(), eq(observationPidSet), any(), eq(false), any(), any(), when(observationSearchBuilder.loadIncludes(
any(SystemRequestDetails.class), any())) any(SearchBuilderLoadIncludesParameters.class)
))
.thenReturn(new HashSet<>()); .thenReturn(new HashSet<>());
// ret // ret
@ -471,10 +472,12 @@ public class JpaBulkExportProcessorTest {
ArgumentCaptor<SystemRequestDetails> groupDaoReadSystemRequestDetailsCaptor = ArgumentCaptor.forClass(SystemRequestDetails.class); ArgumentCaptor<SystemRequestDetails> groupDaoReadSystemRequestDetailsCaptor = ArgumentCaptor.forClass(SystemRequestDetails.class);
verify(groupDao).read(any(IIdType.class), groupDaoReadSystemRequestDetailsCaptor.capture()); verify(groupDao).read(any(IIdType.class), groupDaoReadSystemRequestDetailsCaptor.capture());
validatePartitionId(thePartitioned, groupDaoReadSystemRequestDetailsCaptor.getValue().getRequestPartitionId()); validatePartitionId(thePartitioned, groupDaoReadSystemRequestDetailsCaptor.getValue().getRequestPartitionId());
ArgumentCaptor<SystemRequestDetails> searchBuilderLoadIncludesRequestDetailsCaptor = ArgumentCaptor.forClass(SystemRequestDetails.class); ArgumentCaptor<SearchBuilderLoadIncludesParameters> searchBuilderLoadIncludesRequestDetailsCaptor = ArgumentCaptor.forClass(SearchBuilderLoadIncludesParameters.class);
verify(observationSearchBuilder).loadIncludes(any(), any(), eq(observationPidSet), any(), eq(false), any(), any(), verify(observationSearchBuilder).loadIncludes(searchBuilderLoadIncludesRequestDetailsCaptor.capture());
searchBuilderLoadIncludesRequestDetailsCaptor.capture(), any()); SearchBuilderLoadIncludesParameters param = searchBuilderLoadIncludesRequestDetailsCaptor.getValue();
validatePartitionId(thePartitioned, searchBuilderLoadIncludesRequestDetailsCaptor.getValue().getRequestPartitionId()); assertTrue(param.getRequestDetails() instanceof SystemRequestDetails);
SystemRequestDetails details = (SystemRequestDetails) param.getRequestDetails();
validatePartitionId(thePartitioned, details.getRequestPartitionId());
} }
@ParameterizedTest @ParameterizedTest

View File

@ -0,0 +1,143 @@
package ca.uhn.fhir.mdm.batch2.clear;
import ca.uhn.fhir.jpa.entity.MdmLink;
import ca.uhn.fhir.jpa.model.dao.JpaPid;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import ca.uhn.fhir.jpa.test.config.TestR4Config;
import ca.uhn.fhir.mdm.api.MdmLinkSourceEnum;
import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.util.StopWatch;
import org.apache.commons.dbcp2.BasicDataSource;
import org.hibernate.dialect.PostgreSQL9Dialect;
import org.hl7.fhir.r4.model.Coding;
import org.hl7.fhir.r4.model.Patient;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import org.postgresql.Driver;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.annotation.Configuration;
import org.springframework.test.context.ContextConfiguration;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import static ca.uhn.fhir.mdm.api.MdmConstants.CODE_GOLDEN_RECORD;
import static ca.uhn.fhir.mdm.api.MdmConstants.CODE_HAPI_MDM_MANAGED;
import static ca.uhn.fhir.mdm.api.MdmConstants.SYSTEM_GOLDEN_RECORD_STATUS;
import static ca.uhn.fhir.mdm.api.MdmConstants.SYSTEM_MDM_MANAGED;
import static org.junit.jupiter.api.Assertions.assertTrue;
@Disabled("Keeping as a sandbox to be used whenever we need a lot of MdmLinks in DB for performance testing")
@ContextConfiguration(classes = {MdmLinkSlowDeletionSandboxIT.TestDataSource.class})
public class MdmLinkSlowDeletionSandboxIT extends BaseJpaR4Test {
private static final Logger ourLog = LoggerFactory.getLogger(MdmLinkSlowDeletionSandboxIT.class);
private final int ourMdmLinksToCreate = 1_000_000;
private final int ourLogMdmLinksEach = 1_000;
@Override
public void afterPurgeDatabase() {
// keep the generated data!
// super.afterPurgeDatabase();
}
@Disabled
@Test
void createMdmLinks() {
generatePatientsAndMdmLinks(ourMdmLinksToCreate);
long totalLinks = myMdmLinkDao.count();
ourLog.info("Total links in DB: {}", totalLinks);
assertTrue(totalLinks > 0);
}
private void generatePatientsAndMdmLinks(int theLinkCount) {
StopWatch sw = new StopWatch();
int totalMdmLinksCreated = 0;
for (int i = 0; i < theLinkCount; i++) {
List<JpaPid> patientIds = createMdmLinkPatients();
createMdmLink(patientIds.get(0), patientIds.get(1));
totalMdmLinksCreated++;
if (totalMdmLinksCreated % ourLogMdmLinksEach == 0) {
ourLog.info("Total MDM links created: {} in {} - ETA: {}", totalMdmLinksCreated, sw,
sw.getEstimatedTimeRemaining(totalMdmLinksCreated, ourMdmLinksToCreate));
}
}
}
private void createMdmLink(JpaPid thePidSource, JpaPid thePidTarget) {
MdmLink link = new MdmLink();
link.setGoldenResourcePersistenceId( thePidSource );
link.setSourcePersistenceId( thePidTarget );
Date now = new Date();
link.setCreated(now);
link.setUpdated(now);
link.setVersion("1");
link.setLinkSource(MdmLinkSourceEnum.MANUAL);
link.setMatchResult(MdmMatchResultEnum.MATCH);
link.setMdmSourceType("Patient");
link.setEidMatch(false);
link.setHadToCreateNewGoldenResource(true);
link.setRuleCount(6L);
link.setScore(.8);
link.setVector(61L);
runInTransaction(() -> myEntityManager.persist(link));
}
private List<JpaPid> createMdmLinkPatients() {
List<JpaPid> patientIds = new ArrayList<>();
for (int i = 0; i < 2; i++) {
Patient patient = new Patient();
patient.addName().setFamily(String.format("lastn-%07d", i)).addGiven(String.format("name-%07d", i));
if (i % 2 == 1) {
patient.getMeta()
.addTag(new Coding().setSystem(SYSTEM_MDM_MANAGED).setCode(CODE_HAPI_MDM_MANAGED));
} else {
patient.getMeta()
.addTag(new Coding().setSystem(SYSTEM_GOLDEN_RECORD_STATUS).setCode(CODE_GOLDEN_RECORD));
}
Long pId = myPatientDao.create(patient, new SystemRequestDetails()).getId().getIdPartAsLong();
JpaPid jpaPid = JpaPid.fromIdAndResourceType(pId, "Patient");
patientIds.add(jpaPid);
}
return patientIds;
}
@Configuration
public static class TestDataSource extends TestR4Config {
@Override
public String getHibernateDialect() {
return PostgreSQL9Dialect.class.getName();
// return Oracle12cDialect.class.getName();
}
@Override
public void setConnectionProperties(BasicDataSource theDataSource) {
theDataSource.setDriver(new Driver());
theDataSource.setUrl("jdbc:postgresql://localhost/mdm_link_perf");
theDataSource.setMaxWaitMillis(30000);
theDataSource.setUsername("cdr");
theDataSource.setPassword("smileCDR");
theDataSource.setMaxTotal(ourMaxThreads);
// theDataSource.setDriver(DriverTypeEnum.ORACLE_12C);
// theDataSource.setUrl("jdbc:oracle:thin:@localhost:1527/cdr.localdomain");
// theDataSource.setMaxWaitMillis(30000);
// theDataSource.setUsername("cdr");
// theDataSource.setPassword("smileCDR");
// theDataSource.setMaxTotal(ourMaxThreads);
}
}
}

View File

@ -883,7 +883,7 @@ public class ResourceTable extends BaseHasResource implements Serializable, IBas
} }
private void populateId(IIdType retVal) { private void populateId(IIdType retVal) {
if (myFhirId != null) { if (myFhirId != null && !myFhirId.isEmpty()) {
retVal.setValue(getResourceType() + '/' + myFhirId + '/' + Constants.PARAM_HISTORY + '/' + getVersion()); retVal.setValue(getResourceType() + '/' + myFhirId + '/' + Constants.PARAM_HISTORY + '/' + getVersion());
} else if (getTransientForcedId() != null) { } else if (getTransientForcedId() != null) {
// Avoid a join query if possible // Avoid a join query if possible

View File

@ -1,8 +1,15 @@
package ca.uhn.fhir.jpa.model.entity; package ca.uhn.fhir.jpa.model.entity;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.model.primitive.IdDt;
import org.hl7.fhir.r4.model.Patient;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.CsvSource;
import javax.measure.quantity.Force;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.junit.jupiter.api.Assertions.*; import static org.junit.jupiter.api.Assertions.*;
public class ResourceTableTest { public class ResourceTableTest {
@ -16,5 +23,26 @@ public class ResourceTableTest {
} }
} }
@ParameterizedTest
@CsvSource(value={
"123, 123, Patient/123/_history/1",
", 123, Patient/123/_history/1",
"null, 456, Patient/456/_history/1"
},nullValues={"null"})
public void testPopulateId(String theFhirId, String theForcedId, String theExpected) {
// Given
ResourceTable t = new ResourceTable();
t.setFhirId(theFhirId);
ForcedId forcedId = new ForcedId();
forcedId.setForcedId(theForcedId);
t.setForcedId(forcedId);
t.setResourceType(new Patient().getResourceType().name());
t.setVersionForUnitTest(1);
// When
IdDt actual = t.getIdDt();
// Then
assertTrue(actual.equals(theExpected));
}
} }

View File

@ -27,30 +27,48 @@ import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.model.api.ExtensionDt; import ca.uhn.fhir.model.api.ExtensionDt;
import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum; import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.util.*; import ca.uhn.fhir.util.DatatypeUtil;
import ca.uhn.fhir.util.ExtensionUtil;
import ca.uhn.fhir.util.FhirTerser;
import ca.uhn.fhir.util.HapiExtensions;
import ca.uhn.fhir.util.PhoneticEncoderUtil;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.hl7.fhir.dstu3.model.Extension; import org.hl7.fhir.dstu3.model.Extension;
import org.hl7.fhir.dstu3.model.SearchParameter; import org.hl7.fhir.dstu3.model.SearchParameter;
import org.hl7.fhir.instance.model.api.*; import org.hl7.fhir.instance.model.api.IBase;
import org.hl7.fhir.instance.model.api.IBaseDatatype;
import org.hl7.fhir.instance.model.api.IBaseExtension;
import org.hl7.fhir.instance.model.api.IBaseHasExtensions;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.instance.model.api.IPrimitiveType;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service; import org.springframework.stereotype.Service;
import java.util.*; import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static org.apache.commons.lang3.StringUtils.*; import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
import static org.apache.commons.lang3.StringUtils.startsWith;
@Service @Service
public class SearchParameterCanonicalizer { public class SearchParameterCanonicalizer {
private static final Logger ourLog = LoggerFactory.getLogger(SearchParameterCanonicalizer.class); private static final Logger ourLog = LoggerFactory.getLogger(SearchParameterCanonicalizer.class);
private final FhirContext myFhirContext; private final FhirContext myFhirContext;
private final FhirTerser myTerser;
@Autowired @Autowired
public SearchParameterCanonicalizer(FhirContext theFhirContext) { public SearchParameterCanonicalizer(FhirContext theFhirContext) {
myFhirContext = theFhirContext; myFhirContext = theFhirContext;
myTerser = myFhirContext.newTerser();
} }
private static Collection<String> toStrings(Collection<? extends IPrimitiveType<String>> theBase) { private static Collection<String> toStrings(Collection<? extends IPrimitiveType<String>> theBase) {
@ -95,6 +113,14 @@ public class SearchParameterCanonicalizer {
String name = theNextSp.getCode(); String name = theNextSp.getCode();
String description = theNextSp.getDescription(); String description = theNextSp.getDescription();
String path = theNextSp.getXpath(); String path = theNextSp.getXpath();
Collection<String> baseResource = toStrings(Collections.singletonList(theNextSp.getBaseElement()));
List<String> baseCustomResources = extractDstu2CustomResourcesFromExtensions(theNextSp, HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE);
if(!baseCustomResources.isEmpty()){
baseResource = Collections.singleton(baseCustomResources.get(0));
}
RestSearchParameterTypeEnum paramType = null; RestSearchParameterTypeEnum paramType = null;
RuntimeSearchParam.RuntimeSearchParamStatusEnum status = null; RuntimeSearchParam.RuntimeSearchParamStatusEnum status = null;
if (theNextSp.getTypeElement().getValueAsEnum() != null) { if (theNextSp.getTypeElement().getValueAsEnum() != null) {
@ -138,8 +164,11 @@ public class SearchParameterCanonicalizer {
break; break;
} }
} }
Set<String> providesMembershipInCompartments = Collections.emptySet();
Set<String> targets = DatatypeUtil.toStringSet(theNextSp.getTarget()); Set<String> targetResources = DatatypeUtil.toStringSet(theNextSp.getTarget());
List<String> targetCustomResources = extractDstu2CustomResourcesFromExtensions(theNextSp, HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE);
maybeAddCustomResourcesToResources(targetResources, targetCustomResources);
if (isBlank(name) || isBlank(path)) { if (isBlank(name) || isBlank(path)) {
if (paramType != RestSearchParameterTypeEnum.COMPOSITE) { if (paramType != RestSearchParameterTypeEnum.COMPOSITE) {
@ -164,14 +193,19 @@ public class SearchParameterCanonicalizer {
} }
List<RuntimeSearchParam.Component> components = Collections.emptyList(); List<RuntimeSearchParam.Component> components = Collections.emptyList();
Collection<? extends IPrimitiveType<String>> base = Collections.singletonList(theNextSp.getBaseElement()); return new RuntimeSearchParam(id, uri, name, description, path, paramType, Collections.emptySet(), targetResources, status, unique, components, baseResource);
return new RuntimeSearchParam(id, uri, name, description, path, paramType, providesMembershipInCompartments, targets, status, unique, components, toStrings(base));
} }
private RuntimeSearchParam canonicalizeSearchParameterDstu3(org.hl7.fhir.dstu3.model.SearchParameter theNextSp) { private RuntimeSearchParam canonicalizeSearchParameterDstu3(org.hl7.fhir.dstu3.model.SearchParameter theNextSp) {
String name = theNextSp.getCode(); String name = theNextSp.getCode();
String description = theNextSp.getDescription(); String description = theNextSp.getDescription();
String path = theNextSp.getExpression(); String path = theNextSp.getExpression();
List<String> baseResources = new ArrayList<>(toStrings(theNextSp.getBase()));
List<String> baseCustomResources = extractDstu3CustomResourcesFromExtensions(theNextSp, HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE);
maybeAddCustomResourcesToResources(baseResources, baseCustomResources);
RestSearchParameterTypeEnum paramType = null; RestSearchParameterTypeEnum paramType = null;
RuntimeSearchParam.RuntimeSearchParamStatusEnum status = null; RuntimeSearchParam.RuntimeSearchParamStatusEnum status = null;
if (theNextSp.getType() != null) { if (theNextSp.getType() != null) {
@ -222,8 +256,11 @@ public class SearchParameterCanonicalizer {
break; break;
} }
} }
Set<String> providesMembershipInCompartments = Collections.emptySet();
Set<String> targets = DatatypeUtil.toStringSet(theNextSp.getTarget()); Set<String> targetResources = DatatypeUtil.toStringSet(theNextSp.getTarget());
List<String> targetCustomResources = extractDstu3CustomResourcesFromExtensions(theNextSp, HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE);
maybeAddCustomResourcesToResources(targetResources, targetCustomResources);
if (isBlank(name) || isBlank(path) || paramType == null) { if (isBlank(name) || isBlank(path) || paramType == null) {
if (paramType != RestSearchParameterTypeEnum.COMPOSITE) { if (paramType != RestSearchParameterTypeEnum.COMPOSITE) {
@ -252,35 +289,23 @@ public class SearchParameterCanonicalizer {
components.add(new RuntimeSearchParam.Component(next.getExpression(), next.getDefinition().getReferenceElement().toUnqualifiedVersionless().getValue())); components.add(new RuntimeSearchParam.Component(next.getExpression(), next.getDefinition().getReferenceElement().toUnqualifiedVersionless().getValue()));
} }
return new RuntimeSearchParam(id, uri, name, description, path, paramType, providesMembershipInCompartments, targets, status, unique, components, toStrings(theNextSp.getBase())); return new RuntimeSearchParam(id, uri, name, description, path, paramType, Collections.emptySet(), targetResources, status, unique, components, baseResources);
} }
private RuntimeSearchParam canonicalizeSearchParameterR4Plus(IBaseResource theNextSp) { private RuntimeSearchParam canonicalizeSearchParameterR4Plus(IBaseResource theNextSp) {
FhirTerser terser = myFhirContext.newTerser();
String name = terser.getSinglePrimitiveValueOrNull(theNextSp, "code");
String description = terser.getSinglePrimitiveValueOrNull(theNextSp, "description");
String path = terser.getSinglePrimitiveValueOrNull(theNextSp, "expression");
List<String> base = terser String name = myTerser.getSinglePrimitiveValueOrNull(theNextSp, "code");
.getValues(theNextSp, "base", IPrimitiveType.class) String description = myTerser.getSinglePrimitiveValueOrNull(theNextSp, "description");
.stream() String path = myTerser.getSinglePrimitiveValueOrNull(theNextSp, "expression");
.map(IPrimitiveType::getValueAsString)
.collect(Collectors.toList()); Set<String> baseResources = extractR4PlusResources("base", theNextSp);
if (theNextSp instanceof IBaseHasExtensions) { List<String> baseCustomResources = extractR4PlusCustomResourcesFromExtensions(theNextSp, HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE);
((IBaseHasExtensions) theNextSp)
.getExtension() maybeAddCustomResourcesToResources(baseResources, baseCustomResources);
.stream()
.filter(t -> HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE.equals(t.getUrl()))
.filter(t -> t.getValue() instanceof IPrimitiveType)
.map(t -> ((IPrimitiveType<?>) t.getValue()))
.map(IPrimitiveType::getValueAsString)
.filter(StringUtils::isNotBlank)
.forEach(base::add);
}
RestSearchParameterTypeEnum paramType = null; RestSearchParameterTypeEnum paramType = null;
RuntimeSearchParam.RuntimeSearchParamStatusEnum status = null; RuntimeSearchParam.RuntimeSearchParamStatusEnum status = null;
switch (terser.getSinglePrimitiveValue(theNextSp, "type").orElse("")) { switch (myTerser.getSinglePrimitiveValue(theNextSp, "type").orElse("")) {
case "composite": case "composite":
paramType = RestSearchParameterTypeEnum.COMPOSITE; paramType = RestSearchParameterTypeEnum.COMPOSITE;
break; break;
@ -309,7 +334,7 @@ public class SearchParameterCanonicalizer {
paramType = RestSearchParameterTypeEnum.SPECIAL; paramType = RestSearchParameterTypeEnum.SPECIAL;
break; break;
} }
switch (terser.getSinglePrimitiveValue(theNextSp, "status").orElse("")) { switch (myTerser.getSinglePrimitiveValue(theNextSp, "status").orElse("")) {
case "active": case "active":
status = RuntimeSearchParam.RuntimeSearchParamStatusEnum.ACTIVE; status = RuntimeSearchParam.RuntimeSearchParamStatusEnum.ACTIVE;
break; break;
@ -323,24 +348,11 @@ public class SearchParameterCanonicalizer {
status = RuntimeSearchParam.RuntimeSearchParamStatusEnum.UNKNOWN; status = RuntimeSearchParam.RuntimeSearchParamStatusEnum.UNKNOWN;
break; break;
} }
Set<String> providesMembershipInCompartments = Collections.emptySet();
Set<String> targets = terser Set<String> targetResources = extractR4PlusResources("target", theNextSp);
.getValues(theNextSp, "target", IPrimitiveType.class) List<String> targetCustomResources = extractR4PlusCustomResourcesFromExtensions(theNextSp, HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE);
.stream()
.map(IPrimitiveType::getValueAsString) maybeAddCustomResourcesToResources(targetResources, targetCustomResources);
.collect(Collectors.toSet());
if (theNextSp instanceof IBaseHasExtensions) {
((IBaseHasExtensions) theNextSp)
.getExtension()
.stream()
.filter(t -> HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE.equals(t.getUrl()))
.filter(t -> t.getValue() instanceof IPrimitiveType)
.map(t -> ((IPrimitiveType<?>) t.getValue()))
.map(IPrimitiveType::getValueAsString)
.filter(StringUtils::isNotBlank)
.forEach(targets::add);
}
if (isBlank(name) || isBlank(path) || paramType == null) { if (isBlank(name) || isBlank(path) || paramType == null) {
if ("_text".equals(name) || "_content".equals(name)) { if ("_text".equals(name) || "_content".equals(name)) {
@ -351,7 +363,7 @@ public class SearchParameterCanonicalizer {
} }
IIdType id = theNextSp.getIdElement(); IIdType id = theNextSp.getIdElement();
String uri = terser.getSinglePrimitiveValueOrNull(theNextSp, "url"); String uri = myTerser.getSinglePrimitiveValueOrNull(theNextSp, "url");
ComboSearchParamType unique = null; ComboSearchParamType unique = null;
String value = ((IBaseHasExtensions) theNextSp).getExtension() String value = ((IBaseHasExtensions) theNextSp).getExtension()
@ -369,9 +381,9 @@ public class SearchParameterCanonicalizer {
} }
List<RuntimeSearchParam.Component> components = new ArrayList<>(); List<RuntimeSearchParam.Component> components = new ArrayList<>();
for (IBase next : terser.getValues(theNextSp, "component")) { for (IBase next : myTerser.getValues(theNextSp, "component")) {
String expression = terser.getSinglePrimitiveValueOrNull(next, "expression"); String expression = myTerser.getSinglePrimitiveValueOrNull(next, "expression");
String definition = terser.getSinglePrimitiveValueOrNull(next, "definition"); String definition = myTerser.getSinglePrimitiveValueOrNull(next, "definition");
if (startsWith(definition, "/SearchParameter/")) { if (startsWith(definition, "/SearchParameter/")) {
definition = definition.substring(1); definition = definition.substring(1);
} }
@ -379,7 +391,15 @@ public class SearchParameterCanonicalizer {
components.add(new RuntimeSearchParam.Component(expression, definition)); components.add(new RuntimeSearchParam.Component(expression, definition));
} }
return new RuntimeSearchParam(id, uri, name, description, path, paramType, providesMembershipInCompartments, targets, status, unique, components, base); return new RuntimeSearchParam(id, uri, name, description, path, paramType, Collections.emptySet(), targetResources, status, unique, components, baseResources);
}
private Set<String> extractR4PlusResources(String thePath, IBaseResource theNextSp) {
return myTerser
.getValues(theNextSp, thePath, IPrimitiveType.class)
.stream()
.map(IPrimitiveType::getValueAsString)
.collect(Collectors.toSet());
} }
/** /**
@ -427,5 +447,62 @@ public class SearchParameterCanonicalizer {
} }
} }
private List<String> extractDstu2CustomResourcesFromExtensions(ca.uhn.fhir.model.dstu2.resource.SearchParameter theSearchParameter, String theExtensionUrl) {
List<ExtensionDt> customSpExtensionDt = theSearchParameter.getUndeclaredExtensionsByUrl(theExtensionUrl);
return customSpExtensionDt.stream()
.map(theExtensionDt -> theExtensionDt.getValueAsPrimitive().getValueAsString())
.filter(StringUtils::isNotBlank)
.collect(Collectors.toList());
}
private List<String> extractDstu3CustomResourcesFromExtensions(org.hl7.fhir.dstu3.model.SearchParameter theSearchParameter, String theExtensionUrl) {
List<Extension> customSpExtensions = theSearchParameter.getExtensionsByUrl(theExtensionUrl);
return customSpExtensions.stream()
.map(theExtension -> theExtension.getValueAsPrimitive().getValueAsString())
.filter(StringUtils::isNotBlank)
.collect(Collectors.toList());
}
private List<String> extractR4PlusCustomResourcesFromExtensions(IBaseResource theSearchParameter, String theExtensionUrl) {
List<String> retVal = new ArrayList<>();
if (theSearchParameter instanceof IBaseHasExtensions) {
((IBaseHasExtensions) theSearchParameter)
.getExtension()
.stream()
.filter(t -> theExtensionUrl.equals(t.getUrl()))
.filter(t -> t.getValue() instanceof IPrimitiveType)
.map(t -> ((IPrimitiveType<?>) t.getValue()))
.map(IPrimitiveType::getValueAsString)
.filter(StringUtils::isNotBlank)
.forEach(retVal::add);
}
return retVal;
}
private <T extends Collection<String>> void maybeAddCustomResourcesToResources(T theResources, List<String> theCustomResources) {
// SearchParameter base and target components require strict binding to ResourceType for dstu[2|3], R4, R4B
// and to Version Independent Resource Types for R5.
//
// To handle custom resources, we set a placeholder of type 'Resource' in the base or target component and define
// the custom resource by adding a corresponding extension with url HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE
// or HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE with the name of the custom resource.
//
// To provide a base/target list that contains both the resources and customResources, we need to remove the placeholders
// from the theResources and add theCustomResources.
if (!theCustomResources.isEmpty()){
theResources.removeAll(Collections.singleton("Resource"));
theResources.addAll(theCustomResources);
}
}
} }

View File

@ -2,11 +2,19 @@ package ca.uhn.fhir.jpa.searchparam.registry;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.context.RuntimeSearchParam; import ca.uhn.fhir.context.RuntimeSearchParam;
import ca.uhn.fhir.model.api.ExtensionDt;
import ca.uhn.fhir.model.dstu2.valueset.ConformanceResourceStatusEnum;
import ca.uhn.fhir.model.dstu2.valueset.ResourceTypeEnum;
import ca.uhn.fhir.model.dstu2.valueset.SearchParamTypeEnum;
import ca.uhn.fhir.model.primitive.StringDt;
import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum; import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum;
import ca.uhn.hapi.converters.canonical.VersionCanonicalizer; import ca.uhn.hapi.converters.canonical.VersionCanonicalizer;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r4.model.BaseResource;
import org.hl7.fhir.r4.model.Enumerations; import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.SearchParameter; import org.hl7.fhir.r4.model.SearchParameter;
import org.hl7.fhir.r4.model.StringType;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.ValueSource; import org.junit.jupiter.params.provider.ValueSource;
@ -14,15 +22,104 @@ import org.mockito.junit.jupiter.MockitoExtension;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import static ca.uhn.fhir.util.HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE;
import static ca.uhn.fhir.util.HapiExtensions.EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.contains;
import static org.hamcrest.Matchers.containsInAnyOrder; import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.not;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
@ExtendWith(MockitoExtension.class) @ExtendWith(MockitoExtension.class)
public class SearchParameterCanonicalizerTest { public class SearchParameterCanonicalizerTest {
private static final Logger ourLog = LoggerFactory.getLogger(SearchParameterCanonicalizerTest.class); private static final Logger ourLog = LoggerFactory.getLogger(SearchParameterCanonicalizerTest.class);
ca.uhn.fhir.model.dstu2.resource.SearchParameter initSearchParamDstu2(){
ca.uhn.fhir.model.dstu2.resource.SearchParameter sp = new ca.uhn.fhir.model.dstu2.resource.SearchParameter();
sp.setId("SearchParameter/meal-chef");
sp.setUrl("http://example.org/SearchParameter/meal-chef");
sp.setBase(ResourceTypeEnum.RESOURCE);
sp.setCode("chef");
sp.setType(SearchParamTypeEnum.REFERENCE);
sp.setStatus(ConformanceResourceStatusEnum.ACTIVE);
sp.setXpath("Meal.chef | Observation.subject");
sp.addTarget(ResourceTypeEnum.RESOURCE);
sp.addTarget(ResourceTypeEnum.OBSERVATION);
sp.addUndeclaredExtension(new ExtensionDt(false, EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE, new StringDt("Meal")));
sp.addUndeclaredExtension(new ExtensionDt(false, EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE, new StringDt("Chef")));
return sp;
}
org.hl7.fhir.dstu3.model.SearchParameter initSearchParamDstu3(){
org.hl7.fhir.dstu3.model.SearchParameter sp = new org.hl7.fhir.dstu3.model.SearchParameter();
sp.setId("SearchParameter/meal-chef");
sp.setUrl("http://example.org/SearchParameter/meal-chef");
sp.addBase("Resource");
sp.addBase("Patient");
sp.setCode("chef");
sp.setType(org.hl7.fhir.dstu3.model.Enumerations.SearchParamType.REFERENCE);
sp.setStatus(org.hl7.fhir.dstu3.model.Enumerations.PublicationStatus.ACTIVE);
sp.setExpression("Meal.chef | Observation.subject");
sp.addTarget("Resource");
sp.addTarget("Observation");
sp.addExtension(EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE, new org.hl7.fhir.dstu3.model.StringType("Meal"));
sp.addExtension(EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE, new org.hl7.fhir.dstu3.model.StringType("Chef"));
return sp;
}
IBaseResource initSearchParamR4(){
SearchParameter sp = new SearchParameter();
sp.setId("SearchParameter/meal-chef");
sp.setUrl("http://example.org/SearchParameter/meal-chef");
sp.addBase("Resource");
sp.addBase("Patient");
sp.setCode("chef");
sp.setType(Enumerations.SearchParamType.REFERENCE);
sp.setStatus(Enumerations.PublicationStatus.ACTIVE);
sp.setExpression("Meal.chef | Observation.subject");
sp.addTarget("Resource");
sp.addTarget("Observation");
sp.addExtension(EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE, new StringType("Meal"));
sp.addExtension(EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE, new StringType("Chef"));
return sp;
}
IBaseResource initSearchParamR4B(){
org.hl7.fhir.r4b.model.SearchParameter sp = new org.hl7.fhir.r4b.model.SearchParameter();
sp.setId("SearchParameter/meal-chef");
sp.setUrl("http://example.org/SearchParameter/meal-chef");
sp.addBase("Resource");
sp.addBase("Patient");
sp.setCode("chef");
sp.setType(org.hl7.fhir.r4b.model.Enumerations.SearchParamType.REFERENCE);
sp.setStatus(org.hl7.fhir.r4b.model.Enumerations.PublicationStatus.ACTIVE);
sp.setExpression("Meal.chef | Observation.subject");
sp.addTarget("Resource");
sp.addTarget("Observation");
sp.addExtension(EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE, new org.hl7.fhir.r4b.model.StringType("Meal"));
sp.addExtension(EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE, new org.hl7.fhir.r4b.model.StringType("Chef"));
return sp;
}
IBaseResource initSearchParamR5(){
org.hl7.fhir.r5.model.SearchParameter sp = new org.hl7.fhir.r5.model.SearchParameter();
sp.setId("SearchParameter/meal-chef");
sp.setUrl("http://example.org/SearchParameter/meal-chef");
sp.addBase(org.hl7.fhir.r5.model.Enumerations.VersionIndependentResourceTypesAll.RESOURCE);
sp.addBase(org.hl7.fhir.r5.model.Enumerations.VersionIndependentResourceTypesAll.PATIENT);
sp.setCode("chef");
sp.setType(org.hl7.fhir.r5.model.Enumerations.SearchParamType.REFERENCE);
sp.setStatus(org.hl7.fhir.r5.model.Enumerations.PublicationStatus.ACTIVE);
sp.setExpression("Meal.chef | Observation.subject");
sp.addTarget(org.hl7.fhir.r5.model.Enumerations.VersionIndependentResourceTypesAll.RESOURCE);
sp.addTarget(org.hl7.fhir.r5.model.Enumerations.VersionIndependentResourceTypesAll.OBSERVATION);
sp.addExtension(EXTENSION_SEARCHPARAM_CUSTOM_BASE_RESOURCE, new org.hl7.fhir.r5.model.StringType("Meal"));
sp.addExtension(EXTENSION_SEARCHPARAM_CUSTOM_TARGET_RESOURCE, new org.hl7.fhir.r5.model.StringType("Chef"));
return sp;
}
@ParameterizedTest @ParameterizedTest
@ValueSource(booleans = {false, true}) @ValueSource(booleans = {false, true})
public void testCanonicalizeSearchParameterWithCustomType(boolean theConvertToR5) { public void testCanonicalizeSearchParameterWithCustomType(boolean theConvertToR5) {
@ -37,7 +134,6 @@ public class SearchParameterCanonicalizerTest {
sp.setExpression("Meal.chef | Observation.subject"); sp.setExpression("Meal.chef | Observation.subject");
sp.addTarget("Chef"); sp.addTarget("Chef");
sp.addTarget("Observation"); sp.addTarget("Observation");
IBaseResource searchParamToCanonicalize = sp; IBaseResource searchParamToCanonicalize = sp;
SearchParameterCanonicalizer svc; SearchParameterCanonicalizer svc;
if (theConvertToR5) { if (theConvertToR5) {
@ -57,7 +153,51 @@ public class SearchParameterCanonicalizerTest {
assertThat(output.getPathsSplit(), containsInAnyOrder("Meal.chef", "Observation.subject")); assertThat(output.getPathsSplit(), containsInAnyOrder("Meal.chef", "Observation.subject"));
assertThat(output.getBase(), containsInAnyOrder("Meal", "Patient")); assertThat(output.getBase(), containsInAnyOrder("Meal", "Patient"));
assertThat(output.getTargets(), contains("Chef", "Observation")); assertThat(output.getTargets(), contains("Chef", "Observation"));
}
@ParameterizedTest
@ValueSource(strings = {"Dstu2", "Dstu3", "R4", "R4B", "R5"})
public void testCanonicalizeSearchParameterWithCustomTypeAllVersion(String version) {
SearchParameterCanonicalizer svc;
IBaseResource searchParamToCanonicalize;
switch (version){
case "Dstu2":
searchParamToCanonicalize = initSearchParamDstu2();
svc = new SearchParameterCanonicalizer(FhirContext.forDstu2Cached());
break;
case "Dstu3":
searchParamToCanonicalize = initSearchParamDstu3();
svc = new SearchParameterCanonicalizer(FhirContext.forDstu3Cached());
break;
case "R4":
searchParamToCanonicalize = initSearchParamR4();
svc = new SearchParameterCanonicalizer(FhirContext.forR4Cached());
break;
case "R4B":
searchParamToCanonicalize = initSearchParamR4B();
svc = new SearchParameterCanonicalizer(FhirContext.forR4BCached());
break;
default:
searchParamToCanonicalize = initSearchParamR5();
svc = new SearchParameterCanonicalizer(FhirContext.forR5Cached());
break;
}
RuntimeSearchParam output = svc.canonicalizeSearchParameter(searchParamToCanonicalize);
assertEquals("chef", output.getName());
assertEquals(RestSearchParameterTypeEnum.REFERENCE, output.getParamType());
assertEquals(RuntimeSearchParam.RuntimeSearchParamStatusEnum.ACTIVE, output.getStatus());
assertThat(output.getPathsSplit(), containsInAnyOrder("Meal.chef", "Observation.subject"));
// DSTU2 Resources must only have 1 base
if ("Dstu2".equals(version)){
assertThat(output.getBase(), containsInAnyOrder("Meal"));
} else {
assertThat(output.getBase(), containsInAnyOrder("Meal", "Patient"));
}
assertThat(output.getTargets(), containsInAnyOrder("Chef", "Observation"));
assertThat(output.getBase(), not(contains("Resource")));
assertThat(output.getTargets(), not(contains("Resource")));
} }
} }

View File

@ -24,17 +24,21 @@ import ca.uhn.fhir.jpa.subscription.match.deliver.email.SubscriptionDeliveringEm
import ca.uhn.fhir.jpa.subscription.match.deliver.message.SubscriptionDeliveringMessageSubscriber; import ca.uhn.fhir.jpa.subscription.match.deliver.message.SubscriptionDeliveringMessageSubscriber;
import ca.uhn.fhir.jpa.subscription.match.deliver.resthook.SubscriptionDeliveringRestHookSubscriber; import ca.uhn.fhir.jpa.subscription.match.deliver.resthook.SubscriptionDeliveringRestHookSubscriber;
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscriptionChannelType; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscriptionChannelType;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext; import org.springframework.context.ApplicationContext;
import org.springframework.messaging.MessageHandler; import org.springframework.messaging.MessageHandler;
import java.util.Optional; import java.util.Optional;
public class SubscriptionDeliveryHandlerFactory { public class SubscriptionDeliveryHandlerFactory {
protected ApplicationContext myApplicationContext;
private IEmailSender myEmailSender; private IEmailSender myEmailSender;
@Autowired public SubscriptionDeliveryHandlerFactory(ApplicationContext theApplicationContext, IEmailSender theEmailSender) {
private ApplicationContext myApplicationContext; myApplicationContext = theApplicationContext;
myEmailSender = theEmailSender;
}
protected SubscriptionDeliveringEmailSubscriber newSubscriptionDeliveringEmailSubscriber(IEmailSender theEmailSender) { protected SubscriptionDeliveringEmailSubscriber newSubscriptionDeliveringEmailSubscriber(IEmailSender theEmailSender) {
return myApplicationContext.getBean(SubscriptionDeliveringEmailSubscriber.class, theEmailSender); return myApplicationContext.getBean(SubscriptionDeliveringEmailSubscriber.class, theEmailSender);
@ -60,7 +64,4 @@ public class SubscriptionDeliveryHandlerFactory {
} }
} }
public void setEmailSender(IEmailSender theEmailSender) {
myEmailSender = theEmailSender;
}
} }

View File

@ -41,6 +41,7 @@ import ca.uhn.fhir.jpa.subscription.match.matcher.subscriber.SubscriptionRegiste
import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionLoader; import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionLoader;
import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry; import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry;
import ca.uhn.fhir.jpa.subscription.model.config.SubscriptionModelConfig; import ca.uhn.fhir.jpa.subscription.model.config.SubscriptionModelConfig;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Import; import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.Primary; import org.springframework.context.annotation.Primary;
@ -94,8 +95,8 @@ public class SubscriptionProcessorConfig {
} }
@Bean @Bean
public SubscriptionDeliveryHandlerFactory subscriptionDeliveryHandlerFactory() { public SubscriptionDeliveryHandlerFactory subscriptionDeliveryHandlerFactory(ApplicationContext theApplicationContext, IEmailSender theEmailSender) {
return new SubscriptionDeliveryHandlerFactory(); return new SubscriptionDeliveryHandlerFactory(theApplicationContext, theEmailSender);
} }
@Bean @Bean

View File

@ -25,6 +25,7 @@ import ca.uhn.fhir.jpa.subscription.match.deliver.BaseSubscriptionDeliverySubscr
import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription;
import ca.uhn.fhir.jpa.subscription.model.ResourceDeliveryMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceDeliveryMessage;
import ca.uhn.fhir.rest.api.EncodingEnum; import ca.uhn.fhir.rest.api.EncodingEnum;
import com.google.common.annotations.VisibleForTesting;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@ -103,4 +104,9 @@ public class SubscriptionDeliveringEmailSubscriber extends BaseSubscriptionDeliv
public void setEmailSender(IEmailSender theEmailSender) { public void setEmailSender(IEmailSender theEmailSender) {
myEmailSender = theEmailSender; myEmailSender = theEmailSender;
} }
@VisibleForTesting
public IEmailSender getEmailSender(){
return myEmailSender;
}
} }

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR Subscription Server
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.subscription.match.matcher.subscriber; package ca.uhn.fhir.jpa.subscription.match.matcher.subscriber;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR Subscription Server
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.topic; package ca.uhn.fhir.jpa.topic;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;

View File

@ -12,6 +12,7 @@ import ca.uhn.fhir.jpa.searchparam.config.SearchParamConfig;
import ca.uhn.fhir.jpa.searchparam.registry.ISearchParamProvider; import ca.uhn.fhir.jpa.searchparam.registry.ISearchParamProvider;
import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelFactory; import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelFactory;
import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig; import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import ca.uhn.fhir.jpa.subscription.submit.config.SubscriptionSubmitterConfig; import ca.uhn.fhir.jpa.subscription.submit.config.SubscriptionSubmitterConfig;
import ca.uhn.fhir.jpa.subscription.submit.interceptor.SubscriptionQueryValidator; import ca.uhn.fhir.jpa.subscription.submit.interceptor.SubscriptionQueryValidator;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
@ -85,6 +86,10 @@ public class DaoSubscriptionMatcherTest {
return mock(IRequestPartitionHelperSvc.class); return mock(IRequestPartitionHelperSvc.class);
} }
@Bean
public IEmailSender emailSender(){
return mock(IEmailSender.class);
}
} }
} }

View File

@ -11,6 +11,7 @@ import ca.uhn.fhir.jpa.subscription.channel.impl.LinkedBlockingChannelFactory;
import ca.uhn.fhir.jpa.subscription.channel.subscription.IChannelNamer; import ca.uhn.fhir.jpa.subscription.channel.subscription.IChannelNamer;
import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelFactory; import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelFactory;
import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig; import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import ca.uhn.fhir.jpa.subscription.module.config.MockFhirClientSearchParamProvider; import ca.uhn.fhir.jpa.subscription.module.config.MockFhirClientSearchParamProvider;
import ca.uhn.fhir.jpa.subscription.util.SubscriptionDebugLogInterceptor; import ca.uhn.fhir.jpa.subscription.util.SubscriptionDebugLogInterceptor;
import ca.uhn.fhir.model.primitive.IdDt; import ca.uhn.fhir.model.primitive.IdDt;
@ -101,5 +102,10 @@ public abstract class BaseSubscriptionTest {
public IChannelNamer channelNamer() { public IChannelNamer channelNamer() {
return (theNameComponent, theChannelSettings) -> theNameComponent; return (theNameComponent, theChannelSettings) -> theNameComponent;
} }
@Bean
public IEmailSender emailSender(){
return mock(IEmailSender.class);
}
} }
} }

View File

@ -12,6 +12,7 @@ import ca.uhn.fhir.jpa.searchparam.matcher.SearchParamMatcher;
import ca.uhn.fhir.jpa.subscription.channel.config.SubscriptionChannelConfig; import ca.uhn.fhir.jpa.subscription.channel.config.SubscriptionChannelConfig;
import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelFactory; import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelFactory;
import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig; import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import ca.uhn.fhir.jpa.subscription.match.deliver.websocket.WebsocketConnectionValidator; import ca.uhn.fhir.jpa.subscription.match.deliver.websocket.WebsocketConnectionValidator;
import ca.uhn.fhir.jpa.subscription.match.deliver.websocket.WebsocketValidationResponse; import ca.uhn.fhir.jpa.subscription.match.deliver.websocket.WebsocketValidationResponse;
import ca.uhn.fhir.jpa.subscription.match.registry.ActiveSubscription; import ca.uhn.fhir.jpa.subscription.match.registry.ActiveSubscription;
@ -140,6 +141,10 @@ public class WebsocketConnectionValidatorTest {
public IResourceChangeListenerRegistry resourceChangeListenerRegistry() { public IResourceChangeListenerRegistry resourceChangeListenerRegistry() {
return mock(IResourceChangeListenerRegistry.class, RETURNS_DEEP_STUBS); return mock(IResourceChangeListenerRegistry.class, RETURNS_DEEP_STUBS);
} }
@Bean
public IEmailSender emailSender(){
return mock(IEmailSender.class);
}
} }
} }

View File

@ -220,6 +220,15 @@ public class Batch2CoordinatorIT extends BaseJpaR4Test {
// Since there was only one chunk, the job should proceed without requiring a maintenance pass // Since there was only one chunk, the job should proceed without requiring a maintenance pass
myBatch2JobHelper.awaitJobCompletion(batchJobId); myBatch2JobHelper.awaitJobCompletion(batchJobId);
myLastStepLatch.awaitExpected(); myLastStepLatch.awaitExpected();
final List<JobInstance> jobInstances = myJobPersistence.fetchInstances(10, 0);
assertEquals(1, jobInstances.size());
final JobInstance jobInstance = jobInstances.get(0);
assertEquals(StatusEnum.COMPLETED, jobInstance.getStatus());
assertEquals(1.0, jobInstance.getProgress());
} }
private void createThreeStepReductionJob( private void createThreeStepReductionJob(
@ -361,6 +370,15 @@ public class Batch2CoordinatorIT extends BaseJpaR4Test {
testInfo + i testInfo + i
)); ));
} }
final List<JobInstance> jobInstances = myJobPersistence.fetchInstances(10, 0);
assertEquals(1, jobInstances.size());
final JobInstance jobInstance = jobInstances.get(0);
assertEquals(StatusEnum.COMPLETED, jobInstance.getStatus());
assertEquals(1.0, jobInstance.getProgress());
} }
@Test @Test

View File

@ -1,6 +1,7 @@
package ca.uhn.fhir.jpa.batch2; package ca.uhn.fhir.jpa.batch2;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.model.Batch2JobInfo;
import ca.uhn.fhir.jpa.api.model.BulkExportJobResults; import ca.uhn.fhir.jpa.api.model.BulkExportJobResults;
import ca.uhn.fhir.jpa.api.model.BulkExportParameters; import ca.uhn.fhir.jpa.api.model.BulkExportParameters;
import ca.uhn.fhir.jpa.api.svc.IBatch2JobRunner; import ca.uhn.fhir.jpa.api.svc.IBatch2JobRunner;
@ -33,6 +34,7 @@ import java.util.Collections;
import java.util.HashSet; import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Objects;
import java.util.Set; import java.util.Set;
import java.util.concurrent.BlockingQueue; import java.util.concurrent.BlockingQueue;
import java.util.concurrent.ExecutionException; import java.util.concurrent.ExecutionException;
@ -44,6 +46,7 @@ import java.util.concurrent.TimeUnit;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.emptyOrNullString; import static org.hamcrest.Matchers.emptyOrNullString;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.hasItem; import static org.hamcrest.Matchers.hasItem;
import static org.hamcrest.Matchers.not; import static org.hamcrest.Matchers.not;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
@ -181,7 +184,8 @@ public class BulkDataErrorAbuseTest extends BaseResourceProviderR4Test {
private void verifyBulkExportResults(String theInstanceId, List<String> theContainedList, List<String> theExcludedList) { private void verifyBulkExportResults(String theInstanceId, List<String> theContainedList, List<String> theExcludedList) {
// Iterate over the files // Iterate over the files
String report = myJobRunner.getJobInfo(theInstanceId).getReport(); Batch2JobInfo jobInfo = myJobRunner.getJobInfo(theInstanceId);
String report = jobInfo.getReport();
ourLog.debug("Export job {} report: {}", theInstanceId, report); ourLog.debug("Export job {} report: {}", theInstanceId, report);
if (!theContainedList.isEmpty()) { if (!theContainedList.isEmpty()) {
assertThat("report for instance " + theInstanceId + " is empty", report, not(emptyOrNullString())); assertThat("report for instance " + theInstanceId + " is empty", report, not(emptyOrNullString()));
@ -227,6 +231,10 @@ public class BulkDataErrorAbuseTest extends BaseResourceProviderR4Test {
for (String excludedString : theExcludedList) { for (String excludedString : theExcludedList) {
assertThat("export doesn't have expected ids", foundIds, not(hasItem(excludedString))); assertThat("export doesn't have expected ids", foundIds, not(hasItem(excludedString)));
} }
assertThat(jobInfo.getCombinedRecordsProcessed(), equalTo(2));
ourLog.info("Job {} ok", theInstanceId);
} }
private String startJob(BulkDataExportOptions theOptions) { private String startJob(BulkDataExportOptions theOptions) {

View File

@ -10,7 +10,7 @@ import ca.uhn.fhir.jpa.api.model.DaoMethodOutcome;
import ca.uhn.fhir.jpa.binary.api.IBinaryStorageSvc; import ca.uhn.fhir.jpa.binary.api.IBinaryStorageSvc;
import ca.uhn.fhir.jpa.binary.api.StoredDetails; import ca.uhn.fhir.jpa.binary.api.StoredDetails;
import ca.uhn.fhir.jpa.binary.provider.BinaryAccessProvider; import ca.uhn.fhir.jpa.binary.provider.BinaryAccessProvider;
import ca.uhn.fhir.mdm.util.MessageHelper; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.RestfulServer; import ca.uhn.fhir.rest.server.RestfulServer;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails; import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
@ -39,6 +39,7 @@ import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyBoolean; import static org.mockito.ArgumentMatchers.anyBoolean;
import static org.mockito.ArgumentMatchers.eq; import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.ArgumentMatchers.isNull; import static org.mockito.ArgumentMatchers.isNull;
import static org.mockito.Mockito.doReturn;
import static org.mockito.Mockito.spy; import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.times; import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verify;
@ -66,8 +67,6 @@ public class BinaryAccessProviderTest {
@Spy @Spy
protected IBinaryStorageSvc myBinaryStorageSvc; protected IBinaryStorageSvc myBinaryStorageSvc;
@Autowired @Autowired
private MessageHelper myMessageHelper;
@Autowired
private IInterceptorBroadcaster myInterceptorBroadcaster; private IInterceptorBroadcaster myInterceptorBroadcaster;
@ -157,7 +156,7 @@ public class BinaryAccessProviderTest {
} }
@Test @Test
public void testBinaryAccessRead_WithoutAttachmentId_NullData() throws IOException { public void testBinaryAccessRead_WithoutAttachmentId_NullData() {
DocumentReference docRef = new DocumentReference(); DocumentReference docRef = new DocumentReference();
DocumentReference.DocumentReferenceContentComponent content = docRef.addContent(); DocumentReference.DocumentReferenceContentComponent content = docRef.addContent();
content.getAttachment().setContentType("application/octet-stream"); content.getAttachment().setContentType("application/octet-stream");
@ -257,7 +256,7 @@ public class BinaryAccessProviderTest {
when(theServletRequest.getContentLength()).thenReturn(15); when(theServletRequest.getContentLength()).thenReturn(15);
when(myBinaryStorageSvc.shouldStoreBlob(15, docRef.getIdElement(), "Integer")).thenReturn(true); when(myBinaryStorageSvc.shouldStoreBlob(15, docRef.getIdElement(), "Integer")).thenReturn(true);
myRequestDetails.setServletRequest(theServletRequest); myRequestDetails.setServletRequest(theServletRequest);
when(myBinaryStorageSvc.storeBlob(eq(docRef.getIdElement()), isNull(), eq("Integer"), any(InputStream.class))).thenReturn(sd); doReturn(sd).when(myBinaryStorageSvc).storeBlob(eq(docRef.getIdElement()), isNull(), eq("Integer"), any(InputStream.class), any(RequestDetails.class));
myRequestDetails.setRequestContents(SOME_BYTES); myRequestDetails.setRequestContents(SOME_BYTES);
try { try {
@ -266,7 +265,7 @@ public class BinaryAccessProviderTest {
assertEquals(docRef.getId(), outcome.getIdElement().getValue()); assertEquals(docRef.getId(), outcome.getIdElement().getValue());
} catch (IOException e) { } catch (IOException e) {
} }
verify(myBinaryStorageSvc, times(1)).storeBlob(any(), any(), any(), any()); verify(myBinaryStorageSvc, times(1)).storeBlob(any(), any(), any(), any(), any(ServletRequestDetails.class));
} }
@Test @Test

View File

@ -5,6 +5,7 @@ import ca.uhn.fhir.jpa.binary.api.StoredDetails;
import ca.uhn.fhir.jpa.model.entity.BinaryStorageEntity; import ca.uhn.fhir.jpa.model.entity.BinaryStorageEntity;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test; import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.IdType;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
@ -52,7 +53,7 @@ public class DatabaseBlobBinaryStorageSvcImplTest extends BaseJpaR4Test {
ByteArrayInputStream inputStream = new ByteArrayInputStream(SOME_BYTES); ByteArrayInputStream inputStream = new ByteArrayInputStream(SOME_BYTES);
String contentType = "image/png"; String contentType = "image/png";
IdType resourceId = new IdType("Binary/123"); IdType resourceId = new IdType("Binary/123");
StoredDetails outcome = mySvc.storeBlob(resourceId, null, contentType, inputStream); StoredDetails outcome = mySvc.storeBlob(resourceId, null, contentType, inputStream, new ServletRequestDetails());
myCaptureQueriesListener.logAllQueriesForCurrentThread(); myCaptureQueriesListener.logAllQueriesForCurrentThread();
@ -105,7 +106,7 @@ public class DatabaseBlobBinaryStorageSvcImplTest extends BaseJpaR4Test {
ByteArrayInputStream inputStream = new ByteArrayInputStream(SOME_BYTES); ByteArrayInputStream inputStream = new ByteArrayInputStream(SOME_BYTES);
String contentType = "image/png"; String contentType = "image/png";
IdType resourceId = new IdType("Binary/123"); IdType resourceId = new IdType("Binary/123");
StoredDetails outcome = mySvc.storeBlob(resourceId, "ABCDEFG", contentType, inputStream); StoredDetails outcome = mySvc.storeBlob(resourceId, "ABCDEFG", contentType, inputStream, new ServletRequestDetails());
assertEquals("ABCDEFG", outcome.getBlobId()); assertEquals("ABCDEFG", outcome.getBlobId());
myCaptureQueriesListener.logAllQueriesForCurrentThread(); myCaptureQueriesListener.logAllQueriesForCurrentThread();
@ -163,7 +164,7 @@ public class DatabaseBlobBinaryStorageSvcImplTest extends BaseJpaR4Test {
ByteArrayInputStream inputStream = new ByteArrayInputStream(SOME_BYTES); ByteArrayInputStream inputStream = new ByteArrayInputStream(SOME_BYTES);
String contentType = "image/png"; String contentType = "image/png";
IdType resourceId = new IdType("Binary/123"); IdType resourceId = new IdType("Binary/123");
StoredDetails outcome = mySvc.storeBlob(resourceId, null, contentType, inputStream); StoredDetails outcome = mySvc.storeBlob(resourceId, null, contentType, inputStream, new ServletRequestDetails());
String blobId = outcome.getBlobId(); String blobId = outcome.getBlobId();
// Expunge // Expunge
@ -185,7 +186,7 @@ public class DatabaseBlobBinaryStorageSvcImplTest extends BaseJpaR4Test {
ByteArrayInputStream inputStream = new ByteArrayInputStream(SOME_BYTES); ByteArrayInputStream inputStream = new ByteArrayInputStream(SOME_BYTES);
String contentType = "image/png"; String contentType = "image/png";
IdType resourceId = new IdType("Binary/123"); IdType resourceId = new IdType("Binary/123");
StoredDetails outcome = mySvc.storeBlob(resourceId, null, contentType, inputStream); StoredDetails outcome = mySvc.storeBlob(resourceId, null, contentType, inputStream, new ServletRequestDetails());
// Right ID // Right ID
ByteArrayOutputStream capture = new ByteArrayOutputStream(); ByteArrayOutputStream capture = new ByteArrayOutputStream();

View File

@ -1,9 +1,12 @@
package ca.uhn.fhir.jpa.binstore; package ca.uhn.fhir.jpa.binstore;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.executor.InterceptorService;
import ca.uhn.fhir.jpa.binary.api.StoredDetails; import ca.uhn.fhir.jpa.binary.api.StoredDetails;
import ca.uhn.fhir.rest.server.exceptions.PayloadTooLargeException; import ca.uhn.fhir.rest.server.exceptions.PayloadTooLargeException;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import org.apache.commons.io.FileUtils; import org.apache.commons.io.FileUtils;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.IdType;
@ -34,6 +37,8 @@ public class FilesystemBinaryStorageSvcImplTest {
public void before() { public void before() {
myPath = new File("./target/fstmp"); myPath = new File("./target/fstmp");
mySvc = new FilesystemBinaryStorageSvcImpl(myPath.getAbsolutePath()); mySvc = new FilesystemBinaryStorageSvcImpl(myPath.getAbsolutePath());
mySvc.setFhirContextForTests(FhirContext.forR4Cached());
mySvc.setInterceptorBroadcasterForTests(new InterceptorService());
} }
@AfterEach @AfterEach
@ -45,7 +50,7 @@ public class FilesystemBinaryStorageSvcImplTest {
public void testStoreAndRetrieve() throws IOException { public void testStoreAndRetrieve() throws IOException {
IIdType id = new IdType("Patient/123"); IIdType id = new IdType("Patient/123");
String contentType = "image/png"; String contentType = "image/png";
StoredDetails outcome = mySvc.storeBlob(id, null, contentType, new ByteArrayInputStream(SOME_BYTES)); StoredDetails outcome = mySvc.storeBlob(id, null, contentType, new ByteArrayInputStream(SOME_BYTES), new ServletRequestDetails());
ourLog.info("Got id: {}", outcome); ourLog.info("Got id: {}", outcome);
@ -68,7 +73,7 @@ public class FilesystemBinaryStorageSvcImplTest {
IIdType id = new IdType("Patient/123"); IIdType id = new IdType("Patient/123");
String contentType = "image/png"; String contentType = "image/png";
String blobId = "ABCDEFGHIJKLMNOPQRSTUV"; String blobId = "ABCDEFGHIJKLMNOPQRSTUV";
StoredDetails outcome = mySvc.storeBlob(id, blobId, contentType, new ByteArrayInputStream(SOME_BYTES)); StoredDetails outcome = mySvc.storeBlob(id, blobId, contentType, new ByteArrayInputStream(SOME_BYTES), new ServletRequestDetails());
assertEquals(blobId, outcome.getBlobId()); assertEquals(blobId, outcome.getBlobId());
ourLog.info("Got id: {}", outcome); ourLog.info("Got id: {}", outcome);
@ -103,7 +108,7 @@ public class FilesystemBinaryStorageSvcImplTest {
public void testExpunge() throws IOException { public void testExpunge() throws IOException {
IIdType id = new IdType("Patient/123"); IIdType id = new IdType("Patient/123");
String contentType = "image/png"; String contentType = "image/png";
StoredDetails outcome = mySvc.storeBlob(id, null, contentType, new ByteArrayInputStream(SOME_BYTES)); StoredDetails outcome = mySvc.storeBlob(id, null, contentType, new ByteArrayInputStream(SOME_BYTES), new ServletRequestDetails());
ourLog.info("Got id: {}", outcome); ourLog.info("Got id: {}", outcome);
@ -129,7 +134,7 @@ public class FilesystemBinaryStorageSvcImplTest {
IIdType id = new IdType("Patient/123"); IIdType id = new IdType("Patient/123");
String contentType = "image/png"; String contentType = "image/png";
try { try {
mySvc.storeBlob(id, null, contentType, new ByteArrayInputStream(SOME_BYTES)); mySvc.storeBlob(id, null, contentType, new ByteArrayInputStream(SOME_BYTES), new ServletRequestDetails());
fail(); fail();
} catch (PayloadTooLargeException e) { } catch (PayloadTooLargeException e) {
assertEquals(Msg.code(1343) + "Binary size exceeds maximum: 5", e.getMessage()); assertEquals(Msg.code(1343) + "Binary size exceeds maximum: 5", e.getMessage());

View File

@ -9,7 +9,7 @@ import static org.junit.jupiter.api.Assertions.assertThrows;
public class NullBinaryStorageSvcImplTest { public class NullBinaryStorageSvcImplTest {
private NullBinaryStorageSvcImpl mySvc = new NullBinaryStorageSvcImpl(); private final NullBinaryStorageSvcImpl mySvc = new NullBinaryStorageSvcImpl();
@Test @Test
public void shouldStoreBlob() { public void shouldStoreBlob() {
@ -18,43 +18,31 @@ public class NullBinaryStorageSvcImplTest {
@Test @Test
public void storeBlob() { public void storeBlob() {
assertThrows(UnsupportedOperationException.class, () -> { assertThrows(UnsupportedOperationException.class, () -> mySvc.storeBlob(null, null, null, null, null));
mySvc.storeBlob(null, null, null, null);
});
} }
@Test @Test
public void fetchBlobDetails() { public void fetchBlobDetails() {
assertThrows(UnsupportedOperationException.class, () -> { assertThrows(UnsupportedOperationException.class, () -> mySvc.fetchBlobDetails(null, null));
mySvc.fetchBlobDetails(null, null);
});
} }
@Test @Test
public void writeBlob() { public void writeBlob() {
assertThrows(UnsupportedOperationException.class, () -> { assertThrows(UnsupportedOperationException.class, () -> mySvc.writeBlob(null, null, null));
mySvc.writeBlob(null, null, null);
});
} }
@Test @Test
public void expungeBlob() { public void expungeBlob() {
assertThrows(UnsupportedOperationException.class, () -> { assertThrows(UnsupportedOperationException.class, () -> mySvc.expungeBlob(null, null));
mySvc.expungeBlob(null, null);
});
} }
@Test @Test
public void fetchBlob() { public void fetchBlob() {
assertThrows(UnsupportedOperationException.class, () -> { assertThrows(UnsupportedOperationException.class, () -> mySvc.fetchBlob(null, null));
mySvc.fetchBlob(null, null);
});
} }
@Test @Test
public void newBlobId() { public void newBlobId() {
assertThrows(UnsupportedOperationException.class, () -> { assertThrows(UnsupportedOperationException.class, () -> mySvc.newBlobId());
mySvc.newBlobId();
});
} }
} }

View File

@ -5,6 +5,7 @@ import ca.uhn.fhir.interceptor.model.ReadPartitionIdRequestDetails;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.model.Batch2JobInfo; import ca.uhn.fhir.jpa.api.model.Batch2JobInfo;
import ca.uhn.fhir.jpa.api.model.Batch2JobOperationResult; import ca.uhn.fhir.jpa.api.model.Batch2JobOperationResult;
import ca.uhn.fhir.jpa.api.model.BulkExportJobResults; import ca.uhn.fhir.jpa.api.model.BulkExportJobResults;
@ -98,6 +99,8 @@ public class BulkDataExportProviderTest {
private final HttpClientExtension myClient = new HttpClientExtension(); private final HttpClientExtension myClient = new HttpClientExtension();
@Mock @Mock
private IBatch2JobRunner myJobRunner; private IBatch2JobRunner myJobRunner;
@Mock
IFhirResourceDao myFhirResourceDao;
@InjectMocks @InjectMocks
private BulkDataExportProvider myProvider; private BulkDataExportProvider myProvider;
@RegisterExtension @RegisterExtension
@ -140,6 +143,8 @@ public class BulkDataExportProviderTest {
myProvider.setStorageSettings(myStorageSettings); myProvider.setStorageSettings(myStorageSettings);
DaoRegistry daoRegistry = mock(DaoRegistry.class); DaoRegistry daoRegistry = mock(DaoRegistry.class);
lenient().when(daoRegistry.getRegisteredDaoTypes()).thenReturn(Set.of("Patient", "Observation", "Encounter")); lenient().when(daoRegistry.getRegisteredDaoTypes()).thenReturn(Set.of("Patient", "Observation", "Encounter"));
lenient().when(daoRegistry.getResourceDao(anyString())).thenReturn(myFhirResourceDao);
myProvider.setDaoRegistry(daoRegistry); myProvider.setDaoRegistry(daoRegistry);
} }

View File

@ -19,12 +19,13 @@ import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.util.BulkExportUtils; import ca.uhn.fhir.jpa.util.BulkExportUtils;
import ca.uhn.fhir.parser.IParser; import ca.uhn.fhir.parser.IParser;
import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.MethodOutcome;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions; import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions;
import ca.uhn.fhir.util.BundleBuilder; import ca.uhn.fhir.util.BundleBuilder;
import ca.uhn.fhir.util.JsonUtil; import ca.uhn.fhir.util.JsonUtil;
import ca.uhn.fhir.util.SearchParameterUtil; import ca.uhn.fhir.util.UrlUtil;
import com.google.common.collect.Sets; import com.google.common.collect.Sets;
import org.apache.commons.io.Charsets; import org.apache.commons.io.Charsets;
import org.apache.commons.io.IOUtils; import org.apache.commons.io.IOUtils;
@ -43,8 +44,12 @@ import org.hl7.fhir.r4.model.Extension;
import org.hl7.fhir.r4.model.Group; import org.hl7.fhir.r4.model.Group;
import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.InstantType; import org.hl7.fhir.r4.model.InstantType;
import org.hl7.fhir.r4.model.Meta;
import org.hl7.fhir.r4.model.Observation; import org.hl7.fhir.r4.model.Observation;
import org.hl7.fhir.r4.model.Organization;
import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Practitioner;
import org.hl7.fhir.r4.model.Reference; import org.hl7.fhir.r4.model.Reference;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
@ -81,6 +86,7 @@ import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.hasSize;
import static org.hamcrest.Matchers.not; import static org.hamcrest.Matchers.not;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.fail;
@ -507,6 +513,9 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
myStorageSettings.setBulkExportFileMaximumCapacity(JpaStorageSettings.DEFAULT_BULK_EXPORT_FILE_MAXIMUM_CAPACITY); myStorageSettings.setBulkExportFileMaximumCapacity(JpaStorageSettings.DEFAULT_BULK_EXPORT_FILE_MAXIMUM_CAPACITY);
} }
// TODO reenable 4637
// Reenable when bulk exports that return no results work as expected
@Disabled
@Test @Test
public void testPatientExportIgnoresResourcesNotInPatientCompartment() { public void testPatientExportIgnoresResourcesNotInPatientCompartment() {
Patient patient = new Patient(); Patient patient = new Patient();
@ -522,6 +531,7 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
obs2.setId("obs-excluded"); obs2.setId("obs-excluded");
myObservationDao.update(obs2); myObservationDao.update(obs2);
// test
HashSet<String> types = Sets.newHashSet("Patient", "Observation"); HashSet<String> types = Sets.newHashSet("Patient", "Observation");
BulkExportJobResults bulkExportJobResults = startPatientBulkExportJobAndAwaitResults(types, new HashSet<String>(), "ha"); BulkExportJobResults bulkExportJobResults = startPatientBulkExportJobAndAwaitResults(types, new HashSet<String>(), "ha");
Map<String, List<IBaseResource>> typeToResources = convertJobResultsToResources(bulkExportJobResults); Map<String, List<IBaseResource>> typeToResources = convertJobResultsToResources(bulkExportJobResults);
@ -887,30 +897,163 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
assertThat(typeToContents.get("Observation"), containsString("obs-included")); assertThat(typeToContents.get("Observation"), containsString("obs-included"));
assertThat(typeToContents.get("Observation"), not(containsString("obs-excluded"))); assertThat(typeToContents.get("Observation"), not(containsString("obs-excluded")));
}
@Test
public void testGroupBulkExportWithTypeFilter_ReturnsOnlyResourcesInTypeFilter() {
// setup
IParser parser = myFhirContext.newJsonParser();
{
String patientStr = """
{
"resourceType": "Patient",
"id": "f201"
}
""";
Patient patient = parser.parseResource(Patient.class, patientStr);
myClient.update().resource(patient).execute();
}
{
String practitionerStr = """
{
"resourceType": "Practitioner",
"id": "f201"
}
""";
Practitioner practitioner = parser.parseResource(Practitioner.class, practitionerStr);
myClient.update().resource(practitioner).execute();
}
{
String orgString = """
{
"resourceType": "Organization",
"id": "f201"
}
""";
Organization organization = parser.parseResource(Organization.class, orgString);
myClient.update().resource(organization).execute();
}
{
String bundleStr = """
{
"resourceType": "Bundle",
"id": "bundle-transaction",
"meta": {
"lastUpdated": "2021-04-19T20:24:48.194+00:00"
},
"type": "transaction",
"entry": [
{
"fullUrl": "http://example.org/fhir/Encounter/E1",
"resource": {
"resourceType": "Encounter",
"id": "E1",
"subject": {
"reference": "Patient/f201",
"display": "Roel"
},
"participant": [
{
"individual": {
"reference": "Practitioner/f201"
}
}
],
"serviceProvider": {
"reference": "Organization/f201"
}
},
"request": {
"method": "PUT",
"url": "Encounter/E1"
}
},
{
"fullUrl": "http://example.org/fhir/Encounter/E2",
"resource": {
"resourceType": "Encounter",
"id": "E2",
"subject": {
"reference": "Patient/f201",
"display": "Roel"
},
"participant": [
{
"individual": {
"reference": "Practitioner/f201"
}
}
],
"serviceProvider": {
"reference": "Organization/f201"
}
},
"request": {
"method": "PUT",
"url": "Encounter/A2"
}
},
{
"fullUrl": "http://example.org/fhir/Group/G3",
"resource": {
"resourceType": "Group",
"id": "G3",
"text": {
"status": "additional"
},
"type": "person",
"actual": true,
"member": [
{
"entity": {
"reference": "Patient/f201"
},
"period": {
"start": "2021-01-01"
}
},
{
"entity": {
"reference": "Patient/f201"
},
"period": {
"start": "2021-01-01"
}
}
]
},
"request": {
"method": "PUT",
"url": "Group/G3"
}
}
]
}
""";
Bundle bundle = parser.parseResource(Bundle.class, bundleStr);
myClient.transaction().withBundle(bundle).execute();
}
// test
HashSet<String> resourceTypes = Sets.newHashSet("Encounter");
BulkExportJobResults results = startGroupBulkExportJobAndAwaitCompletion(
resourceTypes,
new HashSet<>(),
"G3" // ID from Transaction Bundle
);
Map<String, List<IBaseResource>> stringListMap = convertJobResultsToResources(results);
assertFalse(stringListMap.containsKey("Organization"), String.join(",", stringListMap.keySet()));
assertFalse(stringListMap.containsKey("Patient"), String.join(",", stringListMap.keySet()));
assertTrue(stringListMap.containsKey("Encounter"), String.join(",", stringListMap.keySet()));
assertThat(stringListMap.get("Encounter"), hasSize(2));
} }
@Test @Test
public void testGroupBulkExportWithTypeFilter() { public void testGroupBulkExportWithTypeFilter() {
// Create some resources // Create some resources
Patient patient = new Patient(); Group g = createGroupWithPatients();
patient.setId("PF"); String groupId = g.getIdPart();
patient.setGender(Enumerations.AdministrativeGender.FEMALE);
patient.setActive(true);
myClient.update().resource(patient).execute();
patient = new Patient();
patient.setId("PM");
patient.setGender(Enumerations.AdministrativeGender.MALE);
patient.setActive(true);
myClient.update().resource(patient).execute();
Group group = new Group();
group.setId("Group/G");
group.setActive(true);
group.addMember().getEntity().setReference("Patient/PF");
group.addMember().getEntity().setReference("Patient/PM");
myClient.update().resource(group).execute();
//Create an observation for each patient //Create an observation for each patient
Observation femaleObs = new Observation(); Observation femaleObs = new Observation();
@ -923,9 +1066,11 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
maleObs.setId("obs-male"); maleObs.setId("obs-male");
myClient.update().resource(maleObs).execute(); myClient.update().resource(maleObs).execute();
// test
HashSet<String> resourceTypes = Sets.newHashSet("Observation", "Patient"); HashSet<String> resourceTypes = Sets.newHashSet("Observation", "Patient");
HashSet<String> filters = Sets.newHashSet("Patient?gender=female"); HashSet<String> filters = Sets.newHashSet("Patient?gender=female");
BulkExportJobResults results = startGroupBulkExportJobAndAwaitCompletion(resourceTypes, filters, "G"); BulkExportJobResults results = startGroupBulkExportJobAndAwaitCompletion(resourceTypes, filters, groupId);
Map<String, List<IBaseResource>> stringListMap = convertJobResultsToResources(results); Map<String, List<IBaseResource>> stringListMap = convertJobResultsToResources(results);
assertThat(stringListMap.get("Observation"), hasSize(1)); assertThat(stringListMap.get("Observation"), hasSize(1));
assertThat(stringListMap.get("Patient"), hasSize(1)); assertThat(stringListMap.get("Patient"), hasSize(1));
@ -978,10 +1123,8 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
coverage.setId("coverage-female"); coverage.setId("coverage-female");
myClient.update().resource(coverage).execute(); myClient.update().resource(coverage).execute();
HashSet<String> resourceTypes = Sets.newHashSet(SearchParameterUtil.getAllResourceTypesThatAreInPatientCompartment(myFhirContext));
HashSet<String> filters = Sets.newHashSet(); HashSet<String> filters = Sets.newHashSet();
BulkExportJobResults results = startGroupBulkExportJobAndAwaitCompletion(resourceTypes, filters, "G"); BulkExportJobResults results = startGroupBulkExportJobAndAwaitCompletion(new HashSet<>(), filters, "G");
Map<String, List<IBaseResource>> typeToResource = convertJobResultsToResources(results); Map<String, List<IBaseResource>> typeToResource = convertJobResultsToResources(results);
assertThat(typeToResource.keySet(), hasSize(4)); assertThat(typeToResource.keySet(), hasSize(4));
assertThat(typeToResource.get("Group"), hasSize(1)); assertThat(typeToResource.get("Group"), hasSize(1));
@ -1053,7 +1196,6 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
coverage.setId("coverage-included"); coverage.setId("coverage-included");
myClient.update().resource(coverage).execute(); myClient.update().resource(coverage).execute();
HashSet<String> resourceTypes = Sets.newHashSet("Observation", "Coverage"); HashSet<String> resourceTypes = Sets.newHashSet("Observation", "Coverage");
BulkExportJobResults bulkExportJobResults = startGroupBulkExportJobAndAwaitCompletion(resourceTypes, new HashSet<>(), "G2"); BulkExportJobResults bulkExportJobResults = startGroupBulkExportJobAndAwaitCompletion(resourceTypes, new HashSet<>(), "G2");
@ -1159,6 +1301,29 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
} }
private Group createGroupWithPatients() {
Patient patient = new Patient();
patient.setId("PF");
patient.setGender(Enumerations.AdministrativeGender.FEMALE);
patient.setActive(true);
myClient.update().resource(patient).execute();
patient = new Patient();
patient.setId("PM");
patient.setGender(Enumerations.AdministrativeGender.MALE);
patient.setActive(true);
myClient.update().resource(patient).execute();
Group group = new Group();
group.setId("Group/G");
group.setActive(true);
group.addMember().getEntity().setReference("Patient/PF");
group.addMember().getEntity().setReference("Patient/PM");
myClient.update().resource(group).execute();
return group;
}
private Map<String, String> convertJobResultsToStringContents(BulkExportJobResults theResults) { private Map<String, String> convertJobResultsToStringContents(BulkExportJobResults theResults) {
Map<String, String> typeToResources = new HashMap<>(); Map<String, String> typeToResources = new HashMap<>();
for (Map.Entry<String, List<String>> entry : theResults.getResourceTypeToBinaryIds().entrySet()) { for (Map.Entry<String, List<String>> entry : theResults.getResourceTypeToBinaryIds().entrySet()) {
@ -1206,29 +1371,91 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
return startBulkExportJobAndAwaitCompletion(BulkDataExportOptions.ExportStyle.SYSTEM, theResourceTypes, theFilters, null); return startBulkExportJobAndAwaitCompletion(BulkDataExportOptions.ExportStyle.SYSTEM, theResourceTypes, theFilters, null);
} }
BulkExportJobResults startBulkExportJobAndAwaitCompletion(BulkDataExportOptions.ExportStyle theExportStyle, Set<String> theResourceTypes, Set<String> theFilters, String theGroupOrPatientId) { BulkExportJobResults startBulkExportJobAndAwaitCompletion(
BulkDataExportOptions options = new BulkDataExportOptions(); BulkDataExportOptions.ExportStyle theExportStyle,
options.setResourceTypes(theResourceTypes); Set<String> theResourceTypes,
options.setFilters(theFilters); Set<String> theFilters,
options.setExportStyle(theExportStyle); String theGroupOrPatientId
) {
Parameters parameters = new Parameters();
parameters.addParameter(JpaConstants.PARAM_EXPORT_OUTPUT_FORMAT, Constants.CT_FHIR_NDJSON);
if (theFilters != null && !theFilters.isEmpty()) {
for (String typeFilter : theFilters) {
parameters.addParameter(
JpaConstants.PARAM_EXPORT_TYPE_FILTER,
typeFilter
);
}
}
if (theResourceTypes != null && !theResourceTypes.isEmpty()) {
parameters.addParameter(
JpaConstants.PARAM_EXPORT_TYPE,
String.join(",", theResourceTypes)
);
}
MethodOutcome outcome;
if (theExportStyle == BulkDataExportOptions.ExportStyle.GROUP) { if (theExportStyle == BulkDataExportOptions.ExportStyle.GROUP) {
options.setGroupId(new IdType("Group", theGroupOrPatientId));
} outcome = myClient
if (theExportStyle == BulkDataExportOptions.ExportStyle.PATIENT && theGroupOrPatientId != null) { .operation()
.onInstance("Group/" + theGroupOrPatientId)
.named(JpaConstants.OPERATION_EXPORT)
.withParameters(parameters)
.returnMethodOutcome()
.withAdditionalHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC)
.execute();
} else if (theExportStyle == BulkDataExportOptions.ExportStyle.PATIENT && theGroupOrPatientId != null) {
//TODO add support for this actual processor. //TODO add support for this actual processor.
//options.setPatientId(new IdType("Patient", theGroupOrPatientId)); fail("Bulk Exports that return no data do not return");
outcome = myClient
.operation()
.onInstance("Patient/" + theGroupOrPatientId)
.named(JpaConstants.OPERATION_EXPORT)
.withParameters(parameters)
.returnMethodOutcome()
.withAdditionalHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC)
.execute();
} else {
// system request
outcome = myClient
.operation()
.onServer()
.named(JpaConstants.OPERATION_EXPORT)
.withParameters(parameters)
.returnMethodOutcome()
.withAdditionalHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC)
.execute();
} }
options.setOutputFormat(Constants.CT_FHIR_NDJSON); assertNotNull(outcome);
assertEquals(202, outcome.getResponseStatusCode());
String pollLocation = null;
for (String header : outcome.getResponseHeaders().keySet()) {
// headers are in lowercase
// constants are in Pascal Case
// :(
if (header.equalsIgnoreCase(Constants.HEADER_CONTENT_LOCATION)) {
pollLocation = outcome.getResponseHeaders().get(header).get(0);
break;
}
}
assertNotNull(pollLocation);
UrlUtil.UrlParts parts = UrlUtil.parseUrl(pollLocation);
assertTrue(isNotBlank(parts.getParams()));
Map<String, String[]> queryParams = UrlUtil.parseQueryString(parts.getParams());
assertTrue(queryParams.containsKey(JpaConstants.PARAM_EXPORT_POLL_STATUS_JOB_ID));
String jobInstanceId = queryParams.get(JpaConstants.PARAM_EXPORT_POLL_STATUS_JOB_ID)[0];
Batch2JobStartResponse startResponse = myJobRunner.startNewJob(BulkExportUtils.createBulkExportJobParametersFromExportOptions(options)); assertNotNull(jobInstanceId);
assertNotNull(startResponse); myBatch2JobHelper.awaitJobCompletion(jobInstanceId, 60);
myBatch2JobHelper.awaitJobCompletion(startResponse.getInstanceId(), 60); await().atMost(300, TimeUnit.SECONDS).until(() -> myJobRunner.getJobInfo(jobInstanceId).getReport() != null);
await().atMost(300, TimeUnit.SECONDS).until(() -> myJobRunner.getJobInfo(startResponse.getInstanceId()).getReport() != null); String report = myJobRunner.getJobInfo(jobInstanceId).getReport();
String report = myJobRunner.getJobInfo(startResponse.getInstanceId()).getReport();
BulkExportJobResults results = JsonUtil.deserialize(report, BulkExportJobResults.class); BulkExportJobResults results = JsonUtil.deserialize(report, BulkExportJobResults.class);
return results; return results;
} }

View File

@ -6,11 +6,15 @@ import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test; import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import ca.uhn.fhir.model.api.Include; import ca.uhn.fhir.model.api.Include;
import ca.uhn.fhir.rest.api.server.IBundleProvider; import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.rest.param.DateParam;
import ca.uhn.fhir.rest.param.DateRangeParam;
import ca.uhn.fhir.rest.param.ParamPrefixEnum;
import ca.uhn.fhir.rest.param.TokenParam; import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.rest.server.SimpleBundleProvider; import ca.uhn.fhir.rest.server.SimpleBundleProvider;
import org.hamcrest.Matcher; import org.hamcrest.Matcher;
import org.hamcrest.Matchers; import org.hamcrest.Matchers;
import org.hamcrest.collection.IsIterableContainingInAnyOrder; import org.hamcrest.collection.IsIterableContainingInAnyOrder;
import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.BodyStructure; import org.hl7.fhir.r4.model.BodyStructure;
import org.hl7.fhir.r4.model.CarePlan; import org.hl7.fhir.r4.model.CarePlan;
import org.hl7.fhir.r4.model.Enumerations; import org.hl7.fhir.r4.model.Enumerations;
@ -22,7 +26,11 @@ import org.hl7.fhir.r4.model.Reference;
import org.hl7.fhir.r4.model.SearchParameter; import org.hl7.fhir.r4.model.SearchParameter;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.springframework.transaction.support.TransactionTemplate;
import java.sql.Date;
import java.time.Instant;
import java.time.temporal.ChronoUnit;
import java.util.Collection; import java.util.Collection;
import java.util.List; import java.util.List;
import java.util.stream.Collectors; import java.util.stream.Collectors;
@ -263,4 +271,43 @@ public class FhirResourceDaoR4SearchIncludeTest extends BaseJpaR4Test {
myCarePlanDao.update(carePlan); myCarePlanDao.update(carePlan);
} }
} }
/**
* https://github.com/hapifhir/hapi-fhir/issues/4896
*/
@Test
void testLastUpdatedDoesNotApplyToForwardOrRevIncludes() {
// given
Instant now = Instant.now();
IIdType org = createOrganization();
IIdType patId = createPatient(withReference("managingOrganization", org));
IIdType groupId = createGroup(withGroupMember(patId));
IIdType careTeam = createResource("CareTeam", withSubject(patId));
// backdate the Group and CareTeam
int updatedCount = new TransactionTemplate(myTxManager).execute((status)->
myEntityManager
.createQuery("update ResourceTable set myUpdated = :new_updated where myId in (:target_ids)")
.setParameter("new_updated", Date.from(now.minus(1, ChronoUnit.HOURS)))
.setParameter("target_ids", List.of(groupId.getIdPartAsLong(), careTeam.getIdPartAsLong(), org.getIdPartAsLong()))
.executeUpdate());
assertEquals(3, updatedCount, "backdated the Organization, CareTeam and Group");
// when
// "Patient?_lastUpdated=gt2023-01-01&_revinclude=Group:member&_revinclude=CareTeam:subject&_include=Patient:organization");
SearchParameterMap map = new SearchParameterMap();
map.setLastUpdated(new DateRangeParam(new DateParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, Date.from(now))));
map.addInclude(new Include("Patient:organization"));
map.addRevInclude(new Include("Group:member"));
map.addRevInclude(new Include("CareTeam:subject"));
IBundleProvider outcome = myPatientDao.search(map, mySrd);
List<String> ids = toUnqualifiedVersionlessIdValues(outcome);
// then
assertThat(ids, Matchers.containsInAnyOrder(patId.getValue(), groupId.getValue(), careTeam.getValue(), org.getValue()));
}
} }

View File

@ -678,14 +678,14 @@ public class FhirResourceDaoR4ValidateTest extends BaseJpaR4Test {
obs.getCode().getCoding().clear(); obs.getCode().getCoding().clear();
obs.getCategory().clear(); obs.getCategory().clear();
obs.getCategoryFirstRep().addCoding().setSystem("http://terminology.hl7.org/CodeSystem/observation-category").setCode("vital-signs"); obs.getCategoryFirstRep().addCoding().setSystem("http://terminology.hl7.org/CodeSystem/observation-category").setCode("vital-signs");
obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("CODE4").setDisplay("Display 3"); obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("CODE4").setDisplay("Display 4");
oo = validateAndReturnOutcome(obs); oo = validateAndReturnOutcome(obs);
assertEquals("No issues detected during validation", oo.getIssueFirstRep().getDiagnostics(), encode(oo)); assertEquals("No issues detected during validation", oo.getIssueFirstRep().getDiagnostics(), encode(oo));
myCaptureQueriesListener.logSelectQueriesForCurrentThread(); myCaptureQueriesListener.logSelectQueriesForCurrentThread();
myCaptureQueriesListener.clear(); myCaptureQueriesListener.clear();
obs.getText().setStatus(Narrative.NarrativeStatus.GENERATED); obs.getText().setStatus(Narrative.NarrativeStatus.GENERATED);
obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("CODE4").setDisplay("Display 3"); obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("CODE4").setDisplay("Display 4");
oo = validateAndReturnOutcome(obs); oo = validateAndReturnOutcome(obs);
assertEquals("No issues detected during validation", oo.getIssueFirstRep().getDiagnostics(), encode(oo)); assertEquals("No issues detected during validation", oo.getIssueFirstRep().getDiagnostics(), encode(oo));
myCaptureQueriesListener.logSelectQueriesForCurrentThread(); myCaptureQueriesListener.logSelectQueriesForCurrentThread();
@ -737,7 +737,7 @@ public class FhirResourceDaoR4ValidateTest extends BaseJpaR4Test {
obs.setStatus(ObservationStatus.FINAL); obs.setStatus(ObservationStatus.FINAL);
obs.setValue(new StringType("This is the value")); obs.setValue(new StringType("This is the value"));
obs.getText().setStatus(Narrative.NarrativeStatus.GENERATED); obs.getText().setStatus(Narrative.NarrativeStatus.GENERATED);
obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("123-4").setDisplay("Display 3"); obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("123-4").setDisplay("Code 123 4");
OperationOutcome oo; OperationOutcome oo;
@ -807,7 +807,7 @@ public class FhirResourceDaoR4ValidateTest extends BaseJpaR4Test {
obs.setStatus(ObservationStatus.FINAL); obs.setStatus(ObservationStatus.FINAL);
obs.setValue(new StringType("This is the value")); obs.setValue(new StringType("This is the value"));
obs.getText().setStatus(Narrative.NarrativeStatus.GENERATED); obs.getText().setStatus(Narrative.NarrativeStatus.GENERATED);
obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("123-4").setDisplay("Display 3"); obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("123-4").setDisplay("Code 123 4");
OperationOutcome oo; OperationOutcome oo;
@ -878,7 +878,7 @@ public class FhirResourceDaoR4ValidateTest extends BaseJpaR4Test {
obs.setStatus(ObservationStatus.FINAL); obs.setStatus(ObservationStatus.FINAL);
obs.setValue(new StringType("This is the value")); obs.setValue(new StringType("This is the value"));
obs.getText().setStatus(Narrative.NarrativeStatus.GENERATED); obs.getText().setStatus(Narrative.NarrativeStatus.GENERATED);
obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("123-4").setDisplay("Display 3"); obs.getCode().getCodingFirstRep().setSystem("http://loinc.org").setCode("123-4").setDisplay("Code 123 4");
// Non-existent target // Non-existent target
obs.setSubject(new Reference("Group/123")); obs.setSubject(new Reference("Group/123"));
@ -1381,6 +1381,43 @@ public class FhirResourceDaoR4ValidateTest extends BaseJpaR4Test {
} }
@Test
public void testValidateUsingExternallyDefinedCodeMisMatchDisplay_ShouldError() {
CodeSystem codeSystem = new CodeSystem();
codeSystem.setUrl("http://foo");
codeSystem.setContent(CodeSystem.CodeSystemContentMode.NOTPRESENT);
IIdType csId = myCodeSystemDao.create(codeSystem).getId();
TermCodeSystemVersion csv = new TermCodeSystemVersion();
csv.addConcept().setCode("bar").setDisplay("Bar Code");
myTermCodeSystemStorageSvc.storeNewCodeSystemVersion(codeSystem, csv, mySrd, Collections.emptyList(), Collections.emptyList());
// Validate a resource containing this codesystem in a field with an extendable binding
Patient patient = new Patient();
patient.getText().setStatus(Narrative.NarrativeStatus.GENERATED).setDivAsString("<div>hello</div>");
patient
.addIdentifier()
.setSystem("http://example.com")
.setValue("12345")
.getType()
.addCoding()
.setSystem("http://foo")
.setCode("bar")
.setDisplay("not bar code");
MethodOutcome outcome = myPatientDao.validate(patient, null, encode(patient), EncodingEnum.JSON, ValidationModeEnum.CREATE, null, mySrd);
OperationOutcome oo = (OperationOutcome) outcome.getOperationOutcome();
ourLog.debug(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(oo));
// It would be ok for this to produce 0 issues, or just an information message too
assertEquals(2, OperationOutcomeUtil.getIssueCount(myFhirContext, oo));
assertThat(OperationOutcomeUtil.getFirstIssueDetails(myFhirContext, oo),
containsString("None of the codings provided are in the value set 'IdentifierType'"));
assertThat(OperationOutcomeUtil.getFirstIssueDetails(myFhirContext, oo),
containsString("a coding should come from this value set unless it has no suitable code (note that the validator cannot judge what is suitable) (codes = http://foo#bar)"));
assertEquals(OperationOutcome.IssueSeverity.ERROR, oo.getIssue().get(1).getSeverity());
assertThat(oo.getIssue().get(1).getDiagnostics(), containsString("Unable to validate code http://foo#bar - Concept Display "));
}
private OperationOutcome doTestValidateResourceContainingProfileDeclaration(String methodName, EncodingEnum enc) throws IOException { private OperationOutcome doTestValidateResourceContainingProfileDeclaration(String methodName, EncodingEnum enc) throws IOException {
Bundle vss = loadResourceFromClasspath(Bundle.class, "/org/hl7/fhir/r4/model/valueset/valuesets.xml"); Bundle vss = loadResourceFromClasspath(Bundle.class, "/org/hl7/fhir/r4/model/valueset/valuesets.xml");
myValueSetDao.update((ValueSet) findResourceByIdInBundle(vss, "observation-status"), mySrd); myValueSetDao.update((ValueSet) findResourceByIdInBundle(vss, "observation-status"), mySrd);

View File

@ -105,6 +105,7 @@ import java.util.Set;
import java.util.UUID; import java.util.UUID;
import java.util.concurrent.atomic.AtomicInteger; import java.util.concurrent.atomic.AtomicInteger;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import java.util.stream.IntStream;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.contains;
@ -4110,6 +4111,41 @@ public class FhirSystemDaoR4Test extends BaseJpaR4SystemTest {
} }
@Test
public void testOrganizationOver100ReferencesFromBundleNoMultipleResourcesMatchError() throws IOException {
myStorageSettings.setAllowInlineMatchUrlReferences(true);
// The bug involves a invalid Long equality comparison, so we need a generated organization ID much larger than 1.
IntStream.range(0, 150).forEach(myint -> {
Patient patient = new Patient();
patient.addIdentifier().setSystem("http://www.ghh.org/identifiers").setValue("condreftestpatid1");
myPatientDao.create(patient, mySrd);
});
final Organization organization = new Organization();
organization.addIdentifier().setSystem("https://github.com/synthetichealth/synthea")
.setValue("9395b8cb-702c-3c5d-926e-1c3524fd6560");
organization.setName("PCP1401");
myOrganizationDao.create(organization, mySrd);
// This bundle needs to have over 100 resources, each referring to the same organization above.
// If there are 100 or less, then TransactionProcessor.preFetchConditionalUrls() will work off the same Long instance for the Organization JpaId
// and the Long == Long equality comparison will work
final InputStream resourceAsStream = getClass().getResourceAsStream("/bundle-refers-to-same-organization.json");
assertNotNull(resourceAsStream);
final String input = IOUtils.toString(resourceAsStream, StandardCharsets.UTF_8);
final Bundle bundle = myFhirContext.newJsonParser().parseResource(Bundle.class, input);
try {
mySystemDao.transaction(mySrd, bundle);
} catch (PreconditionFailedException thePreconditionFailedException) {
if (thePreconditionFailedException.getMessage().contains(Msg.code(2207))) {
fail("This test has failed with HAPI-2207, exactly the condition we aim to prevent");
}
// else let the Exception bubble up
}
}
@Test @Test
public void testTransactionWithInlineMatchUrlNoMatches() throws Exception { public void testTransactionWithInlineMatchUrlNoMatches() throws Exception {
myStorageSettings.setAllowInlineMatchUrlReferences(true); myStorageSettings.setAllowInlineMatchUrlReferences(true);

View File

@ -1121,7 +1121,6 @@ public class AuthorizationInterceptorJpaR4Test extends BaseResourceProviderR4Tes
ourLog.debug(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(resp)); ourLog.debug(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(resp));
} }
@Test @Test
public void testOperationEverything_SomeIncludedResourcesNotAuthorized() { public void testOperationEverything_SomeIncludedResourcesNotAuthorized() {
Patient pt1 = new Patient(); Patient pt1 = new Patient();

View File

@ -1,5 +1,6 @@
package ca.uhn.fhir.jpa.provider.r4; package ca.uhn.fhir.jpa.provider.r4;
import ca.uhn.fhir.interceptor.api.Hook;
import ca.uhn.fhir.interceptor.api.HookParams; import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.interceptor.api.IAnonymousInterceptor; import ca.uhn.fhir.interceptor.api.IAnonymousInterceptor;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
@ -24,6 +25,7 @@ import org.apache.http.client.methods.HttpGet;
import org.apache.http.client.methods.HttpPost; import org.apache.http.client.methods.HttpPost;
import org.apache.http.entity.ByteArrayEntity; import org.apache.http.entity.ByteArrayEntity;
import org.apache.http.entity.ContentType; import org.apache.http.entity.ContentType;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.Attachment; import org.hl7.fhir.r4.model.Attachment;
import org.hl7.fhir.r4.model.Binary; import org.hl7.fhir.r4.model.Binary;
@ -49,14 +51,17 @@ import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.matchesPattern; import static org.hamcrest.Matchers.matchesPattern;
import static org.hamcrest.Matchers.notNullValue; import static org.hamcrest.Matchers.notNullValue;
import static org.hamcrest.Matchers.startsWith;
import static org.junit.jupiter.api.Assertions.assertArrayEquals; import static org.junit.jupiter.api.Assertions.assertArrayEquals;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertNull;
import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.fail;
import static org.mockito.Mockito.any; import static org.mockito.Mockito.any;
import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.doAnswer;
import static org.mockito.Mockito.eq; import static org.mockito.Mockito.eq;
import static org.mockito.Mockito.mock; import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.timeout; import static org.mockito.Mockito.timeout;
import static org.mockito.Mockito.times; import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verify;
@ -252,7 +257,7 @@ public class BinaryAccessProviderR4Test extends BaseResourceProviderR4Test {
Attachment attachment = ref.getContentFirstRep().getAttachment(); Attachment attachment = ref.getContentFirstRep().getAttachment();
assertEquals(ContentType.IMAGE_JPEG.getMimeType(), attachment.getContentType()); assertEquals(ContentType.IMAGE_JPEG.getMimeType(), attachment.getContentType());
assertEquals(15, attachment.getSize()); assertEquals(15, attachment.getSize());
assertEquals(null, attachment.getData()); assertNull(attachment.getData());
assertEquals("2", ref.getMeta().getVersionId()); assertEquals("2", ref.getMeta().getVersionId());
attachmentId = attachment.getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID); attachmentId = attachment.getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID);
assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}")); assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}"));
@ -311,7 +316,7 @@ public class BinaryAccessProviderR4Test extends BaseResourceProviderR4Test {
Attachment attachment = ref.getContentFirstRep().getAttachment(); Attachment attachment = ref.getContentFirstRep().getAttachment();
assertEquals(ContentType.IMAGE_JPEG.getMimeType(), attachment.getContentType()); assertEquals(ContentType.IMAGE_JPEG.getMimeType(), attachment.getContentType());
assertEquals(15, attachment.getSize()); assertEquals(15, attachment.getSize());
assertEquals(null, attachment.getData()); assertNull(attachment.getData());
assertEquals("2", ref.getMeta().getVersionId()); assertEquals("2", ref.getMeta().getVersionId());
attachmentId = attachment.getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID); attachmentId = attachment.getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID);
assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}")); assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}"));
@ -398,7 +403,7 @@ public class BinaryAccessProviderR4Test extends BaseResourceProviderR4Test {
assertArrayEquals(SOME_BYTES_2, attachment.getData()); assertArrayEquals(SOME_BYTES_2, attachment.getData());
assertEquals("2", ref.getMeta().getVersionId()); assertEquals("2", ref.getMeta().getVersionId());
attachmentId = attachment.getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID); attachmentId = attachment.getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID);
assertEquals(null, attachmentId); assertNull(attachmentId);
} }
@ -447,7 +452,7 @@ public class BinaryAccessProviderR4Test extends BaseResourceProviderR4Test {
Binary target = myFhirContext.newJsonParser().parseResource(Binary.class, response); Binary target = myFhirContext.newJsonParser().parseResource(Binary.class, response);
assertEquals(ContentType.IMAGE_JPEG.getMimeType(), target.getContentType()); assertEquals(ContentType.IMAGE_JPEG.getMimeType(), target.getContentType());
assertEquals(null, target.getData()); assertNull(target.getData());
assertEquals("2", target.getMeta().getVersionId()); assertEquals("2", target.getMeta().getVersionId());
attachmentId = target.getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID); attachmentId = target.getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID);
assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}")); assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}"));
@ -512,7 +517,7 @@ public class BinaryAccessProviderR4Test extends BaseResourceProviderR4Test {
Binary target = myFhirContext.newJsonParser().parseResource(Binary.class, response); Binary target = myFhirContext.newJsonParser().parseResource(Binary.class, response);
assertEquals(ContentType.IMAGE_JPEG.getMimeType(), target.getContentType()); assertEquals(ContentType.IMAGE_JPEG.getMimeType(), target.getContentType());
assertEquals(null, target.getData()); assertNull(target.getData());
assertEquals("2", target.getMeta().getVersionId()); assertEquals("2", target.getMeta().getVersionId());
attachmentId = target.getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID); attachmentId = target.getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID);
assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}")); assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}"));
@ -538,6 +543,66 @@ public class BinaryAccessProviderR4Test extends BaseResourceProviderR4Test {
} }
static class BinaryBlobIdInterceptor {
@Hook(Pointcut.STORAGE_BINARY_ASSIGN_BLOB_ID_PREFIX)
public String provideBlobIdForBinary(RequestDetails theRequestDetails, IBaseResource theResource) {
ourLog.info("Received binary for prefixing!");
return "test-blob-id-prefix";
}
}
@Test
public void testWriteLargeBinaryToDocumentReference_callsBlobIdPrefixHook() throws IOException {
byte[] bytes = new byte[1234];
for (int i = 0; i < bytes.length; i++) {
bytes[i] = (byte) (((float)Byte.MAX_VALUE) * Math.random());
}
DocumentReference dr = new DocumentReference();
dr.addContent().getAttachment()
.setContentType("application/pdf")
.setSize(12345)
.setTitle("hello")
.setCreationElement(new DateTimeType("2002"));
IIdType id = myClient.create().resource(dr).execute().getId().toUnqualifiedVersionless();
BinaryBlobIdInterceptor interceptor = spy(new BinaryBlobIdInterceptor());
myInterceptorRegistry.registerInterceptor(interceptor);
try {
// Write using the operation
String path = myServerBase +
"/DocumentReference/" + id.getIdPart() + "/" +
JpaConstants.OPERATION_BINARY_ACCESS_WRITE +
"?path=DocumentReference.content.attachment";
HttpPost post = new HttpPost(path);
post.setEntity(new ByteArrayEntity(bytes, ContentType.IMAGE_JPEG));
post.addHeader("Accept", "application/fhir+json; _pretty=true");
String attachmentId;
try (CloseableHttpResponse resp = ourHttpClient.execute(post)) {
assertEquals(200, resp.getStatusLine().getStatusCode());
assertThat(resp.getEntity().getContentType().getValue(), containsString("application/fhir+json"));
String response = IOUtils.toString(resp.getEntity().getContent(), Constants.CHARSET_UTF8);
ourLog.info("Response: {}", response);
DocumentReference target = myFhirContext.newJsonParser().parseResource(DocumentReference.class, response);
assertNull(target.getContentFirstRep().getAttachment().getData());
assertEquals("2", target.getMeta().getVersionId());
attachmentId = target.getContentFirstRep().getAttachment().getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID);
assertThat(attachmentId, startsWith("test-blob-id-prefix"));
}
verify(interceptor, timeout(5_000).times(1)).provideBlobIdForBinary(any(), any());
verifyNoMoreInteractions(interceptor);
} finally {
myInterceptorRegistry.unregisterInterceptor(interceptor);
}
}
@Test @Test
public void testWriteLargeBinaryToDocumentReference() throws IOException { public void testWriteLargeBinaryToDocumentReference() throws IOException {
byte[] bytes = new byte[134696]; byte[] bytes = new byte[134696];
@ -577,7 +642,7 @@ public class BinaryAccessProviderR4Test extends BaseResourceProviderR4Test {
DocumentReference target = myFhirContext.newJsonParser().parseResource(DocumentReference.class, response); DocumentReference target = myFhirContext.newJsonParser().parseResource(DocumentReference.class, response);
assertEquals(null, target.getContentFirstRep().getAttachment().getData()); assertNull(target.getContentFirstRep().getAttachment().getData());
assertEquals("2", target.getMeta().getVersionId()); assertEquals("2", target.getMeta().getVersionId());
attachmentId = target.getContentFirstRep().getAttachment().getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID); attachmentId = target.getContentFirstRep().getAttachment().getDataElement().getExtensionString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID);
assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}")); assertThat(attachmentId, matchesPattern("[a-zA-Z0-9]{100}"));

View File

@ -10,10 +10,10 @@ import ca.uhn.fhir.jpa.binary.interceptor.BinaryStorageInterceptor;
import ca.uhn.fhir.jpa.binstore.MemoryBinaryStorageSvcImpl; import ca.uhn.fhir.jpa.binstore.MemoryBinaryStorageSvcImpl;
import ca.uhn.fhir.jpa.model.entity.StorageSettings; import ca.uhn.fhir.jpa.model.entity.StorageSettings;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test; import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.client.api.IClientInterceptor; import ca.uhn.fhir.rest.client.api.IClientInterceptor;
import ca.uhn.fhir.rest.client.api.IHttpRequest; import ca.uhn.fhir.rest.client.api.IHttpRequest;
import ca.uhn.fhir.rest.client.api.IHttpResponse; import ca.uhn.fhir.rest.client.api.IHttpResponse;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.util.HapiExtensions; import ca.uhn.fhir.util.HapiExtensions;
import org.hl7.fhir.instance.model.api.IBaseHasExtensions; import org.hl7.fhir.instance.model.api.IBaseHasExtensions;
@ -27,14 +27,14 @@ import org.hl7.fhir.r4.model.StringType;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.junit.jupiter.MockitoExtension;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import java.io.IOException;
import static org.hamcrest.CoreMatchers.is; import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.CoreMatchers.notNullValue; import static org.hamcrest.CoreMatchers.notNullValue;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
@ -46,7 +46,12 @@ import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertNull;
import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.fail;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
@ExtendWith(MockitoExtension.class)
public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test { public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
public static final byte[] FEW_BYTES = {4, 3, 2, 1}; public static final byte[] FEW_BYTES = {4, 3, 2, 1};
@ -57,7 +62,7 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
@Autowired @Autowired
private JpaStorageSettings myStorageSettings; private JpaStorageSettings myStorageSettings;
@Autowired @Autowired
private StorageSettings myOldStorageSettings;; private StorageSettings myOldStorageSettings;
@Autowired @Autowired
private MemoryBinaryStorageSvcImpl myStorageSvc; private MemoryBinaryStorageSvcImpl myStorageSvc;
@ -89,7 +94,7 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
myInterceptorRegistry.unregisterInterceptor(myBinaryStorageInterceptor); myInterceptorRegistry.unregisterInterceptor(myBinaryStorageInterceptor);
} }
class BinaryFilePrefixingInterceptor{ private static class BinaryFilePrefixingInterceptor{
@Hook(Pointcut.STORAGE_BINARY_ASSIGN_BLOB_ID_PREFIX) @Hook(Pointcut.STORAGE_BINARY_ASSIGN_BLOB_ID_PREFIX)
public String provideFilenameForBinary(RequestDetails theRequestDetails, IBaseResource theResource) { public String provideFilenameForBinary(RequestDetails theRequestDetails, IBaseResource theResource) {
@ -117,7 +122,7 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
DaoMethodOutcome outcome = myBinaryDao.create(binary, mySrd); DaoMethodOutcome outcome = myBinaryDao.create(binary, mySrd);
// Make sure it was externalized // Make sure it was externalized
IIdType id = outcome.getId().toUnqualifiedVersionless(); outcome.getId().toUnqualifiedVersionless();
String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome.getResource()); String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome.getResource());
ourLog.info("Encoded: {}", encoded); ourLog.info("Encoded: {}", encoded);
assertThat(encoded, containsString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID)); assertThat(encoded, containsString(HapiExtensions.EXT_EXTERNALIZED_BINARY_ID));
@ -125,6 +130,34 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
myInterceptorRegistry.unregisterInterceptor(interceptor); myInterceptorRegistry.unregisterInterceptor(interceptor);
} }
private static class BinaryBlobIdPrefixInterceptor {
@Hook(Pointcut.STORAGE_BINARY_ASSIGN_BLOB_ID_PREFIX)
public String provideBlobIdForBinary(RequestDetails theRequestDetails, IBaseResource theResource) {
ourLog.info("Received binary for prefixing!" + theResource.getIdElement());
return "prefix-test-blob-id-";
}
}
@Test
public void testExternalizingBinaryFromRequestTriggersPointcutOnce() {
BinaryBlobIdPrefixInterceptor interceptor = spy(new BinaryBlobIdPrefixInterceptor());
myInterceptorRegistry.registerInterceptor(interceptor);
// Create a resource with two metadata extensions on the binary
Binary binary = new Binary();
binary.setContentType("application/octet-stream");
binary.setData(SOME_BYTES);
DaoMethodOutcome outcome = myBinaryDao.create(binary, mySrd);
// Make sure blobId prefix was set and pointcut called only once
outcome.getId().toUnqualifiedVersionless();
String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome.getResource());
ourLog.info("Encoded: {}", encoded);
assertThat(encoded, containsString("\"valueString\": \"prefix-test-blob-id-"));
verify(interceptor, times(1)).provideBlobIdForBinary(any(), any());
myInterceptorRegistry.unregisterInterceptor(interceptor);
}
@Test @Test
public void testCreateAndRetrieveBinary_ServerAssignedId_ExternalizedBinary() { public void testCreateAndRetrieveBinary_ServerAssignedId_ExternalizedBinary() {
@ -219,7 +252,7 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
} }
class ContentTypeStrippingInterceptor implements IClientInterceptor { static class ContentTypeStrippingInterceptor implements IClientInterceptor {
@Override @Override
public void interceptRequest(IHttpRequest theRequest) { public void interceptRequest(IHttpRequest theRequest) {
@ -228,7 +261,7 @@ public class BinaryStorageInterceptorR4Test extends BaseResourceProviderR4Test {
} }
@Override @Override
public void interceptResponse(IHttpResponse theResponse) throws IOException { public void interceptResponse(IHttpResponse theResponse) {
} }
} }

View File

@ -0,0 +1,66 @@
package ca.uhn.fhir.jpa.provider.r4;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.model.primitive.StringDt;
import ca.uhn.fhir.rest.api.MethodOutcome;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.StringType;
import org.junit.jupiter.api.Test;
import static ca.uhn.fhir.jpa.model.util.JpaConstants.OPERATION_EXPORT;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.equalTo;
import static org.junit.jupiter.api.Assertions.assertThrows;
public class BulkExportProviderR4Test extends BaseResourceProviderR4Test {
@Test
void testBulkExport_groupNotExists_throws404() {
// given no data
ResourceNotFoundException e = assertThrows(ResourceNotFoundException.class,
() -> myClient
.operation().onInstance("Group/ABC_not_exist").named(OPERATION_EXPORT)
.withNoParameters(Parameters.class)
.withAdditionalHeader("Prefer", "respond-async")
.returnMethodOutcome()
.execute(),
"$export of missing Group throws 404");
assertThat(e.getStatusCode(), equalTo(404));
}
@Test
void testBulkExport_patientNotExists_throws404() {
// given no data
ResourceNotFoundException e = assertThrows(ResourceNotFoundException.class,
() -> myClient
.operation().onInstance("Patient/ABC_not_exist").named(OPERATION_EXPORT)
.withNoParameters(Parameters.class)
.withAdditionalHeader("Prefer", "respond-async")
.returnMethodOutcome()
.execute(),
"$export of missing Patient throws 404");
assertThat(e.getStatusCode(), equalTo(404));
}
@Test
void testBulkExport_typePatientIdNotExists_throws404() {
// given no data
ResourceNotFoundException e = assertThrows(ResourceNotFoundException.class,
() -> myClient
.operation().onType("Patient").named(OPERATION_EXPORT)
.withParameter(Parameters.class, "patient", new StringType("Patient/abc-no-way"))
.withAdditionalHeader("Prefer", "respond-async")
.returnMethodOutcome()
.execute(),
"Patient/$export with missing patient throws 404");
assertThat(e.getStatusCode(), equalTo(404));
}
}

View File

@ -1,12 +1,15 @@
package ca.uhn.fhir.jpa.provider.r4; package ca.uhn.fhir.jpa.provider.r4;
import ca.uhn.fhir.jpa.model.util.JpaConstants;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test; import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.parser.IParser;
import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.EncodingEnum; import ca.uhn.fhir.rest.api.EncodingEnum;
import com.google.common.base.Charsets; import com.google.common.base.Charsets;
import org.apache.commons.io.IOUtils; import org.apache.commons.io.IOUtils;
import org.apache.http.client.methods.CloseableHttpResponse; import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet; import org.apache.http.client.methods.HttpGet;
import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.Account; import org.hl7.fhir.r4.model.Account;
import org.hl7.fhir.r4.model.AdverseEvent; import org.hl7.fhir.r4.model.AdverseEvent;
import org.hl7.fhir.r4.model.AllergyIntolerance; import org.hl7.fhir.r4.model.AllergyIntolerance;
@ -45,6 +48,7 @@ import org.hl7.fhir.r4.model.FamilyMemberHistory;
import org.hl7.fhir.r4.model.Flag; import org.hl7.fhir.r4.model.Flag;
import org.hl7.fhir.r4.model.Goal; import org.hl7.fhir.r4.model.Goal;
import org.hl7.fhir.r4.model.Group; import org.hl7.fhir.r4.model.Group;
import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.ImagingStudy; import org.hl7.fhir.r4.model.ImagingStudy;
import org.hl7.fhir.r4.model.Immunization; import org.hl7.fhir.r4.model.Immunization;
import org.hl7.fhir.r4.model.ImmunizationEvaluation; import org.hl7.fhir.r4.model.ImmunizationEvaluation;
@ -62,6 +66,7 @@ import org.hl7.fhir.r4.model.MolecularSequence;
import org.hl7.fhir.r4.model.NutritionOrder; import org.hl7.fhir.r4.model.NutritionOrder;
import org.hl7.fhir.r4.model.Observation; import org.hl7.fhir.r4.model.Observation;
import org.hl7.fhir.r4.model.Organization; import org.hl7.fhir.r4.model.Organization;
import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Person; import org.hl7.fhir.r4.model.Person;
import org.hl7.fhir.r4.model.Practitioner; import org.hl7.fhir.r4.model.Practitioner;
@ -81,6 +86,7 @@ import org.hl7.fhir.r4.model.VisionPrescription;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import java.io.IOException; import java.io.IOException;
import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import java.util.TreeSet; import java.util.TreeSet;
@ -88,7 +94,9 @@ import java.util.TreeSet;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.hasItem; import static org.hamcrest.Matchers.hasItem;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertNull;
import static org.junit.jupiter.api.Assertions.assertTrue;
public class JpaPatientEverythingTest extends BaseResourceProviderR4Test { public class JpaPatientEverythingTest extends BaseResourceProviderR4Test {
@ -1626,6 +1634,168 @@ public class JpaPatientEverythingTest extends BaseResourceProviderR4Test {
assertThat(actual, hasItem(medicationAdministrationId)); assertThat(actual, hasItem(medicationAdministrationId));
} }
@Test
public void everything_typeFilterWithRecursivelyRelatedResources_shouldReturnSameAsNonTypeFilteredEverything() {
String testBundle;
{
testBundle = """
{
"resourceType": "Bundle",
"type": "transaction",
"entry": [
{
"fullUrl": "https://interop.providence.org:8000/Patient/385235",
"resource": {
"resourceType": "Patient",
"id": "385235",
"active": true,
"name": [
{
"family": "TESTING",
"given": [
"TESTER",
"T"
]
}
],
"gender": "female"
},
"request": {
"method": "POST"
}
},
{
"fullUrl": "https://interop.providence.org:8000/Encounter/385236",
"resource": {
"resourceType": "Encounter",
"id": "385236",
"subject": {
"reference": "Patient/385235"
}
},
"request": {
"method": "POST"
}
},
{
"fullUrl": "https://interop.providence.org:8000/Observation/385237",
"resource": {
"resourceType": "Observation",
"id": "385237",
"subject": {
"reference": "Patient/385235"
},
"encounter": {
"reference": "Encounter/385236"
},
"performer": [
{
"reference": "Practitioner/79070"
},
{
"reference": "Practitioner/8454"
}
],
"valueQuantity": {
"value": 100.9,
"unit": "%",
"system": "http://unitsofmeasure.org",
"code": "%"
}
},
"request": {
"method": "POST"
}
},
{
"fullUrl": "https://interop.providence.org:8000/Practitioner/8454",
"resource": {
"resourceType": "Practitioner",
"id": "8454"
},
"request": {
"method": "POST"
}
},
{
"fullUrl": "https://interop.providence.org:8000/Practitioner/79070",
"resource": {
"resourceType": "Practitioner",
"id": "79070",
"active": true
},
"request": {
"method": "POST"
}
}
]
}
""";
}
IParser parser = myFhirContext.newJsonParser();
Bundle inputBundle = parser.parseResource(Bundle.class, testBundle);
int resourceCount = inputBundle.getEntry().size();
HashSet<String> resourceTypes = new HashSet<>();
for (Bundle.BundleEntryComponent entry : inputBundle.getEntry()) {
resourceTypes.add(entry.getResource().getResourceType().name());
}
// there are 2 practitioners in the bundle
assertEquals(4, resourceTypes.size());
// pre-seed the resources
Bundle responseBundle = myClient.transaction()
.withBundle(inputBundle)
.execute();
assertNotNull(responseBundle);
assertEquals(resourceCount, responseBundle.getEntry().size());
IIdType patientId = null;
for (Bundle.BundleEntryComponent entry : responseBundle.getEntry()) {
assertEquals("201 Created", entry.getResponse().getStatus());
if (entry.getResponse().getLocation().contains("Patient")) {
patientId = new IdType(entry.getResponse().getLocation());
}
}
assertNotNull(patientId);
assertNotNull(patientId.getIdPart());
ourLog.debug("------ EVERYTHING");
// test without types filter
{
Bundle response = myClient.operation()
.onInstance(String.format("Patient/%s", patientId.getIdPart()))
.named(JpaConstants.OPERATION_EVERYTHING)
.withNoParameters(Parameters.class)
.returnResourceType(Bundle.class)
.execute();
assertNotNull(response);
assertEquals(resourceCount, response.getEntry().size());
for (Bundle.BundleEntryComponent entry : response.getEntry()) {
assertTrue(resourceTypes.contains(entry.getResource().getResourceType().name()));
}
}
ourLog.debug("------- EVERYTHING WITH TYPES");
// test with types filter
{
Parameters parameters = new Parameters();
parameters.addParameter(Constants.PARAM_TYPE, String.join(",", resourceTypes));
Bundle response = myClient.operation()
.onInstance(String.format("Patient/%s", patientId.getIdPart()))
.named(JpaConstants.OPERATION_EVERYTHING)
.withParameters(parameters)
.returnResourceType(Bundle.class)
.execute();
assertNotNull(response);
assertEquals(resourceCount, response.getEntry().size());
for (Bundle.BundleEntryComponent entry : response.getEntry()) {
assertTrue(resourceTypes.contains(entry.getResource().getResourceType().name()));
}
}
}
private Set<String> getActualEverythingResultIds(String patientId) throws IOException { private Set<String> getActualEverythingResultIds(String patientId) throws IOException {
Bundle bundle; Bundle bundle;
HttpGet get = new HttpGet(myClient.getServerBase() + "/" + patientId + "/$everything?_format=json"); HttpGet get = new HttpGet(myClient.getServerBase() + "/" + patientId + "/$everything?_format=json");

View File

@ -5,11 +5,14 @@ import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.jpa.search.SearchCoordinatorSvcImpl; import ca.uhn.fhir.jpa.search.SearchCoordinatorSvcImpl;
import ca.uhn.fhir.jpa.util.QueryParameterUtils; import ca.uhn.fhir.jpa.util.QueryParameterUtils;
import ca.uhn.fhir.parser.StrictErrorHandler; import ca.uhn.fhir.parser.StrictErrorHandler;
import ca.uhn.fhir.rest.api.CacheControlDirective;
import ca.uhn.fhir.rest.api.SearchTotalModeEnum;
import org.hl7.fhir.r4.model.Bundle; import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.ValueSource;
import org.springframework.test.util.AopTestUtils; import org.springframework.test.util.AopTestUtils;
import static org.apache.commons.lang3.StringUtils.leftPad; import static org.apache.commons.lang3.StringUtils.leftPad;
@ -44,9 +47,16 @@ public class PagingMultinodeProviderR4Test extends BaseResourceProviderR4Test {
mySearchCoordinatorSvcRaw = AopTestUtils.getTargetObject(mySearchCoordinatorSvc); mySearchCoordinatorSvcRaw = AopTestUtils.getTargetObject(mySearchCoordinatorSvc);
} }
@Test /**
public void testSearch() { *
{ * @param theUseCacheBoolean - true if we're using offset search,
* false if we're using paging id
*/
@ParameterizedTest
@ValueSource(booleans = {
true, false
})
public void testSearch(boolean theUseCacheBoolean) {
for (int i = 0; i < 100; i++) { for (int i = 0; i < 100; i++) {
Patient patient = new Patient(); Patient patient = new Patient();
String id = "A" + leftPad(Integer.toString(i), 3, '0'); String id = "A" + leftPad(Integer.toString(i), 3, '0');
@ -55,7 +65,9 @@ public class PagingMultinodeProviderR4Test extends BaseResourceProviderR4Test {
patient.addName().setFamily(id); patient.addName().setFamily(id);
myPatientDao.update(patient, mySrd).getId().toUnqualifiedVersionless(); myPatientDao.update(patient, mySrd).getId().toUnqualifiedVersionless();
} }
}
CacheControlDirective directive = new CacheControlDirective();
directive.setNoStore(theUseCacheBoolean);
Bundle found; Bundle found;
@ -63,32 +75,58 @@ public class PagingMultinodeProviderR4Test extends BaseResourceProviderR4Test {
mySearchCoordinatorSvcRaw.setSyncSizeForUnitTests(10); mySearchCoordinatorSvcRaw.setSyncSizeForUnitTests(10);
mySearchCoordinatorSvcRaw.setNeverUseLocalSearchForUnitTests(true); mySearchCoordinatorSvcRaw.setNeverUseLocalSearchForUnitTests(true);
String[][] resultsPages = new String[][]{
new String[]{"Patient/A000", "Patient/A001", "Patient/A002", "Patient/A003", "Patient/A004", "Patient/A005", "Patient/A006", "Patient/A007", "Patient/A008", "Patient/A009"},
new String[]{"Patient/A010", "Patient/A011", "Patient/A012", "Patient/A013", "Patient/A014", "Patient/A015", "Patient/A016", "Patient/A017", "Patient/A018", "Patient/A019"},
new String[]{"Patient/A020", "Patient/A021", "Patient/A022", "Patient/A023", "Patient/A024", "Patient/A025", "Patient/A026", "Patient/A027", "Patient/A028", "Patient/A029"},
new String[]{"Patient/A030", "Patient/A031", "Patient/A032", "Patient/A033", "Patient/A034", "Patient/A035", "Patient/A036", "Patient/A037", "Patient/A038", "Patient/A039"}
};
// page forward
int index = 0;
found = myClient found = myClient
.search() .search()
.forResource(Patient.class) .forResource(Patient.class)
.sort().ascending(Patient.SP_FAMILY) .sort().ascending(Patient.SP_FAMILY)
.count(10) .count(10)
.totalMode(SearchTotalModeEnum.ACCURATE)
.cacheControl(directive)
.offset(0)
.returnBundle(Bundle.class) .returnBundle(Bundle.class)
.execute(); .execute();
assertThat(toUnqualifiedVersionlessIdValues(found), contains("Patient/A000", "Patient/A001", "Patient/A002", "Patient/A003", "Patient/A004", "Patient/A005", "Patient/A006", "Patient/A007", "Patient/A008", "Patient/A009")); assertThat(toUnqualifiedVersionlessIdValues(found), contains(resultsPages[index++]));
found = myClient found = myClient
.loadPage() .loadPage()
.next(found) .next(found)
.cacheControl(directive)
.execute(); .execute();
assertThat(toUnqualifiedVersionlessIdValues(found), contains("Patient/A010", "Patient/A011", "Patient/A012", "Patient/A013", "Patient/A014", "Patient/A015", "Patient/A016", "Patient/A017", "Patient/A018", "Patient/A019")); assertThat(toUnqualifiedVersionlessIdValues(found), contains(resultsPages[index++]));
found = myClient found = myClient
.loadPage() .loadPage()
.next(found) .next(found)
.cacheControl(directive)
.execute(); .execute();
assertThat(toUnqualifiedVersionlessIdValues(found), contains("Patient/A020", "Patient/A021", "Patient/A022", "Patient/A023", "Patient/A024", "Patient/A025", "Patient/A026", "Patient/A027", "Patient/A028", "Patient/A029")); assertThat(toUnqualifiedVersionlessIdValues(found), contains(resultsPages[index++]));
found = myClient found = myClient
.loadPage() .loadPage()
.next(found) .next(found)
.cacheControl(directive)
.execute(); .execute();
assertThat(toUnqualifiedVersionlessIdValues(found), contains("Patient/A030", "Patient/A031", "Patient/A032", "Patient/A033", "Patient/A034", "Patient/A035", "Patient/A036", "Patient/A037", "Patient/A038", "Patient/A039")); assertThat(toUnqualifiedVersionlessIdValues(found), contains(resultsPages[index]));
// page backwards
while (index > 0) {
ourLog.info("Fetching back page {}", index);
found = myClient
.loadPage()
.previous(found)
.cacheControl(directive)
.execute();
assertThat(toUnqualifiedVersionlessIdValues(found), contains(resultsPages[--index]));
}
} }

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR JPA Server Test Utilities
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.dao; package ca.uhn.fhir.jpa.dao;
import ca.uhn.fhir.interceptor.api.IInterceptorService; import ca.uhn.fhir.interceptor.api.IInterceptorService;

View File

@ -46,7 +46,7 @@ import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertThrows; import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
public class QuantitySearchParameterTestCases implements ITestDataBuilder.WithSupport { public abstract class QuantitySearchParameterTestCases implements ITestDataBuilder.WithSupport {
final Support myTestDataBuilder; final Support myTestDataBuilder;
final TestDaoSearch myTestDaoSearch; final TestDaoSearch myTestDaoSearch;

View File

@ -30,6 +30,7 @@ import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.searchparam.submit.config.SearchParamSubmitterConfig; import ca.uhn.fhir.jpa.searchparam.submit.config.SearchParamSubmitterConfig;
import ca.uhn.fhir.jpa.subscription.channel.config.SubscriptionChannelConfig; import ca.uhn.fhir.jpa.subscription.channel.config.SubscriptionChannelConfig;
import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig; import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import ca.uhn.fhir.jpa.subscription.match.deliver.resthook.SubscriptionDeliveringRestHookSubscriber; import ca.uhn.fhir.jpa.subscription.match.deliver.resthook.SubscriptionDeliveringRestHookSubscriber;
import ca.uhn.fhir.jpa.subscription.submit.config.SubscriptionSubmitterConfig; import ca.uhn.fhir.jpa.subscription.submit.config.SubscriptionSubmitterConfig;
import ca.uhn.fhir.jpa.term.TermCodeSystemDeleteJobSvcWithUniTestFailures; import ca.uhn.fhir.jpa.term.TermCodeSystemDeleteJobSvcWithUniTestFailures;
@ -37,6 +38,7 @@ import ca.uhn.fhir.jpa.term.api.ITermCodeSystemDeleteJobSvc;
import ca.uhn.fhir.jpa.test.Batch2JobHelper; import ca.uhn.fhir.jpa.test.Batch2JobHelper;
import ca.uhn.fhir.jpa.test.util.StoppableSubscriptionDeliveringRestHookSubscriber; import ca.uhn.fhir.jpa.test.util.StoppableSubscriptionDeliveringRestHookSubscriber;
import ca.uhn.fhir.jpa.test.util.SubscriptionTestUtil; import ca.uhn.fhir.jpa.test.util.SubscriptionTestUtil;
import ca.uhn.fhir.jpa.util.LoggingEmailSender;
import ca.uhn.fhir.system.HapiTestSystemProperties; import ca.uhn.fhir.system.HapiTestSystemProperties;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Configuration;
@ -112,4 +114,9 @@ public class TestJPAConfig {
public IBinaryStorageSvc binaryStorage() { public IBinaryStorageSvc binaryStorage() {
return new MemoryBinaryStorageSvcImpl(); return new MemoryBinaryStorageSvcImpl();
} }
@Bean
public IEmailSender emailSender(){
return new LoggingEmailSender();
}
} }

View File

@ -15,7 +15,6 @@ import ca.uhn.fhir.test.utilities.ITestDataBuilder;
import ca.uhn.test.util.LogbackCaptureTestExtension; import ca.uhn.test.util.LogbackCaptureTestExtension;
import ch.qos.logback.classic.Level; import ch.qos.logback.classic.Level;
import org.hamcrest.MatcherAssert; import org.hamcrest.MatcherAssert;
import org.hamcrest.Matchers;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
@ -260,10 +259,6 @@ class FhirQueryRuleImplTest implements ITestDataBuilder {
@Nested @Nested
public class MisconfigurationChecks { public class MisconfigurationChecks {
// wipjv check for unsupported params during CdrAuthInterceptor scopes->perms translation.
/** /**
* in case an unsupported perm snuck through the front door. * in case an unsupported perm snuck through the front door.
* Each scope provides positive perm, so unsupported means we can't vote yes. Abstain. * Each scope provides positive perm, so unsupported means we can't vote yes. Abstain.
@ -321,7 +316,6 @@ class FhirQueryRuleImplTest implements ITestDataBuilder {
} }
} }
// wipjv how to test the difference between patient/*.rs?code=foo and patient/Observation.rs?code=foo?
// We need the builder to set AppliesTypeEnum, and the use that to build the matcher expression. // We need the builder to set AppliesTypeEnum, and the use that to build the matcher expression.
private AuthorizationInterceptor.Verdict applyRuleToResource(IBaseResource theResource) { private AuthorizationInterceptor.Verdict applyRuleToResource(IBaseResource theResource) {

View File

@ -1,5 +1,7 @@
package ca.uhn.fhir.jpa.binstore; package ca.uhn.fhir.jpa.binstore;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.interceptor.executor.InterceptorService;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@ -14,6 +16,9 @@ public class BaseBinaryStorageSvcImplTest {
@Test @Test
public void testNewRandomId() { public void testNewRandomId() {
MemoryBinaryStorageSvcImpl svc = new MemoryBinaryStorageSvcImpl(); MemoryBinaryStorageSvcImpl svc = new MemoryBinaryStorageSvcImpl();
svc.setFhirContextForTests(FhirContext.forR4Cached());
svc.setInterceptorBroadcasterForTests(new InterceptorService());
String id = svc.newBlobId(); String id = svc.newBlobId();
ourLog.info(id); ourLog.info(id);
assertThat(id, matchesPattern("^[a-zA-Z0-9]{100}$")); assertThat(id, matchesPattern("^[a-zA-Z0-9]{100}$"));

View File

@ -16,6 +16,7 @@ import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig;
import ca.uhn.fhir.jpa.subscription.match.config.WebsocketDispatcherConfig; import ca.uhn.fhir.jpa.subscription.match.config.WebsocketDispatcherConfig;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender; import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import ca.uhn.fhir.jpa.subscription.submit.config.SubscriptionSubmitterConfig; import ca.uhn.fhir.jpa.subscription.submit.config.SubscriptionSubmitterConfig;
import ca.uhn.fhir.jpa.util.LoggingEmailSender;
import ca.uhn.fhir.rest.server.interceptor.IServerInterceptor; import ca.uhn.fhir.rest.server.interceptor.IServerInterceptor;
import ca.uhn.fhir.rest.server.interceptor.LoggingInterceptor; import ca.uhn.fhir.rest.server.interceptor.LoggingInterceptor;
import ca.uhn.fhirtest.ScheduledSubscriptionDeleter; import ca.uhn.fhirtest.ScheduledSubscriptionDeleter;

View File

@ -1,15 +0,0 @@
package ca.uhn.fhirtest.config;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.EmailDetails;
import ca.uhn.fhir.jpa.subscription.match.deliver.email.IEmailSender;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class LoggingEmailSender implements IEmailSender {
private static final Logger ourLog = LoggerFactory.getLogger(LoggingEmailSender.class);
@Override
public void send(EmailDetails theDetails) {
ourLog.info("Not sending subscription email to: {}", theDetails.getTo());
}
}

View File

@ -49,6 +49,10 @@ public class MdmStorageInterceptor implements IMdmStorageInterceptor {
private static final Logger ourLog = LoggerFactory.getLogger(MdmStorageInterceptor.class); private static final Logger ourLog = LoggerFactory.getLogger(MdmStorageInterceptor.class);
// Used to bypass trying to remove mdm links associated to a resource when running mdm-clear batch job, which
// deletes all links beforehand, and impacts performance for no action
private static final ThreadLocal<Boolean> ourLinksDeletedBeforehand = ThreadLocal.withInitial(() -> Boolean.FALSE);
@Autowired @Autowired
private IExpungeEverythingService myExpungeEverythingService; private IExpungeEverythingService myExpungeEverythingService;
@Autowired @Autowired
@ -124,11 +128,14 @@ public class MdmStorageInterceptor implements IMdmStorageInterceptor {
@Hook(Pointcut.STORAGE_PRESTORAGE_RESOURCE_DELETED) @Hook(Pointcut.STORAGE_PRESTORAGE_RESOURCE_DELETED)
public void deleteMdmLinks(RequestDetails theRequest, IBaseResource theResource) { public void deleteMdmLinks(RequestDetails theRequest, IBaseResource theResource) {
if (!myMdmSettings.isSupportedMdmType(myFhirContext.getResourceType(theResource))) { if (ourLinksDeletedBeforehand.get()) {
return; return;
} }
if (myMdmSettings.isSupportedMdmType(myFhirContext.getResourceType(theResource))) {
myMdmLinkDeleteSvc.deleteWithAnyReferenceTo(theResource); myMdmLinkDeleteSvc.deleteWithAnyReferenceTo(theResource);
} }
}
private void forbidIfModifyingExternalEidOnTarget(IBaseResource theNewResource, IBaseResource theOldResource) { private void forbidIfModifyingExternalEidOnTarget(IBaseResource theNewResource, IBaseResource theOldResource) {
List<CanonicalEID> newExternalEids = Collections.emptyList(); List<CanonicalEID> newExternalEids = Collections.emptyList();
@ -219,4 +226,13 @@ public class MdmStorageInterceptor implements IMdmStorageInterceptor {
ourLog.debug("Expunging MdmLink records with reference to {}", theResource.getIdElement()); ourLog.debug("Expunging MdmLink records with reference to {}", theResource.getIdElement());
theCounter.addAndGet(myMdmLinkDeleteSvc.deleteWithAnyReferenceTo(theResource)); theCounter.addAndGet(myMdmLinkDeleteSvc.deleteWithAnyReferenceTo(theResource));
} }
public static void setLinksDeletedBeforehand() {
ourLinksDeletedBeforehand.set(Boolean.TRUE);
}
public static void resetLinksDeletedBeforehand() {
ourLinksDeletedBeforehand.remove();
}
} }

View File

@ -257,7 +257,7 @@ public class TransactionDetails {
private boolean matchUrlWithDiffIdExists(String theConditionalUrl, @Nonnull IResourcePersistentId thePersistentId) { private boolean matchUrlWithDiffIdExists(String theConditionalUrl, @Nonnull IResourcePersistentId thePersistentId) {
if (myResolvedMatchUrls.containsKey(theConditionalUrl) && myResolvedMatchUrls.get(theConditionalUrl) != NOT_FOUND) { if (myResolvedMatchUrls.containsKey(theConditionalUrl) && myResolvedMatchUrls.get(theConditionalUrl) != NOT_FOUND) {
return myResolvedMatchUrls.get(theConditionalUrl).getId() != thePersistentId.getId(); return ! myResolvedMatchUrls.get(theConditionalUrl).getId().equals(thePersistentId.getId());
} }
return false; return false;
} }

View File

@ -52,7 +52,6 @@ import java.util.Collections;
import java.util.HashSet; import java.util.HashSet;
import java.util.IdentityHashMap; import java.util.IdentityHashMap;
import java.util.List; import java.util.List;
import java.util.Objects;
import java.util.Set; import java.util.Set;
import java.util.concurrent.atomic.AtomicInteger; import java.util.concurrent.atomic.AtomicInteger;
@ -412,13 +411,36 @@ public class AuthorizationInterceptor implements IRuleApplier {
@Hook(Pointcut.STORAGE_INITIATE_BULK_EXPORT) @Hook(Pointcut.STORAGE_INITIATE_BULK_EXPORT)
public void initiateBulkExport(RequestDetails theRequestDetails, BulkDataExportOptions theBulkExportOptions, Pointcut thePointcut) { public void initiateBulkExport(RequestDetails theRequestDetails, BulkDataExportOptions theBulkExportOptions, Pointcut thePointcut) {
// RestOperationTypeEnum restOperationType = determineRestOperationTypeFromBulkExportOptions(theBulkExportOptions);
RestOperationTypeEnum restOperationType = RestOperationTypeEnum.EXTENDED_OPERATION_SERVER; RestOperationTypeEnum restOperationType = RestOperationTypeEnum.EXTENDED_OPERATION_SERVER;
if (theRequestDetails != null) { if (theRequestDetails != null) {
theRequestDetails.setAttribute(REQUEST_ATTRIBUTE_BULK_DATA_EXPORT_OPTIONS, theBulkExportOptions); theRequestDetails.setAttribute(REQUEST_ATTRIBUTE_BULK_DATA_EXPORT_OPTIONS, theBulkExportOptions);
} }
applyRulesAndFailIfDeny(restOperationType, theRequestDetails, null, null, null, thePointcut); applyRulesAndFailIfDeny(restOperationType, theRequestDetails, null, null, null, thePointcut);
} }
/**
* TODO GGG This method should eventually be used when invoking the rules applier.....however we currently rely on the incorrect
* behaviour of passing down `EXTENDED_OPERATION_SERVER`.
*/
private RestOperationTypeEnum determineRestOperationTypeFromBulkExportOptions(BulkDataExportOptions theBulkExportOptions) {
RestOperationTypeEnum restOperationType = RestOperationTypeEnum.EXTENDED_OPERATION_SERVER;
BulkDataExportOptions.ExportStyle exportStyle = theBulkExportOptions.getExportStyle();
if (exportStyle.equals(BulkDataExportOptions.ExportStyle.SYSTEM)) {
restOperationType = RestOperationTypeEnum.EXTENDED_OPERATION_SERVER;
} else if (exportStyle.equals(BulkDataExportOptions.ExportStyle.PATIENT)) {
if (theBulkExportOptions.getPatientIds().size() == 1) {
restOperationType = RestOperationTypeEnum.EXTENDED_OPERATION_INSTANCE;
} else {
restOperationType = RestOperationTypeEnum.EXTENDED_OPERATION_TYPE;
}
} else if (exportStyle.equals(BulkDataExportOptions.ExportStyle.GROUP)) {
restOperationType = RestOperationTypeEnum.EXTENDED_OPERATION_INSTANCE;
}
return restOperationType;
}
private void checkPointcutAndFailIfDeny(RequestDetails theRequestDetails, Pointcut thePointcut, @Nonnull IBaseResource theInputResource) { private void checkPointcutAndFailIfDeny(RequestDetails theRequestDetails, Pointcut thePointcut, @Nonnull IBaseResource theInputResource) {
applyRulesAndFailIfDeny(theRequestDetails.getRestOperationType(), theRequestDetails, theInputResource, theInputResource.getIdElement(), null, thePointcut); applyRulesAndFailIfDeny(theRequestDetails.getRestOperationType(), theRequestDetails, theInputResource, theInputResource.getIdElement(), null, thePointcut);
} }

View File

@ -53,6 +53,14 @@ public interface IAuthRuleBuilderRuleBulkExport {
return patientExportOnGroup(theFocusResourceId.getValue()); return patientExportOnGroup(theFocusResourceId.getValue());
} }
IAuthRuleBuilderRuleBulkExportWithTarget patientExportOnPatient(@Nonnull String theFocusResourceId);
default IAuthRuleBuilderRuleBulkExportWithTarget patientExportOnPatient(@Nonnull IIdType theFocusResourceId) {
return patientExportOnPatient(theFocusResourceId.getValue());
}
/** /**
* Allow/deny <b>patient-level</b> export rule applies to the Group with the given resource ID, e.g. <code>Group/123</code> * Allow/deny <b>patient-level</b> export rule applies to the Group with the given resource ID, e.g. <code>Group/123</code>
* *

View File

@ -836,6 +836,16 @@ public class RuleBuilder implements IAuthRuleBuilder {
return new RuleBuilderBulkExportWithTarget(rule); return new RuleBuilderBulkExportWithTarget(rule);
} }
@Override
public IAuthRuleBuilderRuleBulkExportWithTarget patientExportOnPatient(@Nonnull String theFocusResourceId) {
RuleBulkExportImpl rule = new RuleBulkExportImpl(myRuleName);
rule.setAppliesToPatientExport(theFocusResourceId);
rule.setMode(myRuleMode);
myRules.add(rule);
return new RuleBuilderBulkExportWithTarget(rule);
}
@Override @Override
public IAuthRuleBuilderRuleBulkExportWithTarget patientExportOnGroup(@Nonnull String theFocusResourceId) { public IAuthRuleBuilderRuleBulkExportWithTarget patientExportOnGroup(@Nonnull String theFocusResourceId) {
RuleBulkExportImpl rule = new RuleBulkExportImpl(myRuleName); RuleBulkExportImpl rule = new RuleBulkExportImpl(myRuleName);

View File

@ -30,6 +30,7 @@ import org.hl7.fhir.instance.model.api.IIdType;
import java.util.Collection; import java.util.Collection;
import java.util.Objects; import java.util.Objects;
import java.util.Set; import java.util.Set;
import java.util.stream.Collectors;
import static org.apache.commons.collections4.CollectionUtils.isEmpty; import static org.apache.commons.collections4.CollectionUtils.isEmpty;
import static org.apache.commons.collections4.CollectionUtils.isNotEmpty; import static org.apache.commons.collections4.CollectionUtils.isNotEmpty;
@ -37,6 +38,7 @@ import static org.apache.commons.lang3.StringUtils.isNotBlank;
public class RuleBulkExportImpl extends BaseRule { public class RuleBulkExportImpl extends BaseRule {
private String myGroupId; private String myGroupId;
private String myPatientId;
private BulkDataExportOptions.ExportStyle myWantExportStyle; private BulkDataExportOptions.ExportStyle myWantExportStyle;
private Collection<String> myResourceTypes; private Collection<String> myResourceTypes;
private boolean myWantAnyStyle; private boolean myWantAnyStyle;
@ -83,6 +85,19 @@ public class RuleBulkExportImpl extends BaseRule {
return newVerdict(theOperation, theRequestDetails, theInputResource, theInputResourceId, theOutputResource, theRuleApplier); return newVerdict(theOperation, theRequestDetails, theInputResource, theInputResourceId, theOutputResource, theRuleApplier);
} }
} }
// TODO This is a _bad bad bad implementation_ but we are out of time.
// 1. If a claimed resource ID is present in the parameters, and the permission contains one, check for membership
// 2. If not a member, Deny.
if (myWantExportStyle == BulkDataExportOptions.ExportStyle.PATIENT && isNotBlank(myPatientId) && options.getPatientIds() != null) {
String expectedPatientId = new IdDt(myPatientId).toUnqualifiedVersionless().getValue();
String actualPatientIds = options.getPatientIds().stream().map(t -> t.toUnqualifiedVersionless().getValue()).collect(Collectors.joining(","));
if (actualPatientIds.contains(expectedPatientId)) {
return newVerdict(theOperation, theRequestDetails, theInputResource, theInputResourceId, theOutputResource, theRuleApplier);
} else {
return new AuthorizationInterceptor.Verdict(PolicyEnum.DENY,this);
}
}
return null; return null;
} }
@ -96,6 +111,11 @@ public class RuleBulkExportImpl extends BaseRule {
myGroupId = theGroupId; myGroupId = theGroupId;
} }
public void setAppliesToPatientExport(String thePatientId) {
myWantExportStyle = BulkDataExportOptions.ExportStyle.PATIENT;
myPatientId = thePatientId;
}
public void setAppliesToSystem() { public void setAppliesToSystem() {
myWantExportStyle = BulkDataExportOptions.ExportStyle.SYSTEM; myWantExportStyle = BulkDataExportOptions.ExportStyle.SYSTEM;
} }

View File

@ -13,6 +13,9 @@ import org.mockito.junit.jupiter.MockitoExtension;
import java.util.HashSet; import java.util.HashSet;
import java.util.Set; import java.util.Set;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.nullValue;
import static org.hamcrest.Matchers.is;
import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNull; import static org.junit.Assert.assertNull;
import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.any;
@ -104,4 +107,56 @@ public class RuleBulkExportImplTest {
assertEquals(PolicyEnum.ALLOW, verdict.getDecision()); assertEquals(PolicyEnum.ALLOW, verdict.getDecision());
} }
@Test
public void testPatientExportRulesInBounds() {
//Given
RuleBulkExportImpl myRule = new RuleBulkExportImpl("b");
myRule.setAppliesToPatientExport("Patient/123");
myRule.setMode(PolicyEnum.ALLOW);
BulkDataExportOptions options = new BulkDataExportOptions();
options.setExportStyle(BulkDataExportOptions.ExportStyle.PATIENT);
options.setPatientIds(Set.of(new IdDt("Patient/123")));
when(myRequestDetails.getAttribute(any())).thenReturn(options);
//When
AuthorizationInterceptor.Verdict verdict = myRule.applyRule(myOperation, myRequestDetails, null, null, null, myRuleApplier, myFlags, myPointcut);
//Then: We permit the request, as a patient ID that was requested is honoured by this rule.
assertEquals(PolicyEnum.ALLOW, verdict.getDecision());
}
@Test
public void testPatientExportRulesOutOfBounds() {
//Given
RuleBulkExportImpl myRule = new RuleBulkExportImpl("b");
myRule.setAppliesToPatientExport("Patient/123");
myRule.setMode(PolicyEnum.ALLOW);
BulkDataExportOptions options = new BulkDataExportOptions();
options.setExportStyle(BulkDataExportOptions.ExportStyle.PATIENT);
options.setPatientIds(Set.of(new IdDt("Patient/456")));
when(myRequestDetails.getAttribute(any())).thenReturn(options);
//When
AuthorizationInterceptor.Verdict verdict = myRule.applyRule(myOperation, myRequestDetails, null, null, null, myRuleApplier, myFlags, myPointcut);
//Then: we should deny the request, as the requested export does not contain the patient permitted.
assertEquals(PolicyEnum.DENY, verdict.getDecision());
}
@Test
public void testPatientExportRulesOnTypeLevelExport() {
//Given
RuleBulkExportImpl myRule = new RuleBulkExportImpl("b");
myRule.setAppliesToPatientExport("Patient/123");
myRule.setMode(PolicyEnum.ALLOW);
BulkDataExportOptions options = new BulkDataExportOptions();
options.setExportStyle(BulkDataExportOptions.ExportStyle.PATIENT);
when(myRequestDetails.getAttribute(any())).thenReturn(options);
//When
AuthorizationInterceptor.Verdict verdict = myRule.applyRule(myOperation, myRequestDetails, null, null, null, myRuleApplier, myFlags, myPointcut);
//Then: We make no claims about type-level export on Patient.
assertEquals(null, verdict);
}
} }

View File

@ -160,11 +160,9 @@ public class DropIndexTask extends BaseTableTask {
@Language("SQL") String dropConstraintSql = "ALTER TABLE " + getTableName() + " DROP CONSTRAINT ?"; @Language("SQL") String dropConstraintSql = "ALTER TABLE " + getTableName() + " DROP CONSTRAINT ?";
findAndDropConstraint(findConstraintSql, dropConstraintSql); findAndDropConstraint(findConstraintSql, dropConstraintSql);
} else if (getDriverType() == DriverTypeEnum.ORACLE_12C) { } else if (getDriverType() == DriverTypeEnum.ORACLE_12C) {
@Language("SQL") String findConstraintSql = "SELECT DISTINCT constraint_name FROM user_cons_columns WHERE constraint_name = ? AND table_name = ?"; @Language("SQL") String findConstraintSql = "SELECT constraint_name FROM user_constraints WHERE constraint_name = ? AND table_name = ?";
@Language("SQL") String dropConstraintSql = "ALTER TABLE " + getTableName() + " DROP CONSTRAINT ?"; @Language("SQL") String dropConstraintSql = "ALTER TABLE " + getTableName() + " DROP CONSTRAINT ?";
findAndDropConstraint(findConstraintSql, dropConstraintSql); findAndDropConstraint(findConstraintSql, dropConstraintSql);
findConstraintSql = "SELECT DISTINCT constraint_name FROM all_constraints WHERE index_name = ? AND table_name = ?";
findAndDropConstraint(findConstraintSql, dropConstraintSql);
} else if (getDriverType() == DriverTypeEnum.MSSQL_2012) { } else if (getDriverType() == DriverTypeEnum.MSSQL_2012) {
// Legacy deletion for SQL Server unique indexes // Legacy deletion for SQL Server unique indexes
@Language("SQL") String findConstraintSql = "SELECT tc.CONSTRAINT_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS AS tc WHERE tc.CONSTRAINT_NAME = ? AND tc.TABLE_NAME = ?"; @Language("SQL") String findConstraintSql = "SELECT tc.CONSTRAINT_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS AS tc WHERE tc.CONSTRAINT_NAME = ? AND tc.TABLE_NAME = ?";

View File

@ -164,10 +164,10 @@ public class ModifyColumnTask extends BaseTableColumnTypeTask {
case ORACLE_12C: case ORACLE_12C:
@Language("SQL") String findNullableConstraintSql = @Language("SQL") String findNullableConstraintSql =
"SELECT acc.owner, acc.table_name, acc.column_name, search_condition_vc " + "SELECT acc.owner, acc.table_name, acc.column_name, search_condition_vc " +
"FROM all_cons_columns acc, all_constraints ac " + "FROM all_cons_columns acc, user_constraints uc " +
"WHERE acc.constraint_name = ac.constraint_name " + "WHERE acc.constraint_name = uc.constraint_name " +
"AND acc.table_name = ac.table_name " + "AND acc.table_name = uc.table_name " +
"AND ac.constraint_type = ? " + "AND uc.constraint_type = ? " +
"AND acc.table_name = ? " + "AND acc.table_name = ? " +
"AND acc.column_name = ? " + "AND acc.column_name = ? " +
"AND search_condition_vc = ? "; "AND search_condition_vc = ? ";
@ -176,9 +176,8 @@ public class ModifyColumnTask extends BaseTableColumnTypeTask {
params[1] = tableName.toUpperCase(); params[1] = tableName.toUpperCase();
params[2] = columnName.toUpperCase(); params[2] = columnName.toUpperCase();
params[3] = "\"" + columnName.toUpperCase() + "\" IS NOT NULL"; params[3] = "\"" + columnName.toUpperCase() + "\" IS NOT NULL";
List<Map<String, Object>> queryResults = getConnectionProperties().getTxTemplate().execute(t -> { List<Map<String, Object>> queryResults = getConnectionProperties().getTxTemplate().execute(t ->
return getConnectionProperties().newJdbcTemplate().query(findNullableConstraintSql, params, new ColumnMapRowMapper()); getConnectionProperties().newJdbcTemplate().query(findNullableConstraintSql, params, new ColumnMapRowMapper()));
});
// If this query returns a row then the existence of that row indicates that a NOT NULL constraint exists // If this query returns a row then the existence of that row indicates that a NOT NULL constraint exists
// on this Column and we must override whatever result was previously calculated and set it to false // on this Column and we must override whatever result was previously calculated and set it to false
if (queryResults != null && queryResults.size() > 0 && queryResults.get(0) != null && !queryResults.get(0).isEmpty()) { if (queryResults != null && queryResults.size() > 0 && queryResults.get(0) != null && !queryResults.get(0).isEmpty()) {

View File

@ -19,8 +19,10 @@
*/ */
package ca.uhn.fhir.batch2.jobs.config; package ca.uhn.fhir.batch2.jobs.config;
import ca.uhn.fhir.batch2.api.IJobCoordinator;
import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner; import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner;
import ca.uhn.fhir.batch2.jobs.services.Batch2JobRunnerImpl; import ca.uhn.fhir.batch2.jobs.services.Batch2JobRunnerImpl;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.api.svc.IBatch2JobRunner; import ca.uhn.fhir.jpa.api.svc.IBatch2JobRunner;
import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc;
import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import ca.uhn.fhir.jpa.searchparam.MatchUrlService;
@ -33,7 +35,7 @@ public class BatchCommonCtx {
} }
@Bean @Bean
public IBatch2JobRunner batch2JobRunner() { public IBatch2JobRunner batch2JobRunner(IJobCoordinator theJobCoordinator, FhirContext theFhirContext) {
return new Batch2JobRunnerImpl(); return new Batch2JobRunnerImpl(theJobCoordinator, theFhirContext);
} }
} }

View File

@ -71,10 +71,20 @@ public class FetchResourceIdsStep implements IFirstJobStepWorker<BulkExportJobPa
providerParams.setExpandMdm(params.isExpandMdm()); providerParams.setExpandMdm(params.isExpandMdm());
providerParams.setPartitionId(params.getPartitionId()); providerParams.setPartitionId(params.getPartitionId());
/*
* we set all the requested resource types here so that
* when we recursively fetch resource types for a given patient/group
* we don't recurse for types that they did not request
*/
providerParams.setRequestedResourceTypes(params.getResourceTypes());
int submissionCount = 0; int submissionCount = 0;
try { try {
Set<BatchResourceId> submittedBatchResourceIds = new HashSet<>(); Set<BatchResourceId> submittedBatchResourceIds = new HashSet<>();
/*
* We will fetch ids for each resource type in the ResourceTypes (_type filter).
*/
for (String resourceType : params.getResourceTypes()) { for (String resourceType : params.getResourceTypes()) {
providerParams.setResourceType(resourceType); providerParams.setResourceType(resourceType);

View File

@ -103,6 +103,9 @@ public class BulkExportJobParameters extends BulkExportJobBase {
} }
public List<String> getResourceTypes() { public List<String> getResourceTypes() {
if (myResourceTypes == null) {
myResourceTypes = new ArrayList<>();
}
return myResourceTypes; return myResourceTypes;
} }

View File

@ -25,6 +25,7 @@ import ca.uhn.fhir.batch2.jobs.export.BulkExportUtil;
import ca.uhn.fhir.batch2.jobs.export.models.BulkExportJobParameters; import ca.uhn.fhir.batch2.jobs.export.models.BulkExportJobParameters;
import ca.uhn.fhir.batch2.model.JobInstance; import ca.uhn.fhir.batch2.model.JobInstance;
import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.jpa.api.model.Batch2JobInfo; import ca.uhn.fhir.jpa.api.model.Batch2JobInfo;
import ca.uhn.fhir.jpa.api.model.Batch2JobOperationResult; import ca.uhn.fhir.jpa.api.model.Batch2JobOperationResult;
@ -35,7 +36,6 @@ import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.util.Batch2JobDefinitionConstants; import ca.uhn.fhir.util.Batch2JobDefinitionConstants;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
@ -44,8 +44,14 @@ import static org.slf4j.LoggerFactory.getLogger;
public class Batch2JobRunnerImpl implements IBatch2JobRunner { public class Batch2JobRunnerImpl implements IBatch2JobRunner {
private static final Logger ourLog = getLogger(IBatch2JobRunner.class); private static final Logger ourLog = getLogger(IBatch2JobRunner.class);
@Autowired private final IJobCoordinator myJobCoordinator;
private IJobCoordinator myJobCoordinator;
private final FhirContext myFhirContext;
public Batch2JobRunnerImpl(IJobCoordinator theJobCoordinator, FhirContext theFhirContext) {
myFhirContext = theFhirContext;
myJobCoordinator = theJobCoordinator;
}
@Override @Override
public Batch2JobStartResponse startNewJob(Batch2BaseJobParameters theParameters) { public Batch2JobStartResponse startNewJob(Batch2BaseJobParameters theParameters) {
@ -104,6 +110,7 @@ public class Batch2JobRunnerImpl implements IBatch2JobRunner {
info.setEndTime(theInstance.getEndTime()); info.setEndTime(theInstance.getEndTime());
info.setReport(theInstance.getReport()); info.setReport(theInstance.getReport());
info.setErrorMsg(theInstance.getErrorMessage()); info.setErrorMsg(theInstance.getErrorMessage());
info.setCombinedRecordsProcessed(theInstance.getCombinedRecordsProcessed());
if ( Batch2JobDefinitionConstants.BULK_EXPORT.equals(theInstance.getJobDefinitionId())) { if ( Batch2JobDefinitionConstants.BULK_EXPORT.equals(theInstance.getJobDefinitionId())) {
BulkExportJobParameters parameters = theInstance.getParameters(BulkExportJobParameters.class); BulkExportJobParameters parameters = theInstance.getParameters(BulkExportJobParameters.class);
info.setRequestPartitionId(parameters.getPartitionId()); info.setRequestPartitionId(parameters.getPartitionId());
@ -114,7 +121,8 @@ public class Batch2JobRunnerImpl implements IBatch2JobRunner {
private Batch2JobStartResponse startBatch2BulkExportJob(BulkExportParameters theParameters) { private Batch2JobStartResponse startBatch2BulkExportJob(BulkExportParameters theParameters) {
JobInstanceStartRequest request = createStartRequest(theParameters); JobInstanceStartRequest request = createStartRequest(theParameters);
request.setParameters(BulkExportJobParameters.createFromExportJobParameters(theParameters)); BulkExportJobParameters parameters = BulkExportJobParameters.createFromExportJobParameters(theParameters);
request.setParameters(parameters);
return myJobCoordinator.startInstance(request); return myJobCoordinator.startInstance(request);
} }

View File

@ -52,7 +52,7 @@
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.glassfish</groupId> <groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId> <artifactId>jakarta.el</artifactId>
</dependency> </dependency>
<!-- test --> <!-- test -->

View File

@ -121,7 +121,6 @@ public class JobCoordinatorImpl implements IJobCoordinator {
myJobParameterJsonValidator.validateJobParameters(theStartRequest, jobDefinition); myJobParameterJsonValidator.validateJobParameters(theStartRequest, jobDefinition);
IJobPersistence.CreateResult instanceAndFirstChunk = IJobPersistence.CreateResult instanceAndFirstChunk =
myTransactionService.withSystemRequest().execute(() -> myTransactionService.withSystemRequest().execute(() ->
myJobPersistence.onCreateWithFirstChunk(jobDefinition, theStartRequest.getParameters())); myJobPersistence.onCreateWithFirstChunk(jobDefinition, theStartRequest.getParameters()));

View File

@ -86,7 +86,7 @@ public class ReductionStepDataSink<PT extends IModelJson, IT extends IModelJson,
* here. Until then though, this is safer. * here. Until then though, this is safer.
*/ */
progress.updateInstance(instance); progress.updateInstanceForReductionStep(instance);
instance.setReport(dataString); instance.setReport(dataString);
instance.setStatus(StatusEnum.COMPLETED); instance.setStatus(StatusEnum.COMPLETED);

View File

@ -29,7 +29,6 @@ import ca.uhn.fhir.batch2.model.JobWorkCursor;
import ca.uhn.fhir.batch2.model.JobWorkNotification; import ca.uhn.fhir.batch2.model.JobWorkNotification;
import ca.uhn.fhir.batch2.model.StatusEnum; import ca.uhn.fhir.batch2.model.StatusEnum;
import ca.uhn.fhir.batch2.model.WorkChunkStatusEnum; import ca.uhn.fhir.batch2.model.WorkChunkStatusEnum;
import ca.uhn.fhir.batch2.progress.InstanceProgress;
import ca.uhn.fhir.batch2.progress.JobInstanceProgressCalculator; import ca.uhn.fhir.batch2.progress.JobInstanceProgressCalculator;
import ca.uhn.fhir.batch2.progress.JobInstanceStatusUpdater; import ca.uhn.fhir.batch2.progress.JobInstanceStatusUpdater;
import ca.uhn.fhir.model.api.IModelJson; import ca.uhn.fhir.model.api.IModelJson;
@ -139,14 +138,6 @@ public class JobInstanceProcessor {
if (theInstance.isFinished() && !theInstance.isWorkChunksPurged()) { if (theInstance.isFinished() && !theInstance.isWorkChunksPurged()) {
myJobPersistence.deleteChunksAndMarkInstanceAsChunksPurged(theInstance.getInstanceId()); myJobPersistence.deleteChunksAndMarkInstanceAsChunksPurged(theInstance.getInstanceId());
// update final statistics.
// wipmb For 6.8 - do we need to run stats again? If the status changed to finished, then we just ran them above.
InstanceProgress progress = myJobInstanceProgressCalculator.calculateInstanceProgress(theInstance.getInstanceId());
myJobPersistence.updateInstance(theInstance.getInstanceId(), instance->{
progress.updateInstance(instance);
return true;
});
} }
} }

View File

@ -77,7 +77,7 @@ import java.util.concurrent.TimeUnit;
* </p> * </p>
*/ */
public class JobMaintenanceServiceImpl implements IJobMaintenanceService, IHasScheduledJobs { public class JobMaintenanceServiceImpl implements IJobMaintenanceService, IHasScheduledJobs {
private static final Logger ourLog = Logs.getBatchTroubleshootingLog(); static final Logger ourLog = Logs.getBatchTroubleshootingLog();
public static final int INSTANCES_PER_PASS = 100; public static final int INSTANCES_PER_PASS = 100;
public static final String SCHEDULED_JOB_ID = JobMaintenanceScheduledJob.class.getName(); public static final String SCHEDULED_JOB_ID = JobMaintenanceScheduledJob.class.getName();
@ -218,6 +218,7 @@ public class JobMaintenanceServiceImpl implements IJobMaintenanceService, IHasSc
for (JobInstance instance : instances) { for (JobInstance instance : instances) {
String instanceId = instance.getInstanceId(); String instanceId = instance.getInstanceId();
if (myJobDefinitionRegistry.getJobDefinition(instance.getJobDefinitionId(),instance.getJobDefinitionVersion()).isPresent()) {
if (processedInstanceIds.add(instanceId)) { if (processedInstanceIds.add(instanceId)) {
myJobDefinitionRegistry.setJobDefinition(instance); myJobDefinitionRegistry.setJobDefinition(instance);
JobInstanceProcessor jobInstanceProcessor = new JobInstanceProcessor(myJobPersistence, JobInstanceProcessor jobInstanceProcessor = new JobInstanceProcessor(myJobPersistence,
@ -226,6 +227,10 @@ public class JobMaintenanceServiceImpl implements IJobMaintenanceService, IHasSc
jobInstanceProcessor.process(); jobInstanceProcessor.process();
} }
} }
else {
ourLog.warn("Job definition {} for instance {} is currently unavailable", instance.getJobDefinitionId(), instanceId);
}
}
if (instances.size() < INSTANCES_PER_PASS) { if (instances.size() < INSTANCES_PER_PASS) {
break; break;

View File

@ -211,7 +211,7 @@ public enum StatusEnum {
if (!canTransition) { if (!canTransition) {
// we have a bug? // we have a bug?
ourLog.warn("Tried to execute an illegal state transition. [origStatus={}, newStatus={}]", theOrigStatus, theNewStatus); ourLog.debug("Tried to execute an illegal state transition. [origStatus={}, newStatus={}]", theOrigStatus, theNewStatus);
} }
return canTransition; return canTransition;
} }

View File

@ -105,13 +105,30 @@ public class InstanceProgress {
} }
} }
/**
* Signal to the progress calculator to skip the incomplete work chunk count when determining the completed percentage.
* <p/>
* This is a hack: The reason we do this is to get around a race condition in which all work chunks are complete but
* the last chunk is * still in QUEUED status and will only be marked COMPLETE later.
*
* @param theInstance The Batch 2 {@link JobInstance} that we're updating
*/
public void updateInstanceForReductionStep(JobInstance theInstance) {
updateInstance(theInstance, true);
}
public void updateInstance(JobInstance theInstance) {
updateInstance(theInstance, false);
}
/** /**
* Update the job instance with status information. * Update the job instance with status information.
* We shouldn't read any values from theInstance here -- just write. * We shouldn't read any values from theInstance here -- just write.
* *
* @param theInstance the instance to update with progress statistics * @param theInstance the instance to update with progress statistics
*/ */
public void updateInstance(JobInstance theInstance) { public void updateInstance(JobInstance theInstance, boolean theCalledFromReducer) {
ourLog.debug("updateInstance {}: {}", theInstance.getInstanceId(), this);
if (myEarliestStartTime != null) { if (myEarliestStartTime != null) {
theInstance.setStartTime(myEarliestStartTime); theInstance.setStartTime(myEarliestStartTime);
} }
@ -122,7 +139,9 @@ public class InstanceProgress {
theInstance.setCombinedRecordsProcessed(myRecordsProcessed); theInstance.setCombinedRecordsProcessed(myRecordsProcessed);
if (getChunkCount() > 0) { if (getChunkCount() > 0) {
double percentComplete = (double) (myCompleteChunkCount) / (double) getChunkCount(); final int chunkCount = getChunkCount();
final int conditionalChunkCount = theCalledFromReducer ? (chunkCount - myIncompleteChunkCount) : chunkCount;
final double percentComplete = (double) (myCompleteChunkCount) / (double) conditionalChunkCount;
theInstance.setProgress(percentComplete); theInstance.setProgress(percentComplete);
} }

View File

@ -19,11 +19,16 @@ import ca.uhn.fhir.batch2.model.WorkChunkStatusEnum;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.model.sched.ISchedulerService; import ca.uhn.fhir.jpa.model.sched.ISchedulerService;
import ca.uhn.fhir.jpa.subscription.channel.api.IChannelProducer; import ca.uhn.fhir.jpa.subscription.channel.api.IChannelProducer;
import ca.uhn.test.util.LogbackCaptureTestExtension;
import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.spi.ILoggingEvent;
import com.google.common.collect.Lists; import com.google.common.collect.Lists;
import org.hl7.fhir.r4.model.DateTimeType; import org.hl7.fhir.r4.model.DateTimeType;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.mockito.ArgumentCaptor; import org.mockito.ArgumentCaptor;
import org.mockito.Captor; import org.mockito.Captor;
import org.mockito.Mock; import org.mockito.Mock;
@ -53,6 +58,7 @@ import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyBoolean; import static org.mockito.ArgumentMatchers.anyBoolean;
import static org.mockito.ArgumentMatchers.anyInt; import static org.mockito.ArgumentMatchers.anyInt;
import static org.mockito.ArgumentMatchers.eq; import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.Mockito.never;
import static org.mockito.Mockito.times; import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.verifyNoMoreInteractions; import static org.mockito.Mockito.verifyNoMoreInteractions;
@ -61,6 +67,8 @@ import static org.mockito.Mockito.when;
@ExtendWith(MockitoExtension.class) @ExtendWith(MockitoExtension.class)
public class JobMaintenanceServiceImplTest extends BaseBatch2Test { public class JobMaintenanceServiceImplTest extends BaseBatch2Test {
@RegisterExtension
LogbackCaptureTestExtension myLogCapture = new LogbackCaptureTestExtension((Logger) JobMaintenanceServiceImpl.ourLog, Level.WARN);
@Mock @Mock
IJobCompletionHandler<TestJobParameters> myCompletionHandler; IJobCompletionHandler<TestJobParameters> myCompletionHandler;
@Mock @Mock
@ -115,6 +123,26 @@ public class JobMaintenanceServiceImplTest extends BaseBatch2Test {
verify(myJobPersistence, times(1)).updateInstance(any(), any()); verify(myJobPersistence, times(1)).updateInstance(any(), any());
} }
@Test
public void testInProgress_Calculate_progresss_JobDefinitionMissing() {
ArgumentCaptor<ILoggingEvent> logCaptor = ArgumentCaptor.forClass(ILoggingEvent.class);
List<WorkChunk> chunks = List.of(
JobCoordinatorImplTest.createWorkChunk(STEP_1, null).setStatus(WorkChunkStatusEnum.COMPLETED),
JobCoordinatorImplTest.createWorkChunk(STEP_2, null).setStatus(WorkChunkStatusEnum.QUEUED)
);
JobInstance instance = createInstance();
when(myJobPersistence.fetchInstances(anyInt(), eq(0))).thenReturn(List.of(instance));
mySvc.runMaintenancePass();
String assumedRoleLogText = String.format("Job definition %s for instance %s is currently unavailable", JOB_DEFINITION_ID, instance.getInstanceId());
List<ILoggingEvent> fetchedCredentialLogs = myLogCapture.filterLoggingEventsWithMessageEqualTo(assumedRoleLogText);
assertEquals(1, fetchedCredentialLogs.size());
verify(myJobPersistence, never()).updateInstance(any(), any());
}
@Test @Test
public void testInProgress_CalculateProgress_FirstStepComplete() { public void testInProgress_CalculateProgress_FirstStepComplete() {
List<WorkChunk> chunks = Arrays.asList( List<WorkChunk> chunks = Arrays.asList(
@ -267,7 +295,7 @@ public class JobMaintenanceServiceImplTest extends BaseBatch2Test {
// Verify // Verify
verify(myJobPersistence, times(2)).updateInstance(eq(INSTANCE_ID), any()); verify(myJobPersistence, times(1)).updateInstance(eq(INSTANCE_ID), any());
assertEquals(1.0, instance.getProgress()); assertEquals(1.0, instance.getProgress());
assertEquals(StatusEnum.COMPLETED, instance.getStatus()); assertEquals(StatusEnum.COMPLETED, instance.getStatus());
@ -314,7 +342,7 @@ public class JobMaintenanceServiceImplTest extends BaseBatch2Test {
assertEquals(parseTime("2022-02-12T14:10:00-04:00"), instance.getEndTime()); assertEquals(parseTime("2022-02-12T14:10:00-04:00"), instance.getEndTime());
// twice - once to move to FAILED, and once to purge the chunks // twice - once to move to FAILED, and once to purge the chunks
verify(myJobPersistence, times(2)).updateInstance(eq(INSTANCE_ID), any()); verify(myJobPersistence, times(1)).updateInstance(eq(INSTANCE_ID), any());
verify(myJobPersistence, times(1)).deleteChunksAndMarkInstanceAsChunksPurged(eq(INSTANCE_ID)); verify(myJobPersistence, times(1)).deleteChunksAndMarkInstanceAsChunksPurged(eq(INSTANCE_ID));
verifyNoMoreInteractions(myJobPersistence); verifyNoMoreInteractions(myJobPersistence);

View File

@ -138,15 +138,13 @@ public class QuestionnaireOperationsProvider {
} }
/** /**
* Implements the <a href= * Implements a $package operation following the <a href=
* "https://build.fhir.org/ig/HL7/davinci-dtr/OperationDefinition-questionnaire-package.html">$questionnaire-package</a> * "https://build.fhir.org/ig/HL7/crmi-ig/branches/master/packaging.html">CRMI IG</a>.
* operation found in the
* <a href="https://build.fhir.org/ig/HL7/davinci-dtr/index.html">Da Vinci Documents Templates and Rules (DTR) IG</a>.
* *
* @param theId The id of the Questionnaire. * @param theId The id of the Questionnaire.
* @param theCanonical The canonical identifier for the questionnaire (optionally version-specific). * @param theCanonical The canonical identifier for the questionnaire (optionally version-specific).
* @param theRequestDetails The details (such as tenant) of this request. Usually * @param theRequestDetails The details (such as tenant) of this request. Usually
* autopopulated HAPI. * autopopulated by HAPI.
* @return A Bundle containing the Questionnaire and all related Library, CodeSystem and ValueSet resources * @return A Bundle containing the Questionnaire and all related Library, CodeSystem and ValueSet resources
*/ */
@Operation(name = ProviderConstants.CR_OPERATION_PACKAGE, idempotent = true, type = Questionnaire.class) @Operation(name = ProviderConstants.CR_OPERATION_PACKAGE, idempotent = true, type = Questionnaire.class)

View File

@ -184,15 +184,14 @@ public class QuestionnaireOperationsProvider {
} }
/** /**
* Implements the <a href= * Implements a $package operation following the <a href=
* "https://build.fhir.org/ig/HL7/davinci-dtr/OperationDefinition-questionnaire-package.html">$questionnaire-package</a> * "https://build.fhir.org/ig/HL7/crmi-ig/branches/master/packaging.html">CRMI IG</a>.
* operation found in the
* <a href="https://build.fhir.org/ig/HL7/davinci-dtr/index.html">Da Vinci Documents Templates and Rules (DTR) IG</a>.
* *
* @param theId The id of the Questionnaire. * @param theId The id of the Questionnaire.
* @param theCanonical The canonical identifier for the questionnaire (optionally version-specific). * @param theCanonical The canonical identifier for the questionnaire (optionally version-specific).
* @Param theIsPut A boolean value to determine if the Bundle returned uses PUT or POST request methods. Defaults to false.
* @param theRequestDetails The details (such as tenant) of this request. Usually * @param theRequestDetails The details (such as tenant) of this request. Usually
* autopopulated HAPI. * autopopulated by HAPI.
* @return A Bundle containing the Questionnaire and all related Library, CodeSystem and ValueSet resources * @return A Bundle containing the Questionnaire and all related Library, CodeSystem and ValueSet resources
*/ */
@Operation(name = ProviderConstants.CR_OPERATION_PACKAGE, idempotent = true, type = Questionnaire.class) @Operation(name = ProviderConstants.CR_OPERATION_PACKAGE, idempotent = true, type = Questionnaire.class)

View File

@ -30,14 +30,14 @@ import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.model.DeleteConflictList; import ca.uhn.fhir.jpa.api.model.DeleteConflictList;
import ca.uhn.fhir.jpa.api.model.DeleteMethodOutcome;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService; import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.dao.tx.HapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.HapiTransactionService;
import ca.uhn.fhir.jpa.delete.DeleteConflictUtil; import ca.uhn.fhir.jpa.delete.DeleteConflictUtil;
import ca.uhn.fhir.jpa.model.dao.JpaPid; import ca.uhn.fhir.jpa.model.dao.JpaPid;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.mdm.dao.IMdmLinkDao; import ca.uhn.fhir.mdm.dao.IMdmLinkDao;
import ca.uhn.fhir.mdm.interceptor.MdmStorageInterceptor;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.api.server.storage.TransactionDetails; import ca.uhn.fhir.rest.api.server.storage.TransactionDetails;
import ca.uhn.fhir.rest.server.provider.ProviderConstants; import ca.uhn.fhir.rest.server.provider.ProviderConstants;
import ca.uhn.fhir.util.StopWatch; import ca.uhn.fhir.util.StopWatch;
@ -108,28 +108,42 @@ public class MdmClearStep implements IJobStepWorker<MdmClearJobParameters, Resou
return null; return null;
} }
ourLog.info("Starting mdm clear work chunk with {} resources - Instance[{}] Chunk[{}]", persistentIds.size(), myInstanceId, myChunkId); // avoid double deletion of mdm links
MdmStorageInterceptor.setLinksDeletedBeforehand();
try {
performWork(persistentIds);
} finally {
MdmStorageInterceptor.resetLinksDeletedBeforehand();
}
return null;
}
private void performWork(List<JpaPid> thePersistentIds) {
ourLog.info("Starting mdm clear work chunk with {} resources - Instance[{}] Chunk[{}]", thePersistentIds.size(), myInstanceId, myChunkId);
StopWatch sw = new StopWatch(); StopWatch sw = new StopWatch();
myMdmLinkSvc.deleteLinksWithAnyReferenceToPids(persistentIds); myMdmLinkSvc.deleteLinksWithAnyReferenceToPids(thePersistentIds);
ourLog.trace("Deleted {} mdm links in {}", thePersistentIds.size(), StopWatch.formatMillis(sw.getMillis()));
// We know the list is not empty, and that all resource types are the same, so just use the first one // We know the list is not empty, and that all resource types are the same, so just use the first one
String resourceName = myData.getResourceType(0); String resourceName = myData.getResourceType(0);
IFhirResourceDao dao = myDaoRegistry.getResourceDao(resourceName); IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(resourceName);
DeleteConflictList conflicts = new DeleteConflictList(); DeleteConflictList conflicts = new DeleteConflictList();
dao.deletePidList(ProviderConstants.OPERATION_MDM_CLEAR, persistentIds, conflicts, myRequestDetails); dao.deletePidList(ProviderConstants.OPERATION_MDM_CLEAR, thePersistentIds, conflicts, myRequestDetails);
DeleteConflictUtil.validateDeleteConflictsEmptyOrThrowException(myFhirContext, conflicts); DeleteConflictUtil.validateDeleteConflictsEmptyOrThrowException(myFhirContext, conflicts);
ourLog.trace("Deleted {} golden resources in {}", thePersistentIds.size(), StopWatch.formatMillis(sw.getMillis()));
dao.expunge(persistentIds, myRequestDetails); dao.expunge(thePersistentIds, myRequestDetails);
ourLog.info("Finished removing {} golden resources in {} - {}/sec - Instance[{}] Chunk[{}]", persistentIds.size(), sw, sw.formatThroughput(persistentIds.size(), TimeUnit.SECONDS), myInstanceId, myChunkId); ourLog.info("Finished removing {} golden resources in {} - {}/sec - Instance[{}] Chunk[{}]", thePersistentIds.size(), sw, sw.formatThroughput(thePersistentIds.size(), TimeUnit.SECONDS), myInstanceId, myChunkId);
if (ourClearCompletionCallbackForUnitTest != null) { if (ourClearCompletionCallbackForUnitTest != null) {
ourClearCompletionCallbackForUnitTest.run(); ourClearCompletionCallbackForUnitTest.run();
} }
return null;
} }
} }

Some files were not shown because too many files have changed in this diff Show More