GM1 Mergeback (#5203)

* version bump

* Bump to core release 6.0.22 (#5028)

* Bump to core release 6.0.16

* Bump to core version 6.0.20

* Fix errors thrown as a result of VersionSpecificWorkerContextWrapper

* Bump to core 6.0.22

* Resolve 5126 hfj res ver prov might cause migration error on db that automatically indexes the primary key (#5127)

* dropped old index FK_RESVERPROV_RES_PID on RES_PID column before adding IDX_RESVERPROV_RES_PID

* added changelog

* changed to valid version number

* changed to valid version number, need to be ordered by version number...

* 5123 - Use DEFAULT partition for server-based requests if none specified (#5124)

5123 - Use DEFAULT partition for server-based requests if none specified

* consent remove all suppresses next link in bundle (#5119)

* added FIXME with source of issue

* added FIXME with root cause

* added FIXME with root cause

* Providing solution to the issue and removing fixmes.

* Providing changelog

* auto-formatting.

* Adding new test.

* Adding a new test for standard paging

* let's try this and see if it works...?

* fix tests

* cleanup to trigger a new run

* fixing tests

---------

Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* 5117 MDM Score for No Match Fields Should Not Be Included in Total Score  (#5118)

* fix, test, changelog

* fix, test, changelog

---------

Co-authored-by: justindar <justin.dar@smilecdr.com>

* Rename file to force IT mode

* _source search parameter needs to support modifiers (#5095)

_source search parameter needs to support modifiers - added support form :contains, :missing, :above modifiers

* Fix HFQL docs (#5151)

* Expunge operation on codesystem may throw 500 internal error with precondition fail message. (#5156)

* Initial failing test.

* Solution with changelog.

* fixing format.

* Addressing comment from code review.

* fixing failing test.

---------

Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* documentation update (#5154)

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* Fix hsql jdbc driver deps (#5168)

Avoid non-included classes in jdbc driver dependencies.

* $delete-expunge over 10k resources will now delete all resources (#5144)

* First commit with very rough fix and unit test.

* Refinements to ResourceIdListStep and Batch2DaoSvcImpl.  Make LoadIdsStepTest pass.   Enhance Batch2DaoSvcImplTest.

* Spotless

* Fix checkstyle errors.

* Fix test failures.

* Minor refactoring.  New unit test.  Finalize changelist.

* Spotless fix.

* Delete now useless code from unit test.

* Delete more useless code.

* Test pre-commit hook

* More spotless fixes.

* Address most code review feedback.

* Remove use of pageSize parameter and see if this breaks the pipeline.

* Remove use of pageSize parameter and see if this breaks the pipeline.

* Fix the noUrl case by passing an unlimited Pegeable instead.  Effectively stop using page size for most databases.

* Deprecate the old method and have it call the new one by default.

* updating documentation (#5170)

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* _source search parameter modifiers for Subscription matching (#5159)

* _source search parameter modifiers for Subscription matching - test, implementation and changelog

* Removal of meta tags during updates do not trigger subscription (#5181)

* Initial failing test.

* adding solution;
fixing documentation;

* spotless apply

* adding changelog

* modifying current test

---------

Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* Issue 5173 get gateway everything doesnt return all patients (#5174)

* Failing test

* Also set offset and count in base DAO override

* Changelog

* Fix for specific case where count has been set in parameters

* spotless

* Improve checks

---------

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* Do not 500 and continue IG ingestion when different IGs try to save different ValueSets with colliding FHIR IDs (#5175)

* First commit with failing unit test and small tweaks.

* Swallow resource version exceptions from colliding ValueSet OIDs and log a descriptive error instead.  Add more detailed unit testing.

* Tweaks to logic and update the changelog.  Reverse all changes to TermReadSvcImpl.

* Revert PackageResourceParsingSvc to release branch baseline.

* Accept code reviewer suggestion to change changelog description.

Co-authored-by: michaelabuckley <michaelabuckley@gmail.com>

---------

Co-authored-by: michaelabuckley <michaelabuckley@gmail.com>

* Fix link

* Remove target slf4j version

* dont use new API for a bit (#5191)

* Return DropIdGeneratorTask from the Builder to permit clients to mutate the (#5193)

DropIdGeneratorTask.

* Dqm performance bug update and provider loader fix (#5180)

* update tests, move properties, update operation loader

* update wip

* remove test

* fixing tests, adding config

* update config and provider loader

* fix bundles

* fix cache settings on tests

* version bump and change log

* version bump

* fix formatting

* CVE-2022-45868

* wip cve change

* cve h2 add back in

---------

Co-authored-by: justin.mckelvy <justin.mckelvy@smilecdr.com>

* bulkExportReuse with POST and GET (#5161)

* string manipulation

* Code to ensure bulkExportReuse works with POST and GET requests

* Added formatting changes

* Fixed tests that were not working

* Formatting

* Code clean up

* fixing test failures

* fixing test failures

* Removed arrOfParams to now utilize ObjectMapper

* Removing stack trace and adding an exception

* Fixed test issue

* formatting

* formatting

* Resolving code review comments

* Reduce size of subscription max results (#5194)

* Reduce MAX_SUBSCRIPTION_RESULTS to 10000

* Add changelog

* 5037 goldenresource remains when target resource deleted (#5038)

* draft test and fix

* remove unused fields

* remove unused fields

* remove unused fields

* draft test + solution for possible match case

* combine sql statement + more error checking

* add test case for possible duplicate

* add config for autodeleting grs

* refactoring, adding support for mongo, docs

* refactoring + fixing mongo queries

* add changelogs

* fix both way link removal

* clean up test comments

* rename method

* remove unnecessary bean

* merge master/resolve conflicts

* mvn spotless

* address comment

* changes to avoid version bumping

* spotless

* change error code

---------

Co-authored-by: justindar <justin.dar@smilecdr.com>

* dont use new API for a bit (#5190)

* licenses

* wip

* Fix API usage

* wip

* Version bump

---------

Co-authored-by: dotasek <david.otasek@smilecdr.com>
Co-authored-by: TynerGjs <132295567+TynerGjs@users.noreply.github.com>
Co-authored-by: Steve Corbett <137920358+steve-corbett-smilecdr@users.noreply.github.com>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: justindar <justin.dar@smilecdr.com>
Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>
Co-authored-by: Nathan Doef <n.doef@protonmail.com>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: michaelabuckley <michaelabuckley@gmail.com>
Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com>
Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com>
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
Co-authored-by: Justin McKelvy <60718638+Capt-Mac@users.noreply.github.com>
Co-authored-by: justin.mckelvy <justin.mckelvy@smilecdr.com>
Co-authored-by: LalithE <132382565+LalithE@users.noreply.github.com>
This commit is contained in:
Tadgh 2023-08-15 12:32:10 -07:00 committed by GitHub
parent f2087d2ccc
commit a10856e091
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
203 changed files with 3874 additions and 2273 deletions

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -48,7 +48,16 @@ public enum UriParamQualifierEnum {
* Value <code>:below</code> * Value <code>:below</code>
* </p> * </p>
*/ */
BELOW(":below"); BELOW(":below"),
/**
* The contains modifier allows clients to indicate that a supplied URI input should be matched
* as a case-insensitive and combining-character insensitive match anywhere in the target URI.
* <p>
* Value <code>:contains</code>
* </p>
*/
CONTAINS(":contains");
private static final Map<String, UriParamQualifierEnum> KEY_TO_VALUE; private static final Map<String, UriParamQualifierEnum> KEY_TO_VALUE;

View File

@ -40,6 +40,8 @@ import java.net.URI;
import java.net.URISyntaxException; import java.net.URISyntaxException;
import java.net.URL; import java.net.URL;
import java.net.URLDecoder; import java.net.URLDecoder;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.HashMap; import java.util.HashMap;
@ -602,6 +604,39 @@ public class UrlUtil {
return parameters; return parameters;
} }
/**
* Creates list of sub URIs candidates for search with :above modifier
* Example input: http://[host]/[pathPart1]/[pathPart2]
* Example output: http://[host], http://[host]/[pathPart1], http://[host]/[pathPart1]/[pathPart2]
*
* @param theUri String URI parameter
* @return List of URI candidates
*/
public static List<String> getAboveUriCandidates(String theUri) {
try {
URI uri = new URI(theUri);
if (uri.getScheme() == null || uri.getHost() == null) {
throwInvalidRequestExceptionForNotValidUri(theUri, null);
}
} catch (URISyntaxException theCause) {
throwInvalidRequestExceptionForNotValidUri(theUri, theCause);
}
List<String> candidates = new ArrayList<>();
Path path = Paths.get(theUri);
candidates.add(path.toString().replace(":/", "://"));
while (path.getParent() != null && path.getParent().toString().contains("/")) {
candidates.add(path.getParent().toString().replace(":/", "://"));
path = path.getParent();
}
return candidates;
}
private static void throwInvalidRequestExceptionForNotValidUri(String theUri, Exception theCause) {
throw new InvalidRequestException(
Msg.code(2419) + String.format("Provided URI is not valid: %s", theUri), theCause);
}
public static class UrlParts { public static class UrlParts {
private String myParams; private String myParams;
private String myResourceId; private String myResourceId;

View File

@ -4,7 +4,7 @@
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-bom</artifactId> <artifactId>hapi-fhir-bom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<packaging>pom</packaging> <packaging>pom</packaging>
<name>HAPI FHIR BOM</name> <name>HAPI FHIR BOM</name>
@ -12,7 +12,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-cli</artifactId> <artifactId>hapi-fhir-cli</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -0,0 +1,7 @@
---
type: fix
issue: 5037
jira: SMILE-6370
title: "Previously, when the last source resource with a `MATCH` link was deleted, the golden resource
remained in the database, leaving it orphaned. This has now been fixed such that when there are no more
`MATCH` links left associated with a golden resource, the golden resource will automatically be deleted."

View File

@ -0,0 +1,4 @@
---
type: add
issue: 5095
title: "Added support for :above, :below, :contains and :missing _source search parameter modifiers."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 5150
title: "When running a $delete-expunge with over 10,000 resources, only the first 10,000 resources were deleted.
This is now fixed."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 5155
title: "Previously, requesting an $expunge operation on CodeSystem resources while CS batch deletion is underway would return HTTP 500.
This has been fixed to return HTTP 412 (precondition failed)."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 5157
title: "Previously, the reuse functionality did not operate correctly when dealing with POST and GET requests. This fix ensures that similar POST and GET export requests will be reused."

View File

@ -0,0 +1,4 @@
---
type: add
issue: 5158
title: "Added support for Subscription matching of ':above', ':below', ':contains' and ':missing' '_source' search parameter modifiers."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 5167
title: "Fixed a dependency in the HSQL JDBC driver referencing a non-bundled class (javax.ServletOutputStream)"

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 5173
title: "Fix gateway `$everything` operation to respect server configured default and maximum page sizes."

View File

@ -0,0 +1,3 @@
type: fix
issue: 5179
title: "Added evaluation setting for hapi-fhir storage-cr module operations from outside. Updated provider loading from hapi-fhir instead of external server for caregaps and submitdata providers. Updated testing suite to depend on restful server for new provider loader"

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 5182
jira: SMILE-6857
title: "Previously, removing tags in a resource update with proper headers and versioning flag would not trigger a
new subscription. This has been fixed."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 5183
title: "The latest US Core IG includes two ValueSets with different contents, but the same FHIR Id and OID via two different included IGs (i.e. `2.16.840.1.113762.1.4.1010.9` via us.cdc.phinvads and us.nlm.vsac). Ingesting these duplicates in US Core failed with a 500 error. This has been resolved by logging the error and allowing the rest of the ingestion to proceed."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 5195
title: "MAX_SUBSCRIPTION_RESULTS was set to an arbitrarily high 50000. This resulted in a failures in hapi-fhir-jpaserver-starter. Other constants, such as SearchParamRegistryImpl.MAX_MANAGED_PARAM_COUNT have a limit of 10000 as well. This aligns the MAX_SUBSCRIPTION results with that value."

View File

@ -14,8 +14,8 @@ A simple example query is shown below:
```sql ```sql
SELECT SELECT
name.family as family, name[0].family as family,
name.given as given, name[0].given[0] as given,
birthDate, birthDate,
identifier.where(system='http://hl7.org/fhir/sid/us-ssn').value as SSN identifier.where(system='http://hl7.org/fhir/sid/us-ssn').value as SSN
FROM FROM

View File

@ -22,10 +22,7 @@ Below are some simplifying principles HAPI MDM follows to reduce complexity and
1. The only source resources in the system that do not have a MATCH link are those that have the 'NO-MDM' tag or those that have POSSIBLE_MATCH links pending review. 1. The only source resources in the system that do not have a MATCH link are those that have the 'NO-MDM' tag or those that have POSSIBLE_MATCH links pending review.
1. The HAPI MDM rules define a single identifier system that holds the external enterprise id ("EID"). If a source resource has an external EID, then the Golden Resource it links to always has the same EID. If a source resource has no EID when it arrives, a unique UUID will be assigned as that source resource's EID. 1. The HAPI MDM rules define a single identifier system that holds the external enterprise id ("EID"). If a source resource has an external EID, then the Golden Resource it links to always has the same EID.
1. A Golden Resource can have both an internal EID (auto-created by HAPI), and an external EID (provided by an
external system).
1. Two different Golden Resources cannot have the same EID. 1. Two different Golden Resources cannot have the same EID.
@ -85,7 +82,7 @@ possible that hundreds of John Doe's could be linked to the same Golden Resource
When a new source resource is compared with all other resources of the same type in the repository, there are four possible outcomes: When a new source resource is compared with all other resources of the same type in the repository, there are four possible outcomes:
* CASE 1: No MATCH and no POSSIBLE_MATCH outcomes -> a new Golden Resource is created and linked to that source resource as MATCH. If the incoming resource has an EID, it is copied to the Golden Resource. Otherwise a new UUID is generated and used as the internal EID. * CASE 1: No MATCH and no POSSIBLE_MATCH outcomes -> a new Golden Resource is created and linked to that source resource as MATCH. If the incoming resource has an EID, it is copied to the Golden Resource.
* CASE 2: All of the MATCH source resources are already linked to the same Golden Resource -> a new Link is created between the new source resource and that Golden Resource and is set to MATCH. * CASE 2: All of the MATCH source resources are already linked to the same Golden Resource -> a new Link is created between the new source resource and that Golden Resource and is set to MATCH.
@ -93,6 +90,17 @@ When a new source resource is compared with all other resources of the same type
* CASE 4: Only POSSIBLE_MATCH outcomes -> In this case, new POSSIBLE_MATCH links are created and await manual reassignment to either NO_MATCH or MATCH. * CASE 4: Only POSSIBLE_MATCH outcomes -> In this case, new POSSIBLE_MATCH links are created and await manual reassignment to either NO_MATCH or MATCH.
### MDM and Resource Deletion
By default, when the last source resource in a `MATCH` relationship with a golden resource is deleted, the associated golden resource is permanently (hard) deleted. This prevents orphaned golden resources that remain in the database. Note that this will also delete the respective MDM link history. Here are several scenarios and their associated behaviour, we will define SR as a source resource and GR as a golden resource:
* There is a 1 to 1 `MATCH` relationship between SR/1 and GR/1 -> when SR/1 is deleted, GR/1 is also deleted.
* GR/1 has a `MATCH` link with SR/1, and a `POSSIBLE_MATCH` link with SR/2 -> when SR/1 is deleted, all links are deleted and GR/1 is deleted. Additionally, SR/2 is re-submitted for matching, meaning a new GR could be created or it could match with another GR.
* GR/1 has a `MATCH` link with SR/1, a `POSSIBLE_MATCH` link with SR/2, and a `POSSIBLE_DUPLICATE` with GR/2. Additionally, GR/2 has a `MATCH` with SR/3, a `POSSIBLE_MACH` with SR/2 -> when SR/1 is deleted, all links associated with GR/1, including the `POSSIBLE_DUPLICATE` link, are deleted. SR/2 maintains its `POSSIBLE_MATCH` relation with GR/2. Finally, GR/1 is deleted.
This behaviour can be changed from the default of hard deleting to soft deleting by setting [setAutoExpungeGoldenResources(boolean)](/hapi-fhir/apidocs/hapi-fhir-server-mdm/ca/uhn/fhir/mdm/rules/config/MdmSettings.html#setAutoExpungeGoldenResources(boolean)) to false. Soft deleting the golden resource means the golden resource will continue to persist in the database, but the MDM link history for the affected link(s) will still be accessible, which may be useful for auditing.
# HAPI MDM Technical Details # HAPI MDM Technical Details
When MDM is enabled, the HAPI FHIR JPA Server does the following things on startup: When MDM is enabled, the HAPI FHIR JPA Server does the following things on startup:

View File

@ -1,8 +1,9 @@
# MDM Enterprise Identifiers # MDM Enterprise Identifiers
An Enterprise Identifier (EID) is a unique identifier that can be attached to source resources. Each implementation is expected to use exactly one EID system for incoming resources, defined in the MDM Rules file. If a source resource with a valid EID is submitted, that EID will be copied over to the Golden Resource that was matched. In the case that the incoming source resource had no EID assigned, an internal EID will be created for it. There are thus two classes of EID: An Enterprise Identifier (EID) is a unique identifier that can be attached to source resources.
* Internal EIDs, created by HAPI-MDM, and Each implementation is expected to use exactly one EID system for incoming resources,
* External EIDs, provided by the submitted resources. defined in the MDM Rules file.
If a source resource with a valid EID is submitted, that EID will be copied over to the Golden Resource that was matched.
## MDM EID Settings ## MDM EID Settings

View File

@ -111,9 +111,26 @@ Here is a description of how each section of this document is configured.
### candidateSearchParams ### candidateSearchParams
These define fields which must have at least one exact match before two resources are considered for matching. This is like a list of "pre-searches" that find potential candidates for matches, to avoid the expensive operation of running a match score calculation on all resources in the system. E.g. you may only wish to consider matching two Patients if they either share at least one identifier in common or have the same birthday or the same phone number. The HAPI FHIR server executes each of these searches separately and then takes the union of the results, so you can think of these as `OR` criteria that cast a wide net for potential candidates. In some MDM systems, these "pre-searches" are called "blocking" searches (since they identify "blocks" of candidates that will be searched for matches). These define one or more fields which must have a match before two resources are considered for matching.
This is like a list of "pre-searches" that find potential candidates for matches,
to avoid the expensive operation of running a match score calculation on all resources in the system.
`candidateSearchParameters` are capable of making exact searches and phonetic searches
(see the list of [phonetic search parameters](https://smilecdr.com/docs/fhir_repository/search_parameter_phonetic.html))
E.g. you may only wish to consider matching two Patients if they either share at least one identifier in
common or have the same birthday or the same phone number. The HAPI FHIR server executes each of these searches
separately and then takes the union of the results, so you can think of these as `OR` criteria that
cast a wide net for potential candidates. In some MDM systems, these "pre-searches" are called "blocking"
searches (since they identify "blocks" of candidates that will be searched for matches).
If a list of searchParams is specified in a given candidateSearchParams item, then these search parameters are treated as `AND` parameters. In the following candidateSearchParams definition, hapi-fhir will extract given name, family name and identifiers from the incoming Patient and perform two separate searches, first for all Patient resources that have the same given `AND` the same family name as the incoming Patient, and second for all Patient resources that share at least one identifier as the incoming Patient. Note that if the incoming Patient was missing any of these searchParam values, then that search would be skipped. E.g. if the incoming Patient had a given name but no family name, then only a search for matching identifiers would be performed. If a list of searchParams is specified in a given candidateSearchParams item,
then these search parameters are treated as `AND` parameters.
In the following candidateSearchParams definition, hapi-fhir will extract given name,
family name and identifiers from the incoming Patient and perform two separate searches,
first for all Patient resources that have the same given `AND` the same family name as
the incoming Patient, and second for all Patient resources that share at least one
identifier as the incoming Patient. Note that if the incoming Patient was missing any of these searchParam values,
then that search would be skipped. E.g. if the incoming Patient had a given name but no family name,
then only a search for matching identifiers would be performed.
```json ```json
{ {

View File

@ -11,7 +11,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -4,7 +4,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -36,7 +36,12 @@ import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
import ca.uhn.fhir.jpa.entity.Batch2JobInstanceEntity; import ca.uhn.fhir.jpa.entity.Batch2JobInstanceEntity;
import ca.uhn.fhir.jpa.entity.Batch2WorkChunkEntity; import ca.uhn.fhir.jpa.entity.Batch2WorkChunkEntity;
import ca.uhn.fhir.model.api.PagingIterator; import ca.uhn.fhir.model.api.PagingIterator;
import ca.uhn.fhir.util.Batch2JobDefinitionConstants;
import ca.uhn.fhir.util.Logs; import ca.uhn.fhir.util.Logs;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ObjectNode;
import org.apache.commons.collections4.ListUtils; import org.apache.commons.collections4.ListUtils;
import org.apache.commons.lang3.Validate; import org.apache.commons.lang3.Validate;
import org.slf4j.Logger; import org.slf4j.Logger;
@ -220,6 +225,11 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
List<Batch2JobInstanceEntity> instanceEntities; List<Batch2JobInstanceEntity> instanceEntities;
if (statuses != null && !statuses.isEmpty()) { if (statuses != null && !statuses.isEmpty()) {
if (definitionId.equals(Batch2JobDefinitionConstants.BULK_EXPORT)) {
if (originalRequestUrlTruncation(params) != null) {
params = originalRequestUrlTruncation(params);
}
}
instanceEntities = myJobInstanceRepository.findInstancesByJobIdParamsAndStatus( instanceEntities = myJobInstanceRepository.findInstancesByJobIdParamsAndStatus(
definitionId, params, statuses, pageable); definitionId, params, statuses, pageable);
} else { } else {
@ -228,6 +238,31 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
return toInstanceList(instanceEntities); return toInstanceList(instanceEntities);
} }
private String originalRequestUrlTruncation(String theParams) {
try {
ObjectMapper mapper = new ObjectMapper();
mapper.configure(JsonParser.Feature.ALLOW_UNQUOTED_FIELD_NAMES, true);
mapper.configure(JsonParser.Feature.ALLOW_SINGLE_QUOTES, true);
JsonNode rootNode = mapper.readTree(theParams);
String originalUrl = "originalRequestUrl";
if (rootNode instanceof ObjectNode) {
ObjectNode objectNode = (ObjectNode) rootNode;
if (objectNode.has(originalUrl)) {
String url = objectNode.get(originalUrl).asText();
if (url.contains("?")) {
objectNode.put(originalUrl, url.split("\\?")[0]);
}
}
return mapper.writeValueAsString(objectNode);
}
} catch (Exception e) {
ourLog.info("Error Truncating Original Request Url", e);
}
return null;
}
@Override @Override
@Transactional(propagation = Propagation.REQUIRES_NEW) @Transactional(propagation = Propagation.REQUIRES_NEW)
public List<JobInstance> fetchInstances(int thePageSize, int thePageIndex) { public List<JobInstance> fetchInstances(int thePageSize, int thePageIndex) {

View File

@ -19,16 +19,21 @@
*/ */
package ca.uhn.fhir.jpa.config; package ca.uhn.fhir.jpa.config;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc; import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc;
import ca.uhn.fhir.jpa.api.svc.IDeleteExpungeSvc; import ca.uhn.fhir.jpa.api.svc.IDeleteExpungeSvc;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService; import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.dao.IFulltextSearchSvc; import ca.uhn.fhir.jpa.dao.IFulltextSearchSvc;
import ca.uhn.fhir.jpa.dao.data.IResourceLinkDao; import ca.uhn.fhir.jpa.dao.data.IResourceLinkDao;
import ca.uhn.fhir.jpa.dao.data.IResourceTableDao;
import ca.uhn.fhir.jpa.dao.expunge.ResourceTableFKProvider; import ca.uhn.fhir.jpa.dao.expunge.ResourceTableFKProvider;
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
import ca.uhn.fhir.jpa.delete.batch2.DeleteExpungeSqlBuilder; import ca.uhn.fhir.jpa.delete.batch2.DeleteExpungeSqlBuilder;
import ca.uhn.fhir.jpa.delete.batch2.DeleteExpungeSvcImpl; import ca.uhn.fhir.jpa.delete.batch2.DeleteExpungeSvcImpl;
import ca.uhn.fhir.jpa.reindex.Batch2DaoSvcImpl; import ca.uhn.fhir.jpa.reindex.Batch2DaoSvcImpl;
import ca.uhn.fhir.jpa.searchparam.MatchUrlService;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
@ -37,8 +42,20 @@ import javax.persistence.EntityManager;
public class Batch2SupportConfig { public class Batch2SupportConfig {
@Bean @Bean
public IBatch2DaoSvc batch2DaoSvc() { public IBatch2DaoSvc batch2DaoSvc(
return new Batch2DaoSvcImpl(); IResourceTableDao theResourceTableDao,
MatchUrlService theMatchUrlService,
DaoRegistry theDaoRegistry,
FhirContext theFhirContext,
IHapiTransactionService theTransactionService,
JpaStorageSettings theJpaStorageSettings) {
return new Batch2DaoSvcImpl(
theResourceTableDao,
theMatchUrlService,
theDaoRegistry,
theFhirContext,
theTransactionService,
theJpaStorageSettings);
} }
@Bean @Bean

View File

@ -166,6 +166,7 @@ import javax.xml.stream.events.XMLEvent;
import static java.util.Objects.isNull; import static java.util.Objects.isNull;
import static java.util.Objects.nonNull; import static java.util.Objects.nonNull;
import static org.apache.commons.collections4.CollectionUtils.isEqualCollection;
import static org.apache.commons.lang3.StringUtils.isBlank; import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.isNotBlank; import static org.apache.commons.lang3.StringUtils.isNotBlank;
import static org.apache.commons.lang3.StringUtils.left; import static org.apache.commons.lang3.StringUtils.left;
@ -302,7 +303,7 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
} }
} }
private void extractTagsHapi( private void extractHapiTags(
TransactionDetails theTransactionDetails, TransactionDetails theTransactionDetails,
IResource theResource, IResource theResource,
ResourceTable theEntity, ResourceTable theEntity,
@ -359,7 +360,7 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
} }
} }
private void extractTagsRi( private void extractRiTags(
TransactionDetails theTransactionDetails, TransactionDetails theTransactionDetails,
IAnyResource theResource, IAnyResource theResource,
ResourceTable theEntity, ResourceTable theEntity,
@ -416,6 +417,25 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
} }
} }
private void extractProfileTags(
TransactionDetails theTransactionDetails,
IBaseResource theResource,
ResourceTable theEntity,
Set<ResourceTag> theAllTags) {
RuntimeResourceDefinition def = myContext.getResourceDefinition(theResource);
if (!def.isStandardType()) {
String profile = def.getResourceProfile("");
if (isNotBlank(profile)) {
TagDefinition profileDef = getTagOrNull(
theTransactionDetails, TagTypeEnum.PROFILE, NS_JPA_PROFILE, profile, null, null, null);
ResourceTag tag = theEntity.addTag(profileDef);
theAllTags.add(tag);
theEntity.setHasTags(true);
}
}
}
private Set<ResourceTag> getAllTagDefinitions(ResourceTable theEntity) { private Set<ResourceTag> getAllTagDefinitions(ResourceTable theEntity) {
HashSet<ResourceTag> retVal = Sets.newHashSet(); HashSet<ResourceTag> retVal = Sets.newHashSet();
if (theEntity.isHasTags()) { if (theEntity.isHasTags()) {
@ -845,39 +865,36 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
RequestDetails theRequest, RequestDetails theRequest,
IBaseResource theResource, IBaseResource theResource,
ResourceTable theEntity) { ResourceTable theEntity) {
Set<ResourceTag> allDefs = new HashSet<>(); Set<ResourceTag> allResourceTagsFromTheResource = new HashSet<>();
Set<ResourceTag> allTagsOld = getAllTagDefinitions(theEntity); Set<ResourceTag> allOriginalResourceTagsFromTheEntity = getAllTagDefinitions(theEntity);
if (theResource instanceof IResource) { if (theResource instanceof IResource) {
extractTagsHapi(theTransactionDetails, (IResource) theResource, theEntity, allDefs); extractHapiTags(theTransactionDetails, (IResource) theResource, theEntity, allResourceTagsFromTheResource);
} else { } else {
extractTagsRi(theTransactionDetails, (IAnyResource) theResource, theEntity, allDefs); extractRiTags(theTransactionDetails, (IAnyResource) theResource, theEntity, allResourceTagsFromTheResource);
} }
RuntimeResourceDefinition def = myContext.getResourceDefinition(theResource); extractProfileTags(theTransactionDetails, theResource, theEntity, allResourceTagsFromTheResource);
if (!def.isStandardType()) {
String profile = def.getResourceProfile("");
if (isNotBlank(profile)) {
TagDefinition profileDef = getTagOrNull(
theTransactionDetails, TagTypeEnum.PROFILE, NS_JPA_PROFILE, profile, null, null, null);
ResourceTag tag = theEntity.addTag(profileDef); // the extract[Hapi|Ri|Profile]Tags methods above will have populated the allResourceTagsFromTheResource Set
allDefs.add(tag); // AND
theEntity.setHasTags(true); // added all tags from theResource.meta.tags to theEntity.meta.tags. the next steps are to:
} // 1- remove duplicates;
} // 2- remove tags from theEntity that are not present in theResource if header HEADER_META_SNAPSHOT_MODE
// is present in the request;
//
Set<ResourceTag> allResourceTagsNewAndOldFromTheEntity = getAllTagDefinitions(theEntity);
Set<TagDefinition> allTagDefinitionsPresent = new HashSet<>();
Set<ResourceTag> allTagsNew = getAllTagDefinitions(theEntity); allResourceTagsNewAndOldFromTheEntity.forEach(tag -> {
Set<TagDefinition> allDefsPresent = new HashSet<>();
allTagsNew.forEach(tag -> {
// Don't keep duplicate tags // Don't keep duplicate tags
if (!allDefsPresent.add(tag.getTag())) { if (!allTagDefinitionsPresent.add(tag.getTag())) {
theEntity.getTags().remove(tag); theEntity.getTags().remove(tag);
} }
// Drop any tags that have been removed // Drop any tags that have been removed
if (!allDefs.contains(tag)) { if (!allResourceTagsFromTheResource.contains(tag)) {
if (shouldDroppedTagBeRemovedOnUpdate(theRequest, tag)) { if (shouldDroppedTagBeRemovedOnUpdate(theRequest, tag)) {
theEntity.getTags().remove(tag); theEntity.getTags().remove(tag);
} else if (HapiExtensions.EXT_SUBSCRIPTION_MATCHING_STRATEGY.equals( } else if (HapiExtensions.EXT_SUBSCRIPTION_MATCHING_STRATEGY.equals(
@ -887,21 +904,33 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
} }
}); });
// Update the resource to contain the old tags // at this point, theEntity.meta.tags will be up to date:
allTagsOld.forEach(tag -> { // 1- it was stripped from tags that needed removing;
IBaseCoding iBaseCoding = theResource // 2- it has new tags from a resource update through theResource;
.getMeta() // 3- it has tags from the previous version;
.addTag() //
.setCode(tag.getTag().getCode()) // Since tags are merged on updates, we add tags from theEntity that theResource does not have
.setSystem(tag.getTag().getSystem()) Set<ResourceTag> allUpdatedResourceTagsNewAndOldMinusRemovalsFromTheEntity = getAllTagDefinitions(theEntity);
.setVersion(tag.getTag().getVersion());
if (tag.getTag().getUserSelected() != null) { allUpdatedResourceTagsNewAndOldMinusRemovalsFromTheEntity.forEach(aResourcetag -> {
iBaseCoding.setUserSelected(tag.getTag().getUserSelected()); if (!allResourceTagsFromTheResource.contains(aResourcetag)) {
IBaseCoding iBaseCoding = theResource
.getMeta()
.addTag()
.setCode(aResourcetag.getTag().getCode())
.setSystem(aResourcetag.getTag().getSystem())
.setVersion(aResourcetag.getTag().getVersion());
allResourceTagsFromTheResource.add(aResourcetag);
if (aResourcetag.getTag().getUserSelected() != null) {
iBaseCoding.setUserSelected(aResourcetag.getTag().getUserSelected());
}
} }
}); });
theEntity.setHasTags(!allTagsNew.isEmpty()); theEntity.setHasTags(!allUpdatedResourceTagsNewAndOldMinusRemovalsFromTheEntity.isEmpty());
return !allTagsOld.equals(allTagsNew); return !isEqualCollection(allOriginalResourceTagsFromTheEntity, allResourceTagsFromTheResource);
} }
/** /**
@ -947,7 +976,7 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
* The default implementation removes any profile declarations, but leaves tags and security labels in place. Subclasses may choose to override and change this behaviour. * The default implementation removes any profile declarations, but leaves tags and security labels in place. Subclasses may choose to override and change this behaviour.
* </p> * </p>
* <p> * <p>
* See <a href="http://hl7.org/fhir/resource.html#tag-updates">Updates to Tags, Profiles, and Security Labels</a> for a description of the logic that the default behaviour folows. * See <a href="http://hl7.org/fhir/resource.html#tag-updates">Updates to Tags, Profiles, and Security Labels</a> for a description of the logic that the default behaviour follows.
* </p> * </p>
* *
* @param theTag The tag * @param theTag The tag

View File

@ -107,7 +107,6 @@ import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
import ca.uhn.fhir.rest.server.provider.ProviderConstants; import ca.uhn.fhir.rest.server.provider.ProviderConstants;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails; import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import ca.uhn.fhir.rest.server.util.CompositeInterceptorBroadcaster; import ca.uhn.fhir.rest.server.util.CompositeInterceptorBroadcaster;
import ca.uhn.fhir.util.ObjectUtil;
import ca.uhn.fhir.util.ReflectionUtil; import ca.uhn.fhir.util.ReflectionUtil;
import ca.uhn.fhir.util.StopWatch; import ca.uhn.fhir.util.StopWatch;
import ca.uhn.fhir.util.UrlUtil; import ca.uhn.fhir.util.UrlUtil;
@ -143,7 +142,6 @@ import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.Date; import java.util.Date;
import java.util.HashSet; import java.util.HashSet;
import java.util.Iterator;
import java.util.List; import java.util.List;
import java.util.Objects; import java.util.Objects;
import java.util.Optional; import java.util.Optional;
@ -1118,9 +1116,9 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
for (TagDefinition nextDef : tags) { for (TagDefinition nextDef : tags) {
for (BaseTag next : new ArrayList<BaseTag>(theEntity.getTags())) { for (BaseTag next : new ArrayList<BaseTag>(theEntity.getTags())) {
if (ObjectUtil.equals(next.getTag().getTagType(), nextDef.getTagType()) if (Objects.equals(next.getTag().getTagType(), nextDef.getTagType())
&& ObjectUtil.equals(next.getTag().getSystem(), nextDef.getSystem()) && Objects.equals(next.getTag().getSystem(), nextDef.getSystem())
&& ObjectUtil.equals(next.getTag().getCode(), nextDef.getCode())) { && Objects.equals(next.getTag().getCode(), nextDef.getCode())) {
myEntityManager.remove(next); myEntityManager.remove(next);
theEntity.getTags().remove(next); theEntity.getTags().remove(next);
} }
@ -1236,10 +1234,9 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
IBundleProvider retVal = myTransactionService IBundleProvider retVal = myTransactionService
.withRequest(theRequestDetails) .withRequest(theRequestDetails)
.withRequestPartitionId(requestPartitionId) .withRequestPartitionId(requestPartitionId)
.execute(() -> { .execute(() -> myPersistedJpaBundleProviderFactory.history(
return myPersistedJpaBundleProviderFactory.history( theRequestDetails, myResourceName, null, theSince, theUntil, theOffset, requestPartitionId));
theRequestDetails, myResourceName, null, theSince, theUntil, theOffset, requestPartitionId);
});
ourLog.debug("Processed history on {} in {}ms", myResourceName, w.getMillisAndRestart()); ourLog.debug("Processed history on {} in {}ms", myResourceName, w.getMillisAndRestart());
return retVal; return retVal;
} }
@ -1503,7 +1500,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
JpaPid jpaPid = (JpaPid) thePid; JpaPid jpaPid = (JpaPid) thePid;
Optional<ResourceTable> entity = myResourceTableDao.findById(jpaPid.getId()); Optional<ResourceTable> entity = myResourceTableDao.findById(jpaPid.getId());
if (!entity.isPresent()) { if (entity.isEmpty()) {
throw new ResourceNotFoundException(Msg.code(975) + "No resource found with PID " + jpaPid); throw new ResourceNotFoundException(Msg.code(975) + "No resource found with PID " + jpaPid);
} }
if (isDeleted(entity.get()) && !theDeletedOk) { if (isDeleted(entity.get()) && !theDeletedOk) {
@ -1554,7 +1551,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
T retVal = myJpaStorageResourceParser.toResource(myResourceType, entity, null, false); T retVal = myJpaStorageResourceParser.toResource(myResourceType, entity, null, false);
if (theDeletedOk == false) { if (!theDeletedOk) {
if (isDeleted(entity)) { if (isDeleted(entity)) {
throw createResourceGoneException(entity); throw createResourceGoneException(entity);
} }
@ -1588,7 +1585,6 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
.execute(() -> readEntity(theId, true, theRequest, requestPartitionId)); .execute(() -> readEntity(theId, true, theRequest, requestPartitionId));
} }
@SuppressWarnings("unchecked")
@Override @Override
public ReindexOutcome reindex( public ReindexOutcome reindex(
IResourcePersistentId thePid, IResourcePersistentId thePid,
@ -1657,7 +1653,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
CURRENTLY_REINDEXING.put(theResource, Boolean.TRUE); CURRENTLY_REINDEXING.put(theResource, Boolean.TRUE);
} }
ResourceTable resourceTable = updateEntity( updateEntity(
null, theResource, theEntity, theEntity.getDeleted(), true, false, transactionDetails, true, false); null, theResource, theEntity, theEntity.getDeleted(), true, false, transactionDetails, true, false);
if (theResource != null) { if (theResource != null) {
CURRENTLY_REINDEXING.put(theResource, null); CURRENTLY_REINDEXING.put(theResource, null);
@ -1743,7 +1739,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
} else { } else {
if (readPartitions.contains(null)) { if (readPartitions.contains(null)) {
List<Integer> readPartitionsWithoutNull = List<Integer> readPartitionsWithoutNull =
readPartitions.stream().filter(t -> t != null).collect(Collectors.toList()); readPartitions.stream().filter(Objects::nonNull).collect(Collectors.toList());
entity = myResourceTableDao entity = myResourceTableDao
.readByPartitionIdsOrNull(readPartitionsWithoutNull, pid.getId()) .readByPartitionIdsOrNull(readPartitionsWithoutNull, pid.getId())
.orElse(null); .orElse(null);
@ -1771,7 +1767,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
} }
if (theId.hasVersionIdPart()) { if (theId.hasVersionIdPart()) {
if (theId.isVersionIdPartValidLong() == false) { if (!theId.isVersionIdPartValidLong()) {
throw new ResourceNotFoundException(Msg.code(978) throw new ResourceNotFoundException(Msg.code(978)
+ getContext() + getContext()
.getLocalizer() .getLocalizer()
@ -1884,9 +1880,9 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
} }
for (BaseTag next : new ArrayList<>(entity.getTags())) { for (BaseTag next : new ArrayList<>(entity.getTags())) {
if (ObjectUtil.equals(next.getTag().getTagType(), theTagType) if (Objects.equals(next.getTag().getTagType(), theTagType)
&& ObjectUtil.equals(next.getTag().getSystem(), theScheme) && Objects.equals(next.getTag().getSystem(), theScheme)
&& ObjectUtil.equals(next.getTag().getCode(), theTerm)) { && Objects.equals(next.getTag().getCode(), theTerm)) {
myEntityManager.remove(next); myEntityManager.remove(next);
entity.getTags().remove(next); entity.getTags().remove(next);
} }
@ -1937,7 +1933,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
translateListSearchParams(theParams); translateListSearchParams(theParams);
notifySearchInterceptors(theParams, theRequest); setOffsetAndCount(theParams, theRequest);
CacheControlDirective cacheControlDirective = new CacheControlDirective(); CacheControlDirective cacheControlDirective = new CacheControlDirective();
if (theRequest != null) { if (theRequest != null) {
@ -1965,11 +1961,9 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
} }
private void translateListSearchParams(SearchParameterMap theParams) { private void translateListSearchParams(SearchParameterMap theParams) {
Iterator<String> keyIterator = theParams.keySet().iterator();
// Translate _list=42 to _has=List:item:_id=42 // Translate _list=42 to _has=List:item:_id=42
while (keyIterator.hasNext()) { for (String key : theParams.keySet()) {
String key = keyIterator.next();
if (Constants.PARAM_LIST.equals((key))) { if (Constants.PARAM_LIST.equals((key))) {
List<List<IQueryParameterType>> andOrValues = theParams.get(key); List<List<IQueryParameterType>> andOrValues = theParams.get(key);
theParams.remove(key); theParams.remove(key);
@ -1990,7 +1984,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
} }
} }
private void notifySearchInterceptors(SearchParameterMap theParams, RequestDetails theRequest) { protected void setOffsetAndCount(SearchParameterMap theParams, RequestDetails theRequest) {
if (theRequest != null) { if (theRequest != null) {
if (theRequest.isSubRequest()) { if (theRequest.isSubRequest()) {
@ -2000,7 +1994,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
1, 1,
Integer.MAX_VALUE, Integer.MAX_VALUE,
max, max,
"Maximum search result count in transaction ust be a positive integer"); "Maximum search result count in transaction must be a positive integer");
theParams.setLoadSynchronousUpTo(getStorageSettings().getMaximumSearchResultCountInTransaction()); theParams.setLoadSynchronousUpTo(getStorageSettings().getMaximumSearchResultCountInTransaction());
} }
} }
@ -2051,7 +2045,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
theParams.setLoadSynchronousUpTo(myStorageSettings.getInternalSynchronousSearchSize()); theParams.setLoadSynchronousUpTo(myStorageSettings.getInternalSynchronousSearchSize());
} }
ISearchBuilder builder = ISearchBuilder<?> builder =
mySearchBuilderFactory.newSearchBuilder(this, getResourceName(), getResourceType()); mySearchBuilderFactory.newSearchBuilder(this, getResourceName(), getResourceType());
List<JpaPid> ids = new ArrayList<>(); List<JpaPid> ids = new ArrayList<>();
@ -2208,9 +2202,8 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
RequestDetails theRequest, RequestDetails theRequest,
TransactionDetails theTransactionDetails, TransactionDetails theTransactionDetails,
RequestPartitionId theRequestPartitionId) { RequestPartitionId theRequestPartitionId) {
T resource = theResource;
preProcessResourceForStorage(resource); preProcessResourceForStorage(theResource);
preProcessResourceForStorage(theResource, theRequest, theTransactionDetails, thePerformIndexing); preProcessResourceForStorage(theResource, theRequest, theTransactionDetails, thePerformIndexing);
ResourceTable entity = null; ResourceTable entity = null;
@ -2235,8 +2228,8 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
entity = myEntityManager.find(ResourceTable.class, pid.getId()); entity = myEntityManager.find(ResourceTable.class, pid.getId());
resourceId = entity.getIdDt(); resourceId = entity.getIdDt();
if (myFhirContext.getVersion().getVersion().isEqualOrNewerThan(FhirVersionEnum.R4) if (myFhirContext.getVersion().getVersion().isEqualOrNewerThan(FhirVersionEnum.R4)
&& resource.getIdElement().getIdPart() != null) { && theResource.getIdElement().getIdPart() != null) {
if (!Objects.equals(resource.getIdElement().getIdPart(), resourceId.getIdPart())) { if (!Objects.equals(theResource.getIdElement().getIdPart(), resourceId.getIdPart())) {
String msg = getContext() String msg = getContext()
.getLocalizer() .getLocalizer()
.getMessageSanitized( .getMessageSanitized(
@ -2257,7 +2250,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
} }
DaoMethodOutcome outcome = doCreateForPostOrPut( DaoMethodOutcome outcome = doCreateForPostOrPut(
theRequest, theRequest,
resource, theResource,
theMatchUrl, theMatchUrl,
false, false,
thePerformIndexing, thePerformIndexing,
@ -2303,7 +2296,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
if (create) { if (create) {
return doCreateForPostOrPut( return doCreateForPostOrPut(
theRequest, theRequest,
resource, theResource,
null, null,
false, false,
thePerformIndexing, thePerformIndexing,
@ -2320,7 +2313,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
theMatchUrl, theMatchUrl,
thePerformIndexing, thePerformIndexing,
theForceUpdateVersion, theForceUpdateVersion,
resource, theResource,
entity, entity,
update, update,
theTransactionDetails); theTransactionDetails);
@ -2413,7 +2406,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
entity.setDeleted(null); entity.setDeleted(null);
boolean isUpdatingCurrent = resourceId.hasVersionIdPart() boolean isUpdatingCurrent = resourceId.hasVersionIdPart()
&& Long.parseLong(resourceId.getVersionIdPart()) == currentEntity.getVersion(); && Long.parseLong(resourceId.getVersionIdPart()) == currentEntity.getVersion();
IBasePersistedResource savedEntity = updateHistoryEntity( IBasePersistedResource<?> savedEntity = updateHistoryEntity(
theRequest, theResource, currentEntity, entity, resourceId, theTransactionDetails, isUpdatingCurrent); theRequest, theResource, currentEntity, entity, resourceId, theTransactionDetails, isUpdatingCurrent);
DaoMethodOutcome outcome = toMethodOutcome( DaoMethodOutcome outcome = toMethodOutcome(
theRequest, savedEntity, theResource, null, RestOperationTypeEnum.UPDATE) theRequest, savedEntity, theResource, null, RestOperationTypeEnum.UPDATE)
@ -2437,7 +2430,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
TransactionDetails transactionDetails = new TransactionDetails(); TransactionDetails transactionDetails = new TransactionDetails();
if (theMode == ValidationModeEnum.DELETE) { if (theMode == ValidationModeEnum.DELETE) {
if (theId == null || theId.hasIdPart() == false) { if (theId == null || !theId.hasIdPart()) {
throw new InvalidRequestException( throw new InvalidRequestException(
Msg.code(991) + "No ID supplied. ID is required when validating with mode=DELETE"); Msg.code(991) + "No ID supplied. ID is required when validating with mode=DELETE");
} }
@ -2570,7 +2563,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
Msg.code(997) + "Resource has an ID - ID must not be populated for a FHIR create"); Msg.code(997) + "Resource has an ID - ID must not be populated for a FHIR create");
} }
} else if (myMode == ValidationModeEnum.UPDATE) { } else if (myMode == ValidationModeEnum.UPDATE) {
if (hasId == false) { if (!hasId) {
throw new UnprocessableEntityException( throw new UnprocessableEntityException(
Msg.code(998) + "Resource has no ID - ID must be populated for a FHIR update"); Msg.code(998) + "Resource has no ID - ID must be populated for a FHIR update");
} }

View File

@ -39,6 +39,8 @@ import ca.uhn.fhir.rest.param.TokenParam;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.instance.model.api.IPrimitiveType; import org.hl7.fhir.instance.model.api.IPrimitiveType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional; import org.springframework.transaction.annotation.Transactional;
@ -50,6 +52,8 @@ import javax.servlet.http.HttpServletRequest;
public class JpaResourceDaoPatient<T extends IBaseResource> extends BaseHapiFhirResourceDao<T> public class JpaResourceDaoPatient<T extends IBaseResource> extends BaseHapiFhirResourceDao<T>
implements IFhirResourceDaoPatient<T> { implements IFhirResourceDaoPatient<T> {
private static final Logger ourLog = LoggerFactory.getLogger(JpaResourceDaoPatient.class);
@Autowired @Autowired
private IRequestPartitionHelperSvc myPartitionHelperSvc; private IRequestPartitionHelperSvc myPartitionHelperSvc;
@ -94,7 +98,7 @@ public class JpaResourceDaoPatient<T extends IBaseResource> extends BaseHapiFhir
if (theRequest.getParameters().containsKey("_mdm")) { if (theRequest.getParameters().containsKey("_mdm")) {
String[] paramVal = theRequest.getParameters().get("_mdm"); String[] paramVal = theRequest.getParameters().get("_mdm");
if (Arrays.asList(paramVal).contains("true")) { if (Arrays.asList(paramVal).contains("true")) {
theIds.getValuesAsQueryTokens().stream().forEach(param -> param.setMdmExpand(true)); theIds.getValuesAsQueryTokens().forEach(param -> param.setMdmExpand(true));
} }
} }
paramMap.add("_id", theIds); paramMap.add("_id", theIds);
@ -106,6 +110,9 @@ public class JpaResourceDaoPatient<T extends IBaseResource> extends BaseHapiFhir
RequestPartitionId requestPartitionId = myPartitionHelperSvc.determineReadPartitionForRequestForSearchType( RequestPartitionId requestPartitionId = myPartitionHelperSvc.determineReadPartitionForRequestForSearchType(
theRequest, getResourceName(), paramMap, null); theRequest, getResourceName(), paramMap, null);
adjustCount(theRequest, paramMap);
return mySearchCoordinatorSvc.registerSearch( return mySearchCoordinatorSvc.registerSearch(
this, this,
paramMap, paramMap,
@ -115,6 +122,27 @@ public class JpaResourceDaoPatient<T extends IBaseResource> extends BaseHapiFhir
requestPartitionId); requestPartitionId);
} }
private void adjustCount(RequestDetails theRequest, SearchParameterMap theParamMap) {
if (theRequest.getServer() == null) {
return;
}
if (theParamMap.getCount() == null && theRequest.getServer().getDefaultPageSize() != null) {
theParamMap.setCount(theRequest.getServer().getDefaultPageSize());
return;
}
Integer maxPageSize = theRequest.getServer().getMaximumPageSize();
if (maxPageSize != null && theParamMap.getCount() > maxPageSize) {
ourLog.info(
"Reducing {} from {} to {} which is the maximum allowable page size.",
Constants.PARAM_COUNT,
theParamMap.getCount(),
maxPageSize);
theParamMap.setCount(maxPageSize);
}
}
@Override @Override
@Transactional(propagation = Propagation.SUPPORTS) @Transactional(propagation = Propagation.SUPPORTS)
public IBundleProvider patientInstanceEverything( public IBundleProvider patientInstanceEverything(

View File

@ -82,6 +82,15 @@ public interface IMdmLinkJpaRepository
List<MdmPidTuple> expandPidsBySourcePidAndMatchResult( List<MdmPidTuple> expandPidsBySourcePidAndMatchResult(
@Param("sourcePid") Long theSourcePid, @Param("matchResult") MdmMatchResultEnum theMdmMatchResultEnum); @Param("sourcePid") Long theSourcePid, @Param("matchResult") MdmMatchResultEnum theMdmMatchResultEnum);
@Query("SELECT ml " + "FROM MdmLink ml "
+ "INNER JOIN MdmLink ml2 "
+ "on ml.myGoldenResourcePid=ml2.myGoldenResourcePid "
+ "WHERE ml2.mySourcePid=:sourcePid "
+ "AND ml2.myMatchResult!=:matchResult")
List<MdmLink> findLinksAssociatedWithGoldenResourceOfSourceResourceExcludingMatchResult(
@Param("sourcePid") Long theSourcePid,
@Param("matchResult") MdmMatchResultEnum theMdmMatchResultEnumToExclude);
@Query( @Query(
"SELECT ml.myGoldenResourcePid as goldenPid, ml.mySourcePid as sourcePid FROM MdmLink ml WHERE ml.myGoldenResourcePid = :goldenPid and ml.myMatchResult = :matchResult") "SELECT ml.myGoldenResourcePid as goldenPid, ml.mySourcePid as sourcePid FROM MdmLink ml WHERE ml.myGoldenResourcePid = :goldenPid and ml.myMatchResult = :matchResult")
List<MdmPidTuple> expandPidsByGoldenResourcePidAndMatchResult( List<MdmPidTuple> expandPidsByGoldenResourcePidAndMatchResult(

View File

@ -131,6 +131,12 @@ public class MdmLinkDaoJpaImpl implements IMdmLinkDao<JpaPid, MdmLink> {
.collect(Collectors.toList()); .collect(Collectors.toList());
} }
@Override
public List<MdmLink> findLinksAssociatedWithGoldenResourceOfSourceResourceExcludingNoMatch(JpaPid theSourcePid) {
return myMdmLinkDao.findLinksAssociatedWithGoldenResourceOfSourceResourceExcludingMatchResult(
(theSourcePid).getId(), MdmMatchResultEnum.NO_MATCH);
}
@Override @Override
public List<MdmPidTuple<JpaPid>> expandPidsByGoldenResourcePidAndMatchResult( public List<MdmPidTuple<JpaPid>> expandPidsByGoldenResourcePidAndMatchResult(
JpaPid theSourcePid, MdmMatchResultEnum theMdmMatchResultEnum) { JpaPid theSourcePid, MdmMatchResultEnum theMdmMatchResultEnum) {

View File

@ -44,6 +44,7 @@ import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.param.StringParam; import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.param.TokenParam; import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.rest.param.UriParam; import ca.uhn.fhir.rest.param.UriParam;
import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException;
import ca.uhn.fhir.util.FhirTerser; import ca.uhn.fhir.util.FhirTerser;
import ca.uhn.fhir.util.SearchParameterUtil; import ca.uhn.fhir.util.SearchParameterUtil;
import com.google.common.annotations.VisibleForTesting; import com.google.common.annotations.VisibleForTesting;
@ -53,6 +54,7 @@ import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.instance.model.api.IPrimitiveType; import org.hl7.fhir.instance.model.api.IPrimitiveType;
import org.hl7.fhir.r4.model.Identifier; import org.hl7.fhir.r4.model.Identifier;
import org.hl7.fhir.r4.model.MetadataResource;
import org.hl7.fhir.utilities.json.model.JsonObject; import org.hl7.fhir.utilities.json.model.JsonObject;
import org.hl7.fhir.utilities.npm.IPackageCacheManager; import org.hl7.fhir.utilities.npm.IPackageCacheManager;
import org.hl7.fhir.utilities.npm.NpmPackage; import org.hl7.fhir.utilities.npm.NpmPackage;
@ -342,7 +344,8 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
/** /**
* ============================= Utility methods =============================== * ============================= Utility methods ===============================
*/ */
private void create( @VisibleForTesting
void create(
IBaseResource theResource, IBaseResource theResource,
PackageInstallationSpec theInstallationSpec, PackageInstallationSpec theInstallationSpec,
PackageInstallOutcomeJson theOutcome) { PackageInstallOutcomeJson theOutcome) {
@ -365,8 +368,30 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
String newIdPart = "npm-" + id.getIdPart(); String newIdPart = "npm-" + id.getIdPart();
id.setParts(id.getBaseUrl(), id.getResourceType(), newIdPart, id.getVersionIdPart()); id.setParts(id.getBaseUrl(), id.getResourceType(), newIdPart, id.getVersionIdPart());
} }
updateResource(dao, theResource);
ourLog.info("Created resource with existing id"); try {
updateResource(dao, theResource);
ourLog.info("Created resource with existing id");
} catch (ResourceVersionConflictException exception) {
final Optional<IBaseResource> optResource = readResourceById(dao, id);
final String existingResourceUrlOrNull = optResource
.filter(MetadataResource.class::isInstance)
.map(MetadataResource.class::cast)
.map(MetadataResource::getUrl)
.orElse(null);
final String newResourceUrlOrNull = (theResource instanceof MetadataResource)
? ((MetadataResource) theResource).getUrl()
: null;
ourLog.error(
"Version conflict error: This is possibly due to a collision between ValueSets from different IGs that are coincidentally using the same resource ID: [{}] and new resource URL: [{}], with the exisitng resource having URL: [{}]. Ignoring this update and continuing: The first IG wins. ",
id.getIdPart(),
newResourceUrlOrNull,
existingResourceUrlOrNull,
exception);
}
} }
} else { } else {
if (theInstallationSpec.isReloadExisting()) { if (theInstallationSpec.isReloadExisting()) {
@ -394,6 +419,18 @@ public class PackageInstallerSvcImpl implements IPackageInstallerSvc {
} }
} }
private Optional<IBaseResource> readResourceById(IFhirResourceDao dao, IIdType id) {
try {
return Optional.ofNullable(dao.read(id.toUnqualifiedVersionless(), newSystemRequestDetails()));
} catch (Exception exception) {
// ignore because we're running this query to help build the log
ourLog.warn("Exception when trying to read resource with ID: {}, message: {}", id, exception.getMessage());
}
return Optional.empty();
}
private IBundleProvider searchResource(IFhirResourceDao theDao, SearchParameterMap theMap) { private IBundleProvider searchResource(IFhirResourceDao theDao, SearchParameterMap theMap) {
return theDao.search(theMap, newSystemRequestDetails()); return theDao.search(theMap, newSystemRequestDetails());
} }

View File

@ -21,7 +21,9 @@ package ca.uhn.fhir.jpa.reindex;
import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.context.RuntimeResourceDefinition; import ca.uhn.fhir.context.RuntimeResourceDefinition;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.pid.EmptyResourcePidList; import ca.uhn.fhir.jpa.api.pid.EmptyResourcePidList;
@ -39,93 +41,121 @@ import ca.uhn.fhir.rest.api.SortOrderEnum;
import ca.uhn.fhir.rest.api.SortSpec; import ca.uhn.fhir.rest.api.SortSpec;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId; import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId;
import ca.uhn.fhir.rest.param.DateRangeParam; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.util.DateRangeUtil;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.Pageable; import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Slice; import org.springframework.data.domain.Slice;
import java.util.ArrayList;
import java.util.Date; import java.util.Date;
import java.util.List; import java.util.List;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import javax.annotation.Nonnull; import javax.annotation.Nonnull;
import javax.annotation.Nullable; import javax.annotation.Nullable;
import static org.apache.commons.collections4.CollectionUtils.isNotEmpty;
public class Batch2DaoSvcImpl implements IBatch2DaoSvc { public class Batch2DaoSvcImpl implements IBatch2DaoSvc {
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(Batch2DaoSvcImpl.class);
@Autowired private final IResourceTableDao myResourceTableDao;
private IResourceTableDao myResourceTableDao;
@Autowired private final MatchUrlService myMatchUrlService;
private MatchUrlService myMatchUrlService;
@Autowired private final DaoRegistry myDaoRegistry;
private DaoRegistry myDaoRegistry;
@Autowired private final FhirContext myFhirContext;
private FhirContext myFhirContext;
@Autowired private final IHapiTransactionService myTransactionService;
private IHapiTransactionService myTransactionService;
private final JpaStorageSettings myJpaStorageSettings;
@Override @Override
public boolean isAllResourceTypeSupported() { public boolean isAllResourceTypeSupported() {
return true; return true;
} }
public Batch2DaoSvcImpl(
IResourceTableDao theResourceTableDao,
MatchUrlService theMatchUrlService,
DaoRegistry theDaoRegistry,
FhirContext theFhirContext,
IHapiTransactionService theTransactionService,
JpaStorageSettings theJpaStorageSettings) {
myResourceTableDao = theResourceTableDao;
myMatchUrlService = theMatchUrlService;
myDaoRegistry = theDaoRegistry;
myFhirContext = theFhirContext;
myTransactionService = theTransactionService;
myJpaStorageSettings = theJpaStorageSettings;
}
@Override @Override
public IResourcePidList fetchResourceIdsPage( public IResourcePidList fetchResourceIdsPage(
Date theStart, Date theStart, Date theEnd, @Nullable RequestPartitionId theRequestPartitionId, @Nullable String theUrl) {
Date theEnd,
@Nonnull Integer thePageSize,
@Nullable RequestPartitionId theRequestPartitionId,
@Nullable String theUrl) {
return myTransactionService return myTransactionService
.withSystemRequest() .withSystemRequest()
.withRequestPartitionId(theRequestPartitionId) .withRequestPartitionId(theRequestPartitionId)
.execute(() -> { .execute(() -> {
if (theUrl == null) { if (theUrl == null) {
return fetchResourceIdsPageNoUrl(theStart, theEnd, thePageSize, theRequestPartitionId); return fetchResourceIdsPageNoUrl(theStart, theEnd, theRequestPartitionId);
} else { } else {
return fetchResourceIdsPageWithUrl( return fetchResourceIdsPageWithUrl(theEnd, theUrl, theRequestPartitionId);
theStart, theEnd, thePageSize, theUrl, theRequestPartitionId);
} }
}); });
} }
private IResourcePidList fetchResourceIdsPageWithUrl( @Nonnull
Date theStart, Date theEnd, int thePageSize, String theUrl, RequestPartitionId theRequestPartitionId) { private HomogeneousResourcePidList fetchResourceIdsPageWithUrl(
Date theEnd, @Nonnull String theUrl, @Nullable RequestPartitionId theRequestPartitionId) {
if (!theUrl.contains("?")) {
throw new InternalErrorException(Msg.code(2422) + "this should never happen: URL is missing a '?'");
}
final Integer internalSynchronousSearchSize = myJpaStorageSettings.getInternalSynchronousSearchSize();
if (internalSynchronousSearchSize == null || internalSynchronousSearchSize <= 0) {
throw new InternalErrorException(Msg.code(2423)
+ "this should never happen: internalSynchronousSearchSize is null or less than or equal to 0");
}
List<IResourcePersistentId> currentIds = fetchResourceIdsPageWithUrl(0, theUrl, theRequestPartitionId);
ourLog.debug("FIRST currentIds: {}", currentIds.size());
final List<IResourcePersistentId> allIds = new ArrayList<>(currentIds);
while (internalSynchronousSearchSize < currentIds.size()) {
// Ensure the offset is set to the last ID in the cumulative List, otherwise, we'll be stuck in an infinite
// loop here:
currentIds = fetchResourceIdsPageWithUrl(allIds.size(), theUrl, theRequestPartitionId);
ourLog.debug("NEXT currentIds: {}", currentIds.size());
allIds.addAll(currentIds);
}
final String resourceType = theUrl.substring(0, theUrl.indexOf('?'));
return new HomogeneousResourcePidList(resourceType, allIds, theEnd, theRequestPartitionId);
}
private List<IResourcePersistentId> fetchResourceIdsPageWithUrl(
int theOffset, String theUrl, RequestPartitionId theRequestPartitionId) {
String resourceType = theUrl.substring(0, theUrl.indexOf('?')); String resourceType = theUrl.substring(0, theUrl.indexOf('?'));
RuntimeResourceDefinition def = myFhirContext.getResourceDefinition(resourceType); RuntimeResourceDefinition def = myFhirContext.getResourceDefinition(resourceType);
SearchParameterMap searchParamMap = myMatchUrlService.translateMatchUrl(theUrl, def); SearchParameterMap searchParamMap = myMatchUrlService.translateMatchUrl(theUrl, def);
searchParamMap.setSort(new SortSpec(Constants.PARAM_LASTUPDATED, SortOrderEnum.ASC)); searchParamMap.setSort(new SortSpec(Constants.PARAM_ID, SortOrderEnum.ASC));
DateRangeParam chunkDateRange = searchParamMap.setOffset(theOffset);
DateRangeUtil.narrowDateRange(searchParamMap.getLastUpdated(), theStart, theEnd); searchParamMap.setLoadSynchronousUpTo(myJpaStorageSettings.getInternalSynchronousSearchSize() + 1);
searchParamMap.setLastUpdated(chunkDateRange);
searchParamMap.setCount(thePageSize);
IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(resourceType); IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(resourceType);
SystemRequestDetails request = new SystemRequestDetails(); SystemRequestDetails request = new SystemRequestDetails();
request.setRequestPartitionId(theRequestPartitionId); request.setRequestPartitionId(theRequestPartitionId);
List<IResourcePersistentId> ids = dao.searchForIds(searchParamMap, request);
Date lastDate = null; return dao.searchForIds(searchParamMap, request);
if (isNotEmpty(ids)) {
IResourcePersistentId lastResourcePersistentId = ids.get(ids.size() - 1);
lastDate = dao.readByPid(lastResourcePersistentId, true).getMeta().getLastUpdated();
}
return new HomogeneousResourcePidList(resourceType, ids, lastDate, theRequestPartitionId);
} }
@Nonnull @Nonnull
private IResourcePidList fetchResourceIdsPageNoUrl( private IResourcePidList fetchResourceIdsPageNoUrl(
Date theStart, Date theEnd, int thePagesize, RequestPartitionId theRequestPartitionId) { Date theStart, Date theEnd, RequestPartitionId theRequestPartitionId) {
Pageable page = Pageable.ofSize(thePagesize); final Pageable page = Pageable.unpaged();
Slice<Object[]> slice; Slice<Object[]> slice;
if (theRequestPartitionId == null || theRequestPartitionId.isAllPartitions()) { if (theRequestPartitionId == null || theRequestPartitionId.isAllPartitions()) {
slice = myResourceTableDao.findIdsTypesAndUpdateTimesOfResourcesWithinUpdatedRangeOrderedFromOldest( slice = myResourceTableDao.findIdsTypesAndUpdateTimesOfResourcesWithinUpdatedRangeOrderedFromOldest(

View File

@ -117,6 +117,7 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Objects; import java.util.Objects;
import java.util.Optional;
import java.util.Set; import java.util.Set;
import java.util.function.Supplier; import java.util.function.Supplier;
import java.util.stream.Collectors; import java.util.stream.Collectors;
@ -1854,14 +1855,23 @@ public class QueryStack {
throw new InvalidRequestException(Msg.code(1216) + msg); throw new InvalidRequestException(Msg.code(1216) + msg);
} }
SourcePredicateBuilder join = createOrReusePredicateBuilder(
PredicateBuilderTypeEnum.SOURCE,
theSourceJoinColumn,
Constants.PARAM_SOURCE,
() -> mySqlBuilder.addSourcePredicateBuilder(theSourceJoinColumn))
.getResult();
List<Condition> orPredicates = new ArrayList<>(); List<Condition> orPredicates = new ArrayList<>();
// :missing=true modifier processing requires "LEFT JOIN" with HFJ_RESOURCE table to return correct results
// if both sourceUri and requestId are not populated for the resource
Optional<? extends IQueryParameterType> isMissingSourceOptional = theList.stream()
.filter(nextParameter -> nextParameter.getMissing() != null && nextParameter.getMissing())
.findFirst();
if (isMissingSourceOptional.isPresent()) {
SourcePredicateBuilder join =
getSourcePredicateBuilder(theSourceJoinColumn, SelectQuery.JoinType.LEFT_OUTER);
orPredicates.add(join.createPredicateMissingSourceUri());
return toOrPredicate(orPredicates);
}
// for all other cases we use "INNER JOIN" to match search parameters
SourcePredicateBuilder join = getSourcePredicateBuilder(theSourceJoinColumn, SelectQuery.JoinType.INNER);
for (IQueryParameterType nextParameter : theList) { for (IQueryParameterType nextParameter : theList) {
SourceParam sourceParameter = new SourceParam(nextParameter.getValueAsQueryToken(myFhirContext)); SourceParam sourceParameter = new SourceParam(nextParameter.getValueAsQueryToken(myFhirContext));
String sourceUri = sourceParameter.getSourceUri(); String sourceUri = sourceParameter.getSourceUri();
@ -1870,7 +1880,8 @@ public class QueryStack {
orPredicates.add(toAndPredicate( orPredicates.add(toAndPredicate(
join.createPredicateSourceUri(sourceUri), join.createPredicateRequestId(requestId))); join.createPredicateSourceUri(sourceUri), join.createPredicateRequestId(requestId)));
} else if (isNotBlank(sourceUri)) { } else if (isNotBlank(sourceUri)) {
orPredicates.add(join.createPredicateSourceUri(sourceUri)); orPredicates.add(
join.createPredicateSourceUriWithModifiers(nextParameter, myStorageSettings, sourceUri));
} else if (isNotBlank(requestId)) { } else if (isNotBlank(requestId)) {
orPredicates.add(join.createPredicateRequestId(requestId)); orPredicates.add(join.createPredicateRequestId(requestId));
} }
@ -1879,6 +1890,16 @@ public class QueryStack {
return toOrPredicate(orPredicates); return toOrPredicate(orPredicates);
} }
private SourcePredicateBuilder getSourcePredicateBuilder(
@Nullable DbColumn theSourceJoinColumn, SelectQuery.JoinType theJoinType) {
return createOrReusePredicateBuilder(
PredicateBuilderTypeEnum.SOURCE,
theSourceJoinColumn,
Constants.PARAM_SOURCE,
() -> mySqlBuilder.addSourcePredicateBuilder(theSourceJoinColumn, theJoinType))
.getResult();
}
public Condition createPredicateString( public Condition createPredicateString(
@Nullable DbColumn theSourceJoinColumn, @Nullable DbColumn theSourceJoinColumn,
String theResourceName, String theResourceName,

View File

@ -19,13 +19,29 @@
*/ */
package ca.uhn.fhir.jpa.search.builder.predicate; package ca.uhn.fhir.jpa.search.builder.predicate;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.search.builder.sql.SearchQueryBuilder; import ca.uhn.fhir.jpa.search.builder.sql.SearchQueryBuilder;
import ca.uhn.fhir.jpa.util.QueryParameterUtils;
import ca.uhn.fhir.model.api.IQueryParameterType;
import ca.uhn.fhir.rest.param.UriParam;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.MethodNotAllowedException;
import ca.uhn.fhir.util.StringUtil;
import ca.uhn.fhir.util.UrlUtil;
import com.healthmarketscience.sqlbuilder.BinaryCondition; import com.healthmarketscience.sqlbuilder.BinaryCondition;
import com.healthmarketscience.sqlbuilder.Condition; import com.healthmarketscience.sqlbuilder.Condition;
import com.healthmarketscience.sqlbuilder.FunctionCall;
import com.healthmarketscience.sqlbuilder.UnaryCondition;
import com.healthmarketscience.sqlbuilder.dbspec.basic.DbColumn; import com.healthmarketscience.sqlbuilder.dbspec.basic.DbColumn;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import java.util.List;
import static ca.uhn.fhir.jpa.search.builder.predicate.StringPredicateBuilder.createLeftAndRightMatchLikeExpression;
import static ca.uhn.fhir.jpa.search.builder.predicate.StringPredicateBuilder.createLeftMatchLikeExpression;
public class SourcePredicateBuilder extends BaseJoiningPredicateBuilder { public class SourcePredicateBuilder extends BaseJoiningPredicateBuilder {
private static final Logger ourLog = LoggerFactory.getLogger(SourcePredicateBuilder.class); private static final Logger ourLog = LoggerFactory.getLogger(SourcePredicateBuilder.class);
@ -53,6 +69,57 @@ public class SourcePredicateBuilder extends BaseJoiningPredicateBuilder {
return BinaryCondition.equalTo(myColumnSourceUri, generatePlaceholder(theSourceUri)); return BinaryCondition.equalTo(myColumnSourceUri, generatePlaceholder(theSourceUri));
} }
public Condition createPredicateMissingSourceUri() {
return UnaryCondition.isNull(myColumnSourceUri);
}
public Condition createPredicateSourceUriWithModifiers(
IQueryParameterType theQueryParameter, JpaStorageSettings theStorageSetting, String theSourceUri) {
if (theQueryParameter.getMissing() != null && !theQueryParameter.getMissing()) {
return UnaryCondition.isNotNull(myColumnSourceUri);
} else if (theQueryParameter instanceof UriParam && theQueryParameter.getQueryParameterQualifier() != null) {
UriParam uriParam = (UriParam) theQueryParameter;
switch (uriParam.getQualifier()) {
case ABOVE:
return createPredicateSourceAbove(theSourceUri);
case BELOW:
return createPredicateSourceBelow(theSourceUri);
case CONTAINS:
return createPredicateSourceContains(theStorageSetting, theSourceUri);
default:
throw new InvalidRequestException(Msg.code(2418)
+ String.format(
"Unsupported qualifier specified, qualifier=%s",
theQueryParameter.getQueryParameterQualifier()));
}
} else {
return createPredicateSourceUri(theSourceUri);
}
}
private Condition createPredicateSourceAbove(String theSourceUri) {
List<String> aboveUriCandidates = UrlUtil.getAboveUriCandidates(theSourceUri);
List<String> aboveUriPlaceholders = generatePlaceholders(aboveUriCandidates);
return QueryParameterUtils.toEqualToOrInPredicate(myColumnSourceUri, aboveUriPlaceholders);
}
private Condition createPredicateSourceBelow(String theSourceUri) {
String belowLikeExpression = createLeftMatchLikeExpression(theSourceUri);
return BinaryCondition.like(myColumnSourceUri, generatePlaceholder(belowLikeExpression));
}
private Condition createPredicateSourceContains(JpaStorageSettings theStorageSetting, String theSourceUri) {
if (theStorageSetting.isAllowContainsSearches()) {
FunctionCall upperFunction = new FunctionCall("UPPER");
upperFunction.addCustomParams(myColumnSourceUri);
String normalizedString = StringUtil.normalizeStringForSearchIndexing(theSourceUri);
String containsLikeExpression = createLeftAndRightMatchLikeExpression(normalizedString);
return BinaryCondition.like(upperFunction, generatePlaceholder(containsLikeExpression));
} else {
throw new MethodNotAllowedException(Msg.code(2417) + ":contains modifier is disabled on this server");
}
}
public Condition createPredicateRequestId(String theRequestId) { public Condition createPredicateRequestId(String theRequestId) {
return BinaryCondition.equalTo(myColumnRequestId, generatePlaceholder(theRequestId)); return BinaryCondition.equalTo(myColumnRequestId, generatePlaceholder(theRequestId));
} }

View File

@ -288,9 +288,10 @@ public class SearchQueryBuilder {
/** /**
* Add and return a predicate builder (or a root query if no root query exists yet) for selecting on a <code>_source</code> search parameter * Add and return a predicate builder (or a root query if no root query exists yet) for selecting on a <code>_source</code> search parameter
*/ */
public SourcePredicateBuilder addSourcePredicateBuilder(@Nullable DbColumn theSourceJoinColumn) { public SourcePredicateBuilder addSourcePredicateBuilder(
@Nullable DbColumn theSourceJoinColumn, SelectQuery.JoinType theJoinType) {
SourcePredicateBuilder retVal = mySqlBuilderFactory.newSourcePredicateBuilder(this); SourcePredicateBuilder retVal = mySqlBuilderFactory.newSourcePredicateBuilder(this);
addTable(retVal, theSourceJoinColumn); addTable(retVal, theSourceJoinColumn, theJoinType);
return retVal; return retVal;
} }

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -25,6 +25,7 @@ import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.model.search.StorageProcessingMessage; import ca.uhn.fhir.jpa.model.search.StorageProcessingMessage;
import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases; import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases;
import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases; import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases;
import ca.uhn.fhir.jpa.search.BaseSourceSearchParameterTestCases;
import ca.uhn.fhir.jpa.search.reindex.IResourceReindexingSvc; import ca.uhn.fhir.jpa.search.reindex.IResourceReindexingSvc;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.sp.ISearchParamPresenceSvc; import ca.uhn.fhir.jpa.sp.ISearchParamPresenceSvc;
@ -88,7 +89,6 @@ import org.junit.jupiter.params.provider.EnumSource;
import org.mockito.Mock; import org.mockito.Mock;
import org.mockito.Mockito; import org.mockito.Mockito;
import org.mockito.junit.jupiter.MockitoExtension; import org.mockito.junit.jupiter.MockitoExtension;
import org.springframework.aop.support.Pointcuts;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.test.annotation.DirtiesContext; import org.springframework.test.annotation.DirtiesContext;
@ -113,7 +113,6 @@ import java.util.Date;
import java.util.HashSet; import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import java.util.function.Consumer;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static ca.uhn.fhir.jpa.model.util.UcumServiceUtil.UCUM_CODESYSTEM_URL; import static ca.uhn.fhir.jpa.model.util.UcumServiceUtil.UCUM_CODESYSTEM_URL;
@ -124,10 +123,6 @@ import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.containsString; import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.empty; import static org.hamcrest.Matchers.empty;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.equalToCompressingWhiteSpace;
import static org.hamcrest.Matchers.equalToIgnoringWhiteSpace;
import static org.hamcrest.Matchers.greaterThan;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.hasItem; import static org.hamcrest.Matchers.hasItem;
import static org.hamcrest.Matchers.hasItems; import static org.hamcrest.Matchers.hasItems;
import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.hasSize;
@ -1565,7 +1560,7 @@ public class FhirResourceDaoR4SearchWithElasticSearchIT extends BaseJpaTest impl
public void tagSourceSearch() { public void tagSourceSearch() {
String id = myTestDataBuilder.createObservation(List.of( String id = myTestDataBuilder.createObservation(List.of(
myTestDataBuilder.withObservationCode("http://example.com/", "theCode"), myTestDataBuilder.withObservationCode("http://example.com/", "theCode"),
myTestDataBuilder.withSource(myFhirContext, "http://example.com/theSource"))).getIdPart(); myTestDataBuilder.withSource("http://example.com/theSource"))).getIdPart();
myCaptureQueriesListener.clear(); myCaptureQueriesListener.clear();
List<String> allIds = myTestDaoSearch.searchForIds("/Observation?_source=http://example.com/theSource"); List<String> allIds = myTestDaoSearch.searchForIds("/Observation?_source=http://example.com/theSource");
@ -2365,6 +2360,18 @@ public class FhirResourceDaoR4SearchWithElasticSearchIT extends BaseJpaTest impl
} }
} }
@Nested
class SourceSearchParameterTestCases extends BaseSourceSearchParameterTestCases {
SourceSearchParameterTestCases() {
super(myTestDataBuilder.getTestDataBuilderSupport(), myTestDaoSearch, myStorageSettings);
}
@Override
protected boolean isRequestIdSupported() {
return false;
}
}
/** /**
* Disallow context dirtying for nested classes * Disallow context dirtying for nested classes
*/ */

View File

@ -3,7 +3,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>
@ -32,6 +32,12 @@
<version>${project.version}</version> <version>${project.version}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency>
<groupId>com.tngtech.archunit</groupId>
<artifactId>archunit-junit5</artifactId>
<scope>test</scope>
</dependency>
</dependencies> </dependencies>
<profiles> <profiles>

View File

@ -20,6 +20,7 @@
package ca.uhn.fhir.jpa.fql.jdbc; package ca.uhn.fhir.jpa.fql.jdbc;
import ca.uhn.fhir.jpa.fql.executor.IHfqlExecutionResult; import ca.uhn.fhir.jpa.fql.executor.IHfqlExecutionResult;
import ca.uhn.fhir.jpa.fql.util.HfqlConstants;
import ca.uhn.fhir.rest.client.impl.HttpBasicAuthInterceptor; import ca.uhn.fhir.rest.client.impl.HttpBasicAuthInterceptor;
import ca.uhn.fhir.util.IoUtil; import ca.uhn.fhir.util.IoUtil;
import org.apache.commons.csv.CSVFormat; import org.apache.commons.csv.CSVFormat;
@ -27,10 +28,14 @@ import org.apache.commons.lang3.Validate;
import org.apache.http.impl.client.CloseableHttpClient; import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder; import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.impl.conn.PoolingHttpClientConnectionManager; import org.apache.http.impl.conn.PoolingHttpClientConnectionManager;
import org.hl7.fhir.r4.model.CodeType;
import org.hl7.fhir.r4.model.IntegerType;
import org.hl7.fhir.r4.model.Parameters; import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.StringType;
import java.sql.SQLException; import java.sql.SQLException;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
import javax.annotation.Nonnull;
import static ca.uhn.fhir.jpa.fql.util.HfqlConstants.DEFAULT_FETCH_SIZE; import static ca.uhn.fhir.jpa.fql.util.HfqlConstants.DEFAULT_FETCH_SIZE;
import static org.apache.commons.lang3.ObjectUtils.defaultIfNull; import static org.apache.commons.lang3.ObjectUtils.defaultIfNull;
@ -67,6 +72,18 @@ public class HfqlRestClient {
myClient = httpClientBuilder.build(); myClient = httpClientBuilder.build();
} }
@Nonnull
public static Parameters newQueryRequestParameters(String sql, Integer limit, int fetchSize) {
Parameters input = new Parameters();
input.addParameter(HfqlConstants.PARAM_ACTION, new CodeType(HfqlConstants.PARAM_ACTION_SEARCH));
input.addParameter(HfqlConstants.PARAM_QUERY, new StringType(sql));
if (limit != null) {
input.addParameter(HfqlConstants.PARAM_LIMIT, new IntegerType(limit));
}
input.addParameter(HfqlConstants.PARAM_FETCH_SIZE, new IntegerType(fetchSize));
return input;
}
public IHfqlExecutionResult execute( public IHfqlExecutionResult execute(
Parameters theRequestParameters, boolean theSupportsContinuations, Integer theFetchSize) Parameters theRequestParameters, boolean theSupportsContinuations, Integer theFetchSize)
throws SQLException { throws SQLException {

View File

@ -20,7 +20,6 @@
package ca.uhn.fhir.jpa.fql.jdbc; package ca.uhn.fhir.jpa.fql.jdbc;
import ca.uhn.fhir.jpa.fql.executor.IHfqlExecutionResult; import ca.uhn.fhir.jpa.fql.executor.IHfqlExecutionResult;
import ca.uhn.fhir.jpa.fql.provider.HfqlRestProvider;
import ca.uhn.fhir.jpa.fql.util.HfqlConstants; import ca.uhn.fhir.jpa.fql.util.HfqlConstants;
import org.hl7.fhir.r4.model.Parameters; import org.hl7.fhir.r4.model.Parameters;
@ -43,8 +42,8 @@ class JdbcStatement implements Statement {
} }
@Override @Override
public ResultSet executeQuery(String sql) throws SQLException { public ResultSet executeQuery(String theSqlText) throws SQLException {
execute(sql); execute(theSqlText);
return getResultSet(); return getResultSet();
} }
@ -122,7 +121,7 @@ class JdbcStatement implements Statement {
int fetchSize = myFetchSize; int fetchSize = myFetchSize;
Parameters input = HfqlRestProvider.newQueryRequestParameters(sql, limit, fetchSize); Parameters input = HfqlRestClient.newQueryRequestParameters(sql, limit, fetchSize);
IHfqlExecutionResult result = myConnection.getClient().execute(input, true, getFetchSize()); IHfqlExecutionResult result = myConnection.getClient().execute(input, true, getFetchSize());
myResultSet = new JdbcResultSet(result, this); myResultSet = new JdbcResultSet(result, this);

View File

@ -32,15 +32,10 @@ import ca.uhn.fhir.util.ValidateUtil;
import ca.uhn.fhir.util.VersionUtil; import ca.uhn.fhir.util.VersionUtil;
import org.apache.commons.csv.CSVPrinter; import org.apache.commons.csv.CSVPrinter;
import org.hl7.fhir.instance.model.api.IPrimitiveType; import org.hl7.fhir.instance.model.api.IPrimitiveType;
import org.hl7.fhir.r4.model.CodeType;
import org.hl7.fhir.r4.model.IntegerType;
import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.StringType;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import java.io.IOException; import java.io.IOException;
import java.io.OutputStreamWriter; import java.io.OutputStreamWriter;
import javax.annotation.Nonnull;
import javax.annotation.Nullable; import javax.annotation.Nullable;
import javax.servlet.ServletOutputStream; import javax.servlet.ServletOutputStream;
import javax.servlet.http.HttpServletResponse; import javax.servlet.http.HttpServletResponse;
@ -124,6 +119,7 @@ public class HfqlRestProvider {
String continuation = toStringValue(theContinuation); String continuation = toStringValue(theContinuation);
ValidateUtil.isTrueOrThrowInvalidRequest( ValidateUtil.isTrueOrThrowInvalidRequest(
theOffset != null && theOffset.hasValue(), "No offset supplied"); theOffset != null && theOffset.hasValue(), "No offset supplied");
@SuppressWarnings("java:S2259") // Sonar doesn't understand the above
int startingOffset = theOffset.getValue(); int startingOffset = theOffset.getValue();
String statement = DatatypeUtil.toStringValue(theStatement); String statement = DatatypeUtil.toStringValue(theStatement);
@ -147,6 +143,9 @@ public class HfqlRestProvider {
streamResponseCsv(theServletResponse, fetchSize, outcome, true, outcome.getStatement()); streamResponseCsv(theServletResponse, fetchSize, outcome, true, outcome.getStatement());
break; break;
} }
default:
//noinspection DataFlowIssue
ValidateUtil.isTrueOrThrowInvalidRequest(false, "Unrecognized action: %s", action);
} }
} }
@ -186,41 +185,29 @@ public class HfqlRestProvider {
theServletResponse.setContentType(CT_TEXT_CSV + CHARSET_UTF8_CTSUFFIX); theServletResponse.setContentType(CT_TEXT_CSV + CHARSET_UTF8_CTSUFFIX);
try (ServletOutputStream outputStream = theServletResponse.getOutputStream()) { try (ServletOutputStream outputStream = theServletResponse.getOutputStream()) {
Appendable out = new OutputStreamWriter(outputStream); Appendable out = new OutputStreamWriter(outputStream);
CSVPrinter csvWriter = new CSVPrinter(out, CSV_FORMAT); try (CSVPrinter csvWriter = new CSVPrinter(out, CSV_FORMAT)) {
csvWriter.printRecords(); csvWriter.printRecords();
// Protocol version // Protocol version
csvWriter.printRecord(HfqlConstants.PROTOCOL_VERSION, "HAPI FHIR " + VersionUtil.getVersion()); csvWriter.printRecord(HfqlConstants.PROTOCOL_VERSION, "HAPI FHIR " + VersionUtil.getVersion());
// Search ID, Limit, Parsed FQL Statement // Search ID, Limit, Parsed FQL Statement
String searchId = theResult.getSearchId(); String searchId = theResult.getSearchId();
String parsedFqlStatement = ""; String parsedFqlStatement = "";
if (theInitialPage && theStatement != null) { if (theInitialPage && theStatement != null) {
parsedFqlStatement = JsonUtil.serialize(theStatement, false); parsedFqlStatement = JsonUtil.serialize(theStatement, false);
}
csvWriter.printRecord(searchId, theResult.getLimit(), parsedFqlStatement);
// Print the rows
int recordCount = 0;
while (recordCount++ < theFetchSize && theResult.hasNext()) {
IHfqlExecutionResult.Row nextRow = theResult.getNextRow();
csvWriter.print(nextRow.getRowOffset());
csvWriter.printRecord(nextRow.getRowValues());
}
csvWriter.flush();
} }
csvWriter.printRecord(searchId, theResult.getLimit(), parsedFqlStatement);
// Print the rows
int recordCount = 0;
while (recordCount++ < theFetchSize && theResult.hasNext()) {
IHfqlExecutionResult.Row nextRow = theResult.getNextRow();
csvWriter.print(nextRow.getRowOffset());
csvWriter.printRecord(nextRow.getRowValues());
}
csvWriter.close(true);
} }
} }
@Nonnull
public static Parameters newQueryRequestParameters(String sql, Integer limit, int fetchSize) {
Parameters input = new Parameters();
input.addParameter(HfqlConstants.PARAM_ACTION, new CodeType(HfqlConstants.PARAM_ACTION_SEARCH));
input.addParameter(HfqlConstants.PARAM_QUERY, new StringType(sql));
if (limit != null) {
input.addParameter(HfqlConstants.PARAM_LIMIT, new IntegerType(limit));
}
input.addParameter(HfqlConstants.PARAM_FETCH_SIZE, new IntegerType(fetchSize));
return input;
}
} }

View File

@ -0,0 +1,27 @@
package ca.uhn.fhir.jpa.fql.jdbc;
import com.tngtech.archunit.core.domain.JavaClasses;
import com.tngtech.archunit.core.importer.ImportOption;
import com.tngtech.archunit.junit.AnalyzeClasses;
import com.tngtech.archunit.junit.ArchTest;
import com.tngtech.archunit.lang.syntax.ArchRuleDefinition;
@AnalyzeClasses(
packages = "ca.uhn.fhir.jpa.fql..",
importOptions = {
ImportOption.DoNotIncludeTests.class
}
)
public class ArchitectureTest {
/**
* This project has a "provided" dependency on javax.servlet, but the packaged jdbc driver doesn't bundle it.
*/
@ArchTest
void verifyNoDepsOnProvidedServlet(JavaClasses theJavaClasses) {
ArchRuleDefinition.noClasses().that().resideInAPackage("ca.uhn.fhir.jpa.fql.jdbc")
.should().transitivelyDependOnClassesThat().resideInAPackage("javax.servlet")
.check(theJavaClasses);
}
}

View File

@ -3,7 +3,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR JPA Server - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.mdm.models; package ca.uhn.fhir.jpa.mdm.models;
import ca.uhn.fhir.mdm.model.MdmTransactionContext; import ca.uhn.fhir.mdm.model.MdmTransactionContext;

View File

@ -82,7 +82,6 @@ public class GoldenResourceSearchSvcImpl implements IGoldenResourceSearchSvc {
DateRangeParam chunkDateRange = DateRangeParam chunkDateRange =
DateRangeUtil.narrowDateRange(searchParamMap.getLastUpdated(), theStart, theEnd); DateRangeUtil.narrowDateRange(searchParamMap.getLastUpdated(), theStart, theEnd);
searchParamMap.setLastUpdated(chunkDateRange); searchParamMap.setLastUpdated(chunkDateRange);
searchParamMap.setCount(thePageSize); // request this many pids
searchParamMap.add( searchParamMap.add(
"_tag", new TokenParam(MdmConstants.SYSTEM_GOLDEN_RECORD_STATUS, MdmConstants.CODE_GOLDEN_RECORD)); "_tag", new TokenParam(MdmConstants.SYSTEM_GOLDEN_RECORD_STATUS, MdmConstants.CODE_GOLDEN_RECORD));

View File

@ -11,6 +11,7 @@ import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;
import ca.uhn.fhir.mdm.interceptor.IMdmStorageInterceptor; import ca.uhn.fhir.mdm.interceptor.IMdmStorageInterceptor;
import ca.uhn.fhir.model.primitive.IdDt; import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.PreconditionFailedException;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
@ -62,7 +63,7 @@ public class MdmExpungeTest extends BaseMdmR4Test {
try { try {
myPatientDao.expunge(myTargetId.toVersionless(), expungeOptions, null); myPatientDao.expunge(myTargetId.toVersionless(), expungeOptions, null);
fail(); fail();
} catch (InternalErrorException e) { } catch (PreconditionFailedException e) {
assertThat(e.getMessage(), containsString("ViolationException")); assertThat(e.getMessage(), containsString("ViolationException"));
assertThat(e.getMessage(), containsString("FK_EMPI_LINK_TARGET")); assertThat(e.getMessage(), containsString("FK_EMPI_LINK_TARGET"));
} }

View File

@ -9,12 +9,21 @@ import ca.uhn.fhir.jpa.mdm.helper.MdmHelperConfig;
import ca.uhn.fhir.jpa.mdm.helper.MdmHelperR4; import ca.uhn.fhir.jpa.mdm.helper.MdmHelperR4;
import ca.uhn.fhir.jpa.model.dao.JpaPid; import ca.uhn.fhir.jpa.model.dao.JpaPid;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.mdm.api.IMdmLinkCreateSvc;
import ca.uhn.fhir.mdm.api.IMdmLinkUpdaterSvc;
import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;
import ca.uhn.fhir.mdm.model.CanonicalEID; import ca.uhn.fhir.mdm.model.CanonicalEID;
import ca.uhn.fhir.mdm.model.MdmCreateOrUpdateParams;
import ca.uhn.fhir.mdm.model.MdmTransactionContext;
import ca.uhn.fhir.mdm.rules.config.MdmSettings; import ca.uhn.fhir.mdm.rules.config.MdmSettings;
import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.server.IBundleProvider; import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.param.ReferenceParam; import ca.uhn.fhir.rest.param.ReferenceParam;
import ca.uhn.fhir.rest.server.TransactionLogMessages; import ca.uhn.fhir.rest.server.TransactionLogMessages;
import ca.uhn.fhir.rest.server.exceptions.ForbiddenOperationException; import ca.uhn.fhir.rest.server.exceptions.ForbiddenOperationException;
import ca.uhn.fhir.rest.server.exceptions.ResourceGoneException;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import org.hl7.fhir.instance.model.api.IAnyResource; import org.hl7.fhir.instance.model.api.IAnyResource;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
@ -46,6 +55,7 @@ import static org.hamcrest.Matchers.hasSize;
import static org.hamcrest.Matchers.nullValue; import static org.hamcrest.Matchers.nullValue;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.fail;
import static org.slf4j.LoggerFactory.getLogger; import static org.slf4j.LoggerFactory.getLogger;
@ -59,6 +69,10 @@ public class MdmStorageInterceptorIT extends BaseMdmR4Test {
public MdmHelperR4 myMdmHelper; public MdmHelperR4 myMdmHelper;
@Autowired @Autowired
private IIdHelperService<JpaPid> myIdHelperService; private IIdHelperService<JpaPid> myIdHelperService;
@Autowired
private IMdmLinkUpdaterSvc myMdmLinkUpdaterSvc;
@Autowired
private IMdmLinkCreateSvc myMdmCreateSvc;
@Override @Override
@ -93,6 +107,194 @@ public class MdmStorageInterceptorIT extends BaseMdmR4Test {
assertLinkCount(0); assertLinkCount(0);
} }
@Test
public void testGoldenResourceDeleted_whenOnlyMatchedResourceDeleted() throws InterruptedException {
// Given
Patient paulPatient = buildPaulPatient();
myMdmHelper.createWithLatch(paulPatient);
assertLinkCount(1);
Patient goldenPatient = getOnlyGoldenPatient();
// When
myPatientDao.delete(paulPatient.getIdElement());
// Then
List<IBaseResource> resources = myPatientDao.search(new SearchParameterMap(), SystemRequestDetails.forAllPartitions()).getAllResources();
assertTrue(resources.isEmpty());
assertLinkCount(0);
try {
myPatientDao.read(goldenPatient.getIdElement().toVersionless());
fail();
} catch (ResourceNotFoundException e) {
assertEquals(Constants.STATUS_HTTP_404_NOT_FOUND, e.getStatusCode());
}
}
@Test
public void testGoldenResourceDeleted_andNewGoldenCreated_whenOnlyMatchDeletedButPossibleMatchExists() throws InterruptedException {
// Given
Patient paulPatient = buildPaulPatient();
paulPatient.setActive(true);
myMdmHelper.createWithLatch(paulPatient);
Patient paulPatientPossibleMatch = buildPaulPatient();
paulPatientPossibleMatch.getNameFirstRep().setFamily("DifferentName");
myMdmHelper.createWithLatch(paulPatientPossibleMatch);
assertLinksMatchResult(MdmMatchResultEnum.MATCH, MdmMatchResultEnum.POSSIBLE_MATCH);
// When
myPatientDao.delete(paulPatient.getIdElement());
// Then
List<IBaseResource> resources = myPatientDao.search(new SearchParameterMap(), SystemRequestDetails.forAllPartitions()).getAllResources();
assertEquals(2, resources.size());
assertLinksMatchResult(MdmMatchResultEnum.MATCH);
}
@Test
public void testGoldenResourceDeleted_andNewGoldenCreated_whenOnlyMatchDeletedButMultiplePossibleMatchesExist() throws InterruptedException {
// Given
Patient paulPatient = buildPaulPatient();
paulPatient.setActive(true);
myMdmHelper.createWithLatch(paulPatient);
Patient paulPatientPossibleMatch = buildPaulPatient();
paulPatientPossibleMatch.setActive(true);
paulPatientPossibleMatch.getNameFirstRep().setFamily("DifferentName");
myMdmHelper.createWithLatch(paulPatientPossibleMatch);
Patient paulPatientPossibleMatch2 = buildPaulPatient();
paulPatientPossibleMatch2.setActive(true);
paulPatientPossibleMatch2.getNameFirstRep().setFamily("AnotherPerson");
myMdmHelper.createWithLatch(paulPatientPossibleMatch2);
assertLinksMatchResult(MdmMatchResultEnum.MATCH, MdmMatchResultEnum.POSSIBLE_MATCH, MdmMatchResultEnum.POSSIBLE_MATCH);
// When
myPatientDao.delete(paulPatient.getIdElement());
// Then
List<IBaseResource> resources = myPatientDao.search(new SearchParameterMap(), SystemRequestDetails.forAllPartitions()).getAllResources();
assertEquals(3, resources.size());
assertLinksMatchResult(MdmMatchResultEnum.MATCH, MdmMatchResultEnum.POSSIBLE_MATCH);
}
@Test
public void testDeleteSourceResource_whereGoldenResourceIsPossibleDuplicate() throws InterruptedException {
// Given
Patient paulPatient = buildPaulPatient();
paulPatient.setActive(true);
myMdmHelper.createWithLatch(paulPatient);
Patient paulPatientPossibleMatch = buildPaulPatient();
paulPatientPossibleMatch.setActive(true);
paulPatientPossibleMatch.getNameFirstRep().setFamily("DifferentName");
myMdmHelper.createWithLatch(paulPatientPossibleMatch);
MdmCreateOrUpdateParams params = new MdmCreateOrUpdateParams();
params.setMdmContext(getPatientUpdateLinkContext());
params.setGoldenResource(getOnlyGoldenPatient());
params.setSourceResource(paulPatientPossibleMatch);
params.setMatchResult(MdmMatchResultEnum.NO_MATCH);
myMdmLinkUpdaterSvc.updateLink(params);
Patient paulPatientPossibleMatch2 = buildPaulPatient();
paulPatientPossibleMatch2.setActive(true);
paulPatientPossibleMatch2.getNameFirstRep().setFamily("AnotherPerson");
myMdmHelper.createWithLatch(paulPatientPossibleMatch2);
assertLinkCount(6);
// When
myPatientDao.delete(paulPatient.getIdElement());
// Then
/* Paul 1 MATCH to GR1 --> DELETED
Paul 2 NO_MATCH to GR1 --> DELETED
Paul 2 MATCH to GR2 --> KEPT
Paul 3 POSSIBLE_MATCH to GR1 --> DELETED
Paul 3 POSSIBLE_MATCH to GR2 --> KEPT
GR1 POSSIBLE_DUPLICATE GR2 --> DELETED */
List<IBaseResource> resources = myPatientDao.search(new SearchParameterMap(), SystemRequestDetails.forAllPartitions()).getAllResources();
assertEquals(3, resources.size());
assertLinksMatchResult(MdmMatchResultEnum.MATCH, MdmMatchResultEnum.POSSIBLE_MATCH);
}
@Test
public void testDeleteSourceResource_withNoMatchLink_whereGoldenResourceIsPossibleDuplicate() throws InterruptedException {
// Given
Patient paulPatient = buildPaulPatient();
paulPatient.setActive(true);
myMdmHelper.createWithLatch(paulPatient);
Patient paulPatientPossibleMatch = buildPaulPatient();
paulPatientPossibleMatch.setActive(true);
paulPatientPossibleMatch.getNameFirstRep().setFamily("DifferentName");
myMdmHelper.createWithLatch(paulPatientPossibleMatch);
MdmCreateOrUpdateParams params = new MdmCreateOrUpdateParams();
params.setGoldenResource(getOnlyGoldenPatient());
params.setSourceResource(paulPatientPossibleMatch);
params.setMdmContext(getPatientUpdateLinkContext());
params.setMatchResult(MdmMatchResultEnum.NO_MATCH);
myMdmLinkUpdaterSvc.updateLink(params);
Patient paulPatientPossibleMatch2 = buildPaulPatient();
paulPatientPossibleMatch2.setActive(true);
paulPatientPossibleMatch2.getNameFirstRep().setFamily("AnotherPerson");
myMdmHelper.createWithLatch(paulPatientPossibleMatch2);
assertLinkCount(6);
// When
myPatientDao.delete(paulPatientPossibleMatch.getIdElement());
// Then
/* Paul 1 MATCH to GR1 --> DELETED
Paul 2 NO_MATCH to GR1 --> DELETED
Paul 2 MATCH to GR2 --> KEPT
Paul 3 POSSIBLE_MATCH to GR1 --> DELETED
Paul 3 POSSIBLE_MATCH to GR2 --> KEPT
GR1 POSSIBLE_DUPLICATE GR2 --> DELETED */
List<IBaseResource> resources = myPatientDao.search(new SearchParameterMap(), SystemRequestDetails.forAllPartitions()).getAllResources();
assertEquals(3, resources.size());
assertLinksMatchResult(MdmMatchResultEnum.MATCH, MdmMatchResultEnum.POSSIBLE_MATCH);
}
@Test
public void testGoldenResourceKept_whenAutoDeleteDisabled() throws InterruptedException {
// Given
myMdmSettings.setAutoExpungeGoldenResources(false);
Patient paulPatient = buildPaulPatient();
myMdmHelper.createWithLatch(paulPatient);
assertLinkCount(1);
Patient goldenPatient = getOnlyGoldenPatient();
// When
myPatientDao.delete(paulPatient.getIdElement());
// Then
try {
myPatientDao.read(goldenPatient.getIdElement().toVersionless());
fail();
} catch (ResourceGoneException e) {
assertLinkCount(0);
} finally {
myMdmSettings.setAutoExpungeGoldenResources(true);
}
}
private MdmTransactionContext getPatientUpdateLinkContext() {
MdmTransactionContext ctx = new MdmTransactionContext();
ctx.setRestOperation(MdmTransactionContext.OperationType.UPDATE_LINK);
ctx.setResourceType("Patient");
return ctx;
}
@Test @Test
public void testCreatePatientWithMdmTagForbidden() throws InterruptedException { public void testCreatePatientWithMdmTagForbidden() throws InterruptedException {
//Creating a golden resource with the MDM-MANAGED tag should fail //Creating a golden resource with the MDM-MANAGED tag should fail

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -83,7 +83,7 @@ public class JpaConstants {
* Header name for the "X-Meta-Snapshot-Mode" header, which * Header name for the "X-Meta-Snapshot-Mode" header, which
* specifies that properties in meta (tags, profiles, security labels) * specifies that properties in meta (tags, profiles, security labels)
* should be treated as a snapshot, meaning that these things will * should be treated as a snapshot, meaning that these things will
* be removed if they are nt explicitly included in updates * be removed if they are not explicitly included in updates
*/ */
public static final String HEADER_META_SNAPSHOT_MODE = "X-Meta-Snapshot-Mode"; public static final String HEADER_META_SNAPSHOT_MODE = "X-Meta-Snapshot-Mode";
/** /**

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -51,6 +51,7 @@ import ca.uhn.fhir.rest.server.util.ISearchParamRegistry;
import ca.uhn.fhir.util.MetaUtil; import ca.uhn.fhir.util.MetaUtil;
import ca.uhn.fhir.util.UrlUtil; import ca.uhn.fhir.util.UrlUtil;
import com.google.common.collect.Sets; import com.google.common.collect.Sets;
import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.Validate; import org.apache.commons.lang3.Validate;
import org.hl7.fhir.dstu3.model.Location; import org.hl7.fhir.dstu3.model.Location;
import org.hl7.fhir.instance.model.api.IAnyResource; import org.hl7.fhir.instance.model.api.IAnyResource;
@ -356,7 +357,7 @@ public class InMemoryResourceMatcher {
SourceParam resourceSource = new SourceParam(MetaUtil.getSource(myFhirContext, theResource.getMeta())); SourceParam resourceSource = new SourceParam(MetaUtil.getSource(myFhirContext, theResource.getMeta()));
boolean matches = true; boolean matches = true;
if (paramSource.getSourceUri() != null) { if (paramSource.getSourceUri() != null) {
matches = paramSource.getSourceUri().equals(resourceSource.getSourceUri()); matches = matchSourceWithModifiers(theSourceParam, paramSource, resourceSource.getSourceUri());
} }
if (paramSource.getRequestId() != null) { if (paramSource.getRequestId() != null) {
matches &= paramSource.getRequestId().equals(resourceSource.getRequestId()); matches &= paramSource.getRequestId().equals(resourceSource.getRequestId());
@ -364,6 +365,33 @@ public class InMemoryResourceMatcher {
return matches; return matches;
} }
private boolean matchSourceWithModifiers(
IQueryParameterType parameterType, SourceParam paramSource, String theSourceUri) {
// process :missing modifier
if (parameterType.getMissing() != null) {
return parameterType.getMissing() == StringUtils.isBlank(theSourceUri);
}
// process :above, :below, :contains modifiers
if (parameterType instanceof UriParam && ((UriParam) parameterType).getQualifier() != null) {
UriParam uriParam = ((UriParam) parameterType);
switch (uriParam.getQualifier()) {
case ABOVE:
return UrlUtil.getAboveUriCandidates(paramSource.getSourceUri()).stream()
.anyMatch(candidate -> candidate.equals(theSourceUri));
case BELOW:
return theSourceUri.startsWith(paramSource.getSourceUri());
case CONTAINS:
return StringUtils.containsIgnoreCase(theSourceUri, paramSource.getSourceUri());
default:
// Unsupported modifier specified - no match
return false;
}
} else {
// no modifiers specified - use equals operator
return paramSource.getSourceUri().equals(theSourceUri);
}
}
private boolean matchTagsOrSecurityAndOr( private boolean matchTagsOrSecurityAndOr(
List<List<IQueryParameterType>> theAndOrParams, IBaseResource theResource, boolean theTag) { List<List<IQueryParameterType>> theAndOrParams, IBaseResource theResource, boolean theTag) {
if (theResource == null) { if (theResource == null) {

View File

@ -6,6 +6,7 @@ import ca.uhn.fhir.context.support.IValidationSupport;
import ca.uhn.fhir.jpa.model.config.PartitionSettings; import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamDate; import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamDate;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamToken; import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamToken;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamUri;
import ca.uhn.fhir.jpa.model.entity.StorageSettings; import ca.uhn.fhir.jpa.model.entity.StorageSettings;
import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import ca.uhn.fhir.jpa.searchparam.MatchUrlService;
import ca.uhn.fhir.jpa.searchparam.extractor.ResourceIndexedSearchParams; import ca.uhn.fhir.jpa.searchparam.extractor.ResourceIndexedSearchParams;
@ -26,6 +27,8 @@ import org.hl7.fhir.r5.model.Observation;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.CsvSource;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.mock.mockito.MockBean; import org.springframework.boot.test.mock.mockito.MockBean;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
@ -37,6 +40,8 @@ import java.time.Duration;
import java.time.Instant; import java.time.Instant;
import java.util.Date; import java.util.Date;
import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertTrue;
@ -141,6 +146,33 @@ public class InMemoryResourceMatcherR5Test {
} }
} }
@ParameterizedTest
@CsvSource({
"http://host.com/v1/v2, _source:contains=HOST.com/v1, true",
"http://host.com/v1/v2, _source:contains=http://host.com/v1/v2, true",
"http://host.com/v1/v2, _source:contains=anotherHost.com, false",
"http://host.com/v1/v2, _source:above=http://host.com/v1/v2/v3, true",
"http://host.com/v1/v2, _source:above=http://host.com/v1/v2, true",
"http://host.com, _source:above=http://host.com/v1/v2, true",
"http://host.com/v1/v2, _source:above=http://host.com/v1, false",
"http://host.com/v1/v2, _source:below=http://host.com/v1, true",
"http://host.com/v1/v2, _source:below=http://host.com/v1/v2, true",
"http://host.com/v1/v2, _source:below=http://host.com/v1/v2/v3, false",
" , _source:missing=true, true",
"http://host.com/v1/v2, _source:missing=true, false",
"http://host.com/v1/v2, _source:missing=false, true",
" , _source:missing=false, false"
})
public void testMatch_sourceWithModifiers_matchesSuccessfully(String theSourceValue, String theSearchCriteria, boolean theShouldMatch) {
myObservation.getMeta().setSource(theSourceValue);
ResourceIndexedSearchParams searchParams = new ResourceIndexedSearchParams();
searchParams.myUriParams.add(extractSourceUriParam(myObservation));
InMemoryMatchResult resultInsidePeriod = myInMemoryResourceMatcher.match(theSearchCriteria, myObservation, searchParams, newRequest());
assertThat(resultInsidePeriod.matched(), is(theShouldMatch));
}
@Test @Test
public void testUnsupportedChained() { public void testUnsupportedChained() {
InMemoryMatchResult result = myInMemoryResourceMatcher.match("encounter.class=FOO", myObservation, mySearchParams, newRequest()); InMemoryMatchResult result = myInMemoryResourceMatcher.match("encounter.class=FOO", myObservation, mySearchParams, newRequest());
@ -393,6 +425,11 @@ public class InMemoryResourceMatcherR5Test {
return new ResourceIndexedSearchParamToken(new PartitionSettings(), "Observation", "code", coding.getSystem(), coding.getCode()); return new ResourceIndexedSearchParamToken(new PartitionSettings(), "Observation", "code", coding.getSystem(), coding.getCode());
} }
private ResourceIndexedSearchParamUri extractSourceUriParam(Observation theObservation) {
String source = theObservation.getMeta().getSource();
return new ResourceIndexedSearchParamUri(new PartitionSettings(), "Observation", "_source", source);
}
@Configuration @Configuration
public static class SpringConfig { public static class SpringConfig {
@Bean @Bean

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -4,20 +4,29 @@ import ca.uhn.fhir.batch2.api.IJobCoordinator;
import ca.uhn.fhir.batch2.model.JobInstance; import ca.uhn.fhir.batch2.model.JobInstance;
import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest;
import ca.uhn.fhir.batch2.model.StatusEnum; import ca.uhn.fhir.batch2.model.StatusEnum;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.interceptor.api.Hook; import ca.uhn.fhir.interceptor.api.Hook;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.model.BulkExportJobResults; import ca.uhn.fhir.jpa.api.model.BulkExportJobResults;
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse; import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
import ca.uhn.fhir.jpa.model.util.JpaConstants;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test; import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.RequestTypeEnum;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.api.server.bulk.BulkExportJobParameters; import ca.uhn.fhir.rest.api.server.bulk.BulkExportJobParameters;
import ca.uhn.fhir.rest.client.apache.ResourceEntity;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.test.utilities.HttpClientExtension;
import ca.uhn.fhir.util.Batch2JobDefinitionConstants; import ca.uhn.fhir.util.Batch2JobDefinitionConstants;
import ca.uhn.fhir.util.JsonUtil; import ca.uhn.fhir.util.JsonUtil;
import com.google.common.collect.Sets; import com.google.common.collect.Sets;
import org.apache.commons.io.LineIterator; import org.apache.commons.io.LineIterator;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.client.methods.HttpPost;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.Basic; import org.hl7.fhir.r4.model.Basic;
@ -34,19 +43,23 @@ import org.hl7.fhir.r4.model.Location;
import org.hl7.fhir.r4.model.MedicationAdministration; import org.hl7.fhir.r4.model.MedicationAdministration;
import org.hl7.fhir.r4.model.Observation; import org.hl7.fhir.r4.model.Observation;
import org.hl7.fhir.r4.model.Organization; import org.hl7.fhir.r4.model.Organization;
import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Practitioner; import org.hl7.fhir.r4.model.Practitioner;
import org.hl7.fhir.r4.model.Provenance; import org.hl7.fhir.r4.model.Provenance;
import org.hl7.fhir.r4.model.QuestionnaireResponse; import org.hl7.fhir.r4.model.QuestionnaireResponse;
import org.hl7.fhir.r4.model.Reference; import org.hl7.fhir.r4.model.Reference;
import org.hl7.fhir.r4.model.ServiceRequest; import org.hl7.fhir.r4.model.ServiceRequest;
import org.hl7.fhir.r4.model.StringType;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.MethodOrderer; import org.junit.jupiter.api.MethodOrderer;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestMethodOrder; import org.junit.jupiter.api.TestMethodOrder;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.MethodSource; import org.junit.jupiter.params.provider.MethodSource;
import org.mockito.Spy;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
@ -63,7 +76,6 @@ import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Set; import java.util.Set;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
import java.util.stream.Stream; import java.util.stream.Stream;
import static ca.uhn.fhir.jpa.dao.r4.FhirResourceDaoR4TagsInlineTest.createSearchParameterForInlineSecurity; import static ca.uhn.fhir.jpa.dao.r4.FhirResourceDaoR4TagsInlineTest.createSearchParameterForInlineSecurity;
@ -98,6 +110,12 @@ public class BulkDataExportTest extends BaseResourceProviderR4Test {
myStorageSettings.setJobFastTrackingEnabled(false); myStorageSettings.setJobFastTrackingEnabled(false);
} }
@Spy
private final FhirContext myCtx = FhirContext.forR4Cached();
@RegisterExtension
private final HttpClientExtension mySender = new HttpClientExtension();
@Test @Test
public void testGroupBulkExportWithTypeFilter() { public void testGroupBulkExportWithTypeFilter() {
// Create some resources // Create some resources
@ -573,6 +591,54 @@ public class BulkDataExportTest extends BaseResourceProviderR4Test {
verifyBulkExportResults(options, List.of("Observation/C", "Group/B"), List.of("Patient/A")); verifyBulkExportResults(options, List.of("Observation/C", "Group/B"), List.of("Patient/A"));
} }
/**
* This interceptor was needed so that similar GET and POST export requests return the same jobID
* The test testBulkExportReuse_withGetAndPost_expectSameJobIds() tests this functionality
*/
private class BulkExportReuseInterceptor{
@Hook(Pointcut.STORAGE_INITIATE_BULK_EXPORT)
public void initiateBulkExport(RequestDetails theRequestDetails, BulkExportJobParameters theBulkExportOptions){
if(theRequestDetails.getRequestType().equals(RequestTypeEnum.GET)) {
theBulkExportOptions.getPatientIds();
}
}
}
@Test
public void testBulkExportReuse_withGetAndPost_expectSameJobIds() throws IOException {
Patient patient = new Patient();
patient.setId("P1");
patient.setActive(true);
myClient.update().resource(patient).execute();
BulkExportReuseInterceptor newInterceptor = new BulkExportReuseInterceptor();
myInterceptorRegistry.registerInterceptor(newInterceptor);
Parameters input = new Parameters();
input.addParameter(JpaConstants.PARAM_EXPORT_OUTPUT_FORMAT, new StringType(Constants.CT_FHIR_NDJSON));
input.addParameter(JpaConstants.PARAM_EXPORT_TYPE, new StringType("Patient"));
HttpPost post = new HttpPost(myServer.getBaseUrl() + "/" + JpaConstants.OPERATION_EXPORT);
post.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
post.setEntity(new ResourceEntity(myCtx, input));
HttpGet get = new HttpGet(myServer.getBaseUrl() + "/" + JpaConstants.OPERATION_EXPORT + "?_outputFormat=application%2Ffhir%2Bndjson&_type=Patient");
get.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
try(CloseableHttpResponse postResponse = mySender.execute(post)){
ourLog.info("Response: {}",postResponse);
assertEquals(202, postResponse.getStatusLine().getStatusCode());
assertEquals("Accepted", postResponse.getStatusLine().getReasonPhrase());
try(CloseableHttpResponse getResponse = mySender.execute(get)){
ourLog.info("Get Response: {}", getResponse);
assertEquals(202, getResponse.getStatusLine().getStatusCode());
assertEquals("Accepted", getResponse.getStatusLine().getReasonPhrase());
assertEquals(postResponse.getFirstHeader(Constants.HEADER_CONTENT_LOCATION).getValue(), getResponse.getFirstHeader(Constants.HEADER_CONTENT_LOCATION).getValue());
}
}
myInterceptorRegistry.unregisterInterceptor(newInterceptor);
}
@Test @Test
public void testPatientBulkExportWithReferenceToAuthor_ShouldShowUp() { public void testPatientBulkExportWithReferenceToAuthor_ShouldShowUp() {
myStorageSettings.setIndexMissingFields(JpaStorageSettings.IndexEnabledEnum.ENABLED); myStorageSettings.setIndexMissingFields(JpaStorageSettings.IndexEnabledEnum.ENABLED);

View File

@ -8,23 +8,28 @@ import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.server.IBundleProvider; import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.rest.param.StringParam; import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.param.TokenAndListParam; import ca.uhn.fhir.rest.param.TokenAndListParam;
import ca.uhn.fhir.rest.param.TokenOrListParam;
import ca.uhn.fhir.rest.param.TokenParam; import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.rest.param.UriParam;
import ca.uhn.fhir.rest.param.UriParamQualifierEnum;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.MethodNotAllowedException;
import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException; import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException;
import org.apache.commons.text.RandomStringGenerator; import org.apache.commons.text.RandomStringGenerator;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.Observation;
import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.CsvSource;
import static ca.uhn.fhir.rest.api.Constants.PARAM_SOURCE;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.containsInAnyOrder; import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.matchesPattern; import static org.hamcrest.Matchers.matchesPattern;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.fail;
import static org.mockito.Mockito.when; import static org.mockito.Mockito.when;
@SuppressWarnings({"Duplicates"}) @SuppressWarnings({"Duplicates"})
@ -64,107 +69,6 @@ public class FhirResourceDaoR4SourceTest extends BaseJpaR4Test {
assertThat(toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(pt0id.getValue())); assertThat(toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(pt0id.getValue()));
pt0 = (Patient) result.getResources(0, 1).get(0); pt0 = (Patient) result.getResources(0, 1).get(0);
assertEquals("urn:source:0#a_request_id", pt0.getMeta().getSource()); assertEquals("urn:source:0#a_request_id", pt0.getMeta().getSource());
// Search by request ID
params = new SearchParameterMap();
params.setLoadSynchronous(true);
params.add(Constants.PARAM_SOURCE, new TokenParam("#a_request_id"));
result = myPatientDao.search(params);
assertThat(toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(pt0id.getValue(), pt1id.getValue()));
// Search by source URI and request ID
params = new SearchParameterMap();
params.setLoadSynchronous(true);
params.add(Constants.PARAM_SOURCE, new TokenParam("urn:source:0#a_request_id"));
result = myPatientDao.search(params);
assertThat(toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(pt0id.getValue()));
}
@Test
public void testSearchSource_whenSameSourceForMultipleResourceTypes_willMatchSearchResourceTypeOnly(){
String sourceUrn = "urn:source:0";
String requestId = "a_request_id";
when(mySrd.getRequestId()).thenReturn(requestId);
Patient patient = new Patient();
patient.getMeta().setSource(sourceUrn);
patient.setActive(true);
IIdType ptId = myPatientDao.create(patient, mySrd).getId().toUnqualifiedVersionless();
Observation observation = new Observation();
observation.setStatus(Observation.ObservationStatus.FINAL);
observation.getMeta().setSource(sourceUrn);
myObservationDao.create(observation, mySrd).getId().toUnqualifiedVersionless();
SearchParameterMap params = new SearchParameterMap();
params.setLoadSynchronous(true);
params.add(Constants.PARAM_SOURCE, new TokenParam("urn:source:0"));
IBundleProvider result = myPatientDao.search(params);
assertThat(toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(ptId.getValue()));
}
@Test
public void testSearchWithOr() {
String requestId = "a_request_id";
when(mySrd.getRequestId()).thenReturn(requestId);
Patient pt0 = new Patient();
pt0.getMeta().setSource("urn:source:0");
pt0.setActive(true);
IIdType pt0id = myPatientDao.create(pt0, mySrd).getId().toUnqualifiedVersionless();
Patient pt1 = new Patient();
pt1.getMeta().setSource("urn:source:1");
pt1.setActive(true);
IIdType pt1id = myPatientDao.create(pt1, mySrd).getId().toUnqualifiedVersionless();
Patient pt2 = new Patient();
pt2.getMeta().setSource("urn:source:2");
pt2.setActive(true);
myPatientDao.create(pt2, mySrd).getId().toUnqualifiedVersionless();
// Search
SearchParameterMap params = new SearchParameterMap();
params.setLoadSynchronous(true);
params.add(Constants.PARAM_SOURCE, new TokenOrListParam()
.addOr(new TokenParam("urn:source:0"))
.addOr(new TokenParam("urn:source:1")));
IBundleProvider result = myPatientDao.search(params);
assertThat(toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(pt0id.getValue(), pt1id.getValue()));
}
@Test
public void testSearchWithAnd() {
String requestId = "a_request_id";
when(mySrd.getRequestId()).thenReturn(requestId);
Patient pt0 = new Patient();
pt0.getMeta().setSource("urn:source:0");
pt0.setActive(true);
IIdType pt0id = myPatientDao.create(pt0, mySrd).getId().toUnqualifiedVersionless();
Patient pt1 = new Patient();
pt1.getMeta().setSource("urn:source:1");
pt1.setActive(true);
IIdType pt1id = myPatientDao.create(pt1, mySrd).getId().toUnqualifiedVersionless();
Patient pt2 = new Patient();
pt2.getMeta().setSource("urn:source:2");
pt2.setActive(true);
myPatientDao.create(pt2, mySrd).getId().toUnqualifiedVersionless();
// Search
SearchParameterMap params = new SearchParameterMap();
params.setLoadSynchronous(true);
params.add(Constants.PARAM_SOURCE, new TokenAndListParam()
.addAnd(new TokenParam("urn:source:0"), new TokenParam("@a_request_id")));
IBundleProvider result = myPatientDao.search(params);
assertThat(toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(pt0id.getValue()));
} }
@Test @Test
@ -270,6 +174,21 @@ public class FhirResourceDaoR4SourceTest extends BaseJpaR4Test {
} }
@Test
public void testSearchSource_withContainsModifierAndContainsSearchesDisabled_throwsException() {
myStorageSettings.setAllowContainsSearches(false);
UriParam uriParam = new UriParam("some-source").setQualifier(UriParamQualifierEnum.CONTAINS);
try {
SearchParameterMap searchParameter = SearchParameterMap.newSynchronous();
searchParameter.add(Constants.PARAM_SOURCE, uriParam);
myPatientDao.search(searchParameter);
fail();
} catch (MethodNotAllowedException e) {
assertEquals(Msg.code(2417) + ":contains modifier is disabled on this server", e.getMessage());
}
}
public static void assertConflictException(String theResourceType, ResourceVersionConflictException e) { public static void assertConflictException(String theResourceType, ResourceVersionConflictException e) {
assertThat(e.getMessage(), matchesPattern( assertThat(e.getMessage(), matchesPattern(
"Unable to delete [a-zA-Z]+/[0-9]+ because at least one resource has a reference to this resource. First reference found was resource " + theResourceType + "/[0-9]+ in path [a-zA-Z]+.[a-zA-Z]+")); "Unable to delete [a-zA-Z]+/[0-9]+ because at least one resource has a reference to this resource. First reference found was resource " + theResourceType + "/[0-9]+ in path [a-zA-Z]+.[a-zA-Z]+"));

View File

@ -6,6 +6,7 @@ import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.dao.TestDaoSearch; import ca.uhn.fhir.jpa.dao.TestDaoSearch;
import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases; import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases;
import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases; import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases;
import ca.uhn.fhir.jpa.search.BaseSourceSearchParameterTestCases;
import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import ca.uhn.fhir.jpa.searchparam.MatchUrlService;
import ca.uhn.fhir.jpa.test.BaseJpaTest; import ca.uhn.fhir.jpa.test.BaseJpaTest;
import ca.uhn.fhir.jpa.test.config.TestHSearchAddInConfig; import ca.uhn.fhir.jpa.test.config.TestHSearchAddInConfig;
@ -517,4 +518,16 @@ public class FhirResourceDaoR4StandardQueriesNoFTTest extends BaseJpaTest {
return false; return false;
} }
} }
@Nested
class SourceSearchParameterTestCases extends BaseSourceSearchParameterTestCases {
SourceSearchParameterTestCases() {
super(myDataBuilder, myTestDaoSearch, myStorageSettings);
}
@Override
protected boolean isRequestIdSupported() {
return true;
}
}
} }

View File

@ -39,7 +39,7 @@ public class FhirResourceDaoR4UpdateTagSnapshotTest extends BaseJpaR4Test {
myPatientDao.update(p, mySrd); myPatientDao.update(p, mySrd);
p = myPatientDao.read(new IdType("A"), mySrd); p = myPatientDao.read(new IdType("A"), mySrd);
assertEquals("1", p.getIdElement().getVersionIdPart()); assertEquals("2", p.getIdElement().getVersionIdPart());
assertEquals(true, p.getActive()); assertEquals(true, p.getActive());
assertEquals(1, p.getMeta().getTag().size()); assertEquals(1, p.getMeta().getTag().size());
} }
@ -84,7 +84,7 @@ public class FhirResourceDaoR4UpdateTagSnapshotTest extends BaseJpaR4Test {
myPatientDao.update(p, mySrd); myPatientDao.update(p, mySrd);
p = myPatientDao.read(new IdType("A"), mySrd); p = myPatientDao.read(new IdType("A"), mySrd);
assertEquals("1", p.getIdElement().getVersionIdPart()); assertEquals("2", p.getIdElement().getVersionIdPart());
assertEquals(true, p.getActive()); assertEquals(true, p.getActive());
assertEquals(1, p.getMeta().getTag().size()); assertEquals(1, p.getMeta().getTag().size());
assertEquals("urn:foo", p.getMeta().getTag().get(0).getSystem()); assertEquals("urn:foo", p.getMeta().getTag().get(0).getSystem());
@ -132,7 +132,27 @@ public class FhirResourceDaoR4UpdateTagSnapshotTest extends BaseJpaR4Test {
p = myPatientDao.read(new IdType("A"), mySrd); p = myPatientDao.read(new IdType("A"), mySrd);
assertEquals(true, p.getActive()); assertEquals(true, p.getActive());
assertEquals(0, p.getMeta().getTag().size()); assertEquals(0, p.getMeta().getTag().size());
assertEquals("1", p.getIdElement().getVersionIdPart()); assertEquals("2", p.getIdElement().getVersionIdPart());
}
@Test
public void testUpdateResource_withNewTags_willCreateNewResourceVersion() {
Patient p = new Patient();
p.setId("A");
p.setActive(true);
myPatientDao.update(p, mySrd);
p = new Patient();
p.setId("A");
p.getMeta().addTag("urn:foo", "bar", "baz");
p.setActive(true);
myPatientDao.update(p, mySrd);
p = myPatientDao.read(new IdType("A"), mySrd);
assertEquals(true, p.getActive());
assertEquals(1, p.getMeta().getTag().size());
assertEquals("2", p.getIdElement().getVersionIdPart());
} }

View File

@ -0,0 +1,247 @@
package ca.uhn.fhir.jpa.packages;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.jpa.dao.data.ITermValueSetDao;
import ca.uhn.fhir.jpa.entity.TermValueSet;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r4.model.CodeSystem;
import org.hl7.fhir.r4.model.NamingSystem;
import org.hl7.fhir.r4.model.ValueSet;
import org.hl7.fhir.utilities.npm.NpmPackage;
import org.hl7.fhir.utilities.npm.PackageGenerator;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import javax.annotation.Nonnull;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.util.List;
import java.util.stream.Collectors;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertTrue;
public class PackageInstallerSvcImplCreateTest extends BaseJpaR4Test {
private static final String PACKAGE_ID_1 = "package1";
private static final String PACKAGE_VERSION = "1.0";
private static final String VALUE_SET_OID_FIRST = "2.16.840.1.113762.1.4.1010.9";
private static final String VALUE_SET_OID_SECOND = "2.16.840.1.113762.1.4.1010.10";
private static final String IG_FIRST = "first.ig.gov";
private static final String IG_SECOND = "second.ig.gov";
private static final String FIRST_IG_URL_FIRST_OID = String.format("http://%s/fhir/ValueSet/%s", IG_FIRST, VALUE_SET_OID_FIRST);
private static final String SECOND_IG_URL_FIRST_OID = String.format("http://%s/fhir/ValueSet/%s", IG_SECOND, VALUE_SET_OID_FIRST);
private static final String SECOND_IG_URL_SECOND_OID = String.format("http://%s/fhir/ValueSet/%s", IG_SECOND, VALUE_SET_OID_SECOND);
private static final FhirContext ourCtx = FhirContext.forR4Cached();
private static final CodeSystem CODE_SYSTEM = createCodeSystem();
private static final NpmPackage PACKAGE = createPackage();
private static final SystemRequestDetails REQUEST_DETAILS = new SystemRequestDetails();
@Autowired
private ITermValueSetDao myTermValueSetDao;
@Autowired
private PackageInstallerSvcImpl mySvc;
@Test
void createNamingSystem() throws IOException {
final NamingSystem namingSystem = new NamingSystem();
namingSystem.getUniqueId().add(new NamingSystem.NamingSystemUniqueIdComponent().setValue("123"));
create(namingSystem);
assertEquals(1, myNamingSystemDao.search(SearchParameterMap.newSynchronous(), REQUEST_DETAILS).getAllResources().size());
}
@Test
void createWithNoExistingResourcesNoIdOnValueSet() throws IOException {
final String version1 = "abc";
final String copyright1 = "first";
createValueSetAndCallCreate(VALUE_SET_OID_FIRST, null, version1, FIRST_IG_URL_FIRST_OID, copyright1);
final ValueSet actualValueSet1 = getFirstValueSet();
assertEquals("ValueSet/" + VALUE_SET_OID_FIRST, actualValueSet1.getIdElement().toUnqualifiedVersionless().getValue());
assertEquals(FIRST_IG_URL_FIRST_OID, actualValueSet1.getUrl());
assertEquals(version1, actualValueSet1.getVersion());
assertEquals(copyright1, actualValueSet1.getCopyright());
}
@Test
void createWithNoExistingResourcesIdOnValueSet() throws IOException {
final String version1 = "abc";
final String copyright1 = "first";
createValueSetAndCallCreate(VALUE_SET_OID_FIRST, null, version1, FIRST_IG_URL_FIRST_OID, copyright1);
createValueSetAndCallCreate(VALUE_SET_OID_FIRST, "43", version1, SECOND_IG_URL_FIRST_OID, copyright1);
final TermValueSet termValueSet = getFirstTermValueSet();
assertEquals(FIRST_IG_URL_FIRST_OID, termValueSet.getUrl());
final ValueSet actualValueSet1 = getFirstValueSet();
assertEquals("ValueSet/" + VALUE_SET_OID_FIRST, actualValueSet1.getIdElement().toUnqualifiedVersionless().getValue());
assertEquals(FIRST_IG_URL_FIRST_OID, actualValueSet1.getUrl());
assertEquals(version1, actualValueSet1.getVersion());
assertEquals(copyright1, actualValueSet1.getCopyright());
}
@Test
void createValueSetThenUpdateSameUrl() throws IOException {
final String version1 = "abc";
final String version2 = "def";
final String copyright1 = "first";
final String copyright2 = "second";
createValueSetAndCallCreate(VALUE_SET_OID_FIRST, null, version1, FIRST_IG_URL_FIRST_OID, copyright1);
createValueSetAndCallCreate(VALUE_SET_OID_FIRST, "43", version2, FIRST_IG_URL_FIRST_OID, copyright2);
final ValueSet actualValueSet1 = getFirstValueSet();
assertEquals("ValueSet/" + VALUE_SET_OID_FIRST, actualValueSet1.getIdElement().toUnqualifiedVersionless().getValue());
assertEquals(FIRST_IG_URL_FIRST_OID, actualValueSet1.getUrl());
assertEquals(version2, actualValueSet1.getVersion());
assertEquals(copyright2, actualValueSet1.getCopyright());
}
@Test
void createTwoDifferentValueSets() throws IOException {
final String version1 = "abc";
final String version2 = "def";
final String copyright1 = "first";
final String copyright2 = "second";
createValueSetAndCallCreate(VALUE_SET_OID_FIRST, null, version1, FIRST_IG_URL_FIRST_OID, copyright1);
createValueSetAndCallCreate(VALUE_SET_OID_SECOND, "43", version2, SECOND_IG_URL_SECOND_OID, copyright2);
final List<TermValueSet> all2 = myTermValueSetDao.findAll();
assertEquals(2, all2.size());
final TermValueSet termValueSet1 = all2.get(0);
final TermValueSet termValueSet2 = all2.get(1);
assertEquals(FIRST_IG_URL_FIRST_OID, termValueSet1.getUrl());
assertEquals(SECOND_IG_URL_SECOND_OID, termValueSet2.getUrl());
final List<ValueSet> allValueSets = getAllValueSets();
assertEquals(2, allValueSets.size());
final ValueSet actualValueSet1 = allValueSets.get(0);
assertEquals("ValueSet/" + VALUE_SET_OID_FIRST, actualValueSet1.getIdElement().toUnqualifiedVersionless().getValue());
assertEquals(FIRST_IG_URL_FIRST_OID, actualValueSet1.getUrl());
assertEquals(version1, actualValueSet1.getVersion());
assertEquals(copyright1, actualValueSet1.getCopyright());
final ValueSet actualValueSet2 = allValueSets.get(1);
assertEquals("ValueSet/" + VALUE_SET_OID_SECOND, actualValueSet2.getIdElement().toUnqualifiedVersionless().getValue());
assertEquals(SECOND_IG_URL_SECOND_OID, actualValueSet2.getUrl());
assertEquals(version2, actualValueSet2.getVersion());
assertEquals(copyright2, actualValueSet2.getCopyright());
}
@Nonnull
private List<ValueSet> getAllValueSets() {
final List<IBaseResource> allResources = myValueSetDao.search(SearchParameterMap.newSynchronous(), REQUEST_DETAILS).getAllResources();
assertFalse(allResources.isEmpty());
assertTrue(allResources.get(0) instanceof ValueSet);
return allResources.stream()
.map(ValueSet.class::cast)
.toList();
}
@Nonnull
private ValueSet getFirstValueSet() {
final List<IBaseResource> allResources = myValueSetDao.search(SearchParameterMap.newSynchronous(), REQUEST_DETAILS).getAllResources();
assertEquals(1, allResources.size());
final IBaseResource resource1 = allResources.get(0);
assertTrue(resource1 instanceof ValueSet);
return (ValueSet) resource1;
}
@Nonnull
private TermValueSet getFirstTermValueSet() {
final List<TermValueSet> all2 = myTermValueSetDao.findAll();
assertEquals(1, all2.size());
return all2.get(0);
}
private void createValueSetAndCallCreate(String theOid, String theResourceVersion, String theValueSetVersion, String theUrl, String theCopyright) throws IOException {
create(createValueSet(theOid, theResourceVersion, theValueSetVersion, theUrl, theCopyright));
}
@Nonnull
private static ValueSet createValueSet(String theOid, String theResourceVersion, String theValueSetVersion, String theUrl, String theCopyright) {
final ValueSet valueSetFromFirstIg = new ValueSet();
valueSetFromFirstIg.setUrl(theUrl);
valueSetFromFirstIg.setId(new IdDt(null, "ValueSet", theOid, theResourceVersion));
valueSetFromFirstIg.setVersion(theValueSetVersion);
valueSetFromFirstIg.setCopyright(theCopyright);
return valueSetFromFirstIg;
}
private void create(IBaseResource theResource) throws IOException {
mySvc.create(theResource, createInstallationSpec(packageToBytes()), new PackageInstallOutcomeJson());
}
@Nonnull
private static CodeSystem createCodeSystem() {
final CodeSystem cs = new CodeSystem();
cs.setId("CodeSystem/mycs");
cs.setUrl("http://my-code-system");
cs.setContent(CodeSystem.CodeSystemContentMode.COMPLETE);
return cs;
}
@Nonnull
private static NpmPackage createPackage() {
PackageGenerator manifestGenerator = new PackageGenerator();
manifestGenerator.name(PackageInstallerSvcImplCreateTest.PACKAGE_ID_1);
manifestGenerator.version(PACKAGE_VERSION);
manifestGenerator.description("a package");
manifestGenerator.fhirVersions(List.of(FhirVersionEnum.R4.getFhirVersionString()));
NpmPackage pkg = NpmPackage.empty(manifestGenerator);
String csString = ourCtx.newJsonParser().encodeResourceToString(CODE_SYSTEM);
pkg.addFile("package", "cs.json", csString.getBytes(StandardCharsets.UTF_8), "CodeSystem");
return pkg;
}
@Nonnull
private static PackageInstallationSpec createInstallationSpec(byte[] thePackageContents) {
final PackageInstallationSpec spec = new PackageInstallationSpec();
spec.setName(PACKAGE_ID_1);
spec.setVersion(PACKAGE_VERSION);
spec.setInstallMode(PackageInstallationSpec.InstallModeEnum.STORE_AND_INSTALL);
spec.setPackageContents(thePackageContents);
return spec;
}
@Nonnull
private static byte[] packageToBytes() throws IOException {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
PackageInstallerSvcImplCreateTest.PACKAGE.save(stream);
return stream.toByteArray();
}
}

View File

@ -99,6 +99,8 @@ public class ExpungeR4Test extends BaseResourceProviderR4Test {
private ISearchResultDao mySearchResultDao; private ISearchResultDao mySearchResultDao;
@Autowired @Autowired
private ThreadSafeResourceDeleterSvc myThreadSafeResourceDeleterSvc; private ThreadSafeResourceDeleterSvc myThreadSafeResourceDeleterSvc;
@Autowired
private ExpungeService myExpungeService;
@AfterEach @AfterEach
public void afterDisableExpunge() { public void afterDisableExpunge() {
@ -216,25 +218,27 @@ public class ExpungeR4Test extends BaseResourceProviderR4Test {
} }
public void createStandardCodeSystems() { public void createStandardCodeSystemWithOneVersion(){
CodeSystem codeSystem1 = new CodeSystem(); CodeSystem codeSystem1 = new CodeSystem();
codeSystem1.setUrl(URL_MY_CODE_SYSTEM); codeSystem1.setUrl(URL_MY_CODE_SYSTEM);
codeSystem1.setName("CS1-V1"); codeSystem1.setName("CS1-V1");
codeSystem1.setVersion("1"); codeSystem1.setVersion("1");
codeSystem1.setContent(CodeSystem.CodeSystemContentMode.COMPLETE); codeSystem1.setContent(CodeSystem.CodeSystemContentMode.COMPLETE);
codeSystem1 codeSystem1
.addConcept().setCode("C").setDisplay("Code C").addDesignation( .addConcept().setCode("C").setDisplay("Code C").addDesignation(
new CodeSystem.ConceptDefinitionDesignationComponent().setLanguage("en").setValue("CodeCDesignation")).addProperty( new CodeSystem.ConceptDefinitionDesignationComponent().setLanguage("en").setValue("CodeCDesignation")).addProperty(
new CodeSystem.ConceptPropertyComponent().setCode("CodeCProperty").setValue(new StringType("CodeCPropertyValue")) new CodeSystem.ConceptPropertyComponent().setCode("CodeCProperty").setValue(new StringType("CodeCPropertyValue"))
) )
.addConcept(new CodeSystem.ConceptDefinitionComponent().setCode("CA").setDisplay("Code CA") .addConcept(new CodeSystem.ConceptDefinitionComponent().setCode("CA").setDisplay("Code CA")
.addConcept(new CodeSystem.ConceptDefinitionComponent().setCode("CAA").setDisplay("Code CAA")) .addConcept(new CodeSystem.ConceptDefinitionComponent().setCode("CAA").setDisplay("Code CAA"))
) )
.addConcept(new CodeSystem.ConceptDefinitionComponent().setCode("CB").setDisplay("Code CB")); .addConcept(new CodeSystem.ConceptDefinitionComponent().setCode("CB").setDisplay("Code CB"));
codeSystem1 codeSystem1
.addConcept().setCode("D").setDisplay("Code D"); .addConcept().setCode("D").setDisplay("Code D");
myOneVersionCodeSystemId = myCodeSystemDao.create(codeSystem1).getId(); myOneVersionCodeSystemId = myCodeSystemDao.create(codeSystem1).getId();
}
public void createStandardCodeSystemWithTwoVersions(){
CodeSystem cs2v1 = new CodeSystem(); CodeSystem cs2v1 = new CodeSystem();
cs2v1.setUrl(URL_MY_CODE_SYSTEM_2); cs2v1.setUrl(URL_MY_CODE_SYSTEM_2);
cs2v1.setVersion("1"); cs2v1.setVersion("1");
@ -250,6 +254,11 @@ public class ExpungeR4Test extends BaseResourceProviderR4Test {
myTwoVersionCodeSystemIdV2 = myCodeSystemDao.create(cs2v2).getId(); myTwoVersionCodeSystemIdV2 = myCodeSystemDao.create(cs2v2).getId();
} }
public void createStandardCodeSystems() {
createStandardCodeSystemWithOneVersion();
createStandardCodeSystemWithTwoVersions();
}
private IFhirResourceDao<?> getDao(IIdType theId) { private IFhirResourceDao<?> getDao(IIdType theId) {
IFhirResourceDao<?> dao; IFhirResourceDao<?> dao;
switch (theId.getResourceType()) { switch (theId.getResourceType()) {
@ -674,10 +683,6 @@ public class ExpungeR4Test extends BaseResourceProviderR4Test {
assertExpunged(myDeletedPatientId.withVersion("2")); assertExpunged(myDeletedPatientId.withVersion("2"));
} }
@Autowired
private ExpungeService myExpungeService;
@Test @Test
public void testExpungeDeletedWhereResourceInSearchResults() { public void testExpungeDeletedWhereResourceInSearchResults() {
createStandardPatients(); createStandardPatients();
@ -904,25 +909,26 @@ public class ExpungeR4Test extends BaseResourceProviderR4Test {
} }
@Test @Test
public void testDeleteCodeSystemByUrlThenExpungeWithoutWaitingForBatch() { public void testExpungeCodeSystem_whenCsIsBeingBatchDeleted_willGracefullyHandleConstraintViolationException(){
//set up //set up
createStandardCodeSystems(); createStandardCodeSystemWithOneVersion();
myCodeSystemDao.deleteByUrl("CodeSystem?url=" + URL_MY_CODE_SYSTEM, null); myCodeSystemDao.deleteByUrl("CodeSystem?url=" + URL_MY_CODE_SYSTEM, null);
myTerminologyDeferredStorageSvc.saveDeferred(); myTerminologyDeferredStorageSvc.saveDeferred();
try { try {
// execute // execute
myCodeSystemDao.expunge(new ExpungeOptions() myCodeSystemDao.expunge(new ExpungeOptions()
.setExpungeDeletedResources(true) .setExpungeDeletedResources(true)
.setExpungeOldVersions(true), null); .setExpungeOldVersions(true), null);
fail("expunge should not succeed since the delete batch job is not complete"); fail();
} catch (InternalErrorException e){ } catch (PreconditionFailedException preconditionFailedException){
// verify // verify
assertNotExpunged(myOneVersionCodeSystemId.withVersion("2")); assertThat(preconditionFailedException.getMessage(), startsWith(
assertThat(e.getMessage(), startsWith( "HAPI-2415: The resource could not be expunged. It is likely due to unfinished asynchronous deletions, please try again later"));
"HAPI-1084: ca.uhn.fhir.rest.server.exceptions.PreconditionFailedException: HAPI-2415: The resource could not be ex" +
"punged. It is likely due to unfinished asynchronous deletions, please try again later:"));
} }
myBatch2JobHelper.awaitAllJobsOfJobDefinitionIdToComplete(TERM_CODE_SYSTEM_DELETE_JOB_NAME);
} }
@ParameterizedTest @ParameterizedTest

View File

@ -32,10 +32,8 @@ import java.util.List;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static ca.uhn.fhir.jpa.model.util.JpaConstants.DEFAULT_PARTITION_NAME; import static ca.uhn.fhir.jpa.model.util.JpaConstants.DEFAULT_PARTITION_NAME;
import static org.awaitility.Awaitility.await;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.hasSize;
import static org.hamcrest.Matchers.in;
import static org.hamcrest.Matchers.isA; import static org.hamcrest.Matchers.isA;
import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertEquals;
@ -108,7 +106,7 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv
String jobId = BatchHelperR4.jobIdFromBatch2Parameters(response); String jobId = BatchHelperR4.jobIdFromBatch2Parameters(response);
myBatch2JobHelper.awaitJobCompletion(jobId); myBatch2JobHelper.awaitJobCompletion(jobId);
assertThat(interceptor.requestPartitionIds, hasSize(5)); assertThat(interceptor.requestPartitionIds, hasSize(4));
RequestPartitionId partitionId = interceptor.requestPartitionIds.get(0); RequestPartitionId partitionId = interceptor.requestPartitionIds.get(0);
assertEquals(TENANT_B_ID, partitionId.getFirstPartitionIdOrNull()); assertEquals(TENANT_B_ID, partitionId.getFirstPartitionIdOrNull());
assertEquals(TENANT_B, partitionId.getFirstPartitionNameOrNull()); assertEquals(TENANT_B, partitionId.getFirstPartitionNameOrNull());

View File

@ -1,15 +1,14 @@
package ca.uhn.fhir.jpa.provider.r4; package ca.uhn.fhir.jpa.provider.r4;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test; import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.parser.StrictErrorHandler; import ca.uhn.fhir.parser.StrictErrorHandler;
import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.EncodingEnum; import ca.uhn.fhir.rest.api.EncodingEnum;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.util.BundleUtil;
import com.google.common.base.Charsets; import com.google.common.base.Charsets;
import org.apache.commons.io.IOUtils; import org.apache.commons.io.IOUtils;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.methods.CloseableHttpResponse; import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet; import org.apache.http.client.methods.HttpGet;
import org.hl7.fhir.r4.model.Bundle; import org.hl7.fhir.r4.model.Bundle;
@ -29,6 +28,7 @@ import org.junit.jupiter.api.Test;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List;
import java.util.Set; import java.util.Set;
import java.util.TreeSet; import java.util.TreeSet;
@ -203,7 +203,7 @@ public class PatientEverythingR4Test extends BaseResourceProviderR4Test {
assertNull(bundle.getLink("next")); assertNull(bundle.getLink("next"));
Set<String> actual = new TreeSet<String>(); Set<String> actual = new TreeSet<>();
for (BundleEntryComponent nextEntry : bundle.getEntry()) { for (BundleEntryComponent nextEntry : bundle.getEntry()) {
actual.add(nextEntry.getResource().getIdElement().toUnqualifiedVersionless().getValue()); actual.add(nextEntry.getResource().getIdElement().toUnqualifiedVersionless().getValue());
} }
@ -233,7 +233,7 @@ public class PatientEverythingR4Test extends BaseResourceProviderR4Test {
assertNotNull(bundle.getLink("next").getUrl()); assertNotNull(bundle.getLink("next").getUrl());
assertThat(bundle.getLink("next").getUrl(), containsString("_format=json")); assertThat(bundle.getLink("next").getUrl(), containsString("_format=json"));
bundle = fetchBundle(bundle.getLink("next").getUrl(), EncodingEnum.JSON); fetchBundle(bundle.getLink("next").getUrl(), EncodingEnum.JSON);
} }
/** /**
@ -252,7 +252,7 @@ public class PatientEverythingR4Test extends BaseResourceProviderR4Test {
assertNotNull(bundle.getLink("next").getUrl()); assertNotNull(bundle.getLink("next").getUrl());
ourLog.info("Next link: {}", bundle.getLink("next").getUrl()); ourLog.info("Next link: {}", bundle.getLink("next").getUrl());
assertThat(bundle.getLink("next").getUrl(), containsString("_format=xml")); assertThat(bundle.getLink("next").getUrl(), containsString("_format=xml"));
bundle = fetchBundle(bundle.getLink("next").getUrl(), EncodingEnum.XML); fetchBundle(bundle.getLink("next").getUrl(), EncodingEnum.XML);
} }
@Test @Test
@ -275,7 +275,37 @@ public class PatientEverythingR4Test extends BaseResourceProviderR4Test {
} while (bundle.getLink("next") != null); } while (bundle.getLink("next") != null);
} }
private Bundle fetchBundle(String theUrl, EncodingEnum theEncoding) throws IOException, ClientProtocolException { /**
* Built to reproduce <a href="https://gitlab.com/simpatico.ai/cdr/-/issues/4940">this issue</a>
*/
@Test
public void testEverythingRespectsServerDefaultPageSize() throws IOException {
// setup
for (int i = 0; i < 25; i++) {
Patient patient = new Patient();
patient.addName().setFamily("lastn").addGiven("name");
myPatientDao.create(patient, new SystemRequestDetails()).getId().toUnqualifiedVersionless();
}
// must be larger than myStorageSettings.getSearchPreFetchThresholds()[0] for issue to show up
int originalPagingProviderPageSize = myPagingProvider.getDefaultPageSize();
myPagingProvider.setDefaultPageSize(50);
// execute
Bundle bundle;
try {
bundle = fetchBundle(myServerBase + "/Patient/$everything?_format=json", EncodingEnum.JSON);
} finally {
// restore
myPagingProvider.setDefaultPageSize(originalPagingProviderPageSize);
}
// validate
List<Patient> bundlePatients = BundleUtil.toListOfResourcesOfType(myFhirContext, bundle, Patient.class);
assertEquals(myServer.getDefaultPageSize(), bundlePatients.size());
}
private Bundle fetchBundle(String theUrl, EncodingEnum theEncoding) throws IOException {
Bundle bundle; Bundle bundle;
HttpGet get = new HttpGet(theUrl); HttpGet get = new HttpGet(theUrl);
CloseableHttpResponse resp = ourHttpClient.execute(get); CloseableHttpResponse resp = ourHttpClient.execute(get);

View File

@ -0,0 +1,144 @@
package ca.uhn.fhir.jpa.reindex;
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.pid.IResourcePidList;
import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc;
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
import ca.uhn.fhir.jpa.searchparam.MatchUrlService;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import org.hl7.fhir.instance.model.api.IIdType;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.ValueSource;
import org.springframework.beans.factory.annotation.Autowired;
import javax.annotation.Nonnull;
import java.time.LocalDate;
import java.time.Month;
import java.time.ZoneId;
import java.util.Date;
import java.util.List;
import java.util.stream.IntStream;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
class Batch2DaoSvcImplTest extends BaseJpaR4Test {
private static final Date PREVIOUS_MILLENNIUM = toDate(LocalDate.of(1999, Month.DECEMBER, 31));
private static final Date TOMORROW = toDate(LocalDate.now().plusDays(1));
private static final String URL_PATIENT_EXPUNGE_TRUE = "Patient?_expunge=true";
private static final String PATIENT = "Patient";
private static final int INTERNAL_SYNCHRONOUS_SEARCH_SIZE = 10;
@Autowired
private JpaStorageSettings myJpaStorageSettings;
@Autowired
private MatchUrlService myMatchUrlService;
@Autowired
private IHapiTransactionService myIHapiTransactionService ;
private DaoRegistry mySpiedDaoRegistry;
private IBatch2DaoSvc mySubject;
@BeforeEach
void beforeEach() {
myJpaStorageSettings.setInternalSynchronousSearchSize(INTERNAL_SYNCHRONOUS_SEARCH_SIZE);
mySpiedDaoRegistry = spy(myDaoRegistry);
mySubject = new Batch2DaoSvcImpl(myResourceTableDao, myMatchUrlService, mySpiedDaoRegistry, myFhirContext, myIHapiTransactionService, myJpaStorageSettings);
}
// TODO: LD this test won't work with the nonUrl variant yet: error: No existing transaction found for transaction marked with propagation 'mandatory'
@Test
void fetchResourcesByUrlEmptyUrl() {
final InternalErrorException exception = assertThrows(InternalErrorException.class, () -> mySubject.fetchResourceIdsPage(PREVIOUS_MILLENNIUM, TOMORROW, 800, RequestPartitionId.defaultPartition(), ""));
assertEquals("HAPI-2422: this should never happen: URL is missing a '?'", exception.getMessage());
}
@Test
void fetchResourcesByUrlSingleQuestionMark() {
final InternalErrorException exception = assertThrows(InternalErrorException.class, () -> mySubject.fetchResourceIdsPage(PREVIOUS_MILLENNIUM, TOMORROW, 800, RequestPartitionId.defaultPartition(), "?"));
assertEquals("HAPI-2223: theResourceName must not be blank", exception.getMessage());
}
@Test
void fetchResourcesByUrlNonsensicalResource() {
final InternalErrorException exception = assertThrows(InternalErrorException.class, () -> mySubject.fetchResourceIdsPage(PREVIOUS_MILLENNIUM, TOMORROW, 800, RequestPartitionId.defaultPartition(), "Banana?_expunge=true"));
assertEquals("HAPI-2223: HAPI-1684: Unknown resource name \"Banana\" (this name is not known in FHIR version \"R4\")", exception.getMessage());
}
@ParameterizedTest
@ValueSource(ints = {0, 9, 10, 11, 21, 22, 23, 45})
void fetchResourcesByUrl(int expectedNumResults) {
final List<IIdType> patientIds = IntStream.range(0, expectedNumResults)
.mapToObj(num -> createPatient())
.toList();
final IResourcePidList resourcePidList = mySubject.fetchResourceIdsPage(PREVIOUS_MILLENNIUM, TOMORROW, 800, RequestPartitionId.defaultPartition(), URL_PATIENT_EXPUNGE_TRUE);
final List<? extends IIdType> actualPatientIds =
resourcePidList.getTypedResourcePids()
.stream()
.map(typePid -> new IdDt(typePid.resourceType, (Long) typePid.id.getId()))
.toList();
assertIdsEqual(patientIds, actualPatientIds);
verify(mySpiedDaoRegistry, times(getExpectedNumOfInvocations(expectedNumResults))).getResourceDao(PATIENT);
}
@ParameterizedTest
@ValueSource(ints = {0, 9, 10, 11, 21, 22, 23, 45})
void fetchResourcesNoUrl(int expectedNumResults) {
final int pageSizeWellBelowThreshold = 2;
final List<IIdType> patientIds = IntStream.range(0, expectedNumResults)
.mapToObj(num -> createPatient())
.toList();
final IResourcePidList resourcePidList = mySubject.fetchResourceIdsPage(PREVIOUS_MILLENNIUM, TOMORROW, pageSizeWellBelowThreshold, RequestPartitionId.defaultPartition(), null);
final List<? extends IIdType> actualPatientIds =
resourcePidList.getTypedResourcePids()
.stream()
.map(typePid -> new IdDt(typePid.resourceType, (Long) typePid.id.getId()))
.toList();
assertIdsEqual(patientIds, actualPatientIds);
}
private int getExpectedNumOfInvocations(int expectedNumResults) {
final int maxResultsPerQuery = INTERNAL_SYNCHRONOUS_SEARCH_SIZE + 1;
final int division = expectedNumResults / maxResultsPerQuery;
return division + 1;
}
private static void assertIdsEqual(List<IIdType> expectedResourceIds, List<? extends IIdType> actualResourceIds) {
assertEquals(expectedResourceIds.size(), actualResourceIds.size());
for (int index = 0; index < expectedResourceIds.size(); index++) {
final IIdType expectedIdType = expectedResourceIds.get(index);
final IIdType actualIdType = actualResourceIds.get(index);
assertEquals(expectedIdType.getResourceType(), actualIdType.getResourceType());
assertEquals(expectedIdType.getIdPartAsLong(), actualIdType.getIdPartAsLong());
}
}
@Nonnull
private static Date toDate(LocalDate theLocalDate) {
return Date.from(theLocalDate.atStartOfDay(ZoneId.systemDefault()).toInstant());
}
}

View File

@ -3,16 +3,13 @@ package ca.uhn.fhir.jpa.reindex;
import ca.uhn.fhir.jpa.api.pid.IResourcePidList; import ca.uhn.fhir.jpa.api.pid.IResourcePidList;
import ca.uhn.fhir.jpa.api.pid.TypedResourcePid; import ca.uhn.fhir.jpa.api.pid.TypedResourcePid;
import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc; import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc;
import ca.uhn.fhir.jpa.model.dao.JpaPid;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test; import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import org.hl7.fhir.r4.model.DateType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.InstantType;
import org.junit.jupiter.api.MethodOrderer; import org.junit.jupiter.api.MethodOrderer;
import org.junit.jupiter.api.Test; import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestMethodOrder; import org.junit.jupiter.api.TestMethodOrder;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import java.util.ArrayList;
import java.util.Date; import java.util.Date;
import java.util.List; import java.util.List;
@ -111,26 +108,26 @@ public class ResourceReindexSvcImplTest extends BaseJpaR4Test {
// Setup // Setup
createPatient(withActiveFalse()); final Long patientId0 = createPatient(withActiveFalse()).getIdPartAsLong();
sleepUntilTimeChanges(); sleepUntilTimeChanges();
// Start of resources within range // Start of resources within range
Date start = new Date(); Date start = new Date();
sleepUntilTimeChanges(); sleepUntilTimeChanges();
Long id0 = createPatient(withActiveFalse()).getIdPartAsLong(); Long patientId1 = createPatient(withActiveFalse()).getIdPartAsLong();
createObservation(withObservationCode("http://foo", "bar")); createObservation(withObservationCode("http://foo", "bar"));
createObservation(withObservationCode("http://foo", "bar")); createObservation(withObservationCode("http://foo", "bar"));
sleepUntilTimeChanges(); sleepUntilTimeChanges();
Date beforeLastInRange = new Date(); Date beforeLastInRange = new Date();
sleepUntilTimeChanges(); sleepUntilTimeChanges();
Long id1 = createPatient(withActiveFalse()).getIdPartAsLong(); Long patientId2 = createPatient(withActiveFalse()).getIdPartAsLong();
sleepUntilTimeChanges(); sleepUntilTimeChanges();
Date end = new Date(); Date end = new Date();
sleepUntilTimeChanges(); sleepUntilTimeChanges();
// End of resources within range // End of resources within range
createObservation(withObservationCode("http://foo", "bar")); createObservation(withObservationCode("http://foo", "bar"));
createPatient(withActiveFalse()); final Long patientId3 = createPatient(withActiveFalse()).getIdPartAsLong();
sleepUntilTimeChanges(); sleepUntilTimeChanges();
// Execute // Execute
@ -140,13 +137,17 @@ public class ResourceReindexSvcImplTest extends BaseJpaR4Test {
// Verify // Verify
assertEquals(2, page.size()); assertEquals(4, page.size());
List<TypedResourcePid> typedResourcePids = page.getTypedResourcePids(); List<TypedResourcePid> typedResourcePids = page.getTypedResourcePids();
assertThat(page.getTypedResourcePids(), contains(new TypedResourcePid("Patient", id0), new TypedResourcePid("Patient", id1))); assertThat(page.getTypedResourcePids(),
contains(new TypedResourcePid("Patient", patientId0),
new TypedResourcePid("Patient", patientId1),
new TypedResourcePid("Patient", patientId2),
new TypedResourcePid("Patient", patientId3)));
assertTrue(page.getLastDate().after(beforeLastInRange)); assertTrue(page.getLastDate().after(beforeLastInRange));
assertTrue(page.getLastDate().before(end)); assertTrue(page.getLastDate().before(end) || page.getLastDate().equals(end));
assertEquals(3, myCaptureQueriesListener.logSelectQueries().size()); assertEquals(1, myCaptureQueriesListener.logSelectQueries().size());
assertEquals(0, myCaptureQueriesListener.countInsertQueries()); assertEquals(0, myCaptureQueriesListener.countInsertQueries());
assertEquals(0, myCaptureQueriesListener.countUpdateQueries()); assertEquals(0, myCaptureQueriesListener.countUpdateQueries());
assertEquals(0, myCaptureQueriesListener.countDeleteQueries()); assertEquals(0, myCaptureQueriesListener.countDeleteQueries());

View File

@ -1,8 +1,6 @@
package ca.uhn.fhir.jpa.subscription.message; package ca.uhn.fhir.jpa.subscription.message;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.interceptor.CascadingDeleteInterceptor;
import ca.uhn.fhir.jpa.model.entity.NormalizedQuantitySearchLevel;
import ca.uhn.fhir.jpa.subscription.BaseSubscriptionsR4Test; import ca.uhn.fhir.jpa.subscription.BaseSubscriptionsR4Test;
import ca.uhn.fhir.jpa.subscription.channel.api.ChannelConsumerSettings; import ca.uhn.fhir.jpa.subscription.channel.api.ChannelConsumerSettings;
import ca.uhn.fhir.jpa.subscription.channel.api.IChannelReceiver; import ca.uhn.fhir.jpa.subscription.channel.api.IChannelReceiver;
@ -10,6 +8,9 @@ import ca.uhn.fhir.jpa.subscription.channel.subscription.SubscriptionChannelFact
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedJsonMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedJsonMessage;
import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage;
import ca.uhn.fhir.jpa.test.util.StoppableSubscriptionDeliveringRestHookSubscriber; import ca.uhn.fhir.jpa.test.util.StoppableSubscriptionDeliveringRestHookSubscriber;
import ca.uhn.fhir.rest.client.api.Header;
import ca.uhn.fhir.rest.client.api.IGenericClient;
import ca.uhn.fhir.rest.client.interceptor.AdditionalRequestHeadersInterceptor;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.Coding; import org.hl7.fhir.r4.model.Coding;
@ -18,7 +19,6 @@ import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Subscription; import org.hl7.fhir.r4.model.Subscription;
import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments; import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource; import org.junit.jupiter.params.provider.MethodSource;
@ -30,11 +30,15 @@ import java.util.List;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import java.util.stream.Stream; import java.util.stream.Stream;
import static ca.uhn.fhir.jpa.model.util.JpaConstants.HEADER_META_SNAPSHOT_MODE;
import static java.util.Arrays.asList;
import static java.util.Collections.emptyList;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.hasSize;
import static org.hamcrest.Matchers.instanceOf; import static org.hamcrest.Matchers.instanceOf;
import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.is;
import static org.junit.jupiter.api.Assertions.assertTrue; import static org.hamcrest.core.IsEqual.equalTo;
/** /**
* Test the rest-hook subscriptions * Test the rest-hook subscriptions
@ -117,58 +121,98 @@ public class MessageSubscriptionR4Test extends BaseSubscriptionsR4Test {
assertThat(receivedObs.getMeta().getSource(), is(equalTo(theExpectedSourceValue))); assertThat(receivedObs.getMeta().getSource(), is(equalTo(theExpectedSourceValue)));
} }
@Test
public void testUpdateResourceRetainCorrectMetaTagsThroughDelivery() throws Exception { private static Stream<Arguments> metaTagsSource(){
List<Header> snapshotModeHeader = asList(new Header(HEADER_META_SNAPSHOT_MODE, "TAG"));
return Stream.of(
Arguments.of(asList("tag-1","tag-2"), asList("tag-3"), asList("tag-1","tag-2","tag-3"), emptyList()),
Arguments.of(asList("tag-1","tag-2"), asList("tag-1","tag-2","tag-3"), asList("tag-1","tag-2","tag-3"), emptyList()),
Arguments.of(emptyList(), asList("tag-1","tag-2"), asList("tag-1","tag-2"), emptyList()),
// Arguments.of(asList("tag-1","tag-2"), emptyList(), asList("tag-1","tag-2"), emptyList()), // will not trigger an update since tags are merged
Arguments.of(asList("tag-1","tag-2"), emptyList(), emptyList(), snapshotModeHeader),
Arguments.of(asList("tag-1","tag-2"), asList("tag-3"), asList("tag-3"), snapshotModeHeader),
Arguments.of(asList("tag-1","tag-2","tag-3"), asList("tag-1","tag-2"), asList("tag-1","tag-2"), snapshotModeHeader),
Arguments.of(asList("tag-1","tag-2","tag-3"), asList("tag-2","tag-3"), asList("tag-2","tag-3"), snapshotModeHeader),
Arguments.of(asList("tag-1","tag-2","tag-3"), asList("tag-1","tag-3"), asList("tag-1","tag-3"), snapshotModeHeader)
);
}
@ParameterizedTest
@MethodSource("metaTagsSource")
public void testUpdateResource_withHeaderSnapshotMode_willRetainCorrectMetaTagsThroughDelivery(List<String> theTagsForCreate, List<String> theTagsForUpdate, List<String> theExpectedTags, List<Header> theHeaders) throws Exception {
myStorageSettings.setTagStorageMode(JpaStorageSettings.TagStorageModeEnum.NON_VERSIONED); myStorageSettings.setTagStorageMode(JpaStorageSettings.TagStorageModeEnum.NON_VERSIONED);
createSubscriptionWithCriteria("[Patient]"); createSubscriptionWithCriteria("[Patient]");
waitForActivatedSubscriptionCount(1); waitForActivatedSubscriptionCount(1);
// Create Patient with two meta tags
Patient patient = new Patient(); Patient patient = new Patient();
patient.setActive(true); patient.setActive(true);
patient.getMeta().addTag().setSystem("http://www.example.com/tags").setCode("tag-1"); patient.getMeta().setTag(toSimpleCodingList(theTagsForCreate));
patient.getMeta().addTag().setSystem("http://www.example.com/tags").setCode("tag-2");
IIdType id = myClient.create().resource(patient).execute().getId(); IIdType id = myClient.create().resource(patient).execute().getId();
// Should see 1 subscription notification for CREATE
waitForQueueToDrain(); waitForQueueToDrain();
// Should receive two meta tags Patient receivedPatient = fetchSingleResourceFromSubscriptionTerminalEndpoint();
IBaseResource resource = fetchSingleResourceFromSubscriptionTerminalEndpoint(); assertThat(receivedPatient.getMeta().getTag(), hasSize(theTagsForCreate.size()));
assertThat(resource, instanceOf(Patient.class));
Patient receivedPatient = (Patient) resource;
assertThat(receivedPatient.getMeta().getTag().size(), is(equalTo(2)));
// Update the previous Patient and add one more tag
patient = new Patient(); patient = new Patient();
patient.setId(id); patient.setId(id);
patient.setActive(true); patient.setActive(true);
patient.getMeta().getTag().add(new Coding().setSystem("http://www.example.com/tags").setCode("tag-3")); patient.getMeta().setTag(toSimpleCodingList(theTagsForUpdate));
maybeAddHeaderInterceptor(myClient, theHeaders);
myClient.update().resource(patient).execute(); myClient.update().resource(patient).execute();
waitForQueueToDrain(); waitForQueueToDrain();
// Should receive all three meta tags receivedPatient = fetchSingleResourceFromSubscriptionTerminalEndpoint();;
List<String> expected = List.of("tag-1", "tag-2", "tag-3");
resource = fetchSingleResourceFromSubscriptionTerminalEndpoint();
receivedPatient = (Patient) resource;
List<Coding> receivedTagList = receivedPatient.getMeta().getTag();
ourLog.info(getFhirContext().newJsonParser().setPrettyPrint(true).encodeResourceToString(receivedPatient)); ourLog.info(getFhirContext().newJsonParser().setPrettyPrint(true).encodeResourceToString(receivedPatient));
assertThat(receivedTagList.size(), is(equalTo(3)));
List<String> actual = receivedTagList.stream().map(t -> t.getCode()).sorted().collect(Collectors.toList()); List<String> receivedTagList = toSimpleTagList(receivedPatient.getMeta().getTag());
assertTrue(expected.equals(actual)); assertThat(receivedTagList, containsInAnyOrder(theExpectedTags.toArray()));
} }
private IBaseResource fetchSingleResourceFromSubscriptionTerminalEndpoint() { private void maybeAddHeaderInterceptor(IGenericClient theClient, List<Header> theHeaders) {
if(theHeaders.isEmpty()){
return;
}
AdditionalRequestHeadersInterceptor additionalRequestHeadersInterceptor = new AdditionalRequestHeadersInterceptor();
theHeaders.forEach(aHeader ->
additionalRequestHeadersInterceptor
.addHeaderValue(
aHeader.getName(),
aHeader.getValue()
)
);
theClient.registerInterceptor(additionalRequestHeadersInterceptor);
}
private List<Coding> toSimpleCodingList(List<String> theTags) {
return theTags.stream().map(theString -> new Coding().setCode(theString)).collect(Collectors.toList());
}
private List<String> toSimpleTagList(List<Coding> theTags) {
return theTags.stream().map(t -> t.getCode()).collect(Collectors.toList());
}
private static Coding toSimpleCode(String theCode){
return new Coding().setCode(theCode);
}
private <T> T fetchSingleResourceFromSubscriptionTerminalEndpoint() {
assertThat(handler.getMessages().size(), is(equalTo(1))); assertThat(handler.getMessages().size(), is(equalTo(1)));
ResourceModifiedJsonMessage resourceModifiedJsonMessage = handler.getMessages().get(0); ResourceModifiedJsonMessage resourceModifiedJsonMessage = handler.getMessages().get(0);
ResourceModifiedMessage payload = resourceModifiedJsonMessage.getPayload(); ResourceModifiedMessage payload = resourceModifiedJsonMessage.getPayload();
String payloadString = payload.getPayloadString(); String payloadString = payload.getPayloadString();
IBaseResource resource = myFhirContext.newJsonParser().parseResource(payloadString); IBaseResource resource = myFhirContext.newJsonParser().parseResource(payloadString);
handler.clearMessages(); handler.clearMessages();
return resource; return (T) resource;
} }
} }

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -6,7 +6,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -72,7 +72,7 @@ public class TestDaoSearch {
} }
} }
@Autowired @Autowired(required = false)
private IFulltextSearchSvc myFulltextSearchSvc; private IFulltextSearchSvc myFulltextSearchSvc;
final FhirContext myFhirCtx; final FhirContext myFhirCtx;

View File

@ -0,0 +1,281 @@
/*-
* #%L
* HAPI FHIR JPA Server Test Utilities
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.search;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.dao.TestDaoSearch;
import ca.uhn.fhir.test.utilities.ITestDataBuilder;
import org.hl7.fhir.instance.model.api.IIdType;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.condition.EnabledIf;
/**
* Test cases for _source search parameter.
*/
public abstract class BaseSourceSearchParameterTestCases implements ITestDataBuilder.WithSupport {
final ITestDataBuilder.Support myTestDataBuilder;
final TestDaoSearch myTestDaoSearch;
final JpaStorageSettings myStorageSettings;
protected BaseSourceSearchParameterTestCases(
ITestDataBuilder.Support theTestDataBuilder,
TestDaoSearch theTestDaoSearch,
JpaStorageSettings theStorageSettings) {
myTestDataBuilder = theTestDataBuilder;
myTestDaoSearch = theTestDaoSearch;
myStorageSettings = theStorageSettings;
}
/**
* Enable if requestId within _source Search Parameter is supported
* Example: _source={sourceURI}#{requestId}
*/
protected abstract boolean isRequestIdSupported();
@Override
public Support getTestDataBuilderSupport() {
return myTestDataBuilder;
}
@AfterEach
public final void after() {
myTestDataBuilder.setRequestId(null);
myStorageSettings.setStoreMetaSourceInformation(new JpaStorageSettings().getStoreMetaSourceInformation());
}
@BeforeEach
public void before() {
myStorageSettings.setStoreMetaSourceInformation(
JpaStorageSettings.StoreMetaSourceInformationEnum.SOURCE_URI_AND_REQUEST_ID);
}
@Test
public void testSearch_withSource_returnsCorrectBundle() {
IIdType pt0id = createPatient(withSource("http://host/0"), withActiveTrue());
IIdType pt1id = createPatient(withSource("http://host/1"), withActiveTrue());
myTestDaoSearch.assertSearchFinds("search by source URI finds", "Patient?_source=http://host/0", pt0id);
myTestDaoSearch.assertSearchNotFound("search by source URI not found", "Patient?_source=http://host/0", pt1id);
}
@EnabledIf("isRequestIdSupported")
@Test
public void testSearch_withRequestIdAndSource_returnsCorrectBundle() {
myTestDataBuilder.setRequestId("a_request_id");
IIdType pt0id = createPatient(withSource("http://host/0"), withActiveTrue());
IIdType pt1id = createPatient(withSource("http://host/1"), withActiveTrue());
myTestDataBuilder.setRequestId("b_request_id");
IIdType pt2id = createPatient(withSource("http://host/1"), withActiveTrue());
myTestDaoSearch.assertSearchFinds("search by requestId finds", "Patient?_source=#a_request_id", pt0id, pt1id);
myTestDaoSearch.assertSearchNotFound("search by requestId not found", "Patient?_source=#a_request_id", pt2id);
myTestDaoSearch.assertSearchFinds(
"search by source URI and requestId finds", "Patient?_source=http://host/0#a_request_id", pt0id);
myTestDaoSearch.assertSearchNotFound(
"search by source URI and requestId not found",
"Patient?_source=http://host/0#a_request_id",
pt1id,
pt2id);
}
@Test
public void testSearchSource_whenSameSourceForMultipleResourceTypes_willMatchSearchResourceTypeOnly() {
String sourceUrn = "http://host/0";
myTestDataBuilder.setRequestId("a_request_id");
IIdType pt0id = createPatient(withSource(sourceUrn), withActiveTrue());
IIdType ob0id = createObservation(withSource(sourceUrn), withStatus("final"));
myTestDaoSearch.assertSearchFinds(
"search source URI for Patient finds", "Patient?_source=http://host/0", pt0id);
myTestDaoSearch.assertSearchNotFound(
"search source URI for Patient - Observation not found", "Patient?_source=http://host/0", ob0id);
}
@Test
public void testSearchSource_withOrJoinedParameter_returnsUnionResultBundle() {
myTestDataBuilder.setRequestId("a_request_id");
IIdType pt0id = createPatient(withSource("http://host/0"), withActiveTrue());
IIdType pt1id = createPatient(withSource("http://host/1"), withActiveTrue());
createPatient(withSource("http://host/2"), withActiveTrue());
myTestDaoSearch.assertSearchFinds(
"search source URI with union", "Patient?_source=http://host/0,http://host/1", pt0id, pt1id);
}
@EnabledIf("isRequestIdSupported")
@Test
public void testSearch_withSourceAndRequestId_returnsIntersectionResultBundle() {
myTestDataBuilder.setRequestId("a_request_id");
IIdType pt0id = createPatient(withSource("http://host/0"), withActiveTrue());
myTestDataBuilder.setRequestId("b_request_id");
IIdType pt1id = createPatient(withSource("http://host/0"), withActiveTrue());
IIdType pt2id = createPatient(withSource("http://host/1"), withActiveTrue());
myTestDaoSearch.assertSearchFinds(
"search for source URI and requestId intersection finds",
"Patient?_source=http://host/0&_source=#a_request_id",
pt0id);
myTestDaoSearch.assertSearchNotFound(
"search for source URI and requestId intersection not found",
"Patient?_source=http://host/0&_source=#a_request_id",
pt1id,
pt2id);
}
@Test
public void testSearchSource_withContainsModifier_returnsCorrectBundle() {
myStorageSettings.setAllowContainsSearches(true);
IIdType p1Id = createPatient(withSource("http://some-source"), withActiveTrue(), withFamily("Family"));
IIdType p2Id = createPatient(withSource("http://some-source/v1/321"), withActiveTrue());
IIdType p3Id = createPatient(withSource(("http://another-source/v1")), withActiveTrue(), withFamily("Family"));
myTestDaoSearch.assertSearchFinds(
"search matches both sources (same case search)", "Patient?_source:contains=some-source", p1Id, p2Id);
myTestDaoSearch.assertSearchFinds(
"search matches both sources (case insensitive search)",
"Patient?_source:contains=Some-Source",
p1Id,
p2Id);
myTestDaoSearch.assertSearchFinds(
"search matches all sources (union search)",
"Patient?_source:contains=Another-Source,some-source",
p1Id,
p2Id,
p3Id);
myTestDaoSearch.assertSearchFinds(
"search matches one sources (intersection with family SearchParameter)",
"Patient?_source:contains=Another-Source,some-source&family=Family,YourFamily",
p3Id);
myTestDaoSearch.assertSearchNotFound(
"search returns empty bundle (contains with missing=true)",
"Patient?_source:contains=Another-Source,some-source&_source:missing=true",
p1Id,
p2Id,
p3Id);
}
@Test
public void testSearchSource_withMissingModifierFalse_returnsNonEmptySources() {
IIdType p1Id = createPatient(withSource("http://some-source/v1"), withActiveTrue());
createPatient(withActiveTrue());
myTestDaoSearch.assertSearchFinds("search matches non-empty source", "Patient?_source:missing=false", p1Id);
}
@Test
public void testSearchSource_withMissingModifierTrue_returnsEmptySources() {
createPatient(withSource("http://some-source/v1"), withActiveTrue(), withFamily("Family"));
IIdType p2Id = createPatient(withActiveTrue(), withFamily("Family"));
myTestDaoSearch.assertSearchFinds("search matches empty source", "Patient?_source:missing=true", p2Id);
myTestDaoSearch.assertSearchFinds(
"search matches empty source with family parameter intersection",
"Patient?_source:missing=true&family=Family",
p2Id);
}
@Test
public void testSearchSource_withAboveModifier_returnsSourcesAbove() {
IIdType p1Id = createPatient(withSource("http://some-source/v1/123"), withActiveTrue());
IIdType p2Id = createPatient(withSource("http://some-source/v1/321"), withActiveTrue());
IIdType p3Id = createPatient(withSource("http://some-source/v1/321/v2"), withActiveTrue());
IIdType p4Id = createPatient(withSource("http://another-source"), withActiveTrue());
myTestDaoSearch.assertSearchFinds(
"search matches all sources above",
"Patient?_source:above=http://some-source/v1/321/v2/456",
p2Id,
p3Id);
myTestDaoSearch.assertSearchFinds(
"search matches all sources above", "Patient?_source:above=http://some-source/v1/321/v2", p2Id, p3Id);
myTestDaoSearch.assertSearchFinds(
"search matches source above", "Patient?_source:above=http://some-source/v1/321", p2Id);
myTestDaoSearch.assertSearchNotFound(
"search not matches if sources is not above",
"Patient?_source:above=http://some-source/fhir/v5/789",
p1Id,
p2Id,
p3Id,
p4Id);
myTestDaoSearch.assertSearchNotFound(
"search not matches if sources is not above",
"Patient?_source:above=http://some-source",
p1Id,
p2Id,
p3Id,
p4Id);
myTestDaoSearch.assertSearchFinds(
"search not matches for another source",
"Patient?_source:above=http://another-source,http://some-source/v1/321/v2",
p2Id,
p3Id,
p4Id);
}
@Test
public void testSearchSource_withBelowModifier_returnsSourcesBelow() {
IIdType p1Id = createPatient(withSource("http://some-source/v1/123"), withActiveTrue());
IIdType p2Id = createPatient(withSource("http://some-source/v1"), withActiveTrue());
IIdType p3Id = createPatient(withSource("http://some-source"), withActiveTrue());
IIdType p4Id = createPatient(withSource("http://another-source"), withActiveTrue());
myTestDaoSearch.assertSearchFinds(
"search matches all sources below", "Patient?_source:below=http://some-source", p1Id, p2Id, p3Id);
myTestDaoSearch.assertSearchFinds(
"search below with union",
"Patient?_source:below=http://some-source/v1,http://another-source",
p1Id,
p2Id,
p4Id);
myTestDaoSearch.assertSearchFinds(
"search below with intersection",
"Patient?_source:below=http://some-source/v1&_source:below=http://some-source/v1/123",
p1Id);
myTestDaoSearch.assertSearchNotFound(
"search below with intersection not matches",
"Patient?_source:below=http://some-source/v1&_source:below=http://some-source/v1/123",
p2Id,
p3Id,
p4Id);
}
}

View File

@ -6,6 +6,7 @@ import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.dao.TestDaoSearch; import ca.uhn.fhir.jpa.dao.TestDaoSearch;
import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases; import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases;
import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases; import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases;
import ca.uhn.fhir.jpa.search.BaseSourceSearchParameterTestCases;
import ca.uhn.fhir.jpa.test.BaseJpaTest; import ca.uhn.fhir.jpa.test.BaseJpaTest;
import ca.uhn.fhir.jpa.test.config.TestR4Config; import ca.uhn.fhir.jpa.test.config.TestR4Config;
import ca.uhn.fhir.storage.test.BaseDateSearchDaoTests; import ca.uhn.fhir.storage.test.BaseDateSearchDaoTests;
@ -91,4 +92,16 @@ public class FhirResourceDaoR4StandardQueriesLuceneTest extends BaseJpaTest {
} }
} }
@Nested
class SourceSearchParameterTestCases extends BaseSourceSearchParameterTestCases {
SourceSearchParameterTestCases() {
super(myDataBuilder, myTestDaoSearch, myStorageSettings);
}
@Override
protected boolean isRequestIdSupported() {
return false;
}
}
} }

View File

@ -5,7 +5,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId> <artifactId>hapi-fhir</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@ -7,7 +7,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -7,7 +7,7 @@
<parent> <parent>
<groupId>ca.uhn.hapi.fhir</groupId> <groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId> <artifactId>hapi-deployable-pom</artifactId>
<version>6.9.2-SNAPSHOT</version> <version>6.9.3-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath> <relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent> </parent>

View File

@ -19,6 +19,7 @@
*/ */
package ca.uhn.fhir.mdm.api; package ca.uhn.fhir.mdm.api;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.mdm.rules.json.MdmRulesJson; import ca.uhn.fhir.mdm.rules.json.MdmRulesJson;
import java.util.stream.Collectors; import java.util.stream.Collectors;
@ -61,4 +62,14 @@ public interface IMdmSettings {
boolean getSearchAllPartitionForMatch(); boolean getSearchAllPartitionForMatch();
void setSearchAllPartitionForMatch(boolean theSearchAllPartitionForMatch); void setSearchAllPartitionForMatch(boolean theSearchAllPartitionForMatch);
// TODO: on next bump, make this method non-default
default boolean isAutoExpungeGoldenResources() {
return false;
}
// TODO: on next bump, make this method non-default
default void setAutoExpungeGoldenResources(boolean theShouldAutoExpunge) {
throw new UnsupportedOperationException(Msg.code(2427));
}
} }

View File

@ -53,6 +53,11 @@ public interface IMdmLinkDao<P extends IResourcePersistentId, M extends IMdmLink
List<MdmPidTuple<P>> expandPidsByGoldenResourcePidAndMatchResult( List<MdmPidTuple<P>> expandPidsByGoldenResourcePidAndMatchResult(
P theSourcePid, MdmMatchResultEnum theMdmMatchResultEnum); P theSourcePid, MdmMatchResultEnum theMdmMatchResultEnum);
// TODO: on next bump, make this method non-default
default List<M> findLinksAssociatedWithGoldenResourceOfSourceResourceExcludingNoMatch(P theSourcePid) {
throw new UnsupportedOperationException(Msg.code(2428));
}
List<P> findPidByResourceNameAndThreshold(String theResourceName, Date theHighThreshold, Pageable thePageable); List<P> findPidByResourceNameAndThreshold(String theResourceName, Date theHighThreshold, Pageable thePageable);
List<P> findPidByResourceNameAndThresholdAndPartitionId( List<P> findPidByResourceNameAndThresholdAndPartitionId(

View File

@ -23,26 +23,52 @@ import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.api.Hook; import ca.uhn.fhir.interceptor.api.Hook;
import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.model.DeleteConflictList;
import ca.uhn.fhir.jpa.api.svc.IDeleteExpungeSvc;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.api.svc.IMdmClearHelperSvc;
import ca.uhn.fhir.jpa.dao.expunge.IExpungeEverythingService; import ca.uhn.fhir.jpa.dao.expunge.IExpungeEverythingService;
import ca.uhn.fhir.mdm.api.IMdmLink;
import ca.uhn.fhir.mdm.api.IMdmLinkUpdaterSvc;
import ca.uhn.fhir.mdm.api.IMdmSettings; import ca.uhn.fhir.mdm.api.IMdmSettings;
import ca.uhn.fhir.mdm.api.IMdmSubmitSvc;
import ca.uhn.fhir.mdm.api.MdmConstants; import ca.uhn.fhir.mdm.api.MdmConstants;
import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;
import ca.uhn.fhir.mdm.dao.IMdmLinkDao;
import ca.uhn.fhir.mdm.model.CanonicalEID; import ca.uhn.fhir.mdm.model.CanonicalEID;
import ca.uhn.fhir.mdm.model.MdmCreateOrUpdateParams;
import ca.uhn.fhir.mdm.model.MdmTransactionContext;
import ca.uhn.fhir.mdm.svc.MdmLinkDeleteSvc; import ca.uhn.fhir.mdm.svc.MdmLinkDeleteSvc;
import ca.uhn.fhir.mdm.util.EIDHelper; import ca.uhn.fhir.mdm.util.EIDHelper;
import ca.uhn.fhir.mdm.util.MdmResourceUtil; import ca.uhn.fhir.mdm.util.MdmResourceUtil;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId;
import ca.uhn.fhir.rest.api.server.storage.TransactionDetails;
import ca.uhn.fhir.rest.server.TransactionLogMessages;
import ca.uhn.fhir.rest.server.exceptions.ForbiddenOperationException; import ca.uhn.fhir.rest.server.exceptions.ForbiddenOperationException;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails; import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import org.hl7.fhir.instance.model.api.IAnyResource;
import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service; import org.springframework.stereotype.Service;
import java.util.ArrayList;
import java.util.Collections; import java.util.Collections;
import java.util.List; import java.util.List;
import java.util.Map;
import java.util.concurrent.atomic.AtomicInteger; import java.util.concurrent.atomic.AtomicInteger;
import java.util.stream.Collectors;
import static ca.uhn.fhir.mdm.api.MdmMatchResultEnum.MATCH;
import static ca.uhn.fhir.mdm.api.MdmMatchResultEnum.NO_MATCH;
import static ca.uhn.fhir.mdm.api.MdmMatchResultEnum.POSSIBLE_MATCH;
@Service @Service
public class MdmStorageInterceptor implements IMdmStorageInterceptor { public class MdmStorageInterceptor implements IMdmStorageInterceptor {
@ -68,6 +94,21 @@ public class MdmStorageInterceptor implements IMdmStorageInterceptor {
@Autowired @Autowired
private IMdmSettings myMdmSettings; private IMdmSettings myMdmSettings;
@Autowired
private IIdHelperService myIdHelperSvc;
@Autowired
private IMdmLinkDao myMdmLinkDao;
@Autowired
private IMdmSubmitSvc myMdmSubmitSvc;
@Autowired
private DaoRegistry myDaoRegistry;
@Autowired
private IMdmLinkUpdaterSvc mdmLinkUpdaterSvc;
@Hook(Pointcut.STORAGE_PRESTORAGE_RESOURCE_CREATED) @Hook(Pointcut.STORAGE_PRESTORAGE_RESOURCE_CREATED)
public void blockManualResourceManipulationOnCreate( public void blockManualResourceManipulationOnCreate(
IBaseResource theBaseResource, IBaseResource theBaseResource,
@ -147,6 +188,9 @@ public class MdmStorageInterceptor implements IMdmStorageInterceptor {
} }
} }
@Autowired
private IMdmClearHelperSvc<? extends IResourcePersistentId<?>> myIMdmClearHelperSvc;
@Hook(Pointcut.STORAGE_PRESTORAGE_RESOURCE_DELETED) @Hook(Pointcut.STORAGE_PRESTORAGE_RESOURCE_DELETED)
public void deleteMdmLinks(RequestDetails theRequest, IBaseResource theResource) { public void deleteMdmLinks(RequestDetails theRequest, IBaseResource theResource) {
if (ourLinksDeletedBeforehand.get()) { if (ourLinksDeletedBeforehand.get()) {
@ -154,10 +198,111 @@ public class MdmStorageInterceptor implements IMdmStorageInterceptor {
} }
if (myMdmSettings.isSupportedMdmType(myFhirContext.getResourceType(theResource))) { if (myMdmSettings.isSupportedMdmType(myFhirContext.getResourceType(theResource))) {
IIdType sourceId = theResource.getIdElement().toVersionless();
IResourcePersistentId sourcePid =
myIdHelperSvc.getPidOrThrowException(RequestPartitionId.allPartitions(), sourceId);
List<IMdmLink> allLinks =
myMdmLinkDao.findLinksAssociatedWithGoldenResourceOfSourceResourceExcludingNoMatch(sourcePid);
Map<MdmMatchResultEnum, List<IMdmLink>> linksByMatchResult =
allLinks.stream().collect(Collectors.groupingBy(IMdmLink::getMatchResult));
List<IMdmLink> matches =
linksByMatchResult.containsKey(MATCH) ? linksByMatchResult.get(MATCH) : new ArrayList<>();
List<IMdmLink> possibleMatches = linksByMatchResult.containsKey(POSSIBLE_MATCH)
? linksByMatchResult.get(POSSIBLE_MATCH)
: new ArrayList<>();
if (isDeletingLastMatchedSourceResouce(sourcePid, matches)) {
// We are attempting to delete the only source resource left linked to the golden resource
// In this case, we should automatically delete the golden resource to prevent orphaning
IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(theResource);
IResourcePersistentId goldenPid = extractGoldenPid(theResource, matches.get(0));
cleanUpPossibleMatches(possibleMatches, dao, goldenPid);
IAnyResource goldenResource = (IAnyResource) dao.readByPid(goldenPid);
myMdmLinkDeleteSvc.deleteWithAnyReferenceTo(goldenResource);
deleteGoldenResource(goldenPid, sourceId, dao, theRequest);
}
myMdmLinkDeleteSvc.deleteWithAnyReferenceTo(theResource); myMdmLinkDeleteSvc.deleteWithAnyReferenceTo(theResource);
} }
} }
private void deleteGoldenResource(
IResourcePersistentId goldenPid,
IIdType theSourceId,
IFhirResourceDao<?> theDao,
RequestDetails theRequest) {
setLinksDeletedBeforehand();
if (myMdmSettings.isAutoExpungeGoldenResources()) {
int numDeleted = deleteExpungeGoldenResource(goldenPid);
if (numDeleted > 0) {
ourLog.info("Removed {} golden resource(s) with references to {}", numDeleted, theSourceId);
}
} else {
String url = theRequest == null ? "" : theRequest.getCompleteUrl();
theDao.deletePidList(
url,
Collections.singleton(goldenPid),
new DeleteConflictList(),
theRequest,
new TransactionDetails());
}
resetLinksDeletedBeforehand();
}
/**
* Clean up possible matches associated with a GR if they are the only link left
* since they are no longer "real matches"
* Possible match resources are resubmitted for matching
*/
private void cleanUpPossibleMatches(
List<IMdmLink> possibleMatches, IFhirResourceDao<?> theDao, IResourcePersistentId theGoldenPid) {
IAnyResource goldenResource = (IAnyResource) theDao.readByPid(theGoldenPid);
for (IMdmLink possibleMatch : possibleMatches) {
if (possibleMatch.getGoldenResourcePersistenceId().equals(theGoldenPid)) {
IBaseResource sourceResource = theDao.readByPid(possibleMatch.getSourcePersistenceId());
MdmCreateOrUpdateParams params = new MdmCreateOrUpdateParams();
params.setGoldenResource(goldenResource);
params.setSourceResource((IAnyResource) sourceResource);
params.setMatchResult(NO_MATCH);
MdmTransactionContext mdmContext =
createMdmContext(MdmTransactionContext.OperationType.UPDATE_LINK, sourceResource.fhirType());
params.setMdmContext(mdmContext);
mdmLinkUpdaterSvc.updateLink(params);
}
}
}
private IResourcePersistentId extractGoldenPid(IBaseResource theResource, IMdmLink theMdmLink) {
IResourcePersistentId goldenPid = theMdmLink.getGoldenResourcePersistenceId();
goldenPid = myIdHelperSvc.newPidFromStringIdAndResourceName(goldenPid.toString(), theResource.fhirType());
return goldenPid;
}
private boolean isDeletingLastMatchedSourceResouce(IResourcePersistentId theSourcePid, List<IMdmLink> theMatches) {
return theMatches.size() == 1
&& theMatches.get(0).getSourcePersistenceId().equals(theSourcePid);
}
private MdmTransactionContext createMdmContext(
MdmTransactionContext.OperationType theOperation, String theResourceType) {
TransactionLogMessages transactionLogMessages = TransactionLogMessages.createNew();
MdmTransactionContext retVal = new MdmTransactionContext(transactionLogMessages, theOperation);
retVal.setResourceType(theResourceType);
return retVal;
}
private int deleteExpungeGoldenResource(IResourcePersistentId theGoldenPid) {
IDeleteExpungeSvc deleteExpungeSvc = myIMdmClearHelperSvc.getDeleteExpungeSvc();
return deleteExpungeSvc.deleteExpunge(new ArrayList<>(Collections.singleton(theGoldenPid)), false, null);
}
private void forbidIfModifyingExternalEidOnTarget(IBaseResource theNewResource, IBaseResource theOldResource) { private void forbidIfModifyingExternalEidOnTarget(IBaseResource theNewResource, IBaseResource theOldResource) {
List<CanonicalEID> newExternalEids = Collections.emptyList(); List<CanonicalEID> newExternalEids = Collections.emptyList();
List<CanonicalEID> oldExternalEids = Collections.emptyList(); List<CanonicalEID> oldExternalEids = Collections.emptyList();

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.mdm.model; package ca.uhn.fhir.mdm.model;
import ca.uhn.fhir.mdm.api.MdmMatchResultEnum; import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.mdm.model; package ca.uhn.fhir.mdm.model;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.mdm.model; package ca.uhn.fhir.mdm.model;
import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.RequestDetails;

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.mdm.model.mdmevents; package ca.uhn.fhir.mdm.model.mdmevents;
import ca.uhn.fhir.model.api.IModelJson; import ca.uhn.fhir.model.api.IModelJson;

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.mdm.model.mdmevents; package ca.uhn.fhir.mdm.model.mdmevents;
import ca.uhn.fhir.model.api.IModelJson; import ca.uhn.fhir.model.api.IModelJson;

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.mdm.model.mdmevents; package ca.uhn.fhir.mdm.model.mdmevents;
import ca.uhn.fhir.model.api.IModelJson; import ca.uhn.fhir.model.api.IModelJson;

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.mdm.model.mdmevents; package ca.uhn.fhir.mdm.model.mdmevents;
import ca.uhn.fhir.model.api.IModelJson; import ca.uhn.fhir.model.api.IModelJson;

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR - Master Data Management
* %%
* Copyright (C) 2014 - 2023 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.mdm.model.mdmevents; package ca.uhn.fhir.mdm.model.mdmevents;
import ca.uhn.fhir.model.api.IModelJson; import ca.uhn.fhir.model.api.IModelJson;

Some files were not shown because too many files have changed in this diff Show More