Bulk export reducer step fix mergeback (#4606)
* One more fix for #4467 * Enabling massIngestionMode causes incomplete resource deletion (#4476) * Adding initial test. * Adding fix and subsequent test. * Adding changelog. --------- Co-authored-by: peartree <etienne.poirier@smilecdr.com> * Provide the capability to request that the name of the subscription matching channel be unqualified (#4464) * Adding initial test. * Adding initial solution implementation. * Adding change log and code clean up. * addressing comments from 1st code review. --------- Co-authored-by: peartree <etienne.poirier@smilecdr.com> * Change visibility of migration method (#4471) * change migration visibility * add empty migration method for 640 --------- Co-authored-by: nathaniel.doef <nathaniel.doef@smilecdr.com> * Fix subscription validation not to validate partition ID when invoked from an update pointcut (#4484) * First commit: Make SubscriptionValidatingInterceptor aware of which Pointcut is being called. In validatePermissions(), skip determinePartition() if the Pointcut is STORAGE_PRESTORAGE_RESOURCE_UPDATED. Fix resulting compile errors in various unit tests. * Fix/enhance unit tests. Mark methods as deprecated instead of deleting them. Add proper error code. Complete changelog. * Remove erroneous TODOs and tweak the validation logic. * Enhance unit tests and fix changelog. * Reindex batch job fails when processing deleted resources. (#4482) * adding changelog. * Providing solution and adding changelog. * Adding new test. --------- Co-authored-by: peartree <etienne.poirier@smilecdr.com> * cleaning up checkstyle files (#4470) * cleaning up checkstyle files * One more fix for #4467 (#4469) * added exlusions for files at base project level so checkstyle doesn't error out * duplicate error code from merge * changing lifecycle goal for all module checkstyle check * moving checkstyle to base pom file, changing exectution phase on base check, cleaning dependency, resolving duplicate error code * wip * trying to figure out why pipeline cannot copy files * removing modules that don't actually need to be built. * I messed up the version --------- Co-authored-by: James Agnew <jamesagnew@gmail.com> * Bump core to 5.6.881 (#4496) * Bump core to 5.6.881-SNAPSHOT * Work on fixing tests * Work on fixing tests 2 * Bump to core release --------- Co-authored-by: dotasek <david.otasek@smilecdr.com> * Issue 4486 mdm inconsistent possible match score values (#4487) * Extract method for readability * Save always normalized score values in POSSIBLE_MATCH links. * Avoid setting properties to null values. Adjust test. * Simplify fix * Fix test. Add RangeTestHelper. --------- Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com> * Revert "cleaning up checkstyle files (#4470)" This reverts commitefae3b5d5f
. * core version fix * Loosen rules for id helper * License * fix batch2 reduction step (#4499) * fix bug where FINALIZE jobs are not cancellable * moved reduction step to message hander * moving reduction step to queue * addingchangelog * cleaning up * review fixes * review fix' --------- Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-mbp.home> * Scheduled batch 2 bulk export job and binary delete (#4492) * First commit: Scheduled batch 2 bulk export job delete and binary, incomplete mock-based unit test, and a mess of TODOs and code that needs to be deleted. * Refine solution and add a concrete unit test but still work to do. * Comment out code in cancelAndPurgeAllJobs() and see if it breaks the pipeline. * Unit tests complete. New Msg code for new IJobPersistence.fetchInstances() method. Cleanup TODOs and add others. * Finish final touches on implementation. * Add changelog. * Various cleanup. * Code review feedback. * Small tweak to changelog. * Last code review tweak. * Address more code review comments. * Reverse changes to consider work chunks. Add a constant for write-to-binary. * Change bulk import test for valueUri type (#4503) * change tests * suggested test change * CVE resolutions (#4513) * Bump Postgres for CVE * Bump jetty * Verison bump * Remove comments * Revrt bump * Add check in scanner (#4518) * 4516 create hapi fhir cli command to clear stale lock entries (#4517) * Initial implementation * better tests * Add changelog and docs * Forgotten files * Code review comments * Fix checkstyle * Unable to Expunge CodeSystem (#4507) * changes for GL-3943 * changes for GL-3943 --------- Co-authored-by: isaacwen <isaac.wen@smilecdr.com> * New line:: * Update to documentation regarding narrative generation; (#4521) Providing changelog; Co-authored-by: peartree <etienne.poirier@smilecdr.com> * changed what score is set for mdmlinks that created new golden resource (#4514) * changed what score is set for mdmlinks that created new golden resource * fix test --------- Co-authored-by: Long Ma <long@smilecdr.com> * REVERT: change to operationoutcome.html * trying to fix BulkDataExportTest testGroupBulkExportNotInGroup_DoesNo… (#4527) * trying to fix BulkDataExportTest testGroupBulkExportNotInGroup_DoesNotShowUp * added change log --------- Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-mbp.home> * fix build (#4530) * Making narrative_generation.md reference an html snippet (#4531) Co-authored-by: peartree <etienne.poirier@smilecdr.com> * fixed the issue of meta.source field inconsistently populated in subscription messages for different requests (#4524) * fix + test * minor fix * Addressing suggestion * Minor changes * 4441 rel 6 4 bad references creation bug (#4519) * adding a test * fail in the case of ref enforce on type and on write and autocreate are all true * update to code * removing a line * cleanup * removing check on urn * changing just to trigger a build * adding a comment to the pom * updating test for better information --------- Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-mbp.home> * fixed channel import null pointer exception from null header (#4534) * retryCount 0 on null header + test + changelog * suggested changes * Revert "fixed the issue of meta.source field inconsistently populated in subscription messages for different requests (#4524)" (#4535) This reverts commit53252b8d15
. * Better error handling for when channel type is not supported (#4538) Co-authored-by: kylejule <kyle.jule@smilecdr.com> * Avoid logging message payloads that contain sensitive data (#4537) Don't log payloads - they may contain sensitive data. * Bulk Export Bug With Many Resources and Low Max File Size (#4506) * failing test * fix + changelog * tweak * add method to IJobPersistence to use a Stream * tweak * tweak * decrease test time * clean up * code review comments * version bump * Increase timeout limit to match BulkExportUseCaseTest * shorten test * maintenance pass * add logging * Revert "add logging" This reverts commitb0453fd953
. * Revert "maintenance pass" This reverts commitbbc7418d51
. * test * trying to fix BulkDataExportTest testGroupBulkExportNotInGroup_DoesNotShowUp * shorten tests * logging * move test location * fixes a regression caused my change in hapi-fhir * timeout * Revert "fixes a regression caused my change in hapi-fhir" This reverts commit4b58013149
. * testing * Revert "testing" This reverts commitaafc95c2f3
. --------- Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-mbp.home> * bump ver * License updates' * Downgrade dep' * Updating version to: 6.4.1 post release. * Add javadocs and sources to our serviceloaders * Reset version * Change parent * Remove bumped version * License fixes, new parent * Updating version to: 6.4.1 post release. * Fix bad creation of versionenum * Improve performance on bulk export * Add changelog * Start working on cleaned up reducer * Clean up batch calls * Work on issues * Build fixes * typedbundleprovider getallresources override (#4552) * typedbundleprovider getallresources override * added test to immunization tests that validates pagination does not result in null pointer if over default queryCount * moved HapiFhirDal test to its own test class * removed unused imports * Update to use JpaStorageSettings * adding changelog for issue 4551 * fix changelog spacing * changelog type to fix --------- Co-authored-by: justin.mckelvy <justin.mckelvy@smilecdr.com> Co-authored-by: Jonathan Percival <jonathan.i.percival@gmail.com> * Add backport info * Upgrade core to 5.6.97, make adjustments in hapi-fhir, and ensure that all tests pass (#4579) * First commit: Create new branch from the release branch with changes from James' branch. This probably won't compile as the work is incomplete. * Second round of changes from integrating James' branch. * Mark most test failures with TODOs. * Add whitespace * Add changes to FhirPathR4 to set FHIRPathEngine to non-strict FP evaluation. * Fix CreatePackageCommandTest to assert null instead of empty string. Comments on tests that fail due to the double-quote encoding problem. * Downgrade to core 5.6.97. * Fix another test and remove TODOs. * Fix changelog. * Clean up some erroneous changes and TODOs. --------- Co-authored-by: Tadgh <garygrantgraham@gmail.com> * Fix up dal test * Address leftover code review feedback from the upgrade to core 5.6.97. (#4585) * Exclude pinned core deps * Force pin structs * Add model changes to IBaseCoding and related changes (#4587) * Add IbaseCoding changes, and tinder changes * Fix up tag definition * converter addition * Fix unit test and add changelog. * Add jira to changelog. --------- Co-authored-by: Tadgh <garygrantgraham@gmail.com> * Fix changelog * Tidy metadata * Test fixing * Test fixes * Work on progress * Add changelog * Build tweak * Fixes * Disable intermittently failing tests. (#4593) * rename tests to IT * Disable more intermittently failing tests (#4595) * Disable more intermittently failing tests. * Disable another intermittently failing tests. * ITify * Disable yet another intermittently failing tests. (#4596) * disable * disables * Fixes * disables * Fix compile * Test fixes * Updating version to: 6.4.2 post release. * Bump to 6.4.2-SNAPSHOT * Fix compile * fix version * Address review comments * Review comments * Version bump * Compile fix * Test fixes * Compile fixes * One more compile fix * Test fixes * Version bump * Resolve fixme --------- Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com> Co-authored-by: peartree <etienne.poirier@smilecdr.com> Co-authored-by: Nathan Doef <n.doef@protonmail.com> Co-authored-by: nathaniel.doef <nathaniel.doef@smilecdr.com> Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com> Co-authored-by: Mark Iantorno <markiantorno@gmail.com> Co-authored-by: dotasek <dotasek.dev@gmail.com> Co-authored-by: dotasek <david.otasek@smilecdr.com> Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com> Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com> Co-authored-by: Tadgh <garygrantgraham@gmail.com> Co-authored-by: TipzCM <leif.stawnyczy@gmail.com> Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-mbp.home> Co-authored-by: samguntersmilecdr <123124187+samguntersmilecdr@users.noreply.github.com> Co-authored-by: Isaac Wen <76772867+isaacwen@users.noreply.github.com> Co-authored-by: isaacwen <isaac.wen@smilecdr.com> Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com> Co-authored-by: Long Ma <long@smilecdr.com> Co-authored-by: Qingyixia <106992634+Qingyixia@users.noreply.github.com> Co-authored-by: KGJ-software <39975592+KGJ-software@users.noreply.github.com> Co-authored-by: kylejule <kyle.jule@smilecdr.com> Co-authored-by: michaelabuckley <michaelabuckley@gmail.com> Co-authored-by: Justin McKelvy <60718638+Capt-Mac@users.noreply.github.com> Co-authored-by: justin.mckelvy <justin.mckelvy@smilecdr.com> Co-authored-by: Jonathan Percival <jonathan.i.percival@gmail.com>
This commit is contained in:
parent
2083b7ca74
commit
ae1d249d99
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -133,6 +133,15 @@ public class DefaultProfileValidationSupport implements IValidationSupport {
|
||||||
structureDefinitionResources.add("/org/hl7/fhir/r4/model/profile/profiles-others.xml");
|
structureDefinitionResources.add("/org/hl7/fhir/r4/model/profile/profiles-others.xml");
|
||||||
structureDefinitionResources.add("/org/hl7/fhir/r4/model/extension/extension-definitions.xml");
|
structureDefinitionResources.add("/org/hl7/fhir/r4/model/extension/extension-definitions.xml");
|
||||||
break;
|
break;
|
||||||
|
case R4B:
|
||||||
|
terminologyResources.add("/org/hl7/fhir/r4b/model/valueset/valuesets.xml");
|
||||||
|
terminologyResources.add("/org/hl7/fhir/r4b/model/valueset/v2-tables.xml");
|
||||||
|
terminologyResources.add("/org/hl7/fhir/r4b/model/valueset/v3-codesystems.xml");
|
||||||
|
structureDefinitionResources.add("/org/hl7/fhir/r4b/model/profile/profiles-resources.xml");
|
||||||
|
structureDefinitionResources.add("/org/hl7/fhir/r4b/model/profile/profiles-types.xml");
|
||||||
|
structureDefinitionResources.add("/org/hl7/fhir/r4b/model/profile/profiles-others.xml");
|
||||||
|
structureDefinitionResources.add("/org/hl7/fhir/r4b/model/extension/extension-definitions.xml");
|
||||||
|
break;
|
||||||
case R5:
|
case R5:
|
||||||
structureDefinitionResources.add("/org/hl7/fhir/r5/model/profile/profiles-resources.xml");
|
structureDefinitionResources.add("/org/hl7/fhir/r5/model/profile/profiles-resources.xml");
|
||||||
structureDefinitionResources.add("/org/hl7/fhir/r5/model/profile/profiles-types.xml");
|
structureDefinitionResources.add("/org/hl7/fhir/r5/model/profile/profiles-types.xml");
|
||||||
|
@ -221,15 +230,41 @@ public class DefaultProfileValidationSupport implements IValidationSupport {
|
||||||
@Override
|
@Override
|
||||||
public IBaseResource fetchStructureDefinition(String theUrl) {
|
public IBaseResource fetchStructureDefinition(String theUrl) {
|
||||||
String url = theUrl;
|
String url = theUrl;
|
||||||
if (url.startsWith(URL_PREFIX_STRUCTURE_DEFINITION)) {
|
if (!url.startsWith(URL_PREFIX_STRUCTURE_DEFINITION)) {
|
||||||
// no change
|
if (url.indexOf('/') == -1) {
|
||||||
} else if (url.indexOf('/') == -1) {
|
|
||||||
url = URL_PREFIX_STRUCTURE_DEFINITION + url;
|
url = URL_PREFIX_STRUCTURE_DEFINITION + url;
|
||||||
} else if (StringUtils.countMatches(url, '/') == 1) {
|
} else if (StringUtils.countMatches(url, '/') == 1) {
|
||||||
url = URL_PREFIX_STRUCTURE_DEFINITION_BASE + url;
|
url = URL_PREFIX_STRUCTURE_DEFINITION_BASE + url;
|
||||||
}
|
}
|
||||||
|
}
|
||||||
Map<String, IBaseResource> structureDefinitionMap = provideStructureDefinitionMap();
|
Map<String, IBaseResource> structureDefinitionMap = provideStructureDefinitionMap();
|
||||||
return structureDefinitionMap.get(url);
|
IBaseResource retVal = structureDefinitionMap.get(url);
|
||||||
|
if (retVal == null) {
|
||||||
|
|
||||||
|
if (url.startsWith(URL_PREFIX_STRUCTURE_DEFINITION)) {
|
||||||
|
|
||||||
|
/*
|
||||||
|
* A few built-in R4 SearchParameters have the wrong casing for primitive
|
||||||
|
* search parameters eg "value.as(String)" when it should be
|
||||||
|
* "value.as(string)". This lets us be a bit lenient about this.
|
||||||
|
*/
|
||||||
|
if (myCtx.getVersion().getVersion() == FhirVersionEnum.R4 || myCtx.getVersion().getVersion() == FhirVersionEnum.R4B || myCtx.getVersion().getVersion() == FhirVersionEnum.R5) {
|
||||||
|
String end = url.substring(URL_PREFIX_STRUCTURE_DEFINITION.length());
|
||||||
|
if (Character.isUpperCase(end.charAt(0))) {
|
||||||
|
String newEnd = Character.toLowerCase(end.charAt(0)) + end.substring(1);
|
||||||
|
String alternateUrl = URL_PREFIX_STRUCTURE_DEFINITION + newEnd;
|
||||||
|
retVal = structureDefinitionMap.get(alternateUrl);
|
||||||
|
if (retVal != null) {
|
||||||
|
retVal = myCtx.newTerser().clone(retVal);
|
||||||
|
myCtx.newTerser().setElement(retVal, "type", end);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
return retVal;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
|
|
|
@ -59,6 +59,8 @@ public class Tag extends BaseElement implements IElement, IBaseCoding {
|
||||||
private String myLabel;
|
private String myLabel;
|
||||||
private String myScheme;
|
private String myScheme;
|
||||||
private String myTerm;
|
private String myTerm;
|
||||||
|
private String myVersion;
|
||||||
|
private boolean myUserSelected;
|
||||||
|
|
||||||
public Tag() {
|
public Tag() {
|
||||||
}
|
}
|
||||||
|
@ -124,6 +126,16 @@ public class Tag extends BaseElement implements IElement, IBaseCoding {
|
||||||
return false;
|
return false;
|
||||||
} else if (!myTerm.equals(other.myTerm))
|
} else if (!myTerm.equals(other.myTerm))
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
|
if (myVersion == null) {
|
||||||
|
if (other.getVersion() != null)
|
||||||
|
return false;
|
||||||
|
} else if (!myVersion.equals(other.getVersion()))
|
||||||
|
return false;
|
||||||
|
|
||||||
|
if (myUserSelected != other.getUserSelected())
|
||||||
|
return false;
|
||||||
|
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -133,6 +145,8 @@ public class Tag extends BaseElement implements IElement, IBaseCoding {
|
||||||
int result = 1;
|
int result = 1;
|
||||||
result = prime * result + ((myScheme == null) ? 0 : myScheme.hashCode());
|
result = prime * result + ((myScheme == null) ? 0 : myScheme.hashCode());
|
||||||
result = prime * result + ((myTerm == null) ? 0 : myTerm.hashCode());
|
result = prime * result + ((myTerm == null) ? 0 : myTerm.hashCode());
|
||||||
|
result = prime * result + ((myVersion == null) ? 0 : myVersion.hashCode());
|
||||||
|
result = prime * result + Boolean.hashCode(myUserSelected);
|
||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -174,6 +188,8 @@ public class Tag extends BaseElement implements IElement, IBaseCoding {
|
||||||
b.append("Scheme", myScheme);
|
b.append("Scheme", myScheme);
|
||||||
b.append("Term", myTerm);
|
b.append("Term", myTerm);
|
||||||
b.append("Label", myLabel);
|
b.append("Label", myLabel);
|
||||||
|
b.append("Version", myVersion);
|
||||||
|
b.append("UserSelected", myUserSelected);
|
||||||
return b.toString();
|
return b.toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -210,4 +226,22 @@ public class Tag extends BaseElement implements IElement, IBaseCoding {
|
||||||
return this;
|
return this;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String getVersion() { return myVersion; }
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public IBaseCoding setVersion(String theVersion) {
|
||||||
|
myVersion = theVersion;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public boolean getUserSelected() { return myUserSelected; }
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public IBaseCoding setUserSelected(boolean theUserSelected) {
|
||||||
|
myUserSelected = theUserSelected;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -115,6 +115,7 @@ public enum VersionEnum {
|
||||||
V6_3_0,
|
V6_3_0,
|
||||||
V6_4_0,
|
V6_4_0,
|
||||||
V6_4_1,
|
V6_4_1,
|
||||||
|
V6_4_2,
|
||||||
V6_5_0,
|
V6_5_0,
|
||||||
V6_6_0
|
V6_6_0
|
||||||
;
|
;
|
||||||
|
|
|
@ -28,10 +28,19 @@ public interface IBaseCoding extends IBase {
|
||||||
|
|
||||||
String getSystem();
|
String getSystem();
|
||||||
|
|
||||||
|
boolean getUserSelected();
|
||||||
|
|
||||||
|
String getVersion();
|
||||||
|
|
||||||
IBaseCoding setCode(String theTerm);
|
IBaseCoding setCode(String theTerm);
|
||||||
|
|
||||||
IBaseCoding setDisplay(String theLabel);
|
IBaseCoding setDisplay(String theLabel);
|
||||||
|
|
||||||
IBaseCoding setSystem(String theScheme);
|
IBaseCoding setSystem(String theScheme);
|
||||||
|
|
||||||
|
IBaseCoding setVersion(String theVersion);
|
||||||
|
|
||||||
|
IBaseCoding setUserSelected(boolean theUserSelected);
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -27,7 +27,7 @@ public class TagTest {
|
||||||
@Test
|
@Test
|
||||||
public void testHashCode() {
|
public void testHashCode() {
|
||||||
Tag tag1 = new Tag().setScheme("scheme").setTerm("term").setLabel("label");
|
Tag tag1 = new Tag().setScheme("scheme").setTerm("term").setLabel("label");
|
||||||
assertEquals(1920714536, tag1.hashCode());
|
assertEquals(-1029266947, tag1.hashCode());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
|
|
|
@ -4,14 +4,14 @@
|
||||||
<modelVersion>4.0.0</modelVersion>
|
<modelVersion>4.0.0</modelVersion>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-bom</artifactId>
|
<artifactId>hapi-fhir-bom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<packaging>pom</packaging>
|
<packaging>pom</packaging>
|
||||||
<name>HAPI FHIR BOM</name>
|
<name>HAPI FHIR BOM</name>
|
||||||
|
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
@ -237,6 +237,11 @@
|
||||||
<artifactId>hapi-fhir-validation-resources-r4</artifactId>
|
<artifactId>hapi-fhir-validation-resources-r4</artifactId>
|
||||||
<version>${project.version}</version>
|
<version>${project.version}</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>${project.groupId}</groupId>
|
||||||
|
<artifactId>hapi-fhir-validation-resources-r4b</artifactId>
|
||||||
|
<version>${project.version}</version>
|
||||||
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>${project.groupId}</groupId>
|
<groupId>${project.groupId}</groupId>
|
||||||
<artifactId>hapi-fhir-validation-resources-r5</artifactId>
|
<artifactId>hapi-fhir-validation-resources-r5</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -103,7 +103,7 @@ public class CreatePackageCommandTest extends BaseTest {
|
||||||
{
|
{
|
||||||
"name" : "com.example.ig",
|
"name" : "com.example.ig",
|
||||||
"version" : "1.0.1",
|
"version" : "1.0.1",
|
||||||
"description" : "",
|
"description" : null,
|
||||||
"fhirVersions" : ["4.0.1"],
|
"fhirVersions" : ["4.0.1"],
|
||||||
"dependencies" : {
|
"dependencies" : {
|
||||||
"hl7.fhir.core" : "4.0.1",
|
"hl7.fhir.core" : "4.0.1",
|
||||||
|
@ -158,7 +158,7 @@ public class CreatePackageCommandTest extends BaseTest {
|
||||||
{
|
{
|
||||||
"name" : "com.example.ig",
|
"name" : "com.example.ig",
|
||||||
"version" : "1.0.1",
|
"version" : "1.0.1",
|
||||||
"description" : "",
|
"description" : null,
|
||||||
"fhirVersions" : ["4.0.1"]
|
"fhirVersions" : ["4.0.1"]
|
||||||
}
|
}
|
||||||
""";
|
""";
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-cli</artifactId>
|
<artifactId>hapi-fhir-cli</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../../hapi-deployable-pom</relativePath>
|
<relativePath>../../hapi-deployable-pom</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -273,9 +273,8 @@ public class VersionCanonicalizer {
|
||||||
retVal.setSystem(coding.getSystem());
|
retVal.setSystem(coding.getSystem());
|
||||||
retVal.setDisplay(coding.getDisplay());
|
retVal.setDisplay(coding.getDisplay());
|
||||||
retVal.setVersion(coding.getVersion());
|
retVal.setVersion(coding.getVersion());
|
||||||
if (coding.getUserSelected() != null) {
|
retVal.setUserSelected( ! coding.getUserSelectedElement().isEmpty() && coding.getUserSelected() );
|
||||||
retVal.setUserSelected(coding.getUserSelected());
|
|
||||||
}
|
|
||||||
return retVal;
|
return retVal;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
@ -212,6 +212,11 @@
|
||||||
<artifactId>hapi-fhir-validation-resources-r4</artifactId>
|
<artifactId>hapi-fhir-validation-resources-r4</artifactId>
|
||||||
<version>${project.version}</version>
|
<version>${project.version}</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
|
<artifactId>hapi-fhir-validation-resources-r4b</artifactId>
|
||||||
|
<version>${project.version}</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
This release bumps the org.hl7.fhir core dependency up to 5.6.97, and modifies IBaseCoding
|
|
@ -0,0 +1,3 @@
|
||||||
|
---
|
||||||
|
release-date: "2023-02-24"
|
||||||
|
codename: "Vishwa"
|
|
@ -1,4 +1,5 @@
|
||||||
---
|
---
|
||||||
type: fix
|
type: fix
|
||||||
issue: 4551
|
issue: 4551
|
||||||
|
backport: 6.4.1
|
||||||
title: "The HapifhirDal search method required use of TypedBundleProvider.getallresources to avoid null pointer issue on searches that are greater than the default querycount"
|
title: "The HapifhirDal search method required use of TypedBundleProvider.getallresources to avoid null pointer issue on searches that are greater than the default querycount"
|
||||||
|
|
|
@ -0,0 +1,6 @@
|
||||||
|
---
|
||||||
|
type: perf
|
||||||
|
issue: 4569
|
||||||
|
backport: 6.4.1
|
||||||
|
title: "A race condition in the Bulk Export module sometimes resulted in bulk export jobs producing completion
|
||||||
|
reports that did not contain all generated output files. This has been corrected."
|
|
@ -0,0 +1,7 @@
|
||||||
|
---
|
||||||
|
type: perf
|
||||||
|
issue: 4569
|
||||||
|
backport: 6.4.1
|
||||||
|
title: "An inefficient query in the JPA Bulk Export module was optimized. This query caused exports for resources
|
||||||
|
containing tags/security labels/profiles to perform a number of redundant database lookups, so this type of
|
||||||
|
export should be much faster now."
|
|
@ -0,0 +1,5 @@
|
||||||
|
---
|
||||||
|
type: fix
|
||||||
|
issue: 4582
|
||||||
|
backport: 6.4.1
|
||||||
|
title: "Update IBaseCoding, Tag, and tinder Child to support new userSelected and version fields for resource tags."
|
|
@ -0,0 +1,6 @@
|
||||||
|
---
|
||||||
|
type: fix
|
||||||
|
issue: 4582
|
||||||
|
jira: SMILE-4688
|
||||||
|
backport: 6.4.1
|
||||||
|
title: "Upgrade dependency on core to 5.6.97 including hapi-fhir code enhancements and unit test fixes."
|
|
@ -11,7 +11,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
<modelVersion>4.0.0</modelVersion>
|
<modelVersion>4.0.0</modelVersion>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
@ -132,6 +132,11 @@
|
||||||
<artifactId>hapi-fhir-validation-resources-r4</artifactId>
|
<artifactId>hapi-fhir-validation-resources-r4</artifactId>
|
||||||
<version>${project.version}</version>
|
<version>${project.version}</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
|
<artifactId>hapi-fhir-validation-resources-r4b</artifactId>
|
||||||
|
<version>${project.version}</version>
|
||||||
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-validation-resources-r5</artifactId>
|
<artifactId>hapi-fhir-validation-resources-r5</artifactId>
|
||||||
|
|
|
@ -47,6 +47,7 @@ import org.springframework.transaction.PlatformTransactionManager;
|
||||||
import org.springframework.transaction.TransactionDefinition;
|
import org.springframework.transaction.TransactionDefinition;
|
||||||
import org.springframework.transaction.annotation.Propagation;
|
import org.springframework.transaction.annotation.Propagation;
|
||||||
import org.springframework.transaction.annotation.Transactional;
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
import org.springframework.transaction.support.TransactionSynchronizationManager;
|
||||||
import org.springframework.transaction.support.TransactionTemplate;
|
import org.springframework.transaction.support.TransactionTemplate;
|
||||||
|
|
||||||
import javax.annotation.Nonnull;
|
import javax.annotation.Nonnull;
|
||||||
|
@ -61,6 +62,7 @@ import java.util.function.Consumer;
|
||||||
import java.util.stream.Collectors;
|
import java.util.stream.Collectors;
|
||||||
import java.util.stream.Stream;
|
import java.util.stream.Stream;
|
||||||
|
|
||||||
|
import static ca.uhn.fhir.jpa.entity.Batch2WorkChunkEntity.ERROR_MSG_MAX_LENGTH;
|
||||||
import static org.apache.commons.lang3.StringUtils.isBlank;
|
import static org.apache.commons.lang3.StringUtils.isBlank;
|
||||||
|
|
||||||
public class JpaJobPersistenceImpl implements IJobPersistence {
|
public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
|
@ -235,7 +237,8 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
@Override
|
@Override
|
||||||
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
||||||
public void markWorkChunkAsErroredAndIncrementErrorCount(String theChunkId, String theErrorMessage) {
|
public void markWorkChunkAsErroredAndIncrementErrorCount(String theChunkId, String theErrorMessage) {
|
||||||
myWorkChunkRepository.updateChunkStatusAndIncrementErrorCountForEndError(theChunkId, new Date(), theErrorMessage, StatusEnum.ERRORED);
|
String errorMessage = truncateErrorMessage(theErrorMessage);
|
||||||
|
myWorkChunkRepository.updateChunkStatusAndIncrementErrorCountForEndError(theChunkId, new Date(), errorMessage, StatusEnum.ERRORED);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
|
@ -251,28 +254,39 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
||||||
public void markWorkChunkAsFailed(String theChunkId, String theErrorMessage) {
|
public void markWorkChunkAsFailed(String theChunkId, String theErrorMessage) {
|
||||||
ourLog.info("Marking chunk {} as failed with message: {}", theChunkId, theErrorMessage);
|
ourLog.info("Marking chunk {} as failed with message: {}", theChunkId, theErrorMessage);
|
||||||
String errorMessage;
|
String errorMessage = truncateErrorMessage(theErrorMessage);
|
||||||
if (theErrorMessage.length() > Batch2WorkChunkEntity.ERROR_MSG_MAX_LENGTH) {
|
|
||||||
ourLog.warn("Truncating error message that is too long to store in database: {}", theErrorMessage);
|
|
||||||
errorMessage = theErrorMessage.substring(0, Batch2WorkChunkEntity.ERROR_MSG_MAX_LENGTH);
|
|
||||||
} else {
|
|
||||||
errorMessage = theErrorMessage;
|
|
||||||
}
|
|
||||||
myWorkChunkRepository.updateChunkStatusAndIncrementErrorCountForEndError(theChunkId, new Date(), errorMessage, StatusEnum.FAILED);
|
myWorkChunkRepository.updateChunkStatusAndIncrementErrorCountForEndError(theChunkId, new Date(), errorMessage, StatusEnum.FAILED);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Nonnull
|
||||||
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
private static String truncateErrorMessage(String theErrorMessage) {
|
||||||
public void markWorkChunkAsCompletedAndClearData(String theChunkId, int theRecordsProcessed) {
|
String errorMessage;
|
||||||
myWorkChunkRepository.updateChunkStatusAndClearDataForEndSuccess(theChunkId, new Date(), theRecordsProcessed, StatusEnum.COMPLETED);
|
if (theErrorMessage != null && theErrorMessage.length() > ERROR_MSG_MAX_LENGTH) {
|
||||||
|
ourLog.warn("Truncating error message that is too long to store in database: {}", theErrorMessage);
|
||||||
|
errorMessage = theErrorMessage.substring(0, ERROR_MSG_MAX_LENGTH);
|
||||||
|
} else {
|
||||||
|
errorMessage = theErrorMessage;
|
||||||
|
}
|
||||||
|
return errorMessage;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
||||||
public void markWorkChunksWithStatusAndWipeData(String theInstanceId, List<String> theChunkIds, StatusEnum theStatus, String theErrorMsg) {
|
public void markWorkChunkAsCompletedAndClearData(String theInstanceId, String theChunkId, int theRecordsProcessed) {
|
||||||
|
StatusEnum newStatus = StatusEnum.COMPLETED;
|
||||||
|
ourLog.debug("Marking chunk {} for instance {} to status {}", theChunkId, theInstanceId, newStatus);
|
||||||
|
myWorkChunkRepository.updateChunkStatusAndClearDataForEndSuccess(theChunkId, new Date(), theRecordsProcessed, newStatus);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public void markWorkChunksWithStatusAndWipeData(String theInstanceId, List<String> theChunkIds, StatusEnum theStatus, String theErrorMessage) {
|
||||||
|
assert TransactionSynchronizationManager.isActualTransactionActive();
|
||||||
|
|
||||||
|
ourLog.debug("Marking all chunks for instance {} to status {}", theInstanceId, theStatus);
|
||||||
|
String errorMessage = truncateErrorMessage(theErrorMessage);
|
||||||
List<List<String>> listOfListOfIds = ListUtils.partition(theChunkIds, 100);
|
List<List<String>> listOfListOfIds = ListUtils.partition(theChunkIds, 100);
|
||||||
for (List<String> idList : listOfListOfIds) {
|
for (List<String> idList : listOfListOfIds) {
|
||||||
myWorkChunkRepository.updateAllChunksForInstanceStatusClearDataAndSetError(idList, new Date(), theStatus, theErrorMsg);
|
myWorkChunkRepository.updateAllChunksForInstanceStatusClearDataAndSetError(idList, new Date(), theStatus, errorMessage);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -285,6 +299,13 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
@Override
|
@Override
|
||||||
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
||||||
public boolean canAdvanceInstanceToNextStep(String theInstanceId, String theCurrentStepId) {
|
public boolean canAdvanceInstanceToNextStep(String theInstanceId, String theCurrentStepId) {
|
||||||
|
Optional<Batch2JobInstanceEntity> instance = myJobInstanceRepository.findById(theInstanceId);
|
||||||
|
if (!instance.isPresent()) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
if (instance.get().getStatus().isEnded()) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
List<StatusEnum> statusesForStep = myWorkChunkRepository.getDistinctStatusesForStep(theInstanceId, theCurrentStepId);
|
List<StatusEnum> statusesForStep = myWorkChunkRepository.getDistinctStatusesForStep(theInstanceId, theCurrentStepId);
|
||||||
ourLog.debug("Checking whether gated job can advanced to next step. [instanceId={}, currentStepId={}, statusesForStep={}]", theInstanceId, theCurrentStepId, statusesForStep);
|
ourLog.debug("Checking whether gated job can advanced to next step. [instanceId={}, currentStepId={}, statusesForStep={}]", theInstanceId, theCurrentStepId, statusesForStep);
|
||||||
return statusesForStep.stream().noneMatch(StatusEnum::isIncomplete) && statusesForStep.stream().anyMatch(status -> status == StatusEnum.COMPLETED);
|
return statusesForStep.stream().noneMatch(StatusEnum::isIncomplete) && statusesForStep.stream().anyMatch(status -> status == StatusEnum.COMPLETED);
|
||||||
|
@ -314,6 +335,11 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
return myTxTemplate.execute(tx -> myWorkChunkRepository.fetchAllChunkIdsForStepWithStatus(theInstanceId, theStepId, theStatusEnum));
|
return myTxTemplate.execute(tx -> myWorkChunkRepository.fetchAllChunkIdsForStepWithStatus(theInstanceId, theStepId, theStatusEnum));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public void updateInstanceUpdateTime(String theInstanceId) {
|
||||||
|
myJobInstanceRepository.updateInstanceUpdateTime(theInstanceId, new Date());
|
||||||
|
}
|
||||||
|
|
||||||
private void fetchChunksForStep(String theInstanceId, String theStepId, int thePageSize, int thePageIndex, Consumer<WorkChunk> theConsumer) {
|
private void fetchChunksForStep(String theInstanceId, String theStepId, int thePageSize, int thePageIndex, Consumer<WorkChunk> theConsumer) {
|
||||||
myTxTemplate.executeWithoutResult(tx -> {
|
myTxTemplate.executeWithoutResult(tx -> {
|
||||||
List<Batch2WorkChunkEntity> chunks = myWorkChunkRepository.fetchChunksForStep(PageRequest.of(thePageIndex, thePageSize), theInstanceId, theStepId);
|
List<Batch2WorkChunkEntity> chunks = myWorkChunkRepository.fetchChunksForStep(PageRequest.of(thePageIndex, thePageSize), theInstanceId, theStepId);
|
||||||
|
@ -380,6 +406,7 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
instanceEntity.setReport(theInstance.getReport());
|
instanceEntity.setReport(theInstance.getReport());
|
||||||
|
|
||||||
myJobInstanceRepository.save(instanceEntity);
|
myJobInstanceRepository.save(instanceEntity);
|
||||||
|
|
||||||
return recordsChangedByStatusUpdate > 0;
|
return recordsChangedByStatusUpdate > 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -393,8 +420,9 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
||||||
public void deleteChunks(String theInstanceId) {
|
public void deleteChunksAndMarkInstanceAsChunksPurged(String theInstanceId) {
|
||||||
ourLog.info("Deleting all chunks for instance ID: {}", theInstanceId);
|
ourLog.info("Deleting all chunks for instance ID: {}", theInstanceId);
|
||||||
|
myJobInstanceRepository.updateWorkChunksPurgedTrue(theInstanceId);
|
||||||
myWorkChunkRepository.deleteAllForInstance(theInstanceId);
|
myWorkChunkRepository.deleteAllForInstance(theInstanceId);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -35,6 +35,7 @@ import ca.uhn.fhir.jpa.model.sched.ScheduledJobDefinition;
|
||||||
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
|
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
|
||||||
import ca.uhn.fhir.util.Batch2JobDefinitionConstants;
|
import ca.uhn.fhir.util.Batch2JobDefinitionConstants;
|
||||||
import ca.uhn.fhir.util.JsonUtil;
|
import ca.uhn.fhir.util.JsonUtil;
|
||||||
|
import com.google.common.annotations.VisibleForTesting;
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
import org.apache.commons.lang3.time.DateUtils;
|
import org.apache.commons.lang3.time.DateUtils;
|
||||||
import org.hl7.fhir.instance.model.api.IBaseBinary;
|
import org.hl7.fhir.instance.model.api.IBaseBinary;
|
||||||
|
@ -54,6 +55,7 @@ import javax.annotation.PostConstruct;
|
||||||
import java.time.LocalDateTime;
|
import java.time.LocalDateTime;
|
||||||
import java.time.ZoneId;
|
import java.time.ZoneId;
|
||||||
import java.util.Date;
|
import java.util.Date;
|
||||||
|
import java.util.Iterator;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
import java.util.Optional;
|
import java.util.Optional;
|
||||||
|
|
|
@ -24,6 +24,7 @@ import ca.uhn.fhir.context.FhirContext;
|
||||||
import ca.uhn.fhir.jpa.packages.loader.PackageLoaderSvc;
|
import ca.uhn.fhir.jpa.packages.loader.PackageLoaderSvc;
|
||||||
import ca.uhn.fhir.jpa.packages.loader.PackageResourceParsingSvc;
|
import ca.uhn.fhir.jpa.packages.loader.PackageResourceParsingSvc;
|
||||||
import org.hl7.fhir.utilities.npm.PackageClient;
|
import org.hl7.fhir.utilities.npm.PackageClient;
|
||||||
|
import org.hl7.fhir.utilities.npm.PackageServer;
|
||||||
import org.springframework.context.annotation.Bean;
|
import org.springframework.context.annotation.Bean;
|
||||||
import org.springframework.context.annotation.Configuration;
|
import org.springframework.context.annotation.Configuration;
|
||||||
|
|
||||||
|
@ -34,8 +35,8 @@ public class PackageLoaderConfig {
|
||||||
public PackageLoaderSvc packageLoaderSvc() {
|
public PackageLoaderSvc packageLoaderSvc() {
|
||||||
PackageLoaderSvc svc = new PackageLoaderSvc();
|
PackageLoaderSvc svc = new PackageLoaderSvc();
|
||||||
svc.getPackageServers().clear();
|
svc.getPackageServers().clear();
|
||||||
svc.getPackageServers().add(PackageClient.PRIMARY_SERVER);
|
svc.getPackageServers().add(PackageServer.primaryServer());
|
||||||
svc.getPackageServers().add(PackageClient.SECONDARY_SERVER);
|
svc.getPackageServers().add(PackageServer.secondaryServer());
|
||||||
return svc;
|
return svc;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -159,7 +159,7 @@ public class JpaResourceDaoCodeSystem<T extends IBaseResource> extends BaseHapiF
|
||||||
String codeSystemUrl;
|
String codeSystemUrl;
|
||||||
if (theCodeSystemId != null) {
|
if (theCodeSystemId != null) {
|
||||||
IBaseResource codeSystem = read(theCodeSystemId, theRequestDetails);
|
IBaseResource codeSystem = read(theCodeSystemId, theRequestDetails);
|
||||||
codeSystemUrl = CommonCodeSystemsTerminologyService.getCodeSystemUrl(codeSystem);
|
codeSystemUrl = CommonCodeSystemsTerminologyService.getCodeSystemUrl(myFhirContext, codeSystem);
|
||||||
} else if (isNotBlank(toStringValue(theCodeSystemUrl))) {
|
} else if (isNotBlank(toStringValue(theCodeSystemUrl))) {
|
||||||
codeSystemUrl = toStringValue(theCodeSystemUrl);
|
codeSystemUrl = toStringValue(theCodeSystemUrl);
|
||||||
} else {
|
} else {
|
||||||
|
|
|
@ -176,8 +176,8 @@ public class JpaResourceDaoValueSet<T extends IBaseResource> extends BaseHapiFhi
|
||||||
String valueSetIdentifier;
|
String valueSetIdentifier;
|
||||||
if (theValueSetId != null) {
|
if (theValueSetId != null) {
|
||||||
IBaseResource valueSet = read(theValueSetId, theRequestDetails);
|
IBaseResource valueSet = read(theValueSetId, theRequestDetails);
|
||||||
StringBuilder valueSetIdentifierBuilder = new StringBuilder(CommonCodeSystemsTerminologyService.getValueSetUrl(valueSet));
|
StringBuilder valueSetIdentifierBuilder = new StringBuilder(CommonCodeSystemsTerminologyService.getValueSetUrl(myFhirContext, valueSet));
|
||||||
String valueSetVersion = CommonCodeSystemsTerminologyService.getValueSetVersion(valueSet);
|
String valueSetVersion = CommonCodeSystemsTerminologyService.getValueSetVersion(myFhirContext, valueSet);
|
||||||
if (valueSetVersion != null) {
|
if (valueSetVersion != null) {
|
||||||
valueSetIdentifierBuilder.append("|").append(valueSetVersion);
|
valueSetIdentifierBuilder.append("|").append(valueSetVersion);
|
||||||
}
|
}
|
||||||
|
|
|
@ -38,13 +38,17 @@ public interface IBatch2JobInstanceRepository extends JpaRepository<Batch2JobIns
|
||||||
@Query("UPDATE Batch2JobInstanceEntity e SET e.myStatus = :status WHERE e.myId = :id and e.myStatus <> :status")
|
@Query("UPDATE Batch2JobInstanceEntity e SET e.myStatus = :status WHERE e.myId = :id and e.myStatus <> :status")
|
||||||
int updateInstanceStatus(@Param("id") String theInstanceId, @Param("status") StatusEnum theStatus);
|
int updateInstanceStatus(@Param("id") String theInstanceId, @Param("status") StatusEnum theStatus);
|
||||||
|
|
||||||
|
@Modifying
|
||||||
|
@Query("UPDATE Batch2JobInstanceEntity e SET e.myUpdateTime = :updated WHERE e.myId = :id")
|
||||||
|
int updateInstanceUpdateTime(@Param("id") String theInstanceId, @Param("updated") Date theUpdated);
|
||||||
|
|
||||||
@Modifying
|
@Modifying
|
||||||
@Query("UPDATE Batch2JobInstanceEntity e SET e.myCancelled = :cancelled WHERE e.myId = :id")
|
@Query("UPDATE Batch2JobInstanceEntity e SET e.myCancelled = :cancelled WHERE e.myId = :id")
|
||||||
int updateInstanceCancelled(@Param("id") String theInstanceId, @Param("cancelled") boolean theCancelled);
|
int updateInstanceCancelled(@Param("id") String theInstanceId, @Param("cancelled") boolean theCancelled);
|
||||||
|
|
||||||
@Modifying
|
@Modifying
|
||||||
@Query("UPDATE Batch2JobInstanceEntity e SET e.myCurrentGatedStepId = :currentGatedStepId WHERE e.myId = :id")
|
@Query("UPDATE Batch2JobInstanceEntity e SET e.myWorkChunksPurged = true WHERE e.myId = :id")
|
||||||
void updateInstanceCurrentGatedStepId(@Param("id") String theInstanceId, @Param("currentGatedStepId") String theCurrentGatedStepId);
|
int updateWorkChunksPurgedTrue(@Param("id") String theInstanceId);
|
||||||
|
|
||||||
@Query("SELECT b from Batch2JobInstanceEntity b WHERE b.myDefinitionId = :defId AND b.myParamsJson = :params AND b.myStatus IN( :stats )")
|
@Query("SELECT b from Batch2JobInstanceEntity b WHERE b.myDefinitionId = :defId AND b.myParamsJson = :params AND b.myStatus IN( :stats )")
|
||||||
List<Batch2JobInstanceEntity> findInstancesByJobIdParamsAndStatus(
|
List<Batch2JobInstanceEntity> findInstancesByJobIdParamsAndStatus(
|
||||||
|
|
|
@ -144,8 +144,6 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks<VersionEnum> {
|
||||||
.online(true)
|
.online(true)
|
||||||
.withColumns("SEARCH_PID")
|
.withColumns("SEARCH_PID")
|
||||||
.onlyAppliesToPlatforms(NON_AUTOMATIC_FK_INDEX_PLATFORMS);
|
.onlyAppliesToPlatforms(NON_AUTOMATIC_FK_INDEX_PLATFORMS);
|
||||||
;
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private void init620() {
|
private void init620() {
|
||||||
|
|
|
@ -60,6 +60,7 @@ import org.hl7.fhir.instance.model.api.IIdType;
|
||||||
import org.hl7.fhir.instance.model.api.IPrimitiveType;
|
import org.hl7.fhir.instance.model.api.IPrimitiveType;
|
||||||
import org.hl7.fhir.utilities.npm.BasePackageCacheManager;
|
import org.hl7.fhir.utilities.npm.BasePackageCacheManager;
|
||||||
import org.hl7.fhir.utilities.npm.NpmPackage;
|
import org.hl7.fhir.utilities.npm.NpmPackage;
|
||||||
|
import org.hl7.fhir.utilities.npm.PackageServer;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
@ -129,9 +130,9 @@ public class JpaPackageCache extends BasePackageCacheManager implements IHapiPac
|
||||||
private IBinaryStorageSvc myBinaryStorageSvc;
|
private IBinaryStorageSvc myBinaryStorageSvc;
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void addPackageServer(@Nonnull String theUrl) {
|
public void addPackageServer(@Nonnull PackageServer thePackageServer) {
|
||||||
assert myPackageLoaderSvc != null;
|
assert myPackageLoaderSvc != null;
|
||||||
myPackageLoaderSvc.addPackageServer(theUrl);
|
myPackageLoaderSvc.addPackageServer(thePackageServer);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
|
@ -150,7 +151,7 @@ public class JpaPackageCache extends BasePackageCacheManager implements IHapiPac
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public List<String> getPackageServers() {
|
public List<PackageServer> getPackageServers() {
|
||||||
return myPackageLoaderSvc.getPackageServers();
|
return myPackageLoaderSvc.getPackageServers();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -116,7 +116,7 @@ class JpaJobPersistenceImplTest {
|
||||||
String jobId = "jobid";
|
String jobId = "jobid";
|
||||||
|
|
||||||
// test
|
// test
|
||||||
mySvc.deleteChunks(jobId);
|
mySvc.deleteChunksAndMarkInstanceAsChunksPurged(jobId);
|
||||||
|
|
||||||
// verify
|
// verify
|
||||||
verify(myWorkChunkRepository)
|
verify(myWorkChunkRepository)
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -16,6 +16,8 @@ import org.hl7.fhir.r4.model.Bundle;
|
||||||
import org.hl7.fhir.r4.model.IdType;
|
import org.hl7.fhir.r4.model.IdType;
|
||||||
import org.hl7.fhir.r4.model.Meta;
|
import org.hl7.fhir.r4.model.Meta;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
import org.junit.jupiter.api.Disabled;
|
||||||
|
import org.junit.jupiter.api.RepeatedTest;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
import org.junit.jupiter.api.extension.ExtendWith;
|
import org.junit.jupiter.api.extension.ExtendWith;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
|
|
@ -3,7 +3,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
@ -92,6 +92,11 @@
|
||||||
<artifactId>hapi-fhir-validation-resources-r4</artifactId>
|
<artifactId>hapi-fhir-validation-resources-r4</artifactId>
|
||||||
<version>${project.version}</version>
|
<version>${project.version}</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
|
<artifactId>hapi-fhir-validation-resources-r4b</artifactId>
|
||||||
|
<version>${project.version}</version>
|
||||||
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-validation-resources-r5</artifactId>
|
<artifactId>hapi-fhir-validation-resources-r5</artifactId>
|
||||||
|
|
|
@ -87,6 +87,7 @@ import static ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum.DATE;
|
||||||
import static ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum.REFERENCE;
|
import static ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum.REFERENCE;
|
||||||
import static org.apache.commons.lang3.StringUtils.isBlank;
|
import static org.apache.commons.lang3.StringUtils.isBlank;
|
||||||
import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
||||||
|
import static org.apache.commons.lang3.StringUtils.startsWith;
|
||||||
import static org.apache.commons.lang3.StringUtils.trim;
|
import static org.apache.commons.lang3.StringUtils.trim;
|
||||||
|
|
||||||
public abstract class BaseSearchParamExtractor implements ISearchParamExtractor {
|
public abstract class BaseSearchParamExtractor implements ISearchParamExtractor {
|
||||||
|
@ -829,6 +830,7 @@ public abstract class BaseSearchParamExtractor implements ISearchParamExtractor
|
||||||
try {
|
try {
|
||||||
allValues = allValuesFunc.get();
|
allValues = allValuesFunc.get();
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
|
e.printStackTrace();
|
||||||
String msg = getContext().getLocalizer().getMessage(BaseSearchParamExtractor.class, "failedToExtractPaths", nextPath, e.toString());
|
String msg = getContext().getLocalizer().getMessage(BaseSearchParamExtractor.class, "failedToExtractPaths", nextPath, e.toString());
|
||||||
throw new InternalErrorException(Msg.code(504) + msg, e);
|
throw new InternalErrorException(Msg.code(504) + msg, e);
|
||||||
}
|
}
|
||||||
|
@ -1294,7 +1296,30 @@ public abstract class BaseSearchParamExtractor implements ISearchParamExtractor
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private <T> SearchParamSet<T> extractSearchParams(IBaseResource theResource, IExtractor<T> theExtractor, RestSearchParameterTypeEnum theSearchParamType, boolean theWantLocalReferences) {
|
/**
|
||||||
|
* Ignore any of the Resource-level search params. This is kind of awkward, but here is why
|
||||||
|
* we do it:
|
||||||
|
* <p>
|
||||||
|
* The ReadOnlySearchParamCache supplies these params, and they have paths associated with
|
||||||
|
* them. E.g. HAPI's SearchParamRegistryImpl will know about the _id search parameter and
|
||||||
|
* assigns it the path "Resource.id". All of these parameters have indexing code paths in the
|
||||||
|
* server that don't rely on the existence of the SearchParameter. For example, we have a
|
||||||
|
* dedicated column on ResourceTable that handles the _id parameter.
|
||||||
|
* <p>
|
||||||
|
* Until 6.2.0 the FhirPath evaluator didn't actually resolve any values for these paths
|
||||||
|
* that started with Resource instead of the actual resource name, so it never actually
|
||||||
|
* made a difference that these parameters existed because they'd never actually result
|
||||||
|
* in any index rows. In 6.4.0 that bug was fixed in the core FhirPath engine. We don't
|
||||||
|
* want that fix to result in pointless index rows for things like _id and _tag, so we
|
||||||
|
* ignore them here.
|
||||||
|
* <p>
|
||||||
|
* Note that you can still create a search parameter that includes a path like
|
||||||
|
* "meta.tag" if you really need to create an SP that actually does index _tag. This
|
||||||
|
* is needed if you want to search for tags in <code>INLINE</code> tag storage mode.
|
||||||
|
* This is the only way you could actually specify a FhirPath expression for those
|
||||||
|
* prior to 6.2.0 so this isn't a breaking change.
|
||||||
|
*/
|
||||||
|
<T> SearchParamSet<T> extractSearchParams(IBaseResource theResource, IExtractor<T> theExtractor, RestSearchParameterTypeEnum theSearchParamType, boolean theWantLocalReferences) {
|
||||||
SearchParamSet<T> retVal = new SearchParamSet<>();
|
SearchParamSet<T> retVal = new SearchParamSet<>();
|
||||||
|
|
||||||
Collection<RuntimeSearchParam> searchParams = getSearchParams(theResource);
|
Collection<RuntimeSearchParam> searchParams = getSearchParams(theResource);
|
||||||
|
@ -1306,6 +1331,11 @@ public abstract class BaseSearchParamExtractor implements ISearchParamExtractor
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// See the method javadoc for an explanation of this
|
||||||
|
if (startsWith(nextSpDef.getPath(), "Resource.")) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
extractSearchParam(nextSpDef, theResource, theExtractor, retVal, theWantLocalReferences);
|
extractSearchParam(nextSpDef, theResource, theExtractor, retVal, theWantLocalReferences);
|
||||||
}
|
}
|
||||||
return retVal;
|
return retVal;
|
||||||
|
|
|
@ -32,6 +32,7 @@ import org.hl7.fhir.exceptions.PathEngineException;
|
||||||
import org.hl7.fhir.instance.model.api.IBase;
|
import org.hl7.fhir.instance.model.api.IBase;
|
||||||
import org.hl7.fhir.r4.context.IWorkerContext;
|
import org.hl7.fhir.r4.context.IWorkerContext;
|
||||||
import org.hl7.fhir.r4.hapi.ctx.HapiWorkerContext;
|
import org.hl7.fhir.r4.hapi.ctx.HapiWorkerContext;
|
||||||
|
import org.hl7.fhir.r4.utils.FHIRPathEngine;
|
||||||
import org.hl7.fhir.r4.model.Base;
|
import org.hl7.fhir.r4.model.Base;
|
||||||
import org.hl7.fhir.r4.model.ExpressionNode;
|
import org.hl7.fhir.r4.model.ExpressionNode;
|
||||||
import org.hl7.fhir.r4.model.IdType;
|
import org.hl7.fhir.r4.model.IdType;
|
||||||
|
@ -133,7 +134,7 @@ public class SearchParamExtractorR4 extends BaseSearchParamExtractor implements
|
||||||
|
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public Base resolveReference(Object theAppContext, String theUrl) throws FHIRException {
|
public Base resolveReference(Object theAppContext, String theUrl, Base theRefContext) throws FHIRException {
|
||||||
|
|
||||||
/*
|
/*
|
||||||
* When we're doing resolution within the SearchParamExtractor, if we want
|
* When we're doing resolution within the SearchParamExtractor, if we want
|
||||||
|
|
|
@ -95,6 +95,8 @@ public class ReadOnlySearchParamCache {
|
||||||
IBaseBundle allSearchParameterBundle = null;
|
IBaseBundle allSearchParameterBundle = null;
|
||||||
if (theFhirContext.getVersion().getVersion() == FhirVersionEnum.R4) {
|
if (theFhirContext.getVersion().getVersion() == FhirVersionEnum.R4) {
|
||||||
allSearchParameterBundle = (IBaseBundle) theFhirContext.newJsonParser().parseResource(ClasspathUtil.loadResourceAsStream("org/hl7/fhir/r4/model/sp/search-parameters.json"));
|
allSearchParameterBundle = (IBaseBundle) theFhirContext.newJsonParser().parseResource(ClasspathUtil.loadResourceAsStream("org/hl7/fhir/r4/model/sp/search-parameters.json"));
|
||||||
|
} else if (theFhirContext.getVersion().getVersion() == FhirVersionEnum.R4B) {
|
||||||
|
allSearchParameterBundle = (IBaseBundle) theFhirContext.newXmlParser().parseResource(ClasspathUtil.loadResourceAsStream("org/hl7/fhir/r4b/model/sp/search-parameters.xml"));
|
||||||
} else if (theFhirContext.getVersion().getVersion() == FhirVersionEnum.R5) {
|
} else if (theFhirContext.getVersion().getVersion() == FhirVersionEnum.R5) {
|
||||||
allSearchParameterBundle = (IBaseBundle) theFhirContext.newXmlParser().parseResource(ClasspathUtil.loadResourceAsStream("org/hl7/fhir/r5/model/sp/search-parameters.xml"));
|
allSearchParameterBundle = (IBaseBundle) theFhirContext.newXmlParser().parseResource(ClasspathUtil.loadResourceAsStream("org/hl7/fhir/r5/model/sp/search-parameters.xml"));
|
||||||
}
|
}
|
||||||
|
|
|
@ -79,6 +79,7 @@ public class SearchParameterCanonicalizer {
|
||||||
retVal = canonicalizeSearchParameterDstu3((org.hl7.fhir.dstu3.model.SearchParameter) theSearchParameter);
|
retVal = canonicalizeSearchParameterDstu3((org.hl7.fhir.dstu3.model.SearchParameter) theSearchParameter);
|
||||||
break;
|
break;
|
||||||
case R4:
|
case R4:
|
||||||
|
case R4B:
|
||||||
case R5:
|
case R5:
|
||||||
retVal = canonicalizeSearchParameterR4Plus(theSearchParameter);
|
retVal = canonicalizeSearchParameterR4Plus(theSearchParameter);
|
||||||
break;
|
break;
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -15,6 +15,9 @@ import org.eclipse.jetty.servlet.ServletHandler;
|
||||||
import org.eclipse.jetty.servlet.ServletHolder;
|
import org.eclipse.jetty.servlet.ServletHolder;
|
||||||
import org.hl7.fhir.instance.model.api.IBaseBinary;
|
import org.hl7.fhir.instance.model.api.IBaseBinary;
|
||||||
import org.hl7.fhir.instance.model.api.IBaseResource;
|
import org.hl7.fhir.instance.model.api.IBaseResource;
|
||||||
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
|
import org.hl7.fhir.utilities.npm.IPackageCacheManager;
|
||||||
|
import org.hl7.fhir.utilities.npm.PackageServer;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
@ -66,7 +69,7 @@ public class IgInstallerDstu3Test extends BaseJpaDstu3Test {
|
||||||
|
|
||||||
myPort = JettyUtil.getPortForStartedServer(myServer);
|
myPort = JettyUtil.getPortForStartedServer(myServer);
|
||||||
jpaPackageCache.getPackageServers().clear();
|
jpaPackageCache.getPackageServers().clear();
|
||||||
jpaPackageCache.addPackageServer("http://localhost:" + myPort);
|
jpaPackageCache.addPackageServer(new PackageServer("http://localhost:" + myPort));
|
||||||
|
|
||||||
myFakeNpmServlet.getResponses().clear();
|
myFakeNpmServlet.getResponses().clear();
|
||||||
}
|
}
|
||||||
|
|
|
@ -15,6 +15,7 @@ import org.hl7.fhir.dstu3.model.Condition;
|
||||||
import org.hl7.fhir.dstu3.model.OperationOutcome;
|
import org.hl7.fhir.dstu3.model.OperationOutcome;
|
||||||
import org.hl7.fhir.dstu3.model.StructureDefinition;
|
import org.hl7.fhir.dstu3.model.StructureDefinition;
|
||||||
import org.hl7.fhir.dstu3.model.ValueSet;
|
import org.hl7.fhir.dstu3.model.ValueSet;
|
||||||
|
import org.hl7.fhir.utilities.npm.PackageServer;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
@ -61,7 +62,7 @@ public class NpmDstu3Test extends BaseJpaDstu3Test {
|
||||||
|
|
||||||
int port = JettyUtil.getPortForStartedServer(myServer);
|
int port = JettyUtil.getPortForStartedServer(myServer);
|
||||||
jpaPackageCache.getPackageServers().clear();
|
jpaPackageCache.getPackageServers().clear();
|
||||||
jpaPackageCache.addPackageServer("http://localhost:" + port);
|
jpaPackageCache.addPackageServer(new PackageServer("http://localhost:" + port));
|
||||||
|
|
||||||
myResponses.clear();
|
myResponses.clear();
|
||||||
}
|
}
|
||||||
|
|
|
@ -18,6 +18,7 @@
|
||||||
<logger name="org.eclipse" level="error"/>
|
<logger name="org.eclipse" level="error"/>
|
||||||
<logger name="ca.uhn.fhir.rest.client" level="info"/>
|
<logger name="ca.uhn.fhir.rest.client" level="info"/>
|
||||||
<logger name="ca.uhn.fhir.jpa.dao" level="info"/>
|
<logger name="ca.uhn.fhir.jpa.dao" level="info"/>
|
||||||
|
<logger name="ca.uhn.fhir.rest.client.interceptor.LoggingInterceptor" level="debug"/>
|
||||||
|
|
||||||
<!-- set to debug to enable term expansion logs -->
|
<!-- set to debug to enable term expansion logs -->
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -33,6 +33,7 @@ import ca.uhn.test.concurrency.PointcutLatch;
|
||||||
import com.fasterxml.jackson.annotation.JsonProperty;
|
import com.fasterxml.jackson.annotation.JsonProperty;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
import org.junit.jupiter.api.Disabled;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
import org.junit.jupiter.params.ParameterizedTest;
|
import org.junit.jupiter.params.ParameterizedTest;
|
||||||
import org.junit.jupiter.params.provider.ValueSource;
|
import org.junit.jupiter.params.provider.ValueSource;
|
||||||
|
|
|
@ -41,6 +41,7 @@ import static org.hamcrest.Matchers.empty;
|
||||||
import static org.hamcrest.Matchers.hasSize;
|
import static org.hamcrest.Matchers.hasSize;
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
import static org.junit.jupiter.api.Assertions.assertFalse;
|
import static org.junit.jupiter.api.Assertions.assertFalse;
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertNotEquals;
|
||||||
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
||||||
import static org.junit.jupiter.api.Assertions.assertNull;
|
import static org.junit.jupiter.api.Assertions.assertNull;
|
||||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||||
|
@ -55,6 +56,7 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
public static final int JOB_DEF_VER = 1;
|
public static final int JOB_DEF_VER = 1;
|
||||||
public static final int SEQUENCE_NUMBER = 1;
|
public static final int SEQUENCE_NUMBER = 1;
|
||||||
public static final String CHUNK_DATA = "{\"key\":\"value\"}";
|
public static final String CHUNK_DATA = "{\"key\":\"value\"}";
|
||||||
|
public static final String INSTANCE_ID = "instance-id";
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
private IJobPersistence mySvc;
|
private IJobPersistence mySvc;
|
||||||
|
@ -102,7 +104,7 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
|
|
||||||
// Execute
|
// Execute
|
||||||
|
|
||||||
mySvc.deleteChunks(instanceId);
|
mySvc.deleteChunksAndMarkInstanceAsChunksPurged(instanceId);
|
||||||
|
|
||||||
// Verify
|
// Verify
|
||||||
|
|
||||||
|
@ -216,19 +218,6 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
.collect(Collectors.toUnmodifiableSet()));
|
.collect(Collectors.toUnmodifiableSet()));
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* Returns a set of statuses, and whether they should be successfully picked up and started by a consumer.
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static List<Arguments> provideStatuses() {
|
|
||||||
return List.of(
|
|
||||||
Arguments.of(StatusEnum.QUEUED, true),
|
|
||||||
Arguments.of(StatusEnum.IN_PROGRESS, true),
|
|
||||||
Arguments.of(StatusEnum.ERRORED, true),
|
|
||||||
Arguments.of(StatusEnum.FAILED, false),
|
|
||||||
Arguments.of(StatusEnum.COMPLETED, false)
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@ParameterizedTest
|
@ParameterizedTest
|
||||||
@MethodSource("provideStatuses")
|
@MethodSource("provideStatuses")
|
||||||
public void testStartChunkOnlyWorksOnValidChunks(StatusEnum theStatus, boolean theShouldBeStartedByConsumer) {
|
public void testStartChunkOnlyWorksOnValidChunks(StatusEnum theStatus, boolean theShouldBeStartedByConsumer) {
|
||||||
|
@ -236,7 +225,7 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
JobInstance instance = createInstance();
|
JobInstance instance = createInstance();
|
||||||
String instanceId = mySvc.storeNewInstance(instance);
|
String instanceId = mySvc.storeNewInstance(instance);
|
||||||
storeWorkChunk(JOB_DEFINITION_ID, TARGET_STEP_ID, instanceId, 0, CHUNK_DATA);
|
storeWorkChunk(JOB_DEFINITION_ID, TARGET_STEP_ID, instanceId, 0, CHUNK_DATA);
|
||||||
BatchWorkChunk batchWorkChunk = new BatchWorkChunk(JOB_DEFINITION_ID, JOB_DEF_VER, TARGET_STEP_ID, instanceId,0, CHUNK_DATA);
|
BatchWorkChunk batchWorkChunk = new BatchWorkChunk(JOB_DEFINITION_ID, JOB_DEF_VER, TARGET_STEP_ID, instanceId, 0, CHUNK_DATA);
|
||||||
String chunkId = mySvc.storeWorkChunk(batchWorkChunk);
|
String chunkId = mySvc.storeWorkChunk(batchWorkChunk);
|
||||||
Optional<Batch2WorkChunkEntity> byId = myWorkChunkRepository.findById(chunkId);
|
Optional<Batch2WorkChunkEntity> byId = myWorkChunkRepository.findById(chunkId);
|
||||||
Batch2WorkChunkEntity entity = byId.get();
|
Batch2WorkChunkEntity entity = byId.get();
|
||||||
|
@ -335,6 +324,24 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
empty());
|
empty());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testUpdateTime() {
|
||||||
|
// Setup
|
||||||
|
JobInstance instance = createInstance();
|
||||||
|
String instanceId = mySvc.storeNewInstance(instance);
|
||||||
|
|
||||||
|
Date updateTime = runInTransaction(() -> new Date(myJobInstanceRepository.findById(instanceId).orElseThrow().getUpdateTime().getTime()));
|
||||||
|
|
||||||
|
sleepUntilTimeChanges();
|
||||||
|
|
||||||
|
// Test
|
||||||
|
runInTransaction(() -> mySvc.updateInstanceUpdateTime(instanceId));
|
||||||
|
|
||||||
|
// Verify
|
||||||
|
Date updateTime2 = runInTransaction(() -> new Date(myJobInstanceRepository.findById(instanceId).orElseThrow().getUpdateTime().getTime()));
|
||||||
|
assertNotEquals(updateTime, updateTime2);
|
||||||
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testFetchUnknownWork() {
|
public void testFetchUnknownWork() {
|
||||||
assertFalse(myWorkChunkRepository.findById("FOO").isPresent());
|
assertFalse(myWorkChunkRepository.findById("FOO").isPresent());
|
||||||
|
@ -393,7 +400,7 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
|
|
||||||
sleepUntilTimeChanges();
|
sleepUntilTimeChanges();
|
||||||
|
|
||||||
mySvc.markWorkChunkAsCompletedAndClearData(chunkId, 50);
|
mySvc.markWorkChunkAsCompletedAndClearData(INSTANCE_ID, chunkId, 50);
|
||||||
runInTransaction(() -> {
|
runInTransaction(() -> {
|
||||||
Batch2WorkChunkEntity entity = myWorkChunkRepository.findById(chunkId).orElseThrow(IllegalArgumentException::new);
|
Batch2WorkChunkEntity entity = myWorkChunkRepository.findById(chunkId).orElseThrow(IllegalArgumentException::new);
|
||||||
assertEquals(StatusEnum.COMPLETED, entity.getStatus());
|
assertEquals(StatusEnum.COMPLETED, entity.getStatus());
|
||||||
|
@ -427,13 +434,14 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
assertEquals(1, chunks.size());
|
assertEquals(1, chunks.size());
|
||||||
assertEquals(5, chunks.get(0).getErrorCount());
|
assertEquals(5, chunks.get(0).getErrorCount());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testGatedAdvancementByStatus() {
|
public void testGatedAdvancementByStatus() {
|
||||||
// Setup
|
// Setup
|
||||||
JobInstance instance = createInstance();
|
JobInstance instance = createInstance();
|
||||||
String instanceId = mySvc.storeNewInstance(instance);
|
String instanceId = mySvc.storeNewInstance(instance);
|
||||||
String chunkId = storeWorkChunk(DEF_CHUNK_ID, STEP_CHUNK_ID, instanceId, SEQUENCE_NUMBER, null);
|
String chunkId = storeWorkChunk(DEF_CHUNK_ID, STEP_CHUNK_ID, instanceId, SEQUENCE_NUMBER, null);
|
||||||
mySvc.markWorkChunkAsCompletedAndClearData(chunkId, 0);
|
mySvc.markWorkChunkAsCompletedAndClearData(INSTANCE_ID, chunkId, 0);
|
||||||
|
|
||||||
boolean canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
boolean canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
assertTrue(canAdvance);
|
assertTrue(canAdvance);
|
||||||
|
@ -445,18 +453,18 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
assertFalse(canAdvance);
|
assertFalse(canAdvance);
|
||||||
|
|
||||||
//Toggle it to complete
|
//Toggle it to complete
|
||||||
mySvc.markWorkChunkAsCompletedAndClearData(newChunkId, 0);
|
mySvc.markWorkChunkAsCompletedAndClearData(INSTANCE_ID, newChunkId, 0);
|
||||||
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
assertTrue(canAdvance);
|
assertTrue(canAdvance);
|
||||||
|
|
||||||
//Create a new chunk and set it in progress.
|
//Create a new chunk and set it in progress.
|
||||||
String newerChunkId= storeWorkChunk(DEF_CHUNK_ID, STEP_CHUNK_ID, instanceId, SEQUENCE_NUMBER, null);
|
String newerChunkId = storeWorkChunk(DEF_CHUNK_ID, STEP_CHUNK_ID, instanceId, SEQUENCE_NUMBER, null);
|
||||||
mySvc.fetchWorkChunkSetStartTimeAndMarkInProgress(newerChunkId);
|
mySvc.fetchWorkChunkSetStartTimeAndMarkInProgress(newerChunkId);
|
||||||
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
assertFalse(canAdvance);
|
assertFalse(canAdvance);
|
||||||
|
|
||||||
//Toggle IN_PROGRESS to complete
|
//Toggle IN_PROGRESS to complete
|
||||||
mySvc.markWorkChunkAsCompletedAndClearData(newerChunkId, 0);
|
mySvc.markWorkChunkAsCompletedAndClearData(INSTANCE_ID, newerChunkId, 0);
|
||||||
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
canAdvance = mySvc.canAdvanceInstanceToNextStep(instanceId, STEP_CHUNK_ID);
|
||||||
assertTrue(canAdvance);
|
assertTrue(canAdvance);
|
||||||
}
|
}
|
||||||
|
@ -609,7 +617,7 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
chunkIds.add(id);
|
chunkIds.add(id);
|
||||||
}
|
}
|
||||||
|
|
||||||
mySvc.markWorkChunksWithStatusAndWipeData(instance.getInstanceId(), chunkIds, StatusEnum.COMPLETED, null);
|
runInTransaction(() -> mySvc.markWorkChunksWithStatusAndWipeData(instance.getInstanceId(), chunkIds, StatusEnum.COMPLETED, null));
|
||||||
|
|
||||||
Iterator<WorkChunk> reducedChunks = mySvc.fetchAllWorkChunksIterator(instanceId, true);
|
Iterator<WorkChunk> reducedChunks = mySvc.fetchAllWorkChunksIterator(instanceId, true);
|
||||||
|
|
||||||
|
@ -631,7 +639,6 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
return instance;
|
return instance;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@Nonnull
|
@Nonnull
|
||||||
private String storeJobInstanceAndUpdateWithEndTime(StatusEnum theStatus, int minutes) {
|
private String storeJobInstanceAndUpdateWithEndTime(StatusEnum theStatus, int minutes) {
|
||||||
final JobInstance jobInstance = new JobInstance();
|
final JobInstance jobInstance = new JobInstance();
|
||||||
|
@ -656,4 +663,17 @@ public class JpaJobPersistenceImplTest extends BaseJpaR4Test {
|
||||||
|
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns a set of statuses, and whether they should be successfully picked up and started by a consumer.
|
||||||
|
*/
|
||||||
|
public static List<Arguments> provideStatuses() {
|
||||||
|
return List.of(
|
||||||
|
Arguments.of(StatusEnum.QUEUED, true),
|
||||||
|
Arguments.of(StatusEnum.IN_PROGRESS, true),
|
||||||
|
Arguments.of(StatusEnum.ERRORED, true),
|
||||||
|
Arguments.of(StatusEnum.FAILED, false),
|
||||||
|
Arguments.of(StatusEnum.COMPLETED, false)
|
||||||
|
);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -0,0 +1,222 @@
|
||||||
|
package ca.uhn.fhir.jpa.bulk;
|
||||||
|
|
||||||
|
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
|
||||||
|
import ca.uhn.fhir.jpa.api.model.BulkExportJobResults;
|
||||||
|
import ca.uhn.fhir.jpa.api.svc.IBatch2JobRunner;
|
||||||
|
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
|
||||||
|
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
|
||||||
|
import ca.uhn.fhir.jpa.util.BulkExportUtils;
|
||||||
|
import ca.uhn.fhir.rest.api.Constants;
|
||||||
|
import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions;
|
||||||
|
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
|
||||||
|
import ca.uhn.fhir.util.JsonUtil;
|
||||||
|
import com.google.common.collect.Sets;
|
||||||
|
import org.hl7.fhir.r4.model.Binary;
|
||||||
|
import org.hl7.fhir.r4.model.Enumerations;
|
||||||
|
import org.hl7.fhir.r4.model.Group;
|
||||||
|
import org.hl7.fhir.r4.model.IdType;
|
||||||
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
|
import org.junit.jupiter.api.AfterEach;
|
||||||
|
import org.junit.jupiter.api.Disabled;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
|
||||||
|
import java.io.BufferedReader;
|
||||||
|
import java.io.StringReader;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.Collections;
|
||||||
|
import java.util.HashSet;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Map;
|
||||||
|
import java.util.Set;
|
||||||
|
import java.util.concurrent.BlockingQueue;
|
||||||
|
import java.util.concurrent.ExecutionException;
|
||||||
|
import java.util.concurrent.ExecutorService;
|
||||||
|
import java.util.concurrent.Future;
|
||||||
|
import java.util.concurrent.LinkedBlockingQueue;
|
||||||
|
import java.util.concurrent.ThreadPoolExecutor;
|
||||||
|
import java.util.concurrent.TimeUnit;
|
||||||
|
|
||||||
|
import static org.hamcrest.MatcherAssert.assertThat;
|
||||||
|
import static org.hamcrest.Matchers.emptyOrNullString;
|
||||||
|
import static org.hamcrest.Matchers.hasItem;
|
||||||
|
import static org.hamcrest.Matchers.not;
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||||
|
import static org.junit.jupiter.api.Assertions.fail;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* A test to poke at our job framework and induce errors.
|
||||||
|
*/
|
||||||
|
public class BulkDataErrorAbuseTest extends BaseResourceProviderR4Test {
|
||||||
|
private static final Logger ourLog = LoggerFactory.getLogger(BulkDataErrorAbuseTest.class);
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private IBatch2JobRunner myJobRunner;
|
||||||
|
|
||||||
|
@AfterEach
|
||||||
|
void afterEach() {
|
||||||
|
myStorageSettings.setIndexMissingFields(JpaStorageSettings.IndexEnabledEnum.DISABLED);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testGroupBulkExportNotInGroup_DoesNotShowUp() throws InterruptedException, ExecutionException {
|
||||||
|
duAbuseTest(100);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* This test is disabled because it never actually exists. Run it if you want to ensure
|
||||||
|
* that changes to the Bulk Export Batch2 task haven't affected our ability to successfully
|
||||||
|
* run endless parallel jobs. If you run it for a few minutes, and it never stops on its own,
|
||||||
|
* you are good.
|
||||||
|
* <p>
|
||||||
|
* The enabled test above called {@link #testGroupBulkExportNotInGroup_DoesNotShowUp()} does
|
||||||
|
* run with the build and runs 100 jobs.
|
||||||
|
*/
|
||||||
|
@Test
|
||||||
|
@Disabled
|
||||||
|
public void testNonStopAbuseBatch2BulkExportStressTest() throws InterruptedException, ExecutionException {
|
||||||
|
duAbuseTest(Integer.MAX_VALUE);
|
||||||
|
}
|
||||||
|
|
||||||
|
private void duAbuseTest(int taskExecutions) throws InterruptedException, ExecutionException {
|
||||||
|
// Create some resources
|
||||||
|
Patient patient = new Patient();
|
||||||
|
patient.setId("PING1");
|
||||||
|
patient.setGender(Enumerations.AdministrativeGender.FEMALE);
|
||||||
|
patient.setActive(true);
|
||||||
|
myClient.update().resource(patient).execute();
|
||||||
|
|
||||||
|
patient = new Patient();
|
||||||
|
patient.setId("PING2");
|
||||||
|
patient.setGender(Enumerations.AdministrativeGender.MALE);
|
||||||
|
patient.setActive(true);
|
||||||
|
myClient.update().resource(patient).execute();
|
||||||
|
|
||||||
|
patient = new Patient();
|
||||||
|
patient.setId("PNING3");
|
||||||
|
patient.setGender(Enumerations.AdministrativeGender.MALE);
|
||||||
|
patient.setActive(true);
|
||||||
|
myClient.update().resource(patient).execute();
|
||||||
|
|
||||||
|
Group group = new Group();
|
||||||
|
group.setId("Group/G2");
|
||||||
|
group.setActive(true);
|
||||||
|
group.addMember().getEntity().setReference("Patient/PING1");
|
||||||
|
group.addMember().getEntity().setReference("Patient/PING2");
|
||||||
|
myClient.update().resource(group).execute();
|
||||||
|
|
||||||
|
// set the export options
|
||||||
|
BulkDataExportOptions options = new BulkDataExportOptions();
|
||||||
|
options.setResourceTypes(Sets.newHashSet("Patient"));
|
||||||
|
options.setGroupId(new IdType("Group", "G2"));
|
||||||
|
options.setFilters(new HashSet<>());
|
||||||
|
options.setExportStyle(BulkDataExportOptions.ExportStyle.GROUP);
|
||||||
|
options.setOutputFormat(Constants.CT_FHIR_NDJSON);
|
||||||
|
|
||||||
|
BlockingQueue<Runnable> workQueue = new LinkedBlockingQueue<>();
|
||||||
|
ExecutorService executorService = new ThreadPoolExecutor(10, 10,
|
||||||
|
0L, TimeUnit.MILLISECONDS,
|
||||||
|
workQueue);
|
||||||
|
|
||||||
|
ourLog.info("Starting task creation");
|
||||||
|
|
||||||
|
List<Future<Boolean>> futures = new ArrayList<>();
|
||||||
|
for (int i = 0; i < taskExecutions; i++) {
|
||||||
|
futures.add(executorService.submit(() -> {
|
||||||
|
String instanceId = null;
|
||||||
|
try {
|
||||||
|
instanceId = startJob(options);
|
||||||
|
|
||||||
|
// Run a scheduled pass to build the export
|
||||||
|
myBatch2JobHelper.awaitJobCompletion(instanceId, 60);
|
||||||
|
|
||||||
|
verifyBulkExportResults(instanceId, List.of("Patient/PING1", "Patient/PING2"), Collections.singletonList("Patient/PNING3"));
|
||||||
|
|
||||||
|
return true;
|
||||||
|
} catch (Throwable theError) {
|
||||||
|
ourLog.error("Caught an error during processing instance {}", instanceId, theError);
|
||||||
|
throw new InternalErrorException("Caught an error during processing instance " + instanceId, theError);
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Don't let the list of futures grow so big we run out of memory
|
||||||
|
if (futures.size() > 200) {
|
||||||
|
while (futures.size() > 100) {
|
||||||
|
// This should always return true, but it'll throw an exception if we failed
|
||||||
|
assertTrue(futures.remove(0).get());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
ourLog.info("Done creating tasks, waiting for task completion");
|
||||||
|
|
||||||
|
for (var next : futures) {
|
||||||
|
// This should always return true, but it'll throw an exception if we failed
|
||||||
|
assertTrue(next.get());
|
||||||
|
}
|
||||||
|
|
||||||
|
ourLog.info("Finished task execution");
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
private void verifyBulkExportResults(String theInstanceId, List<String> theContainedList, List<String> theExcludedList) {
|
||||||
|
// Iterate over the files
|
||||||
|
String report = myJobRunner.getJobInfo(theInstanceId).getReport();
|
||||||
|
ourLog.debug("Export job {} report: {}", theInstanceId, report);
|
||||||
|
if (!theContainedList.isEmpty()) {
|
||||||
|
assertThat("report for instance " + theInstanceId + " is empty", report, not(emptyOrNullString()));
|
||||||
|
}
|
||||||
|
BulkExportJobResults results = JsonUtil.deserialize(report, BulkExportJobResults.class);
|
||||||
|
|
||||||
|
Set<String> foundIds = new HashSet<>();
|
||||||
|
for (Map.Entry<String, List<String>> file : results.getResourceTypeToBinaryIds().entrySet()) {
|
||||||
|
String resourceType = file.getKey();
|
||||||
|
List<String> binaryIds = file.getValue();
|
||||||
|
for (var nextBinaryId : binaryIds) {
|
||||||
|
|
||||||
|
Binary binary = myBinaryDao.read(new IdType(nextBinaryId), mySrd);
|
||||||
|
assertEquals(Constants.CT_FHIR_NDJSON, binary.getContentType());
|
||||||
|
|
||||||
|
String nextNdJsonFileContent = new String(binary.getContent(), Constants.CHARSET_UTF8);
|
||||||
|
ourLog.trace("Export job {} file {} contents: {}", theInstanceId, nextBinaryId, nextNdJsonFileContent);
|
||||||
|
|
||||||
|
List<String> lines = new BufferedReader(new StringReader(nextNdJsonFileContent))
|
||||||
|
.lines().toList();
|
||||||
|
ourLog.debug("Export job {} file {} line-count: {}", theInstanceId, nextBinaryId, lines.size());
|
||||||
|
|
||||||
|
lines.stream()
|
||||||
|
.map(line -> myFhirContext.newJsonParser().parseResource(line))
|
||||||
|
.map(r -> r.getIdElement().toUnqualifiedVersionless())
|
||||||
|
.forEach(nextId -> {
|
||||||
|
if (!resourceType.equals(nextId.getResourceType())) {
|
||||||
|
fail("Found resource of type " + nextId.getResourceType() + " in file for type " + resourceType);
|
||||||
|
} else {
|
||||||
|
if (!foundIds.add(nextId.getValue())) {
|
||||||
|
fail("Found duplicate ID: " + nextId.getValue());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
ourLog.debug("Export job {} exported resources {}", theInstanceId, foundIds);
|
||||||
|
|
||||||
|
for (String containedString : theContainedList) {
|
||||||
|
assertThat("export has expected ids", foundIds, hasItem(containedString));
|
||||||
|
}
|
||||||
|
for (String excludedString : theExcludedList) {
|
||||||
|
assertThat("export doesn't have expected ids", foundIds, not(hasItem(excludedString)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private String startJob(BulkDataExportOptions theOptions) {
|
||||||
|
Batch2JobStartResponse startResponse = myJobRunner.startNewJob(BulkExportUtils.createBulkExportJobParametersFromExportOptions(theOptions));
|
||||||
|
assertNotNull(startResponse);
|
||||||
|
return startResponse.getInstanceId();
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -18,7 +18,6 @@ import org.hl7.fhir.r4.model.*;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
import org.junit.jupiter.api.MethodOrderer;
|
import org.junit.jupiter.api.MethodOrderer;
|
||||||
import org.junit.jupiter.api.Order;
|
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
import org.junit.jupiter.api.TestMethodOrder;
|
import org.junit.jupiter.api.TestMethodOrder;
|
||||||
import org.junit.jupiter.params.ParameterizedTest;
|
import org.junit.jupiter.params.ParameterizedTest;
|
||||||
|
@ -38,12 +37,14 @@ import java.util.Set;
|
||||||
import java.util.concurrent.TimeUnit;
|
import java.util.concurrent.TimeUnit;
|
||||||
import java.util.stream.Stream;
|
import java.util.stream.Stream;
|
||||||
|
|
||||||
|
import static ca.uhn.fhir.jpa.dao.r4.FhirResourceDaoR4TagsInlineTest.createSearchParameterForInlineSecurity;
|
||||||
import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
||||||
import static org.awaitility.Awaitility.await;
|
import static org.awaitility.Awaitility.await;
|
||||||
import static org.hamcrest.MatcherAssert.assertThat;
|
import static org.hamcrest.MatcherAssert.assertThat;
|
||||||
import static org.hamcrest.Matchers.hasItem;
|
import static org.hamcrest.Matchers.hasItem;
|
||||||
import static org.hamcrest.Matchers.not;
|
import static org.hamcrest.Matchers.not;
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertFalse;
|
||||||
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
||||||
import static org.junit.jupiter.api.Assertions.fail;
|
import static org.junit.jupiter.api.Assertions.fail;
|
||||||
|
|
||||||
|
@ -57,6 +58,7 @@ public class BulkDataExportTest extends BaseResourceProviderR4Test {
|
||||||
@AfterEach
|
@AfterEach
|
||||||
void afterEach() {
|
void afterEach() {
|
||||||
myStorageSettings.setIndexMissingFields(JpaStorageSettings.IndexEnabledEnum.DISABLED);
|
myStorageSettings.setIndexMissingFields(JpaStorageSettings.IndexEnabledEnum.DISABLED);
|
||||||
|
myStorageSettings.setTagStorageMode(new JpaStorageSettings().getTagStorageMode());
|
||||||
}
|
}
|
||||||
|
|
||||||
@BeforeEach
|
@BeforeEach
|
||||||
|
@ -96,8 +98,45 @@ public class BulkDataExportTest extends BaseResourceProviderR4Test {
|
||||||
verifyBulkExportResults(options, Collections.singletonList("Patient/PF"), Collections.singletonList("Patient/PM"));
|
verifyBulkExportResults(options, Collections.singletonList("Patient/PF"), Collections.singletonList("Patient/PM"));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testGroupBulkExportWithTypeFilter_OnTags_InlineTagMode() {
|
||||||
|
myStorageSettings.setTagStorageMode(JpaStorageSettings.TagStorageModeEnum.INLINE);
|
||||||
|
mySearchParameterDao.update(createSearchParameterForInlineSecurity(), mySrd);
|
||||||
|
mySearchParamRegistry.forceRefresh();
|
||||||
|
|
||||||
|
// Create some resources
|
||||||
|
Patient patient = new Patient();
|
||||||
|
patient.setId("PF");
|
||||||
|
patient.getMeta().addSecurity("http://security", "val0", null);
|
||||||
|
patient.setActive(true);
|
||||||
|
myClient.update().resource(patient).execute();
|
||||||
|
|
||||||
|
patient = new Patient();
|
||||||
|
patient.setId("PM");
|
||||||
|
patient.getMeta().addSecurity("http://security", "val1", null);
|
||||||
|
patient.setActive(true);
|
||||||
|
myClient.update().resource(patient).execute();
|
||||||
|
|
||||||
|
Group group = new Group();
|
||||||
|
group.setId("Group/G");
|
||||||
|
group.setActive(true);
|
||||||
|
group.addMember().getEntity().setReference("Patient/PF");
|
||||||
|
group.addMember().getEntity().setReference("Patient/PM");
|
||||||
|
myClient.update().resource(group).execute();
|
||||||
|
|
||||||
|
// set the export options
|
||||||
|
BulkDataExportOptions options = new BulkDataExportOptions();
|
||||||
|
options.setResourceTypes(Sets.newHashSet("Patient"));
|
||||||
|
options.setGroupId(new IdType("Group", "G"));
|
||||||
|
options.setFilters(Sets.newHashSet("Patient?_security=http://security|val1"));
|
||||||
|
options.setExportStyle(BulkDataExportOptions.ExportStyle.GROUP);
|
||||||
|
options.setOutputFormat(Constants.CT_FHIR_NDJSON);
|
||||||
|
verifyBulkExportResults(options, Collections.singletonList("Patient/PM"), Collections.singletonList("Patient/PF"));
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
@Order(0)
|
|
||||||
public void testGroupBulkExportNotInGroup_DoesNotShowUp() {
|
public void testGroupBulkExportNotInGroup_DoesNotShowUp() {
|
||||||
// Create some resources
|
// Create some resources
|
||||||
Patient patient = new Patient();
|
Patient patient = new Patient();
|
||||||
|
@ -652,7 +691,7 @@ public class BulkDataExportTest extends BaseResourceProviderR4Test {
|
||||||
Batch2JobStartResponse startResponse = myJobRunner.startNewJob(BulkExportUtils.createBulkExportJobParametersFromExportOptions(theOptions));
|
Batch2JobStartResponse startResponse = myJobRunner.startNewJob(BulkExportUtils.createBulkExportJobParametersFromExportOptions(theOptions));
|
||||||
|
|
||||||
assertNotNull(startResponse);
|
assertNotNull(startResponse);
|
||||||
assertEquals(false, startResponse.isUsesCachedResult());
|
assertFalse(startResponse.isUsesCachedResult());
|
||||||
|
|
||||||
// Run a scheduled pass to build the export
|
// Run a scheduled pass to build the export
|
||||||
myBatch2JobHelper.awaitJobCompletion(startResponse.getInstanceId(), 120);
|
myBatch2JobHelper.awaitJobCompletion(startResponse.getInstanceId(), 120);
|
||||||
|
@ -701,10 +740,10 @@ public class BulkDataExportTest extends BaseResourceProviderR4Test {
|
||||||
}
|
}
|
||||||
|
|
||||||
for (String containedString : theContainedList) {
|
for (String containedString : theContainedList) {
|
||||||
assertThat(foundIds, hasItem(containedString));
|
assertThat("Didn't find expected ID " + containedString + " in IDS: " + foundIds, foundIds, hasItem(containedString));
|
||||||
}
|
}
|
||||||
for (String excludedString : theExcludedList) {
|
for (String excludedString : theExcludedList) {
|
||||||
assertThat(foundIds, not(hasItem(excludedString)));
|
assertThat("Didn't want unexpected ID " + excludedString + " in IDS: " + foundIds, foundIds, not(hasItem(excludedString)));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
package ca.uhn.fhir.jpa.bulk;
|
package ca.uhn.fhir.jpa.bulk;
|
||||||
|
|
||||||
|
import ca.uhn.fhir.batch2.api.IJobMaintenanceService;
|
||||||
import ca.uhn.fhir.batch2.api.IJobPersistence;
|
import ca.uhn.fhir.batch2.api.IJobPersistence;
|
||||||
import ca.uhn.fhir.batch2.model.JobInstance;
|
import ca.uhn.fhir.batch2.model.JobInstance;
|
||||||
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
|
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
|
||||||
|
@ -8,6 +9,10 @@ import ca.uhn.fhir.jpa.api.svc.IBatch2JobRunner;
|
||||||
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
|
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
|
||||||
import ca.uhn.fhir.jpa.bulk.export.model.BulkExportJobStatusEnum;
|
import ca.uhn.fhir.jpa.bulk.export.model.BulkExportJobStatusEnum;
|
||||||
import ca.uhn.fhir.jpa.bulk.export.model.BulkExportResponseJson;
|
import ca.uhn.fhir.jpa.bulk.export.model.BulkExportResponseJson;
|
||||||
|
import ca.uhn.fhir.jpa.dao.data.IBatch2JobInstanceRepository;
|
||||||
|
import ca.uhn.fhir.jpa.dao.data.IBatch2WorkChunkRepository;
|
||||||
|
import ca.uhn.fhir.jpa.entity.Batch2JobInstanceEntity;
|
||||||
|
import ca.uhn.fhir.jpa.entity.Batch2WorkChunkEntity;
|
||||||
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
|
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
|
||||||
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
||||||
import ca.uhn.fhir.jpa.util.BulkExportUtils;
|
import ca.uhn.fhir.jpa.util.BulkExportUtils;
|
||||||
|
@ -16,6 +21,7 @@ import ca.uhn.fhir.rest.api.Constants;
|
||||||
import ca.uhn.fhir.rest.api.server.RequestDetails;
|
import ca.uhn.fhir.rest.api.server.RequestDetails;
|
||||||
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
|
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
|
||||||
import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions;
|
import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions;
|
||||||
|
import ca.uhn.fhir.util.BundleBuilder;
|
||||||
import ca.uhn.fhir.util.JsonUtil;
|
import ca.uhn.fhir.util.JsonUtil;
|
||||||
import ca.uhn.fhir.util.SearchParameterUtil;
|
import ca.uhn.fhir.util.SearchParameterUtil;
|
||||||
import com.google.common.collect.Sets;
|
import com.google.common.collect.Sets;
|
||||||
|
@ -33,22 +39,25 @@ import org.hl7.fhir.r4.model.Encounter;
|
||||||
import org.hl7.fhir.r4.model.Enumerations;
|
import org.hl7.fhir.r4.model.Enumerations;
|
||||||
import org.hl7.fhir.r4.model.Group;
|
import org.hl7.fhir.r4.model.Group;
|
||||||
import org.hl7.fhir.r4.model.IdType;
|
import org.hl7.fhir.r4.model.IdType;
|
||||||
|
import org.hl7.fhir.r4.model.InstantType;
|
||||||
import org.hl7.fhir.r4.model.Observation;
|
import org.hl7.fhir.r4.model.Observation;
|
||||||
import org.hl7.fhir.r4.model.Patient;
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
import org.hl7.fhir.r4.model.Reference;
|
import org.hl7.fhir.r4.model.Reference;
|
||||||
import org.jetbrains.annotations.NotNull;
|
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
import org.junit.jupiter.api.Disabled;
|
||||||
import org.junit.jupiter.api.Nested;
|
import org.junit.jupiter.api.Nested;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
|
||||||
|
import javax.annotation.Nonnull;
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.nio.charset.StandardCharsets;
|
import java.nio.charset.StandardCharsets;
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.Collections;
|
import java.util.Collections;
|
||||||
|
import java.util.Date;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.HashSet;
|
import java.util.HashSet;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
@ -82,6 +91,12 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
private IJobPersistence myJobPersistence;
|
private IJobPersistence myJobPersistence;
|
||||||
|
@Autowired
|
||||||
|
private IJobMaintenanceService myJobMaintenanceService;
|
||||||
|
@Autowired
|
||||||
|
private IBatch2JobInstanceRepository myJobInstanceRepository;
|
||||||
|
@Autowired
|
||||||
|
private IBatch2WorkChunkRepository myWorkChunkRepository;
|
||||||
|
|
||||||
@BeforeEach
|
@BeforeEach
|
||||||
public void beforeEach() {
|
public void beforeEach() {
|
||||||
|
@ -114,6 +129,7 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
|
|
||||||
myBatch2JobHelper.awaitJobCompletion(secondJobId);
|
myBatch2JobHelper.awaitJobCompletion(secondJobId);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testPollingLocationContainsAllRequiredAttributesUponCompletion() throws IOException {
|
public void testPollingLocationContainsAllRequiredAttributesUponCompletion() throws IOException {
|
||||||
|
|
||||||
|
@ -146,7 +162,7 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@NotNull
|
@Nonnull
|
||||||
private String getJobIdFromPollingLocation(String pollingLocation) {
|
private String getJobIdFromPollingLocation(String pollingLocation) {
|
||||||
return pollingLocation.substring(pollingLocation.indexOf("_jobId=") + 7);
|
return pollingLocation.substring(pollingLocation.indexOf("_jobId=") + 7);
|
||||||
}
|
}
|
||||||
|
@ -369,6 +385,8 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
options.setExportStyle(BulkDataExportOptions.ExportStyle.SYSTEM);
|
options.setExportStyle(BulkDataExportOptions.ExportStyle.SYSTEM);
|
||||||
options.setOutputFormat(Constants.CT_FHIR_NDJSON);
|
options.setOutputFormat(Constants.CT_FHIR_NDJSON);
|
||||||
|
|
||||||
|
myCaptureQueriesListener.clear();
|
||||||
|
|
||||||
Batch2JobStartResponse startResponse = myJobRunner.startNewJob(BulkExportUtils.createBulkExportJobParametersFromExportOptions(options));
|
Batch2JobStartResponse startResponse = myJobRunner.startNewJob(BulkExportUtils.createBulkExportJobParametersFromExportOptions(options));
|
||||||
|
|
||||||
assertNotNull(startResponse);
|
assertNotNull(startResponse);
|
||||||
|
@ -378,6 +396,23 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
// Run a scheduled pass to build the export
|
// Run a scheduled pass to build the export
|
||||||
myBatch2JobHelper.awaitJobCompletion(startResponse.getInstanceId());
|
myBatch2JobHelper.awaitJobCompletion(startResponse.getInstanceId());
|
||||||
|
|
||||||
|
String queries = myCaptureQueriesListener
|
||||||
|
.getUpdateQueries()
|
||||||
|
.stream()
|
||||||
|
.filter(t -> t.getSql(false, false).toUpperCase().contains(" BT2_JOB_INSTANCE "))
|
||||||
|
.map(t -> new InstantType(new Date(t.getQueryTimestamp())) + " - " + t.getSql(true, false))
|
||||||
|
.collect(Collectors.joining("\n * "));
|
||||||
|
ourLog.info("Update queries:\n * " + queries);
|
||||||
|
|
||||||
|
runInTransaction(() -> {
|
||||||
|
String entities = myJobInstanceRepository
|
||||||
|
.findAll()
|
||||||
|
.stream()
|
||||||
|
.map(t -> t.toString())
|
||||||
|
.collect(Collectors.joining("\n * "));
|
||||||
|
ourLog.info("Entities:\n * " + entities);
|
||||||
|
});
|
||||||
|
|
||||||
final Optional<JobInstance> optJobInstance = myJobPersistence.fetchInstance(jobId);
|
final Optional<JobInstance> optJobInstance = myJobPersistence.fetchInstance(jobId);
|
||||||
|
|
||||||
assertNotNull(optJobInstance);
|
assertNotNull(optJobInstance);
|
||||||
|
@ -502,25 +537,28 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
@Test
|
@Test
|
||||||
public void testVeryLargeGroup() {
|
public void testVeryLargeGroup() {
|
||||||
|
|
||||||
|
BundleBuilder bb = new BundleBuilder(myFhirContext);
|
||||||
|
|
||||||
Group group = new Group();
|
Group group = new Group();
|
||||||
group.setId("Group/G");
|
group.setId("Group/G");
|
||||||
group.setActive(true);
|
group.setActive(true);
|
||||||
|
bb.addTransactionUpdateEntry(group);
|
||||||
|
|
||||||
for (int i = 0; i < 600; i++) {
|
for (int i = 0; i < 600; i++) {
|
||||||
Patient patient = new Patient();
|
Patient patient = new Patient();
|
||||||
patient.setId("PING-" + i);
|
patient.setId("PING-" + i);
|
||||||
patient.setGender(Enumerations.AdministrativeGender.FEMALE);
|
patient.setGender(Enumerations.AdministrativeGender.FEMALE);
|
||||||
patient.setActive(true);
|
patient.setActive(true);
|
||||||
myClient.update().resource(patient).execute();
|
bb.addTransactionUpdateEntry(patient);
|
||||||
group.addMember().getEntity().setReference("Patient/PING-" + i);
|
group.addMember().getEntity().setReference("Patient/PING-" + i);
|
||||||
|
|
||||||
Observation obs = new Observation();
|
Observation obs = new Observation();
|
||||||
obs.setId("obs-" + i);
|
obs.setId("obs-" + i);
|
||||||
obs.setSubject(new Reference("Patient/PING-" + i));
|
obs.setSubject(new Reference("Patient/PING-" + i));
|
||||||
myClient.update().resource(obs).execute();
|
bb.addTransactionUpdateEntry(obs);
|
||||||
}
|
}
|
||||||
|
|
||||||
myClient.update().resource(group).execute();
|
myClient.transaction().withBundle(bb.getBundle()).execute();
|
||||||
|
|
||||||
HashSet<String> resourceTypes = Sets.newHashSet("Group", "Patient", "Observation");
|
HashSet<String> resourceTypes = Sets.newHashSet("Group", "Patient", "Observation");
|
||||||
BulkExportJobResults bulkExportJobResults = startGroupBulkExportJobAndAwaitCompletion(resourceTypes, new HashSet<>(), "G");
|
BulkExportJobResults bulkExportJobResults = startGroupBulkExportJobAndAwaitCompletion(resourceTypes, new HashSet<>(), "G");
|
||||||
|
@ -567,7 +605,6 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testDifferentTypesDoNotUseCachedResults() {
|
public void testDifferentTypesDoNotUseCachedResults() {
|
||||||
|
|
||||||
Patient patient = new Patient();
|
Patient patient = new Patient();
|
||||||
patient.setId("PING1");
|
patient.setId("PING1");
|
||||||
patient.setGender(Enumerations.AdministrativeGender.FEMALE);
|
patient.setGender(Enumerations.AdministrativeGender.FEMALE);
|
||||||
|
@ -601,6 +638,13 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
Map<String, List<IBaseResource>> secondMap = convertJobResultsToResources(altBulkExportResults);
|
Map<String, List<IBaseResource>> secondMap = convertJobResultsToResources(altBulkExportResults);
|
||||||
assertThat(secondMap.get("Patient"), hasSize(1));
|
assertThat(secondMap.get("Patient"), hasSize(1));
|
||||||
assertThat(secondMap.get("Coverage"), hasSize(1));
|
assertThat(secondMap.get("Coverage"), hasSize(1));
|
||||||
|
|
||||||
|
runInTransaction(() -> {
|
||||||
|
List<Batch2JobInstanceEntity> instances = myJobInstanceRepository.findAll();
|
||||||
|
ourLog.info("Job instance states:\n * {}", instances.stream().map(Object::toString).collect(Collectors.joining("\n * ")));
|
||||||
|
List<Batch2WorkChunkEntity> workChunks = myWorkChunkRepository.findAll();
|
||||||
|
ourLog.info("Work chunks instance states:\n * {}", workChunks.stream().map(Object::toString).collect(Collectors.joining("\n * ")));
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -1014,7 +1058,7 @@ public class BulkExportUseCaseTest extends BaseResourceProviderR4Test {
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
public class WithClientIdStrategyEnumANY {
|
public class WithClientIdStrategyEnumANYTest {
|
||||||
|
|
||||||
@BeforeEach
|
@BeforeEach
|
||||||
void setUp() {
|
void setUp() {
|
||||||
|
|
|
@ -3,12 +3,14 @@ package ca.uhn.fhir.jpa.bulk;
|
||||||
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
|
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
import org.junit.jupiter.api.Disabled;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
|
|
||||||
public class BulkExportUseCaseTestAnyMode extends BulkExportUseCaseTest {
|
@Disabled
|
||||||
private static final Logger ourLog = LoggerFactory.getLogger(BulkExportUseCaseTestAnyMode.class);
|
public class BulkExportUseCaseTestAnyModeIT extends BulkExportUseCaseTest {
|
||||||
|
private static final Logger ourLog = LoggerFactory.getLogger(BulkExportUseCaseTestAnyModeIT.class);
|
||||||
|
|
||||||
|
|
||||||
@BeforeEach
|
@BeforeEach
|
|
@ -0,0 +1,104 @@
|
||||||
|
package ca.uhn.fhir.jpa.bulk.export;
|
||||||
|
|
||||||
|
import ca.uhn.fhir.batch2.api.IJobDataSink;
|
||||||
|
import ca.uhn.fhir.batch2.api.StepExecutionDetails;
|
||||||
|
import ca.uhn.fhir.batch2.jobs.export.ExpandResourcesStep;
|
||||||
|
import ca.uhn.fhir.batch2.jobs.export.models.BulkExportJobParameters;
|
||||||
|
import ca.uhn.fhir.batch2.jobs.export.models.ExpandedResourcesList;
|
||||||
|
import ca.uhn.fhir.batch2.jobs.export.models.ResourceIdList;
|
||||||
|
import ca.uhn.fhir.batch2.jobs.models.BatchResourceId;
|
||||||
|
import ca.uhn.fhir.batch2.model.JobInstance;
|
||||||
|
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
|
||||||
|
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
|
||||||
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
|
import org.junit.jupiter.params.ParameterizedTest;
|
||||||
|
import org.junit.jupiter.params.provider.CsvSource;
|
||||||
|
import org.mockito.ArgumentCaptor;
|
||||||
|
import org.mockito.Captor;
|
||||||
|
import org.mockito.Mock;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.stream.IntStream;
|
||||||
|
|
||||||
|
import static org.hamcrest.MatcherAssert.assertThat;
|
||||||
|
import static org.hamcrest.Matchers.containsString;
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
|
import static org.mockito.Mockito.times;
|
||||||
|
import static org.mockito.Mockito.verify;
|
||||||
|
|
||||||
|
public class ExpandResourcesStepJpaTest extends BaseJpaR4Test {
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private ExpandResourcesStep myExpandResourcesStep;
|
||||||
|
|
||||||
|
@Mock
|
||||||
|
private IJobDataSink<ExpandedResourcesList> mySink;
|
||||||
|
@Captor
|
||||||
|
private ArgumentCaptor<ExpandedResourcesList> myWorkChunkCaptor;
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public void afterCleanupDao() {
|
||||||
|
super.afterCleanupDao();
|
||||||
|
|
||||||
|
myStorageSettings.setTagStorageMode(new JpaStorageSettings().getTagStorageMode());
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Make sure we load inline tags efficiently when generating bulk export
|
||||||
|
*/
|
||||||
|
@ParameterizedTest
|
||||||
|
@CsvSource({"INLINE,2", "NON_VERSIONED,3", "VERSIONED,3"})
|
||||||
|
public void testBulkExportExpandResourcesStep(JpaStorageSettings.TagStorageModeEnum theTagStorageMode, int theExpectedSelectQueries) {
|
||||||
|
// Setup
|
||||||
|
|
||||||
|
myStorageSettings.setTagStorageMode(theTagStorageMode);
|
||||||
|
|
||||||
|
int count = 10;
|
||||||
|
List<Long> ids = IntStream.range(0, count)
|
||||||
|
.boxed()
|
||||||
|
.map(t -> {
|
||||||
|
Patient p = new Patient();
|
||||||
|
p.getMeta().addTag().setSystem("http://static").setCode("tag");
|
||||||
|
p.getMeta().addTag().setSystem("http://dynamic").setCode("tag" + t);
|
||||||
|
return myPatientDao.create(p, mySrd).getId().getIdPartAsLong();
|
||||||
|
}).toList();
|
||||||
|
assertEquals(count, ids.size());
|
||||||
|
|
||||||
|
ResourceIdList resourceList = new ResourceIdList();
|
||||||
|
resourceList.setResourceType("Patient");
|
||||||
|
resourceList.setIds(ids.stream().map(t -> new BatchResourceId().setResourceType("Patient").setId(Long.toString(t))).toList());
|
||||||
|
|
||||||
|
BulkExportJobParameters params = new BulkExportJobParameters();
|
||||||
|
JobInstance jobInstance = new JobInstance();
|
||||||
|
String chunkId = "ABC";
|
||||||
|
|
||||||
|
StepExecutionDetails<BulkExportJobParameters, ResourceIdList> details = new StepExecutionDetails<>(params, resourceList, jobInstance, chunkId);
|
||||||
|
|
||||||
|
// Test
|
||||||
|
|
||||||
|
myCaptureQueriesListener.clear();
|
||||||
|
myExpandResourcesStep.run(details, mySink);
|
||||||
|
|
||||||
|
// Verify
|
||||||
|
|
||||||
|
verify(mySink, times(1)).accept(myWorkChunkCaptor.capture());
|
||||||
|
ExpandedResourcesList expandedResourceList = myWorkChunkCaptor.getValue();
|
||||||
|
assertEquals(10, expandedResourceList.getStringifiedResources().size());
|
||||||
|
assertThat(expandedResourceList.getStringifiedResources().get(0), containsString("{\"system\":\"http://static\",\"code\":\"tag\"}"));
|
||||||
|
assertThat(expandedResourceList.getStringifiedResources().get(0), containsString("{\"system\":\"http://dynamic\",\"code\":\"tag0\"}"));
|
||||||
|
assertThat(expandedResourceList.getStringifiedResources().get(1), containsString("{\"system\":\"http://static\",\"code\":\"tag\"}"));
|
||||||
|
assertThat(expandedResourceList.getStringifiedResources().get(1), containsString("{\"system\":\"http://dynamic\",\"code\":\"tag1\"}"));
|
||||||
|
|
||||||
|
// Verify query counts
|
||||||
|
assertEquals(theExpectedSelectQueries, myCaptureQueriesListener.countSelectQueries());
|
||||||
|
assertEquals(0, myCaptureQueriesListener.countInsertQueries());
|
||||||
|
assertEquals(0, myCaptureQueriesListener.countUpdateQueries());
|
||||||
|
assertEquals(0, myCaptureQueriesListener.countDeleteQueries());
|
||||||
|
assertEquals(2, myCaptureQueriesListener.countCommits());
|
||||||
|
assertEquals(0, myCaptureQueriesListener.countRollbacks());
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
}
|
|
@ -37,6 +37,7 @@ import org.hl7.fhir.r4.model.DecimalType;
|
||||||
import org.hl7.fhir.r4.model.Encounter;
|
import org.hl7.fhir.r4.model.Encounter;
|
||||||
import org.hl7.fhir.r4.model.Enumerations;
|
import org.hl7.fhir.r4.model.Enumerations;
|
||||||
import org.hl7.fhir.r4.model.IdType;
|
import org.hl7.fhir.r4.model.IdType;
|
||||||
|
import org.hl7.fhir.r4.model.InstantType;
|
||||||
import org.hl7.fhir.r4.model.Observation;
|
import org.hl7.fhir.r4.model.Observation;
|
||||||
import org.hl7.fhir.r4.model.Organization;
|
import org.hl7.fhir.r4.model.Organization;
|
||||||
import org.hl7.fhir.r4.model.Patient;
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
|
@ -87,6 +88,32 @@ public class FhirResourceDaoR4CreateTest extends BaseJpaR4Test {
|
||||||
myStorageSettings.setInlineResourceTextBelowSize(new JpaStorageSettings().getInlineResourceTextBelowSize());
|
myStorageSettings.setInlineResourceTextBelowSize(new JpaStorageSettings().getInlineResourceTextBelowSize());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testCreateDoesntIndexForMetaSearchTags() {
|
||||||
|
Observation obs = new Observation();
|
||||||
|
obs.setId("A");
|
||||||
|
obs.addNote().setText("A non indexed value");
|
||||||
|
obs.getMeta().setLastUpdatedElement(InstantType.now());
|
||||||
|
obs.getMeta().addTag().setSystem("http://foo").setCode("blah");
|
||||||
|
obs.getMeta().addTag().setSystem("http://foo").setCode("blah2");
|
||||||
|
obs.getMeta().addSecurity().setSystem("http://foo").setCode("blah");
|
||||||
|
obs.getMeta().addSecurity().setSystem("http://foo").setCode("blah2");
|
||||||
|
obs.getMeta().addProfile("http://blah");
|
||||||
|
obs.getMeta().addProfile("http://blah2");
|
||||||
|
obs.getMeta().setSource("http://foo#bar");
|
||||||
|
myObservationDao.update(obs, new SystemRequestDetails());
|
||||||
|
|
||||||
|
runInTransaction(()->{
|
||||||
|
logAllTokenIndexes();
|
||||||
|
logAllStringIndexes();
|
||||||
|
assertEquals(0, myResourceIndexedSearchParamStringDao.count());
|
||||||
|
assertEquals(0, myResourceIndexedSearchParamTokenDao.count());
|
||||||
|
assertEquals(0, myResourceIndexedSearchParamUriDao.count());
|
||||||
|
});
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testCreateLinkCreatesAppropriatePaths() {
|
public void testCreateLinkCreatesAppropriatePaths() {
|
||||||
|
|
|
@ -17,8 +17,13 @@ import org.junit.jupiter.api.BeforeEach;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.List;
|
||||||
import java.util.UUID;
|
import java.util.UUID;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
import java.util.stream.IntStream;
|
||||||
|
|
||||||
import static org.hamcrest.MatcherAssert.assertThat;
|
import static org.hamcrest.MatcherAssert.assertThat;
|
||||||
import static org.hamcrest.Matchers.containsString;
|
import static org.hamcrest.Matchers.containsString;
|
||||||
|
@ -109,7 +114,8 @@ public class FhirResourceDaoR4SearchSqlTest extends BaseJpaR4Test {
|
||||||
boolean reindexParamCache = myStorageSettings.isMarkResourcesForReindexingUponSearchParameterChange();
|
boolean reindexParamCache = myStorageSettings.isMarkResourcesForReindexingUponSearchParameterChange();
|
||||||
myStorageSettings.setMarkResourcesForReindexingUponSearchParameterChange(false);
|
myStorageSettings.setMarkResourcesForReindexingUponSearchParameterChange(false);
|
||||||
|
|
||||||
SearchParameter searchParameter = FhirResourceDaoR4TagsTest.createSearchParamForInlineResourceProfile();
|
// SearchParameter searchParameter = FhirResourceDaoR4TagsTest.createSearchParamForInlineResourceProfile();
|
||||||
|
SearchParameter searchParameter = FhirResourceDaoR4TagsInlineTest.createSearchParameterForInlineProfile();
|
||||||
ourLog.debug("SearchParam:\n{}", myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(searchParameter));
|
ourLog.debug("SearchParam:\n{}", myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(searchParameter));
|
||||||
mySearchParameterDao.update(searchParameter, mySrd);
|
mySearchParameterDao.update(searchParameter, mySrd);
|
||||||
mySearchParamRegistry.forceRefresh();
|
mySearchParamRegistry.forceRefresh();
|
||||||
|
|
|
@ -0,0 +1,236 @@
|
||||||
|
package ca.uhn.fhir.jpa.dao.r4;
|
||||||
|
|
||||||
|
import ca.uhn.fhir.context.FhirContext;
|
||||||
|
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
|
||||||
|
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
|
||||||
|
import ca.uhn.fhir.rest.gclient.TokenClientParam;
|
||||||
|
import org.hl7.fhir.r4.model.Bundle;
|
||||||
|
import org.hl7.fhir.r4.model.Enumerations;
|
||||||
|
import org.hl7.fhir.r4.model.IdType;
|
||||||
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
|
import org.hl7.fhir.r4.model.SearchParameter;
|
||||||
|
import org.junit.jupiter.api.AfterEach;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
|
import javax.annotation.Nonnull;
|
||||||
|
|
||||||
|
import static ca.uhn.fhir.jpa.dao.r4.FhirResourceDaoR4TagsTest.toProfiles;
|
||||||
|
import static ca.uhn.fhir.jpa.dao.r4.FhirResourceDaoR4TagsTest.toSecurityLabels;
|
||||||
|
import static ca.uhn.fhir.jpa.dao.r4.FhirResourceDaoR4TagsTest.toTags;
|
||||||
|
import static org.hamcrest.MatcherAssert.assertThat;
|
||||||
|
import static org.hamcrest.Matchers.contains;
|
||||||
|
import static org.hamcrest.Matchers.containsInAnyOrder;
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
|
|
||||||
|
@SuppressWarnings({"Duplicates"})
|
||||||
|
public class FhirResourceDaoR4TagsInlineTest extends BaseResourceProviderR4Test {
|
||||||
|
|
||||||
|
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(FhirResourceDaoR4TagsInlineTest.class);
|
||||||
|
|
||||||
|
@Override
|
||||||
|
@AfterEach
|
||||||
|
public final void after() throws Exception {
|
||||||
|
super.after();
|
||||||
|
myStorageSettings.setTagStorageMode(JpaStorageSettings.DEFAULT_TAG_STORAGE_MODE);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testInlineTags_StoreAndRetrieve() {
|
||||||
|
myStorageSettings.setTagStorageMode(JpaStorageSettings.TagStorageModeEnum.INLINE);
|
||||||
|
|
||||||
|
// Store a first version
|
||||||
|
Patient patient = new Patient();
|
||||||
|
patient.setId("Patient/A");
|
||||||
|
patient.getMeta().addProfile("http://profile1");
|
||||||
|
patient.getMeta().addTag("http://tag1", "vtag1", "dtag1");
|
||||||
|
patient.getMeta().addSecurity("http://sec1", "vsec1", "dsec1");
|
||||||
|
patient.setActive(true);
|
||||||
|
myPatientDao.update(patient, mySrd);
|
||||||
|
|
||||||
|
runInTransaction(() -> {
|
||||||
|
assertEquals(0, myResourceTagDao.count());
|
||||||
|
assertEquals(0, myResourceHistoryTagDao.count());
|
||||||
|
assertEquals(0, myTagDefinitionDao.count());
|
||||||
|
});
|
||||||
|
|
||||||
|
// Read it back
|
||||||
|
patient = myPatientDao.read(new IdType("Patient/A/_history/1"), mySrd);
|
||||||
|
assertThat(toProfiles(patient).toString(), toProfiles(patient), contains("http://profile1"));
|
||||||
|
assertThat(toTags(patient).toString(), toTags(patient), contains("http://tag1|vtag1|dtag1"));
|
||||||
|
assertThat(toSecurityLabels(patient).toString(), toSecurityLabels(patient), contains("http://sec1|vsec1|dsec1"));
|
||||||
|
|
||||||
|
// Store a second version
|
||||||
|
patient = new Patient();
|
||||||
|
patient.setId("Patient/A");
|
||||||
|
patient.getMeta().addProfile("http://profile2");
|
||||||
|
patient.getMeta().addTag("http://tag2", "vtag2", "dtag2");
|
||||||
|
patient.getMeta().addSecurity("http://sec2", "vsec2", "dsec2");
|
||||||
|
patient.setActive(true);
|
||||||
|
myPatientDao.update(patient, mySrd);
|
||||||
|
|
||||||
|
runInTransaction(() -> {
|
||||||
|
assertEquals(0, myResourceTagDao.count());
|
||||||
|
assertEquals(0, myResourceHistoryTagDao.count());
|
||||||
|
assertEquals(0, myTagDefinitionDao.count());
|
||||||
|
});
|
||||||
|
|
||||||
|
// First version should have only the initial tags
|
||||||
|
patient = myPatientDao.read(new IdType("Patient/A/_history/1"), mySrd);
|
||||||
|
assertThat(toProfiles(patient).toString(), toProfiles(patient), contains("http://profile1"));
|
||||||
|
assertThat(toTags(patient).toString(), toTags(patient), contains("http://tag1|vtag1|dtag1"));
|
||||||
|
assertThat(toSecurityLabels(patient).toString(), toSecurityLabels(patient), contains("http://sec1|vsec1|dsec1"));
|
||||||
|
|
||||||
|
// Second version should have the new set of tags
|
||||||
|
// TODO: We could copy these forward like we do for non-inline mode. Perhaps in the future.
|
||||||
|
patient = myPatientDao.read(new IdType("Patient/A/_history/2"), mySrd);
|
||||||
|
assertThat(toProfiles(patient).toString(), toProfiles(patient), contains("http://profile2"));
|
||||||
|
assertThat(toTags(patient).toString(), toTags(patient), containsInAnyOrder("http://tag2|vtag2|dtag2"));
|
||||||
|
assertThat(toSecurityLabels(patient).toString(), toSecurityLabels(patient), containsInAnyOrder("http://sec2|vsec2|dsec2"));
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testInlineTags_Search_Tag() {
|
||||||
|
myStorageSettings.setTagStorageMode(JpaStorageSettings.TagStorageModeEnum.INLINE);
|
||||||
|
|
||||||
|
SearchParameter searchParameter = createSearchParameterForInlineTag();
|
||||||
|
ourLog.debug("SearchParam:\n{}", myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(searchParameter));
|
||||||
|
mySearchParameterDao.update(searchParameter, mySrd);
|
||||||
|
mySearchParamRegistry.forceRefresh();
|
||||||
|
|
||||||
|
createPatientsForInlineSearchTests();
|
||||||
|
|
||||||
|
logAllTokenIndexes();
|
||||||
|
|
||||||
|
// Perform a search
|
||||||
|
Bundle outcome = myClient.search().forResource("Patient").where(new TokenClientParam("_tag").exactly().systemAndCode("http://tag1", "vtag1")).returnBundle(Bundle.class).execute();
|
||||||
|
assertThat(toUnqualifiedVersionlessIdValues(outcome), containsInAnyOrder("Patient/A", "Patient/B"));
|
||||||
|
|
||||||
|
validatePatientSearchResultsForInlineTags(outcome);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testInlineTags_Search_Profile() {
|
||||||
|
myStorageSettings.setTagStorageMode(JpaStorageSettings.TagStorageModeEnum.INLINE);
|
||||||
|
|
||||||
|
SearchParameter searchParameter = createSearchParameterForInlineProfile();
|
||||||
|
ourLog.debug("SearchParam:\n{}", myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(searchParameter));
|
||||||
|
mySearchParameterDao.update(searchParameter, mySrd);
|
||||||
|
mySearchParamRegistry.forceRefresh();
|
||||||
|
|
||||||
|
createPatientsForInlineSearchTests();
|
||||||
|
|
||||||
|
logAllTokenIndexes();
|
||||||
|
|
||||||
|
// Perform a search
|
||||||
|
Bundle outcome = myClient.search().forResource("Patient").where(new TokenClientParam("_profile").exactly().code("http://profile1")).returnBundle(Bundle.class).execute();
|
||||||
|
assertThat(toUnqualifiedVersionlessIdValues(outcome), containsInAnyOrder("Patient/A", "Patient/B"));
|
||||||
|
validatePatientSearchResultsForInlineTags(outcome);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testInlineTags_Search_Security() {
|
||||||
|
myStorageSettings.setTagStorageMode(JpaStorageSettings.TagStorageModeEnum.INLINE);
|
||||||
|
|
||||||
|
SearchParameter searchParameter = createSearchParameterForInlineSecurity();
|
||||||
|
ourLog.debug("SearchParam:\n{}", myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(searchParameter));
|
||||||
|
mySearchParameterDao.update(searchParameter, mySrd);
|
||||||
|
mySearchParamRegistry.forceRefresh();
|
||||||
|
|
||||||
|
createPatientsForInlineSearchTests();
|
||||||
|
|
||||||
|
logAllTokenIndexes();
|
||||||
|
|
||||||
|
// Perform a search
|
||||||
|
Bundle outcome = myClient.search().forResource("Patient").where(new TokenClientParam("_security").exactly().systemAndCode("http://sec1", "vsec1")).returnBundle(Bundle.class).execute();
|
||||||
|
assertThat(toUnqualifiedVersionlessIdValues(outcome), containsInAnyOrder("Patient/A", "Patient/B"));
|
||||||
|
|
||||||
|
validatePatientSearchResultsForInlineTags(outcome);
|
||||||
|
}
|
||||||
|
|
||||||
|
private void validatePatientSearchResultsForInlineTags(Bundle outcome) {
|
||||||
|
Patient patient;
|
||||||
|
patient = (Patient) outcome.getEntry().get(0).getResource();
|
||||||
|
assertThat(toProfiles(patient).toString(), toProfiles(patient), contains("http://profile1"));
|
||||||
|
assertThat(toTags(patient).toString(), toTags(patient), contains("http://tag1|vtag1|dtag1"));
|
||||||
|
assertThat(toSecurityLabels(patient).toString(), toSecurityLabels(patient), contains("http://sec1|vsec1|dsec1"));
|
||||||
|
patient = (Patient) outcome.getEntry().get(1).getResource();
|
||||||
|
assertThat(toProfiles(patient).toString(), toProfiles(patient), contains("http://profile1"));
|
||||||
|
assertThat(toTags(patient).toString(), toTags(patient), contains("http://tag1|vtag1|dtag1"));
|
||||||
|
assertThat(toSecurityLabels(patient).toString(), toSecurityLabels(patient), contains("http://sec1|vsec1|dsec1"));
|
||||||
|
}
|
||||||
|
|
||||||
|
private void createPatientsForInlineSearchTests() {
|
||||||
|
Patient patient = new Patient();
|
||||||
|
patient.setId("Patient/A");
|
||||||
|
patient.getMeta().addProfile("http://profile1");
|
||||||
|
patient.getMeta().addTag("http://tag1", "vtag1", "dtag1");
|
||||||
|
patient.getMeta().addSecurity("http://sec1", "vsec1", "dsec1");
|
||||||
|
patient.setActive(true);
|
||||||
|
myPatientDao.update(patient, mySrd);
|
||||||
|
|
||||||
|
patient = new Patient();
|
||||||
|
patient.setId("Patient/B");
|
||||||
|
patient.getMeta().addProfile("http://profile1");
|
||||||
|
patient.getMeta().addTag("http://tag1", "vtag1", "dtag1");
|
||||||
|
patient.getMeta().addSecurity("http://sec1", "vsec1", "dsec1");
|
||||||
|
patient.setActive(true);
|
||||||
|
myPatientDao.update(patient, mySrd);
|
||||||
|
|
||||||
|
patient = new Patient();
|
||||||
|
patient.setId("Patient/NO");
|
||||||
|
patient.getMeta().addProfile("http://profile99");
|
||||||
|
patient.getMeta().addTag("http://tag99", "vtag99", "dtag99");
|
||||||
|
patient.getMeta().addSecurity("http://sec99", "vsec99", "dsec99");
|
||||||
|
patient.setActive(true);
|
||||||
|
myPatientDao.update(patient, mySrd);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Nonnull
|
||||||
|
public static SearchParameter createSearchParameterForInlineTag() {
|
||||||
|
SearchParameter searchParameter = new SearchParameter();
|
||||||
|
searchParameter.setId("SearchParameter/resource-tag");
|
||||||
|
for (String next : FhirContext.forR4Cached().getResourceTypes().stream().sorted().toList()) {
|
||||||
|
searchParameter.addBase(next);
|
||||||
|
}
|
||||||
|
searchParameter.setStatus(Enumerations.PublicationStatus.ACTIVE);
|
||||||
|
searchParameter.setType(Enumerations.SearchParamType.TOKEN);
|
||||||
|
searchParameter.setCode("_tag");
|
||||||
|
searchParameter.setName("Tag");
|
||||||
|
searchParameter.setExpression("meta.tag");
|
||||||
|
return searchParameter;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Nonnull
|
||||||
|
public static SearchParameter createSearchParameterForInlineSecurity() {
|
||||||
|
SearchParameter searchParameter = new SearchParameter();
|
||||||
|
searchParameter.setId("SearchParameter/resource-security");
|
||||||
|
for (String next : FhirContext.forR4Cached().getResourceTypes().stream().sorted().toList()) {
|
||||||
|
searchParameter.addBase(next);
|
||||||
|
}
|
||||||
|
searchParameter.setStatus(Enumerations.PublicationStatus.ACTIVE);
|
||||||
|
searchParameter.setType(Enumerations.SearchParamType.TOKEN);
|
||||||
|
searchParameter.setCode("_security");
|
||||||
|
searchParameter.setName("Security");
|
||||||
|
searchParameter.setExpression("meta.security");
|
||||||
|
return searchParameter;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Nonnull
|
||||||
|
public static SearchParameter createSearchParameterForInlineProfile() {
|
||||||
|
SearchParameter searchParameter = new SearchParameter();
|
||||||
|
searchParameter.setId("SearchParameter/resource-profile");
|
||||||
|
for (String next : FhirContext.forR4Cached().getResourceTypes().stream().sorted().toList()) {
|
||||||
|
searchParameter.addBase(next);
|
||||||
|
}
|
||||||
|
searchParameter.setStatus(Enumerations.PublicationStatus.ACTIVE);
|
||||||
|
searchParameter.setType(Enumerations.SearchParamType.URI);
|
||||||
|
searchParameter.setCode("_profile");
|
||||||
|
searchParameter.setName("Profile");
|
||||||
|
searchParameter.setExpression("meta.profile");
|
||||||
|
return searchParameter;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -595,32 +595,32 @@ public class FhirResourceDaoR4TagsTest extends BaseResourceProviderR4Test {
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nonnull
|
@Nonnull
|
||||||
private List<String> toTags(Patient patient) {
|
static List<String> toTags(Patient patient) {
|
||||||
return toTags(patient.getMeta());
|
return toTags(patient.getMeta());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nonnull
|
@Nonnull
|
||||||
private List<String> toSecurityLabels(Patient patient) {
|
static List<String> toSecurityLabels(Patient patient) {
|
||||||
return toSecurityLabels(patient.getMeta());
|
return toSecurityLabels(patient.getMeta());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nonnull
|
@Nonnull
|
||||||
private List<String> toProfiles(Patient patient) {
|
static List<String> toProfiles(Patient patient) {
|
||||||
return toProfiles(patient.getMeta());
|
return toProfiles(patient.getMeta());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nonnull
|
@Nonnull
|
||||||
private static List<String> toTags(Meta meta) {
|
static List<String> toTags(Meta meta) {
|
||||||
return meta.getTag().stream().map(t -> t.getSystem() + "|" + t.getCode() + "|" + t.getDisplay()).collect(Collectors.toList());
|
return meta.getTag().stream().map(t -> t.getSystem() + "|" + t.getCode() + "|" + t.getDisplay()).collect(Collectors.toList());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nonnull
|
@Nonnull
|
||||||
private static List<String> toSecurityLabels(Meta meta) {
|
static List<String> toSecurityLabels(Meta meta) {
|
||||||
return meta.getSecurity().stream().map(t -> t.getSystem() + "|" + t.getCode() + "|" + t.getDisplay()).collect(Collectors.toList());
|
return meta.getSecurity().stream().map(t -> t.getSystem() + "|" + t.getCode() + "|" + t.getDisplay()).collect(Collectors.toList());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nonnull
|
@Nonnull
|
||||||
private static List<String> toProfiles(Meta meta) {
|
static List<String> toProfiles(Meta meta) {
|
||||||
return meta.getProfile().stream().map(t -> t.getValue()).collect(Collectors.toList());
|
return meta.getProfile().stream().map(t -> t.getValue()).collect(Collectors.toList());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -32,6 +32,7 @@ import ca.uhn.fhir.rest.api.MethodOutcome;
|
||||||
import ca.uhn.fhir.rest.api.SortOrderEnum;
|
import ca.uhn.fhir.rest.api.SortOrderEnum;
|
||||||
import ca.uhn.fhir.rest.api.SortSpec;
|
import ca.uhn.fhir.rest.api.SortSpec;
|
||||||
import ca.uhn.fhir.rest.api.server.IBundleProvider;
|
import ca.uhn.fhir.rest.api.server.IBundleProvider;
|
||||||
|
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
|
||||||
import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId;
|
import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId;
|
||||||
import ca.uhn.fhir.rest.param.DateParam;
|
import ca.uhn.fhir.rest.param.DateParam;
|
||||||
import ca.uhn.fhir.rest.param.DateRangeParam;
|
import ca.uhn.fhir.rest.param.DateRangeParam;
|
||||||
|
@ -49,6 +50,7 @@ import ca.uhn.fhir.rest.server.exceptions.ResourceGoneException;
|
||||||
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
|
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
|
||||||
import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException;
|
import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException;
|
||||||
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
|
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
|
||||||
|
import ca.uhn.fhir.util.BundleBuilder;
|
||||||
import com.google.common.base.Charsets;
|
import com.google.common.base.Charsets;
|
||||||
import com.google.common.collect.Lists;
|
import com.google.common.collect.Lists;
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
|
@ -62,6 +64,7 @@ import org.hibernate.search.mapper.orm.session.SearchSession;
|
||||||
import org.hl7.fhir.instance.model.api.IAnyResource;
|
import org.hl7.fhir.instance.model.api.IAnyResource;
|
||||||
import org.hl7.fhir.instance.model.api.IBaseResource;
|
import org.hl7.fhir.instance.model.api.IBaseResource;
|
||||||
import org.hl7.fhir.instance.model.api.IIdType;
|
import org.hl7.fhir.instance.model.api.IIdType;
|
||||||
|
import org.hl7.fhir.r4.model.Address;
|
||||||
import org.hl7.fhir.r4.model.Age;
|
import org.hl7.fhir.r4.model.Age;
|
||||||
import org.hl7.fhir.r4.model.Attachment;
|
import org.hl7.fhir.r4.model.Attachment;
|
||||||
import org.hl7.fhir.r4.model.Bundle;
|
import org.hl7.fhir.r4.model.Bundle;
|
||||||
|
@ -99,6 +102,7 @@ import org.hl7.fhir.r4.model.OperationOutcome.IssueType;
|
||||||
import org.hl7.fhir.r4.model.Organization;
|
import org.hl7.fhir.r4.model.Organization;
|
||||||
import org.hl7.fhir.r4.model.Patient;
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
import org.hl7.fhir.r4.model.Period;
|
import org.hl7.fhir.r4.model.Period;
|
||||||
|
import org.hl7.fhir.r4.model.Practitioner;
|
||||||
import org.hl7.fhir.r4.model.Provenance;
|
import org.hl7.fhir.r4.model.Provenance;
|
||||||
import org.hl7.fhir.r4.model.Quantity;
|
import org.hl7.fhir.r4.model.Quantity;
|
||||||
import org.hl7.fhir.r4.model.Quantity.QuantityComparator;
|
import org.hl7.fhir.r4.model.Quantity.QuantityComparator;
|
||||||
|
@ -334,7 +338,6 @@ public class FhirResourceDaoR4Test extends BaseJpaR4Test {
|
||||||
assertEquals("#", observation.getSubject().getReference());
|
assertEquals("#", observation.getSubject().getReference());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@Tag("intermittent")
|
@Tag("intermittent")
|
||||||
// @Test
|
// @Test
|
||||||
public void testTermConceptReindexingDoesntDuplicateData() {
|
public void testTermConceptReindexingDoesntDuplicateData() {
|
||||||
|
|
|
@ -41,6 +41,7 @@ import org.hl7.fhir.r4.model.Reference;
|
||||||
import org.hl7.fhir.r4.model.SearchParameter;
|
import org.hl7.fhir.r4.model.SearchParameter;
|
||||||
import org.hl7.fhir.r4.model.StructureDefinition;
|
import org.hl7.fhir.r4.model.StructureDefinition;
|
||||||
import org.hl7.fhir.utilities.npm.NpmPackage;
|
import org.hl7.fhir.utilities.npm.NpmPackage;
|
||||||
|
import org.hl7.fhir.utilities.npm.PackageServer;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
import org.junit.jupiter.api.Disabled;
|
import org.junit.jupiter.api.Disabled;
|
||||||
|
@ -116,7 +117,7 @@ public class NpmR4Test extends BaseJpaR4Test {
|
||||||
|
|
||||||
int port = JettyUtil.getPortForStartedServer(myServer);
|
int port = JettyUtil.getPortForStartedServer(myServer);
|
||||||
jpaPackageCache.getPackageServers().clear();
|
jpaPackageCache.getPackageServers().clear();
|
||||||
jpaPackageCache.addPackageServer("http://localhost:" + port);
|
jpaPackageCache.addPackageServer(new PackageServer("http://localhost:" + port));
|
||||||
|
|
||||||
myFakeNpmServlet.responses.clear();
|
myFakeNpmServlet.responses.clear();
|
||||||
}
|
}
|
||||||
|
@ -138,7 +139,7 @@ public class NpmR4Test extends BaseJpaR4Test {
|
||||||
public void testInstallUsCore() {
|
public void testInstallUsCore() {
|
||||||
JpaPackageCache jpaPackageCache = ProxyUtil.getSingletonTarget(myPackageCacheManager, JpaPackageCache.class);
|
JpaPackageCache jpaPackageCache = ProxyUtil.getSingletonTarget(myPackageCacheManager, JpaPackageCache.class);
|
||||||
jpaPackageCache.getPackageServers().clear();
|
jpaPackageCache.getPackageServers().clear();
|
||||||
jpaPackageCache.addPackageServer("https://packages.fhir.org");
|
jpaPackageCache.addPackageServer(new PackageServer("https://packages.fhir.org"));
|
||||||
|
|
||||||
PackageInstallationSpec spec = new PackageInstallationSpec()
|
PackageInstallationSpec spec = new PackageInstallationSpec()
|
||||||
.setName("hl7.fhir.us.core")
|
.setName("hl7.fhir.us.core")
|
||||||
|
|
|
@ -10,6 +10,7 @@ import org.eclipse.jetty.servlet.ServletHandler;
|
||||||
import org.eclipse.jetty.servlet.ServletHolder;
|
import org.eclipse.jetty.servlet.ServletHolder;
|
||||||
import org.hl7.fhir.instance.model.api.IBaseResource;
|
import org.hl7.fhir.instance.model.api.IBaseResource;
|
||||||
import org.hl7.fhir.utilities.npm.NpmPackage;
|
import org.hl7.fhir.utilities.npm.NpmPackage;
|
||||||
|
import org.hl7.fhir.utilities.npm.PackageServer;
|
||||||
import org.junit.jupiter.api.AfterEach;
|
import org.junit.jupiter.api.AfterEach;
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
@ -55,7 +56,7 @@ public class PackageLoaderSvcIT {
|
||||||
|
|
||||||
int port = JettyUtil.getPortForStartedServer(myServer);
|
int port = JettyUtil.getPortForStartedServer(myServer);
|
||||||
myPackageLoaderSvc.getPackageServers().clear();
|
myPackageLoaderSvc.getPackageServers().clear();
|
||||||
myPackageLoaderSvc.addPackageServer("http://localhost:" + port);
|
myPackageLoaderSvc.addPackageServer(new PackageServer("http://localhost:" + port));
|
||||||
|
|
||||||
myFakeNpmServlet.getResponses().clear();
|
myFakeNpmServlet.getResponses().clear();
|
||||||
}
|
}
|
||||||
|
|
|
@ -765,10 +765,12 @@ public class ResourceProviderR4Test extends BaseResourceProviderR4Test {
|
||||||
patient.getText().setDivAsString("<div xmlns=\"http://www.w3.org/1999/xhtml\">hello</div>");
|
patient.getText().setDivAsString("<div xmlns=\"http://www.w3.org/1999/xhtml\">hello</div>");
|
||||||
|
|
||||||
patient = (Patient) myClient.create().resource(patient).execute().getResource();
|
patient = (Patient) myClient.create().resource(patient).execute().getResource();
|
||||||
|
ourLog.info("Patient: {}", myFhirContext.newJsonParser().encodeResourceToString(patient));
|
||||||
assertEquals(1, patient.getIdElement().getVersionIdPartAsLong());
|
assertEquals(1, patient.getIdElement().getVersionIdPartAsLong());
|
||||||
|
|
||||||
// Read Patient
|
// Read Patient
|
||||||
patient = (Patient) myClient.read().resource("Patient").withId(patient.getIdElement()).execute();
|
patient = (Patient) myClient.read().resource("Patient").withId(patient.getIdElement()).execute();
|
||||||
|
ourLog.info("Patient: {}", myFhirContext.newJsonParser().encodeResourceToString(patient));
|
||||||
assertEquals(1, patient.getIdElement().getVersionIdPartAsLong());
|
assertEquals(1, patient.getIdElement().getVersionIdPartAsLong());
|
||||||
|
|
||||||
// Update Patient with no changes
|
// Update Patient with no changes
|
||||||
|
|
|
@ -284,6 +284,8 @@ public class GiantTransactionPerfTest {
|
||||||
|
|
||||||
mySystemDao.transaction(requestDetails, input);
|
mySystemDao.transaction(requestDetails, input);
|
||||||
|
|
||||||
|
ourLog.info("Merges:\n * " + myEntityManager.myMergeCount.stream().map(t->t.toString()).collect(Collectors.joining("\n * ")));
|
||||||
|
|
||||||
assertThat(myEntityManager.myPersistCount.stream().map(t -> t.getClass().getSimpleName()).collect(Collectors.toList()), Matchers.contains("ResourceTable"));
|
assertThat(myEntityManager.myPersistCount.stream().map(t -> t.getClass().getSimpleName()).collect(Collectors.toList()), Matchers.contains("ResourceTable"));
|
||||||
assertThat(myEntityManager.myMergeCount.stream().map(t -> t.getClass().getSimpleName()).collect(Collectors.toList()), Matchers.containsInAnyOrder("ResourceTable", "ResourceIndexedSearchParamToken", "ResourceIndexedSearchParamToken"));
|
assertThat(myEntityManager.myMergeCount.stream().map(t -> t.getClass().getSimpleName()).collect(Collectors.toList()), Matchers.containsInAnyOrder("ResourceTable", "ResourceIndexedSearchParamToken", "ResourceIndexedSearchParamToken"));
|
||||||
assertEquals(1, myEntityManager.myFlushCount);
|
assertEquals(1, myEntityManager.myFlushCount);
|
||||||
|
|
|
@ -7,6 +7,7 @@ import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
|
||||||
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
|
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
|
||||||
import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc;
|
import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc;
|
||||||
import ca.uhn.fhir.jpa.subscription.match.matcher.matching.SubscriptionMatchingStrategy;
|
import ca.uhn.fhir.jpa.subscription.match.matcher.matching.SubscriptionMatchingStrategy;
|
||||||
|
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
|
||||||
import ca.uhn.fhir.jpa.subscription.match.matcher.matching.SubscriptionStrategyEvaluator;
|
import ca.uhn.fhir.jpa.subscription.match.matcher.matching.SubscriptionStrategyEvaluator;
|
||||||
import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionCanonicalizer;
|
import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionCanonicalizer;
|
||||||
import ca.uhn.fhir.jpa.subscription.submit.interceptor.SubscriptionValidatingInterceptor;
|
import ca.uhn.fhir.jpa.subscription.submit.interceptor.SubscriptionValidatingInterceptor;
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -123,11 +123,12 @@ public class Batch2JobHelper {
|
||||||
return myJobCoordinator.getInstance(theBatchJobId);
|
return myJobCoordinator.getInstance(theBatchJobId);
|
||||||
}
|
}
|
||||||
|
|
||||||
private boolean checkStatusWithMaintenancePass(String theBatchJobId, StatusEnum... theExpectedStatuses) {
|
private boolean checkStatusWithMaintenancePass(String theBatchJobId, StatusEnum... theExpectedStatuses) throws InterruptedException {
|
||||||
if (hasStatus(theBatchJobId, theExpectedStatuses)) {
|
if (hasStatus(theBatchJobId, theExpectedStatuses)) {
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
myJobMaintenanceService.runMaintenancePass();
|
myJobMaintenanceService.runMaintenancePass();
|
||||||
|
Thread.sleep(1000);
|
||||||
return hasStatus(theBatchJobId, theExpectedStatuses);
|
return hasStatus(theBatchJobId, theExpectedStatuses);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
@ -20,7 +20,7 @@
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-caching-api</artifactId>
|
<artifactId>hapi-fhir-caching-api</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>com.github.ben-manes.caffeine</groupId>
|
<groupId>com.github.ben-manes.caffeine</groupId>
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
<artifactId>hapi-fhir-serviceloaders</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../../pom.xml</relativePath>
|
<relativePath>../../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>hapi-fhir-spring-boot-sample-client-apache</artifactId>
|
<artifactId>hapi-fhir-spring-boot-sample-client-apache</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>hapi-fhir-spring-boot-sample-client-okhttp</artifactId>
|
<artifactId>hapi-fhir-spring-boot-sample-client-okhttp</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>hapi-fhir-spring-boot-sample-server-jersey</artifactId>
|
<artifactId>hapi-fhir-spring-boot-sample-server-jersey</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir-spring-boot</artifactId>
|
<artifactId>hapi-fhir-spring-boot</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
<artifactId>hapi-fhir-spring-boot-samples</artifactId>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-fhir</artifactId>
|
<artifactId>hapi-fhir</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>ca.uhn.hapi.fhir</groupId>
|
<groupId>ca.uhn.hapi.fhir</groupId>
|
||||||
<artifactId>hapi-deployable-pom</artifactId>
|
<artifactId>hapi-deployable-pom</artifactId>
|
||||||
<version>6.5.3-SNAPSHOT</version>
|
<version>6.5.4-SNAPSHOT</version>
|
||||||
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
|
||||||
</parent>
|
</parent>
|
||||||
<modelVersion>4.0.0</modelVersion>
|
<modelVersion>4.0.0</modelVersion>
|
||||||
|
|
|
@ -57,13 +57,13 @@ public class BulkExportCreateReportStep implements IReductionStepWorker<BulkExpo
|
||||||
results.setOriginalRequestUrl(requestUrl);
|
results.setOriginalRequestUrl(requestUrl);
|
||||||
|
|
||||||
if (myResourceToBinaryIds != null) {
|
if (myResourceToBinaryIds != null) {
|
||||||
ourLog.info("Bulk Export Report creation step");
|
ourLog.info("Bulk Export Report creation step for instance: {}", theStepExecutionDetails.getInstance().getInstanceId());
|
||||||
|
|
||||||
results.setResourceTypeToBinaryIds(myResourceToBinaryIds);
|
results.setResourceTypeToBinaryIds(myResourceToBinaryIds);
|
||||||
|
|
||||||
myResourceToBinaryIds = null;
|
myResourceToBinaryIds = null;
|
||||||
} else {
|
} else {
|
||||||
String msg = "Export complete, but no data to generate report.";
|
String msg = "Export complete, but no data to generate report for job instance: " + theStepExecutionDetails.getInstance().getInstanceId();
|
||||||
ourLog.warn(msg);
|
ourLog.warn(msg);
|
||||||
|
|
||||||
results.setReportMsg(msg);
|
results.setReportMsg(msg);
|
||||||
|
|
|
@ -28,14 +28,20 @@ import ca.uhn.fhir.batch2.api.StepExecutionDetails;
|
||||||
import ca.uhn.fhir.batch2.jobs.export.models.BulkExportJobParameters;
|
import ca.uhn.fhir.batch2.jobs.export.models.BulkExportJobParameters;
|
||||||
import ca.uhn.fhir.batch2.jobs.export.models.ExpandedResourcesList;
|
import ca.uhn.fhir.batch2.jobs.export.models.ExpandedResourcesList;
|
||||||
import ca.uhn.fhir.batch2.jobs.export.models.ResourceIdList;
|
import ca.uhn.fhir.batch2.jobs.export.models.ResourceIdList;
|
||||||
import ca.uhn.fhir.batch2.jobs.models.BatchResourceId;
|
|
||||||
import ca.uhn.fhir.context.FhirContext;
|
import ca.uhn.fhir.context.FhirContext;
|
||||||
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
|
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
|
||||||
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
|
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
|
||||||
|
import ca.uhn.fhir.jpa.api.model.PersistentIdToForcedIdMap;
|
||||||
import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
|
import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
|
||||||
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
|
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
|
||||||
|
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
|
||||||
import ca.uhn.fhir.jpa.model.entity.StorageSettings;
|
import ca.uhn.fhir.jpa.model.entity.StorageSettings;
|
||||||
|
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
||||||
import ca.uhn.fhir.parser.IParser;
|
import ca.uhn.fhir.parser.IParser;
|
||||||
|
import ca.uhn.fhir.rest.api.server.IBundleProvider;
|
||||||
|
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
|
||||||
|
import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId;
|
||||||
|
import ca.uhn.fhir.rest.param.TokenOrListParam;
|
||||||
import ca.uhn.fhir.rest.server.interceptor.ResponseTerminologyTranslationSvc;
|
import ca.uhn.fhir.rest.server.interceptor.ResponseTerminologyTranslationSvc;
|
||||||
import com.google.common.collect.ArrayListMultimap;
|
import com.google.common.collect.ArrayListMultimap;
|
||||||
import com.google.common.collect.ListMultimap;
|
import com.google.common.collect.ListMultimap;
|
||||||
|
@ -47,7 +53,11 @@ import org.springframework.context.ApplicationContext;
|
||||||
import javax.annotation.Nonnull;
|
import javax.annotation.Nonnull;
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.Set;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
|
import static ca.uhn.fhir.rest.api.Constants.PARAM_ID;
|
||||||
import static org.slf4j.LoggerFactory.getLogger;
|
import static org.slf4j.LoggerFactory.getLogger;
|
||||||
|
|
||||||
public class ExpandResourcesStep implements IJobStepWorker<BulkExportJobParameters, ResourceIdList, ExpandedResourcesList> {
|
public class ExpandResourcesStep implements IJobStepWorker<BulkExportJobParameters, ResourceIdList, ExpandedResourcesList> {
|
||||||
|
@ -71,6 +81,9 @@ public class ExpandResourcesStep implements IJobStepWorker<BulkExportJobParamete
|
||||||
@Autowired
|
@Autowired
|
||||||
private IIdHelperService myIdHelperService;
|
private IIdHelperService myIdHelperService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private IHapiTransactionService myTransactionService;
|
||||||
|
|
||||||
private volatile ResponseTerminologyTranslationSvc myResponseTerminologyTranslationSvc;
|
private volatile ResponseTerminologyTranslationSvc myResponseTerminologyTranslationSvc;
|
||||||
|
|
||||||
@Nonnull
|
@Nonnull
|
||||||
|
@ -102,7 +115,7 @@ public class ExpandResourcesStep implements IJobStepWorker<BulkExportJobParamete
|
||||||
terminologyTranslationSvc.processResourcesForTerminologyTranslation(allResources);
|
terminologyTranslationSvc.processResourcesForTerminologyTranslation(allResources);
|
||||||
}
|
}
|
||||||
|
|
||||||
// encode them
|
// encode them - Key is resource type, Value is a collection of serialized resources of that type
|
||||||
ListMultimap<String, String> resources = encodeToString(allResources, jobParameters);
|
ListMultimap<String, String> resources = encodeToString(allResources, jobParameters);
|
||||||
|
|
||||||
// set to datasink
|
// set to datasink
|
||||||
|
@ -125,12 +138,46 @@ public class ExpandResourcesStep implements IJobStepWorker<BulkExportJobParamete
|
||||||
}
|
}
|
||||||
|
|
||||||
private List<IBaseResource> fetchAllResources(ResourceIdList theIds) {
|
private List<IBaseResource> fetchAllResources(ResourceIdList theIds) {
|
||||||
List<IBaseResource> resources = new ArrayList<>();
|
ArrayListMultimap<String, String> typeToIds = ArrayListMultimap.create();
|
||||||
|
theIds.getIds().forEach(t -> typeToIds.put(t.getResourceType(), t.getId()));
|
||||||
|
|
||||||
|
List<IBaseResource> resources = new ArrayList<>(theIds.getIds().size());
|
||||||
|
|
||||||
|
for (String resourceType : typeToIds.keySet()) {
|
||||||
|
|
||||||
|
IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(resourceType);
|
||||||
|
List<String> allIds = typeToIds.get(resourceType);
|
||||||
|
while (!allIds.isEmpty()) {
|
||||||
|
|
||||||
|
// Load in batches in order to avoid having too many PIDs go into a
|
||||||
|
// single SQ statement at once
|
||||||
|
int batchSize = Math.min(500, allIds.size());
|
||||||
|
|
||||||
|
Set<IResourcePersistentId> nextBatchOfPids =
|
||||||
|
allIds
|
||||||
|
.subList(0, batchSize)
|
||||||
|
.stream()
|
||||||
|
.map(t -> myIdHelperService.newPidFromStringIdAndResourceName(t, resourceType))
|
||||||
|
.collect(Collectors.toSet());
|
||||||
|
allIds = allIds.subList(batchSize, allIds.size());
|
||||||
|
|
||||||
|
PersistentIdToForcedIdMap nextBatchOfResourceIds = myTransactionService
|
||||||
|
.withRequest(null)
|
||||||
|
.execute(() -> myIdHelperService.translatePidsToForcedIds(nextBatchOfPids));
|
||||||
|
|
||||||
|
TokenOrListParam idListParam = new TokenOrListParam();
|
||||||
|
for (IResourcePersistentId nextPid : nextBatchOfPids) {
|
||||||
|
Optional<String> resourceId = nextBatchOfResourceIds.get(nextPid);
|
||||||
|
idListParam.add(resourceId.orElse(nextPid.getId().toString()));
|
||||||
|
}
|
||||||
|
|
||||||
|
SearchParameterMap spMap = SearchParameterMap
|
||||||
|
.newSynchronous()
|
||||||
|
.add(PARAM_ID, idListParam);
|
||||||
|
IBundleProvider outcome = dao.search(spMap, new SystemRequestDetails());
|
||||||
|
resources.addAll(outcome.getAllResources());
|
||||||
|
}
|
||||||
|
|
||||||
for (BatchResourceId batchResourceId : theIds.getIds()) {
|
|
||||||
IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(batchResourceId.getResourceType());
|
|
||||||
// This should be a query, but we have PIDs, and we don't have a _pid search param. TODO GGG, figure out how to make this search by pid.
|
|
||||||
resources.add(dao.readByPid(myIdHelperService.newPidFromStringIdAndResourceName(batchResourceId.getId(), batchResourceId.getResourceType())));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return resources;
|
return resources;
|
||||||
|
|
|
@ -12,12 +12,17 @@ import ca.uhn.fhir.batch2.model.JobInstance;
|
||||||
import ca.uhn.fhir.context.FhirContext;
|
import ca.uhn.fhir.context.FhirContext;
|
||||||
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
|
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
|
||||||
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
|
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
|
||||||
|
import ca.uhn.fhir.jpa.api.model.PersistentIdToForcedIdMap;
|
||||||
import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
|
import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
|
||||||
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
|
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
|
||||||
|
import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService;
|
||||||
|
import ca.uhn.fhir.jpa.dao.tx.NonTransactionalHapiTransactionService;
|
||||||
import ca.uhn.fhir.jpa.model.dao.JpaPid;
|
import ca.uhn.fhir.jpa.model.dao.JpaPid;
|
||||||
import ca.uhn.fhir.jpa.model.entity.StorageSettings;
|
import ca.uhn.fhir.jpa.model.entity.StorageSettings;
|
||||||
import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions;
|
import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions;
|
||||||
import ca.uhn.fhir.rest.api.server.storage.BaseResourcePersistentId;
|
import ca.uhn.fhir.rest.api.server.storage.BaseResourcePersistentId;
|
||||||
|
import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId;
|
||||||
|
import ca.uhn.fhir.rest.server.SimpleBundleProvider;
|
||||||
import ca.uhn.fhir.rest.server.interceptor.ResponseTerminologyTranslationSvc;
|
import ca.uhn.fhir.rest.server.interceptor.ResponseTerminologyTranslationSvc;
|
||||||
import org.hl7.fhir.instance.model.api.IBaseResource;
|
import org.hl7.fhir.instance.model.api.IBaseResource;
|
||||||
import org.hl7.fhir.r4.model.Patient;
|
import org.hl7.fhir.r4.model.Patient;
|
||||||
|
@ -32,6 +37,10 @@ import org.mockito.junit.jupiter.MockitoExtension;
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.Arrays;
|
import java.util.Arrays;
|
||||||
import java.util.Date;
|
import java.util.Date;
|
||||||
|
import java.util.HashMap;
|
||||||
|
import java.util.Map;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.Set;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
import static org.junit.jupiter.api.Assertions.assertFalse;
|
import static org.junit.jupiter.api.Assertions.assertFalse;
|
||||||
|
@ -62,6 +71,9 @@ public class ExpandResourcesStepTest {
|
||||||
@Spy
|
@Spy
|
||||||
private StorageSettings myStorageSettings = new StorageSettings();
|
private StorageSettings myStorageSettings = new StorageSettings();
|
||||||
|
|
||||||
|
@Spy
|
||||||
|
private IHapiTransactionService myTransactionService = new NonTransactionalHapiTransactionService();
|
||||||
|
|
||||||
@InjectMocks
|
@InjectMocks
|
||||||
private ExpandResourcesStep mySecondStep;
|
private ExpandResourcesStep mySecondStep;
|
||||||
|
|
||||||
|
@ -122,9 +134,17 @@ public class ExpandResourcesStepTest {
|
||||||
createParameters(),
|
createParameters(),
|
||||||
instance
|
instance
|
||||||
);
|
);
|
||||||
ArrayList<IBaseResource> clone = new ArrayList<>(resources);
|
when(patientDao.search(any(), any())).thenReturn(new SimpleBundleProvider(resources));
|
||||||
when(patientDao.readByPid(any(BaseResourcePersistentId.class))).thenAnswer(i -> clone.remove(0));
|
|
||||||
when(myIdHelperService.newPidFromStringIdAndResourceName(anyString(), anyString())).thenReturn(JpaPid.fromId(1L));
|
when(myIdHelperService.newPidFromStringIdAndResourceName(anyString(), anyString())).thenReturn(JpaPid.fromId(1L));
|
||||||
|
when(myIdHelperService.translatePidsToForcedIds(any())).thenAnswer(t->{
|
||||||
|
Set<IResourcePersistentId<?>> inputSet = t.getArgument(0, Set.class);
|
||||||
|
Map<IResourcePersistentId<?>, Optional<String>> map = new HashMap<>();
|
||||||
|
for (var next : inputSet) {
|
||||||
|
map.put(next, Optional.empty());
|
||||||
|
}
|
||||||
|
return new PersistentIdToForcedIdMap<>(map);
|
||||||
|
});
|
||||||
|
|
||||||
// test
|
// test
|
||||||
RunOutcome outcome = mySecondStep.run(input, sink);
|
RunOutcome outcome = mySecondStep.run(input, sink);
|
||||||
|
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue