From 568c6d20db9cf738110356f2e7d045e2a338ac6c Mon Sep 17 00:00:00 2001 From: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com> Date: Wed, 14 Aug 2024 15:03:35 -0600 Subject: [PATCH 1/3] rel_7_4 Merge-back (#6212) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * use SearchParamater validator in package installer (#6112) * Ensure ' ' is treated as '+' in timezones with offsets. (#6115) * Use lockless mode when adding index on Azure Sql server (#6100) * Use lockless mode when adding index on Azure Sql server Use try-catch for Online add-index on Sql Server. This avoids having to map out the entire matrix of Sql Server product names and ONLINE index support. Warnings in docs, and cleanups * make consent service dont call willSeeResource on children if parent resource is AUTHORIZED or REJECT (#6127) * fix hfj search migration task (#6143) * fix migration task * changelog * changelog * code review * spotless --------- Co-authored-by: jdar * Enhance migration for MSSQL to change the collation for HFJ_RESOURCE.FHIR_ID to case sensitive (#6135) * MSSQL: Migrate HFJ_RESOURCE.FHIR_ID to new collation: SQL_Latin1_General_CP1_CS_AS * Spotless. * Enhance test. Fix case in ResourceSearchView to defend against future migration to case insensitive collation. * Remove TODOs. Add comment to ResourceSearchView explaining why all columns are uppercase. Changelog. * Update hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6146-mssql-hfj-resource-fhir-id-colllation.yaml Code reviewer suggestion Co-authored-by: Michael Buckley * Code review fixes: Make changes conditional on the collation including _CI_, otherwise, leave it alone. --------- Co-authored-by: Michael Buckley * Common API for FHIR Data Access (#6141) * Add initial interface for common FHIR API * Fix formatting * Update javadocs * Address code review comments * Add path value to _id search parameter and other missing search param… (#6128) * Add path value to _id search parameter and other missing search parameters to IAnyResource. * Adjust tests and remove now unnecessary addition of meta parameters which are now provided by IAnyResource * Revert unneeded change * _security param is not token but uri * Add tests for new defined resource-level standard parameters * Adjust test --------- Co-authored-by: juan.marchionatto * update to online (#6157) * SEARCH_UUID should be non-null (#6165) Avoid using constants in migrations because it creates false history. * Handle 400 and 404 codes returned by remote terminology operation. (#6151) * Handle 400 and 404 codes returned by remote terminology operation. * Some simplification * Adjust changelog * Add a comment to explain alternate solution which can be reused. * fix concepts with no display element for $apply-codesystem-delta-add and $apply-codesystem-delta-remove (#6164) * allow transaction with update conditional urls (#6155) * Revert "Add path value to _id search parameter and other missing search param…" (#6171) This reverts commit 2275eba1a0f379b6ccb2d1b467fa22bd9febcf54. * 7 2 2 mb (#6160) * Enhance RuleBuilder code to support multiple instances (#5852) * Overhaul bulk export permissions. * Overhaul bulk export permissions. * Small tweak to rule builder. * Cleanup validation. * Cleanup validation. * Code review feedback. * Postgres terminology service hard coded column names migration (#5866) * updating parent pids column name * updating name of the fullTestField Search * updating name of the fullTestField Search * fixing typo. * failing test. * - Moving FullTextField annotation from getter method and adding it to the newly added VC property of the entity; - reverting the name of the FullTextField entity to its previous name of 'myParentPids'; - reverting the name of the lucene index to search on in the terminology service. - updating the changelog; * making spotless happy --------- Co-authored-by: peartree * 5879 back porting fix for issue 5877 (attempting to update a tokenparam with a value greater than 200 characters raises an sqlexception) to release rel_7_2 (#5881) * initial failing test. * solution * adding changelog * spotless * moving changelog from 7_4_0 to 7_2_0 and deleting 7_4_0 folder. --------- Co-authored-by: peartree * Expose BaseRequestPartitionHelperSvc validateAndNormalize methods (#5811) * Expose BaseRequestPartitionHelperSvc validate and normalize methods * Compilation errors * change mock test to jpa test * change mock test to jpa test * validateAndNormalizePartitionIds * validateAndNormalizePartitionNames * validateAndNormalizePartitionIds validation + bug fix * validateAndNormalizePartitionNames validation * fix test * version bump * Ensure a non-numeric FHIR ID doesn't result in a NumberFormatException when processing survivorship rules (#5883) * Add failing test as well as commented out potential solution. * Fix for NumberFormatException. * Add conditional test for survivorship rules. * Spotless. * Add changelog. * Code review feedback. * updating documentation (#5889) * Ensure temp file ends with "." and then suffix. (#5894) * bugfix to https://github.com/hapifhir/hapi-fhir-jpaserver-starter/issues/675 (#5892) Co-authored-by: Jens Kristian Villadsen * Enhance mdm interceptor (#5899) * Add MDM Transaction Context for further downstream processing giving interceptors a better chance of figuring out what happened. * Added javadoc * Cahngelog * spotless --------- Co-authored-by: Jens Kristian Villadsen * Fix BaseHapiFhirResourceDao $meta method to use HapiTransactionService instead of @Transaction (#5896) * Try making ResourceTable.myTags EAGER instead of LAZY and see if it breaks anything. * Try making ResourceTable.myTags EAGER instead of LAZY and see if it breaks anything. * Ensure BaseHapiFhirResourceDao#metaGetOperation uses HapiTransactionService instead of @Transactional in order to resolve megascale $meta bug. * Add changelog. * Update hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_0/5898-ld-megascale-meta-operation-fails-hapi-0389.yaml Commit code reviewer suggestion. Co-authored-by: Tadgh --------- Co-authored-by: Tadgh * Fix query chained on sort bug where we over-filter results (#5903) * Failing test. * Ensure test cleanup doesn't fail by deleting Patients before Practitioners. * Implement fix. * Spotless. * Clean up unit test and add changelog. Fix unit test. * Fix changelog file. * Apply suggestions from code review Apply code review suggestions. Co-authored-by: Michael Buckley * Spotless --------- Co-authored-by: Michael Buckley * cve fix (#5906) Co-authored-by: Long Ma * Fixing issues with postgres LOB migration. (#5895) * Fixing issues with postgres LOB migration. * addressing code review comments for audit/transaction logs. * test and implementation for BinaryStorageEntity migration post code review. * test and implementation for BinaryStorageEntity migration post code review. * test and implementation for TermConcept migration post code review. * applying spotless * test and implementation for TermConceptProperty migration post code review. * test and implementation for TermValueSetConcept migration post code review. * fixing migration version * fixing migration task * changelog * fixing changelog * Minor renames * addressing comments and suggestions from second code review. * passing tests * fixing more tests --------- Co-authored-by: peartree Co-authored-by: Tadgh * 6051 bulk export security errors (#5915) * Enhance RuleBuilder code to support multiple instances (#5852) * Overhaul bulk export permissions. * Overhaul bulk export permissions. * Small tweak to rule builder. * Cleanup validation. * Cleanup validation. * Code review feedback. * Postgres terminology service hard coded column names migration (#5866) * updating parent pids column name * updating name of the fullTestField Search * updating name of the fullTestField Search * fixing typo. * failing test. * - Moving FullTextField annotation from getter method and adding it to the newly added VC property of the entity; - reverting the name of the FullTextField entity to its previous name of 'myParentPids'; - reverting the name of the lucene index to search on in the terminology service. - updating the changelog; * making spotless happy --------- Co-authored-by: peartree * 5879 back porting fix for issue 5877 (attempting to update a tokenparam with a value greater than 200 characters raises an sqlexception) to release rel_7_2 (#5881) * initial failing test. * solution * adding changelog * spotless * moving changelog from 7_4_0 to 7_2_0 and deleting 7_4_0 folder. --------- Co-authored-by: peartree * Expose BaseRequestPartitionHelperSvc validateAndNormalize methods (#5811) * Expose BaseRequestPartitionHelperSvc validate and normalize methods * Compilation errors * change mock test to jpa test * change mock test to jpa test * validateAndNormalizePartitionIds * validateAndNormalizePartitionNames * validateAndNormalizePartitionIds validation + bug fix * validateAndNormalizePartitionNames validation * fix test * version bump * Ensure a non-numeric FHIR ID doesn't result in a NumberFormatException when processing survivorship rules (#5883) * Add failing test as well as commented out potential solution. * Fix for NumberFormatException. * Add conditional test for survivorship rules. * Spotless. * Add changelog. * Code review feedback. * updating documentation (#5889) * Ensure temp file ends with "." and then suffix. (#5894) * bugfix to https://github.com/hapifhir/hapi-fhir-jpaserver-starter/issues/675 (#5892) Co-authored-by: Jens Kristian Villadsen * Enhance mdm interceptor (#5899) * Add MDM Transaction Context for further downstream processing giving interceptors a better chance of figuring out what happened. * Added javadoc * Cahngelog * spotless --------- Co-authored-by: Jens Kristian Villadsen * Fix BaseHapiFhirResourceDao $meta method to use HapiTransactionService instead of @Transaction (#5896) * Try making ResourceTable.myTags EAGER instead of LAZY and see if it breaks anything. * Try making ResourceTable.myTags EAGER instead of LAZY and see if it breaks anything. * Ensure BaseHapiFhirResourceDao#metaGetOperation uses HapiTransactionService instead of @Transactional in order to resolve megascale $meta bug. * Add changelog. * Update hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_0/5898-ld-megascale-meta-operation-fails-hapi-0389.yaml Commit code reviewer suggestion. Co-authored-by: Tadgh --------- Co-authored-by: Tadgh * Fix query chained on sort bug where we over-filter results (#5903) * Failing test. * Ensure test cleanup doesn't fail by deleting Patients before Practitioners. * Implement fix. * Spotless. * Clean up unit test and add changelog. Fix unit test. * Fix changelog file. * Apply suggestions from code review Apply code review suggestions. Co-authored-by: Michael Buckley * Spotless --------- Co-authored-by: Michael Buckley * cve fix (#5906) Co-authored-by: Long Ma * Fixing issues with postgres LOB migration. (#5895) * Fixing issues with postgres LOB migration. * addressing code review comments for audit/transaction logs. * test and implementation for BinaryStorageEntity migration post code review. * test and implementation for BinaryStorageEntity migration post code review. * test and implementation for TermConcept migration post code review. * applying spotless * test and implementation for TermConceptProperty migration post code review. * test and implementation for TermValueSetConcept migration post code review. * fixing migration version * fixing migration task * changelog * fixing changelog * Minor renames * addressing comments and suggestions from second code review. * passing tests * fixing more tests --------- Co-authored-by: peartree Co-authored-by: Tadgh * refactor bulk export rule, add concept of appliestoallpatients, fix tests * spotless * Cahgnelog, tests * more tests * refactor style checks --------- Co-authored-by: Luke deGruchy Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com> Co-authored-by: peartree Co-authored-by: Nathan Doef Co-authored-by: TipzCM Co-authored-by: dotasek Co-authored-by: Jens Kristian Villadsen Co-authored-by: Michael Buckley Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com> Co-authored-by: Long Ma * Convert a few nulls to aggressive denies * Change chain sort syntax for MS SQL (#5917) * Change sort type on chains * Change sort type on chains * Test for MS SQL * Comments * Version bump * Updating version to: 7.2.1 post release. * Fix queries with chained sort with Lucene by checking supported SortSpecs (#5958) * First commit with very rough solution. * Solidify solutions for both requirements. Add new tests. Enhance others. * Spotless. * Add new chained sort spec algorithm. Add new Msg.codes. Finalize tests. Update docs. Add changelog. * pom remove the snapshot * Updating version to: 7.2.2 post release. * cherry-picked pr 6051 * changelog fix * cherry-picked 6027 * docs and changelog * merge fix for issue with infinite cache refresh loop * Use lockless mode when adding index on Azure Sql server (#6100) (#6129) * Use lockless mode when adding index on Azure Sql server Use try-catch for Online add-index on Sql Server. This avoids having to map out the entire matrix of Sql Server product names and ONLINE index support. Warnings in docs, and cleanups * added fix for 6133 * failing Test * Add fix * spotless * Remove useless file * Fix claeaner * cleanup * Remove dead class * Changelog * test description * Add test. Fix broken logic. * fix quantity search parameter test to pass * reverted test testDirectPathWholeResourceNotIndexedWorks in FhirResourceDaoR4SearchWithElasticSearchIT * spotless * cleanup mistake during merge * added missing imports * fix more mergeback oopsies * bump to 7.3.13-snapshot --------- Co-authored-by: Luke deGruchy Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com> Co-authored-by: peartree Co-authored-by: Nathan Doef Co-authored-by: TipzCM Co-authored-by: dotasek Co-authored-by: Jens Kristian Villadsen Co-authored-by: Tadgh Co-authored-by: Michael Buckley Co-authored-by: Long Ma Co-authored-by: markiantorno * Patient validate operation with remote terminology service enabled returns 400 bad request (#6124) * Patient $validate operation with Remote Terminology Service enabled returns 400 Bad Request - failing test * Patient $validate operation with Remote Terminology Service enabled returns 400 Bad Request - implementation * - Changing method accessibility from default to public to allow method overwriting. (#6172) Co-authored-by: peartree * applying Taha Attari's fix on branch merging to rel_7_4 (#6177) Co-authored-by: peartree * Automated Migration Testing (HAPI-FHIR) V7_4_0 (#6170) * Automated Migration Testing (HAPI-FHIR) - updated test migration scripts for 7_4_0 * Automated Migration Testing (HAPI-FHIR) - updated test migration scripts for 7_2_0 * To provide the target resource partitionId and partitionDate in the resourceLinlk (#6149) * initial POC. * addressing comments from first code review * Adding tests * adding changelog and spotless * fixing tests * spotless --------- Co-authored-by: peartree * applying patch (#6190) Co-authored-by: peartree * cve for 08 release (#6197) Co-authored-by: Long Ma * Search param path missing for _id param (#6175) * Add path tp _id search param and definitions for _lastUpdated _tag, _profile and _security * Add tests and changelog * Increase snapshot version * Irrelevant change to force new build --------- Co-authored-by: juan.marchionatto * Reverting to core fhir-test-cases 1.1.14; (#6194) re-enabling FhirPatchCoreTest Co-authored-by: peartree * Fix $reindex job with custom partition interceptor based on resource type. Update reindex job to always run with urls. (#6185) * Refactor logic to bring together partition related logic in batch2 jobs using IJobPartitionProvider. Update logic such that reindex job without urls will attempt to create urls for all supported resource types. * Small changes and fix of pipeline error. * Small change to enable mdm-submit to use PartitionedUrl in the job parameters * Revert logback change. Fix dependency version generating errors in the pipeline. * Spotless fix. Add test dependency back without version. * Upgrade test dependency to another version * Add javadoc for PartitionedUrl. Other small fixes and refactoring in tests. * Spotless fix. * Change to JobParameters to fix some of the tests. * Small changes for code review in test * Address code review comments. * Revert change from bad merge. * Address remaining code review comments * 6188 subscription not marked as a cross partition subscription matches operation on resources in other partitions (#6191) * initial failing test * WIP * fixing/adding tests * added changelog * spotless * fixing tests * Cleaning up tests * addressing commetns from first code review. * no-op to get pipelines going --------- Co-authored-by: peartree * Resolve 6173 - Log unhandled Exceptions in RestfulServer (#6176) (#6205) * 6173 - Log unhandled Exceptions in RestfulServer. * Use placeholder for failed streams. * Starting test for server handling. * Got test working. * Fixed use of synchronized keyword. * Applied mvn spotless. --------- Co-authored-by: Kevin Dougan <72025369+KevinDougan@users.noreply.github.com> Co-authored-by: Michael Buckley * Partition aware transactions (#6167) * Partition aware transactions * Address review comments * Test fixes * Remove dead issue field * Test fixes --------- Co-authored-by: Tadgh * Add license header * rel_7_4 mergeback --------- Co-authored-by: Emre Dincturk <74370953+mrdnctrk@users.noreply.github.com> Co-authored-by: Luke deGruchy Co-authored-by: Michael Buckley Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com> Co-authored-by: jdar Co-authored-by: Luke deGruchy Co-authored-by: JP Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com> Co-authored-by: juan.marchionatto Co-authored-by: TipzCM Co-authored-by: Martha Mitran Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com> Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com> Co-authored-by: peartree Co-authored-by: Nathan Doef Co-authored-by: dotasek Co-authored-by: Jens Kristian Villadsen Co-authored-by: Tadgh Co-authored-by: Long Ma Co-authored-by: markiantorno Co-authored-by: Martha Mitran Co-authored-by: Kevin Dougan <72025369+KevinDougan@users.noreply.github.com> Co-authored-by: James Agnew --- NOTICE.txt | 1 + .../context/support/IValidationSupport.java | 3 +- .../interceptor/model/RequestPartitionId.java | 31 +++ .../ca/uhn/fhir/repository/Repository.java | 19 ++ .../java/ca/uhn/fhir/util/FhirTerser.java | 36 ++- .../ca/uhn/fhir/util/SubscriptionUtil.java | 2 +- .../java/ca/uhn/fhir/util/VersionEnum.java | 2 + .../fhir/instance/model/api/IAnyResource.java | 92 ++++++- .../ca/uhn/fhir/i18n/hapi-messages.properties | 4 + .../model/RequestPartitionIdTest.java | 44 ++++ .../uhn/hapi/fhir/changelog/7_2_1/upgrade.md | 0 .../hapi/fhir/changelog/7_2_1/version.yaml | 3 + .../uhn/hapi/fhir/changelog/7_2_2/upgrade.md | 0 .../hapi/fhir/changelog/7_2_2/version.yaml | 3 + ...fy-parse-copy-contained-duplicate-bug.yaml | 5 + ...ucene-chain-sort-silently-not-working.yaml | 6 + ...l-index-search-with-filter-parameters.yaml | 1 + ...ssue-with-cache-refresh-infinite-loop.yaml | 1 + ...-failed-text-and-content-search-in-r5.yaml | 1 + .../changelog/7_4_0/6046-text-contains.yaml | 1 + ...-service-enabled-returned-bad-request.yaml | 5 + ...ch-parameters-missing-in-ianyresource.yaml | 6 + ...g-search-not-rely-on-hibernate-search.yaml | 6 + .../7_4_0/6134-filesystem-regression.yaml | 6 + ...source-partition-info-in-resourcelink.yaml | 5 + ...ueset-with-remote-terminology-systems.yaml | 6 + ...sactional-update-with-conditional-url.yaml | 11 + ...dx-idxcmbtqknu-hashc-to-be-concurrent.yaml | 6 + ...r-remove-fail-without-concept-display.yaml | 8 + ...e-transactions-better-partition-aware.yaml | 7 + ...index-job-no-longer-runs-without-urls.yaml | 7 + ...e-type-based-partitioning-interceptor.yaml | 6 + ...ss-partition-subs-match-all-resources.yaml | 6 + .../{reindex => batch2}/Batch2DaoSvcImpl.java | 12 +- .../uhn/fhir/jpa/batch2/JpaBatch2Config.java | 9 - .../jpa/batch2/JpaJobPartitionProvider.java | 36 +-- .../fhir/jpa/config/Batch2SupportConfig.java | 2 +- .../fhir/jpa/dao/BaseHapiFhirResourceDao.java | 35 +-- .../fhir/jpa/dao/TransactionProcessor.java | 38 +-- .../fhir/jpa/dao/data/IResourceTableDao.java | 9 +- .../data/custom/IResourceTableDaoImpl.java | 29 ++- .../fhir/jpa/dao/index/IdHelperService.java | 32 ++- .../search/ExtendedHSearchSearchBuilder.java | 20 +- .../tasks/HapiFhirJpaMigrationTasks.java | 25 +- .../jpa/model/cross/JpaResourceLookup.java | 15 +- .../provider/TerminologyUploaderProvider.java | 3 + .../batch2/JpaJobPartitionProviderTest.java | 76 ------ ...esourceDaoR4SearchWithElasticSearchIT.java | 26 ++ .../jpa/mdm/svc/MdmControllerSvcImpl.java | 7 +- .../ca/uhn/fhir/jpa/model/dao/JpaPid.java | 11 + .../jpa/model/entity/BasePartitionable.java | 11 +- .../entity/PartitionablePartitionId.java | 6 + .../fhir/jpa/model/entity/ResourceLink.java | 138 +++++++++-- .../model/ReadPartitionIdRequestDetails.java | 18 ++ .../fhir/jpa/searchparam/MatchUrlService.java | 57 ++--- .../SearchParamExtractorService.java | 106 ++++----- .../ResourceIndexedSearchParamsTest.java | 23 +- .../FhirContextSearchParamRegistryTest.java | 38 +++ .../SubscriptionMatchingSubscriber.java | 6 +- .../SubscriptionValidatingInterceptor.java | 2 +- .../SubscriptionTriggeringSvcImpl.java | 2 +- .../subscription/util/SubscriptionUtil.java | 2 +- .../jpa/topic/SubscriptionTopicConfig.java | 12 +- .../SubscriptionCanonicalizerTest.java | 192 +++++++++++---- .../module/CanonicalSubscriptionTest.java | 77 +++--- .../SubscriptionMatchingSubscriberTest.java | 114 +-------- ...ceDaoDstu3SearchCustomSearchParamTest.java | 31 +-- .../dao/dstu3/FhirResourceDaoDstu3Test.java | 105 +++++++- .../jpa/searchparam/MatchUrlServiceTest.java | 16 +- .../fhir/jpa/batch2/Batch2DaoSvcImplTest.java | 225 ++++++++++++++++++ .../jpa/batch2/Batch2JobMaintenanceIT.java | 3 +- .../jpa/batch2/JpaJobPersistenceImplTest.java | 2 +- .../FilesystemBinaryStorageSvcImplTest.java | 28 +++ .../jpa/dao/BaseHapiFhirResourceDaoTest.java | 35 +-- .../jpa/dao/index/IdHelperServiceTest.java | 16 +- .../jpa/dao/r4/BasePartitioningR4Test.java | 14 +- .../r4/FhirResourceDaoR4QueryCountTest.java | 18 +- .../jpa/dao/r4/FhirResourceDaoR4Test.java | 166 ++++++++++++- .../PartitionedStrictTransactionR4Test.java | 211 ++++++++++++++++ ...itioningNonNullDefaultPartitionR4Test.java | 6 +- .../jpa/dao/r4/PartitioningSqlR4Test.java | 37 ++- .../uhn/fhir/jpa/dao/tx/ReindexStepTest.java | 3 +- ...esourceTypePartitionInterceptorR4Test.java | 88 +++++++ ...rtitionedSubscriptionTriggeringR4Test.java | 90 +++++-- .../r4/MultitenantBatchOperationR4Test.java | 49 ++-- .../r4/TerminologyUploaderProviderR4Test.java | 189 ++++++++++++--- .../jpa/reindex/Batch2DaoSvcImplTest.java | 127 ---------- .../job => reindex}/ReindexJobTest.java | 7 +- .../ReindexJobWithPartitioningTest.java | 102 +++++--- .../job => reindex}/ReindexTestHelper.java | 2 +- .../stresstest/GiantTransactionPerfTest.java | 1 + .../jpa/term/TerminologySvcDeltaR4Test.java | 34 +-- .../r5/FhirSystemDaoTransactionR5Test.java | 4 +- .../provider/r5/ResourceProviderR5Test.java | 3 +- ...anceReindexServiceImplNarrativeR5Test.java | 26 +- .../InstanceReindexServiceImplR5Test.java | 2 +- .../HapiEmbeddedDatabasesExtension.java | 13 +- .../QuantitySearchParameterTestCases.java | 6 +- .../ca/uhn/fhir/jpa/test/BaseJpaR4Test.java | 2 +- .../ca/uhn/fhir/jpa/test/BaseJpaTest.java | 2 + .../ca/uhn/fhir/jpa/test/Batch2JobHelper.java | 2 +- .../releases/V7_2_0/data/H2_EMBEDDED.sql | 92 ++++++- .../releases/V7_2_0/data/MSSQL_2012.sql | 92 ++++++- .../releases/V7_2_0/data/ORACLE_12C.sql | 92 ++++++- .../releases/V7_2_0/data/POSTGRES_9_4.sql | 96 +++++++- .../releases/V7_4_0/data/H2_EMBEDDED.sql | 53 +++++ .../releases/V7_4_0/data/MSSQL_2012.sql | 53 +++++ .../releases/V7_4_0/data/ORACLE_12C.sql | 53 +++++ .../releases/V7_4_0/data/POSTGRES_9_4.sql | 53 +++++ .../jpa/embedded/HapiSchemaMigrationTest.java | 6 +- .../expunge/DeleteExpungeJobParameters.java | 4 +- .../DeleteExpungeJobParametersValidator.java | 5 +- .../DeleteExpungeJobSubmitterImpl.java | 22 +- .../batch2/jobs/reindex/ReindexAppCtx.java | 6 +- .../jobs/reindex/ReindexJobParameters.java | 4 +- .../ReindexJobParametersValidator.java | 15 +- .../batch2/jobs/reindex/ReindexProvider.java | 17 +- .../jobs/reindex/ReindexProviderTest.java | 101 ++++---- .../test/IJobPartitionProviderTest.java | 114 +++++++++ .../batch2/api/IJobPartitionProvider.java | 20 +- .../fhir/batch2/config/BaseBatch2Config.java | 11 +- .../DefaultJobPartitionProvider.java | 107 +++++++++ .../SimpleJobPartitionProvider.java | 45 ---- .../jobs/parameters/IUrlListValidator.java | 3 - .../jobs/parameters/PartitionedUrl.java | 52 +++- ....java => PartitionedUrlJobParameters.java} | 57 ++--- .../jobs/parameters/UrlListValidator.java | 17 +- .../jobs/step/GenerateRangeChunksStep.java | 64 +---- .../fhir/batch2/jobs/step/LoadIdsStep.java | 4 +- .../batch2/jobs/step/ResourceIdListStep.java | 22 +- .../SimpleJobPartitionProviderTest.java | 43 ---- .../step/GenerateRangeChunksStepTest.java | 110 +++------ .../batch2/jobs/step/LoadIdsStepTest.java | 8 +- .../jobs/step/ResourceIdListStepTest.java | 8 +- .../batch2/clear/MdmClearJobParameters.java | 4 +- .../batch2/submit/MdmSubmitJobParameters.java | 4 +- .../fhir/jpa/binary/api/StoredDetails.java | 23 +- .../jpa/dao/BaseTransactionProcessor.java | 100 +++++++- .../jpa/dao/tx/HapiTransactionService.java | 20 +- .../jpa/dao/tx/IHapiTransactionService.java | 14 ++ .../registry/SubscriptionCanonicalizer.java | 45 ++-- .../model/CanonicalSubscription.java | 2 +- .../ca/uhn/fhir/util/FhirTerserR4Test.java | 23 ++ ...teTerminologyServiceValidationSupport.java | 182 +++++++------- .../support/ValidationSupportUtils.java | 133 +++++++++++ .../VersionSpecificWorkerContextWrapper.java | 41 ++-- .../hapi/validation/ILookupCodeTest.java | 17 +- .../IRemoteTerminologyLookupCodeTest.java | 57 +++++ .../support/ValidationSupportUtilsTest.java | 185 ++++++++++++++ ...rsionSpecificWorkerContextWrapperTest.java | 28 +++ .../FhirInstanceValidatorDstu3Test.java | 3 +- .../RemoteTerminologyLookupCodeDstu3Test.java | 13 +- .../FhirInstanceValidatorR4Test.java | 60 ++++- .../RemoteTerminologyLookupCodeR4Test.java | 13 +- ...inologyServiceValidationSupportR4Test.java | 111 +++++++-- .../FhirInstanceValidatorR4BTest.java | 25 +- .../FhirInstanceValidatorR5Test.java | 3 +- pom.xml | 2 +- 158 files changed, 4241 insertions(+), 1510 deletions(-) create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_1/upgrade.md create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_1/version.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_2/upgrade.md create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_2/version.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/4837-fix-stringify-parse-copy-contained-duplicate-bug.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/5960-lucene-chain-sort-silently-not-working.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6122-validate-operation-with-remote-terminology-service-enabled-returned-bad-request.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6123-search-parameters-missing-in-ianyresource.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6133-make-everything-search-not-rely-on-hibernate-search.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6134-filesystem-regression.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6148-provide-target-resource-partition-info-in-resourcelink.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6150-validate-returns-404-valueset-with-remote-terminology-systems.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6153-allow-transactional-update-with-conditional-url.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6156-updated-migration-on-idx-idxcmbtqknu-hashc-to-be-concurrent.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6159-code-system-delta-add-or-remove-fail-without-concept-display.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6163-make-transactions-better-partition-aware.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6179-reindex-job-no-longer-runs-without-urls.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6179-reindex-with-custom-resource-type-based-partitioning-interceptor.yaml create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6188-non-cross-partition-subs-match-all-resources.yaml rename hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/{reindex => batch2}/Batch2DaoSvcImpl.java (94%) delete mode 100644 hapi-fhir-jpaserver-base/src/test/java/ca/uhn/fhir/jpa/batch2/JpaJobPartitionProviderTest.java create mode 100644 hapi-fhir-jpaserver-searchparam/src/test/java/ca/uhn/fhir/jpa/searchparam/registry/FhirContextSearchParamRegistryTest.java create mode 100644 hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2DaoSvcImplTest.java create mode 100644 hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitionedStrictTransactionR4Test.java create mode 100644 hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/interceptor/ResourceTypePartitionInterceptorR4Test.java delete mode 100644 hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/Batch2DaoSvcImplTest.java rename hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/{delete/job => reindex}/ReindexJobTest.java (99%) rename hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/{delete/job => reindex}/ReindexJobWithPartitioningTest.java (52%) rename hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/{delete/job => reindex}/ReindexTestHelper.java (99%) create mode 100644 hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/H2_EMBEDDED.sql create mode 100644 hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/MSSQL_2012.sql create mode 100644 hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/ORACLE_12C.sql create mode 100644 hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/POSTGRES_9_4.sql create mode 100644 hapi-fhir-storage-batch2-test-utilities/src/main/java/ca/uhn/hapi/fhir/batch2/test/IJobPartitionProviderTest.java create mode 100644 hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/coordinator/DefaultJobPartitionProvider.java delete mode 100644 hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/coordinator/SimpleJobPartitionProvider.java rename hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/{JobParameters.java => PartitionedUrlJobParameters.java} (57%) delete mode 100644 hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/coordinator/SimpleJobPartitionProviderTest.java create mode 100644 hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/support/ValidationSupportUtils.java create mode 100644 hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/IRemoteTerminologyLookupCodeTest.java create mode 100644 hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/support/ValidationSupportUtilsTest.java diff --git a/NOTICE.txt b/NOTICE.txt index 2fbff9e7e6f..35f38a015ed 100644 --- a/NOTICE.txt +++ b/NOTICE.txt @@ -10,3 +10,4 @@ distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. + diff --git a/hapi-fhir-base/src/main/java/ca/uhn/fhir/context/support/IValidationSupport.java b/hapi-fhir-base/src/main/java/ca/uhn/fhir/context/support/IValidationSupport.java index 3624d4146b8..d0990842fa2 100644 --- a/hapi-fhir-base/src/main/java/ca/uhn/fhir/context/support/IValidationSupport.java +++ b/hapi-fhir-base/src/main/java/ca/uhn/fhir/context/support/IValidationSupport.java @@ -1071,8 +1071,9 @@ public interface IValidationSupport { } } - public void setErrorMessage(String theErrorMessage) { + public LookupCodeResult setErrorMessage(String theErrorMessage) { myErrorMessage = theErrorMessage; + return this; } public String getErrorMessage() { diff --git a/hapi-fhir-base/src/main/java/ca/uhn/fhir/interceptor/model/RequestPartitionId.java b/hapi-fhir-base/src/main/java/ca/uhn/fhir/interceptor/model/RequestPartitionId.java index e4f0b5f1201..cfb538997ac 100644 --- a/hapi-fhir-base/src/main/java/ca/uhn/fhir/interceptor/model/RequestPartitionId.java +++ b/hapi-fhir-base/src/main/java/ca/uhn/fhir/interceptor/model/RequestPartitionId.java @@ -40,6 +40,7 @@ import java.util.Collections; import java.util.List; import java.util.Objects; import java.util.stream.Collectors; +import java.util.stream.Stream; import static org.apache.commons.lang3.ObjectUtils.defaultIfNull; @@ -98,6 +99,28 @@ public class RequestPartitionId implements IModelJson { myAllPartitions = true; } + /** + * Creates a new RequestPartitionId which includes all partition IDs from + * this {@link RequestPartitionId} but also includes all IDs from the given + * {@link RequestPartitionId}. Any duplicates are only included once, and + * partition names and dates are ignored and not returned. This {@link RequestPartitionId} + * and {@literal theOther} are not modified. + * + * @since 7.4.0 + */ + public RequestPartitionId mergeIds(RequestPartitionId theOther) { + if (isAllPartitions() || theOther.isAllPartitions()) { + return RequestPartitionId.allPartitions(); + } + + List thisPartitionIds = getPartitionIds(); + List otherPartitionIds = theOther.getPartitionIds(); + List newPartitionIds = Stream.concat(thisPartitionIds.stream(), otherPartitionIds.stream()) + .distinct() + .collect(Collectors.toList()); + return RequestPartitionId.fromPartitionIds(newPartitionIds); + } + public static RequestPartitionId fromJson(String theJson) throws JsonProcessingException { return ourObjectMapper.readValue(theJson, RequestPartitionId.class); } @@ -332,6 +355,14 @@ public class RequestPartitionId implements IModelJson { return new RequestPartitionId(thePartitionNames, thePartitionIds, thePartitionDate); } + public static boolean isDefaultPartition(@Nullable RequestPartitionId thePartitionId) { + if (thePartitionId == null) { + return false; + } + + return thePartitionId.isDefaultPartition(); + } + /** * Create a string representation suitable for use as a cache key. Null aware. *

diff --git a/hapi-fhir-base/src/main/java/ca/uhn/fhir/repository/Repository.java b/hapi-fhir-base/src/main/java/ca/uhn/fhir/repository/Repository.java index a95de2b450e..e859a9ae569 100644 --- a/hapi-fhir-base/src/main/java/ca/uhn/fhir/repository/Repository.java +++ b/hapi-fhir-base/src/main/java/ca/uhn/fhir/repository/Repository.java @@ -1,3 +1,22 @@ +/*- + * #%L + * HAPI FHIR - Core Library + * %% + * Copyright (C) 2014 - 2024 Smile CDR, Inc. + * %% + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * #L% + */ package ca.uhn.fhir.repository; import ca.uhn.fhir.context.FhirContext; diff --git a/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/FhirTerser.java b/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/FhirTerser.java index 7d1dcef782f..d65c39397c3 100644 --- a/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/FhirTerser.java +++ b/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/FhirTerser.java @@ -101,7 +101,7 @@ public class FhirTerser { return newList; } - private ExtensionDt createEmptyExtensionDt(IBaseExtension theBaseExtension, String theUrl) { + private ExtensionDt createEmptyExtensionDt(IBaseExtension theBaseExtension, String theUrl) { return createEmptyExtensionDt(theBaseExtension, false, theUrl); } @@ -122,13 +122,13 @@ public class FhirTerser { return theSupportsUndeclaredExtensions.addUndeclaredExtension(theIsModifier, theUrl); } - private IBaseExtension createEmptyExtension(IBaseHasExtensions theBaseHasExtensions, String theUrl) { - return (IBaseExtension) theBaseHasExtensions.addExtension().setUrl(theUrl); + private IBaseExtension createEmptyExtension(IBaseHasExtensions theBaseHasExtensions, String theUrl) { + return (IBaseExtension) theBaseHasExtensions.addExtension().setUrl(theUrl); } - private IBaseExtension createEmptyModifierExtension( + private IBaseExtension createEmptyModifierExtension( IBaseHasModifierExtensions theBaseHasModifierExtensions, String theUrl) { - return (IBaseExtension) + return (IBaseExtension) theBaseHasModifierExtensions.addModifierExtension().setUrl(theUrl); } @@ -407,7 +407,7 @@ public class FhirTerser { public String getSinglePrimitiveValueOrNull(IBase theTarget, String thePath) { return getSingleValue(theTarget, thePath, IPrimitiveType.class) - .map(t -> t.getValueAsString()) + .map(IPrimitiveType::getValueAsString) .orElse(null); } @@ -487,7 +487,7 @@ public class FhirTerser { } else { // DSTU3+ final String extensionUrlForLambda = extensionUrl; - List extensions = Collections.emptyList(); + List> extensions = Collections.emptyList(); if (theCurrentObj instanceof IBaseHasExtensions) { extensions = ((IBaseHasExtensions) theCurrentObj) .getExtension().stream() @@ -505,7 +505,7 @@ public class FhirTerser { } } - for (IBaseExtension next : extensions) { + for (IBaseExtension next : extensions) { if (theWantedClass.isAssignableFrom(next.getClass())) { retVal.add((T) next); } @@ -581,7 +581,7 @@ public class FhirTerser { } else { // DSTU3+ final String extensionUrlForLambda = extensionUrl; - List extensions = Collections.emptyList(); + List> extensions = Collections.emptyList(); if (theCurrentObj instanceof IBaseHasModifierExtensions) { extensions = ((IBaseHasModifierExtensions) theCurrentObj) @@ -602,7 +602,7 @@ public class FhirTerser { } } - for (IBaseExtension next : extensions) { + for (IBaseExtension next : extensions) { if (theWantedClass.isAssignableFrom(next.getClass())) { retVal.add((T) next); } @@ -1203,7 +1203,6 @@ public class FhirTerser { public void visit(IBase theElement, IModelVisitor2 theVisitor) { BaseRuntimeElementDefinition def = myContext.getElementDefinition(theElement.getClass()); if (def instanceof BaseRuntimeElementCompositeDefinition) { - BaseRuntimeElementCompositeDefinition defComposite = (BaseRuntimeElementCompositeDefinition) def; visit(theElement, null, def, theVisitor, new ArrayList<>(), new ArrayList<>(), new ArrayList<>()); } else if (theElement instanceof IBaseExtension) { theVisitor.acceptUndeclaredExtension( @@ -1562,7 +1561,7 @@ public class FhirTerser { throw new DataFormatException(Msg.code(1796) + "Invalid path " + thePath + ": Element of type " + def.getName() + " has no child named " + nextPart + ". Valid names: " + def.getChildrenAndExtension().stream() - .map(t -> t.getElementName()) + .map(BaseRuntimeChildDefinition::getElementName) .sorted() .collect(Collectors.joining(", "))); } @@ -1817,7 +1816,18 @@ public class FhirTerser { if (getResourceToIdMap() == null) { return null; } - return getResourceToIdMap().get(theNext); + + var idFromMap = getResourceToIdMap().get(theNext); + if (idFromMap != null) { + return idFromMap; + } else if (theNext.getIdElement().getIdPart() != null) { + return getResourceToIdMap().values().stream() + .filter(id -> theNext.getIdElement().getIdPart().equals(id.getIdPart())) + .findAny() + .orElse(null); + } else { + return null; + } } private List getOrCreateResourceList() { diff --git a/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/SubscriptionUtil.java b/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/SubscriptionUtil.java index 283b01c684c..b47d138b248 100644 --- a/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/SubscriptionUtil.java +++ b/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/SubscriptionUtil.java @@ -65,7 +65,7 @@ public class SubscriptionUtil { populatePrimitiveValue(theContext, theSubscription, "status", theStatus); } - public static boolean isCrossPartition(IBaseResource theSubscription) { + public static boolean isDefinedAsCrossPartitionSubcription(IBaseResource theSubscription) { if (theSubscription instanceof IBaseHasExtensions) { IBaseExtension extension = ExtensionUtil.getExtensionByUrl( theSubscription, HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION); diff --git a/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/VersionEnum.java b/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/VersionEnum.java index 7559f15ed29..805be566f10 100644 --- a/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/VersionEnum.java +++ b/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/VersionEnum.java @@ -154,6 +154,8 @@ public enum VersionEnum { V7_1_0, V7_2_0, + V7_2_1, + V7_2_2, V7_3_0, V7_4_0, diff --git a/hapi-fhir-base/src/main/java/org/hl7/fhir/instance/model/api/IAnyResource.java b/hapi-fhir-base/src/main/java/org/hl7/fhir/instance/model/api/IAnyResource.java index 2a7e5214327..d9e5f62f148 100644 --- a/hapi-fhir-base/src/main/java/org/hl7/fhir/instance/model/api/IAnyResource.java +++ b/hapi-fhir-base/src/main/java/org/hl7/fhir/instance/model/api/IAnyResource.java @@ -20,29 +20,115 @@ package org.hl7.fhir.instance.model.api; import ca.uhn.fhir.model.api.annotation.SearchParamDefinition; +import ca.uhn.fhir.rest.gclient.DateClientParam; import ca.uhn.fhir.rest.gclient.TokenClientParam; +import ca.uhn.fhir.rest.gclient.UriClientParam; /** * An IBaseResource that has a FHIR version of DSTU3 or higher */ public interface IAnyResource extends IBaseResource { + String SP_RES_ID = "_id"; /** * Search parameter constant for _id */ - @SearchParamDefinition(name = "_id", path = "", description = "The ID of the resource", type = "token") - String SP_RES_ID = "_id"; + @SearchParamDefinition( + name = SP_RES_ID, + path = "Resource.id", + description = "The ID of the resource", + type = "token") /** * Fluent Client search parameter constant for _id *

* Description: the _id of a resource
* Type: string
- * Path: Resource._id
+ * Path: Resource.id
*

*/ TokenClientParam RES_ID = new TokenClientParam(IAnyResource.SP_RES_ID); + String SP_RES_LAST_UPDATED = "_lastUpdated"; + /** + * Search parameter constant for _lastUpdated + */ + @SearchParamDefinition( + name = SP_RES_LAST_UPDATED, + path = "Resource.meta.lastUpdated", + description = "The last updated date of the resource", + type = "date") + + /** + * Fluent Client search parameter constant for _lastUpdated + *

+ * Description: The last updated date of a resource
+ * Type: date
+ * Path: Resource.meta.lastUpdated
+ *

+ */ + DateClientParam RES_LAST_UPDATED = new DateClientParam(IAnyResource.SP_RES_LAST_UPDATED); + + String SP_RES_TAG = "_tag"; + /** + * Search parameter constant for _tag + */ + @SearchParamDefinition( + name = SP_RES_TAG, + path = "Resource.meta.tag", + description = "The tag of the resource", + type = "token") + + /** + * Fluent Client search parameter constant for _tag + *

+ * Description: The tag of a resource
+ * Type: token
+ * Path: Resource.meta.tag
+ *

+ */ + TokenClientParam RES_TAG = new TokenClientParam(IAnyResource.SP_RES_TAG); + + String SP_RES_PROFILE = "_profile"; + /** + * Search parameter constant for _profile + */ + @SearchParamDefinition( + name = SP_RES_PROFILE, + path = "Resource.meta.profile", + description = "The profile of the resource", + type = "uri") + + /** + * Fluent Client search parameter constant for _profile + *

+ * Description: The profile of a resource
+ * Type: uri
+ * Path: Resource.meta.profile
+ *

+ */ + UriClientParam RES_PROFILE = new UriClientParam(IAnyResource.SP_RES_PROFILE); + + String SP_RES_SECURITY = "_security"; + /** + * Search parameter constant for _security + */ + @SearchParamDefinition( + name = SP_RES_SECURITY, + path = "Resource.meta.security", + description = "The security of the resource", + type = "token") + + /** + * Fluent Client search parameter constant for _security + *

+ * Description: The security of a resource
+ * Type: token
+ * Path: Resource.meta.security
+ *

+ */ + TokenClientParam RES_SECURITY = new TokenClientParam(IAnyResource.SP_RES_SECURITY); + String getId(); IIdType getIdElement(); diff --git a/hapi-fhir-base/src/main/resources/ca/uhn/fhir/i18n/hapi-messages.properties b/hapi-fhir-base/src/main/resources/ca/uhn/fhir/i18n/hapi-messages.properties index 686f1452d6d..fbd3091918a 100644 --- a/hapi-fhir-base/src/main/resources/ca/uhn/fhir/i18n/hapi-messages.properties +++ b/hapi-fhir-base/src/main/resources/ca/uhn/fhir/i18n/hapi-messages.properties @@ -6,6 +6,9 @@ org.hl7.fhir.common.hapi.validation.support.CommonCodeSystemsTerminologyService. org.hl7.fhir.common.hapi.validation.support.CommonCodeSystemsTerminologyService.mismatchCodeSystem=Inappropriate CodeSystem URL "{0}" for ValueSet: {1} org.hl7.fhir.common.hapi.validation.support.CommonCodeSystemsTerminologyService.codeNotFoundInValueSet=Code "{0}" is not in valueset: {1} +org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport.unknownCodeInSystem=Unknown code "{0}#{1}". The Remote Terminology server {2} returned {3} +org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport.unknownCodeInValueSet=Unknown code "{0}#{1}" for ValueSet with URL "{2}". The Remote Terminology server {3} returned {4} + ca.uhn.fhir.jpa.term.TermReadSvcImpl.expansionRefersToUnknownCs=Unknown CodeSystem URI "{0}" referenced from ValueSet ca.uhn.fhir.jpa.term.TermReadSvcImpl.valueSetNotYetExpanded=ValueSet "{0}" has not yet been pre-expanded. Performing in-memory expansion without parameters. Current status: {1} | {2} ca.uhn.fhir.jpa.term.TermReadSvcImpl.valueSetNotYetExpanded_OffsetNotAllowed=ValueSet expansion can not combine "offset" with "ValueSet.compose.exclude" unless the ValueSet has been pre-expanded. ValueSet "{0}" must be pre-expanded for this operation to work. @@ -91,6 +94,7 @@ ca.uhn.fhir.jpa.dao.BaseStorageDao.inlineMatchNotSupported=Inline match URLs are ca.uhn.fhir.jpa.dao.BaseStorageDao.transactionOperationWithMultipleMatchFailure=Failed to {0} resource with match URL "{1}" because this search matched {2} resources ca.uhn.fhir.jpa.dao.BaseStorageDao.deleteByUrlThresholdExceeded=Failed to DELETE resources with match URL "{0}" because the resolved number of resources: {1} exceeds the threshold of {2} ca.uhn.fhir.jpa.dao.BaseStorageDao.transactionOperationWithIdNotMatchFailure=Failed to {0} resource with match URL "{1}" because the matching resource does not match the provided ID +ca.uhn.fhir.jpa.dao.BaseTransactionProcessor.multiplePartitionAccesses=Can not process transaction with {0} entries: Entries require access to multiple/conflicting partitions ca.uhn.fhir.jpa.dao.BaseHapiFhirDao.transactionOperationFailedNoId=Failed to {0} resource in transaction because no ID was provided ca.uhn.fhir.jpa.dao.BaseHapiFhirDao.transactionOperationFailedUnknownId=Failed to {0} resource in transaction because no resource could be found with ID {1} ca.uhn.fhir.jpa.dao.BaseHapiFhirDao.uniqueIndexConflictFailure=Can not create resource of type {0} as it would create a duplicate unique index matching query: {1} (existing index belongs to {2}, new unique index created by {3}) diff --git a/hapi-fhir-base/src/test/java/ca/uhn/fhir/interceptor/model/RequestPartitionIdTest.java b/hapi-fhir-base/src/test/java/ca/uhn/fhir/interceptor/model/RequestPartitionIdTest.java index 51c837d4f38..8d6e934ba73 100644 --- a/hapi-fhir-base/src/test/java/ca/uhn/fhir/interceptor/model/RequestPartitionIdTest.java +++ b/hapi-fhir-base/src/test/java/ca/uhn/fhir/interceptor/model/RequestPartitionIdTest.java @@ -41,6 +41,50 @@ public class RequestPartitionIdTest { assertFalse(RequestPartitionId.forPartitionIdsAndNames(null, Lists.newArrayList(1, 2), null).isDefaultPartition()); } + @Test + public void testMergeIds() { + RequestPartitionId input0 = RequestPartitionId.fromPartitionIds(1, 2, 3); + RequestPartitionId input1 = RequestPartitionId.fromPartitionIds(1, 2, 4); + + RequestPartitionId actual = input0.mergeIds(input1); + RequestPartitionId expected = RequestPartitionId.fromPartitionIds(1, 2, 3, 4); + assertEquals(expected, actual); + + } + + @Test + public void testMergeIds_ThisAllPartitions() { + RequestPartitionId input0 = RequestPartitionId.allPartitions(); + RequestPartitionId input1 = RequestPartitionId.fromPartitionIds(1, 2, 4); + + RequestPartitionId actual = input0.mergeIds(input1); + RequestPartitionId expected = RequestPartitionId.allPartitions(); + assertEquals(expected, actual); + + } + + @Test + public void testMergeIds_OtherAllPartitions() { + RequestPartitionId input0 = RequestPartitionId.fromPartitionIds(1, 2, 3); + RequestPartitionId input1 = RequestPartitionId.allPartitions(); + + RequestPartitionId actual = input0.mergeIds(input1); + RequestPartitionId expected = RequestPartitionId.allPartitions(); + assertEquals(expected, actual); + + } + + @Test + public void testMergeIds_IncludesDefault() { + RequestPartitionId input0 = RequestPartitionId.fromPartitionIds(1, 2, 3); + RequestPartitionId input1 = RequestPartitionId.defaultPartition(); + + RequestPartitionId actual = input0.mergeIds(input1); + RequestPartitionId expected = RequestPartitionId.fromPartitionIds(1, 2, 3, null); + assertEquals(expected, actual); + + } + @Test public void testSerDeserSer() throws JsonProcessingException { { diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_1/upgrade.md b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_1/upgrade.md new file mode 100644 index 00000000000..e69de29bb2d diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_1/version.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_1/version.yaml new file mode 100644 index 00000000000..9bed9362fed --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_1/version.yaml @@ -0,0 +1,3 @@ +--- +release-date: "2024-05-30" +codename: "Borealis" diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_2/upgrade.md b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_2/upgrade.md new file mode 100644 index 00000000000..e69de29bb2d diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_2/version.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_2/version.yaml new file mode 100644 index 00000000000..9d69a732c39 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_2_2/version.yaml @@ -0,0 +1,3 @@ +--- +release-date: "2024-07-19" +codename: "Borealis" diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/4837-fix-stringify-parse-copy-contained-duplicate-bug.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/4837-fix-stringify-parse-copy-contained-duplicate-bug.yaml new file mode 100644 index 00000000000..e0bb4aec0ca --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/4837-fix-stringify-parse-copy-contained-duplicate-bug.yaml @@ -0,0 +1,5 @@ +--- +type: fix +issue: 4837 +title: "In the case where a resource was serialized, deserialized, copied and reserialized it resulted in duplication of + contained resources. This has been corrected." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/5960-lucene-chain-sort-silently-not-working.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/5960-lucene-chain-sort-silently-not-working.yaml new file mode 100644 index 00000000000..5564df71c85 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/5960-lucene-chain-sort-silently-not-working.yaml @@ -0,0 +1,6 @@ +--- +type: fix +issue: 5960 +backport: 7.2.1 +title: "Previously, queries with chained would fail to sort correctly with lucene and full text searches enabled. + This has been fixed." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6024-full-index-search-with-filter-parameters.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6024-full-index-search-with-filter-parameters.yaml index d62a971afac..6af4104ae8e 100644 --- a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6024-full-index-search-with-filter-parameters.yaml +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6024-full-index-search-with-filter-parameters.yaml @@ -1,6 +1,7 @@ --- type: fix issue: 6024 +backport: 7.2.2 title: "Fixed a bug in search where requesting a count with HSearch indexing and FilterParameter enabled and using the _filter parameter would result in inaccurate results being returned. diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6044-fix-issue-with-cache-refresh-infinite-loop.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6044-fix-issue-with-cache-refresh-infinite-loop.yaml index a883f4d37ae..70d4f3df266 100644 --- a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6044-fix-issue-with-cache-refresh-infinite-loop.yaml +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6044-fix-issue-with-cache-refresh-infinite-loop.yaml @@ -1,6 +1,7 @@ --- type: fix issue: 6044 +backport: 7.2.2 title: "Fixed an issue where doing a cache refresh with advanced Hibernate Search enabled would result in an infinite loop of cache refresh -> search for StructureDefinition -> cache refresh, etc diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6046-failed-text-and-content-search-in-r5.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6046-failed-text-and-content-search-in-r5.yaml index 13dfe63ecde..8b3d60c7e8a 100644 --- a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6046-failed-text-and-content-search-in-r5.yaml +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6046-failed-text-and-content-search-in-r5.yaml @@ -1,4 +1,5 @@ --- type: fix issue: 6046 +backport: 7.2.2 title: "Previously, using `_text` and `_content` searches in Hibernate Search in R5 was not supported. This issue has been fixed." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6046-text-contains.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6046-text-contains.yaml index 1dbf3931cec..3a53993e880 100644 --- a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6046-text-contains.yaml +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6046-text-contains.yaml @@ -1,5 +1,6 @@ --- type: add issue: 6046 +backport: 7.2.2 title: "Added support for `:contains` parameter qualifier on the `_text` and `_content` Search Parameters. When using Hibernate Search, this will cause the search to perform an substring match on the provided value. Documentation can be found [here](/hapi-fhir/docs/server_jpa/elastic.html#performing-fulltext-search-in-luceneelasticsearch)." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6122-validate-operation-with-remote-terminology-service-enabled-returned-bad-request.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6122-validate-operation-with-remote-terminology-service-enabled-returned-bad-request.yaml new file mode 100644 index 00000000000..df736310acb --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6122-validate-operation-with-remote-terminology-service-enabled-returned-bad-request.yaml @@ -0,0 +1,5 @@ +--- +type: fix +issue: 6122 +title: "Previously, executing the '$validate' operation on a resource instance could result in an HTTP 400 Bad Request +instead of an HTTP 200 OK response with a list of validation issues. This has been fixed." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6123-search-parameters-missing-in-ianyresource.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6123-search-parameters-missing-in-ianyresource.yaml new file mode 100644 index 00000000000..158a1aee33c --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6123-search-parameters-missing-in-ianyresource.yaml @@ -0,0 +1,6 @@ +--- +type: fix +issue: 6123 +title: "`IAnyResource` `_id` search parameter was missing `path` property value, which resulted in extractor not + working when standard search parameters were instantiated from defined context. This has been fixed, and also + `_LastUpdated`, `_tag`, `_profile`, and `_security` parameter definitions were added to the class." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6133-make-everything-search-not-rely-on-hibernate-search.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6133-make-everything-search-not-rely-on-hibernate-search.yaml new file mode 100644 index 00000000000..7c3dbfe1338 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6133-make-everything-search-not-rely-on-hibernate-search.yaml @@ -0,0 +1,6 @@ +--- +type: fix +issue: 6083 +backport: 7.2.2 +title: "A bug with $everything operation was discovered when trying to search using hibernate search, this change makes +all $everything operation rely on database search until hibernate search fully supports the operation." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6134-filesystem-regression.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6134-filesystem-regression.yaml new file mode 100644 index 00000000000..cc28d857c21 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6134-filesystem-regression.yaml @@ -0,0 +1,6 @@ +--- +type: fix +issue: 6134 +backport: 7.2.2 +title: "Fixed a regression in 7.2.0 which caused systems using `FILESYSTEM` binary storage mode to be unable to read metadata documents +that had been previously stored on disk." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6148-provide-target-resource-partition-info-in-resourcelink.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6148-provide-target-resource-partition-info-in-resourcelink.yaml new file mode 100644 index 00000000000..9dca360dc4e --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6148-provide-target-resource-partition-info-in-resourcelink.yaml @@ -0,0 +1,5 @@ +--- +type: add +issue: 6148 +jira: SMILE-8613 +title: "Added the target resource partitionId and partitionDate to the resourceLink table." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6150-validate-returns-404-valueset-with-remote-terminology-systems.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6150-validate-returns-404-valueset-with-remote-terminology-systems.yaml new file mode 100644 index 00000000000..ee2b7883248 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6150-validate-returns-404-valueset-with-remote-terminology-systems.yaml @@ -0,0 +1,6 @@ +--- +type: fix +issue: 6150 +title: "Previously, the resource $validate operation would return a 404 when the associated profile uses a ValueSet +that has multiple includes referencing Remote Terminology CodeSystem resources. +This has been fixed to return a 200 with issues instead." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6153-allow-transactional-update-with-conditional-url.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6153-allow-transactional-update-with-conditional-url.yaml new file mode 100644 index 00000000000..5d8c0ed1bb6 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6153-allow-transactional-update-with-conditional-url.yaml @@ -0,0 +1,11 @@ +--- +type: fix +issue: 6153 +title: "Previously, if you created a resource with some conditional url, + but then submitted a transaction bundle that + a) updated the resource to not match the condition anymore and + b) create a resource with the (same) condition + a unique index violation would result. + + This has been fixed. +" diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6156-updated-migration-on-idx-idxcmbtqknu-hashc-to-be-concurrent.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6156-updated-migration-on-idx-idxcmbtqknu-hashc-to-be-concurrent.yaml new file mode 100644 index 00000000000..1015eb7a180 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6156-updated-migration-on-idx-idxcmbtqknu-hashc-to-be-concurrent.yaml @@ -0,0 +1,6 @@ +--- +type: fix +issue: 6156 +title: "Index IDX_IDXCMBTOKNU_HASHC on table HFJ_IDX_CMB_TOK_NU's migration + is now marked as online (concurrent). +" diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6159-code-system-delta-add-or-remove-fail-without-concept-display.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6159-code-system-delta-add-or-remove-fail-without-concept-display.yaml new file mode 100644 index 00000000000..6356d7a86f7 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6159-code-system-delta-add-or-remove-fail-without-concept-display.yaml @@ -0,0 +1,8 @@ +--- +type: fix +issue: 6159 +jira: SMILE-8604 +title: "Previously, `$apply-codesystem-delta-add` and `$apply-codesystem-delta-remove` operations were failing +with a 500 Server Error when invoked with a CodeSystem Resource payload that had a concept without a +`display` element. This has now been fixed so that concepts without display field is accepted, as `display` +element is not required." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6163-make-transactions-better-partition-aware.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6163-make-transactions-better-partition-aware.yaml new file mode 100644 index 00000000000..2813c6391ae --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6163-make-transactions-better-partition-aware.yaml @@ -0,0 +1,7 @@ +--- +type: fix +jira: SMILE-8652 +title: "When JPA servers are configured to always require a new database + transaction when switching partitions, the server will now correctly + identify the correct partition for FHIR transaction operations, and + fail the operation if multiple partitions would be required." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6179-reindex-job-no-longer-runs-without-urls.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6179-reindex-job-no-longer-runs-without-urls.yaml new file mode 100644 index 00000000000..1f06a3be9f8 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6179-reindex-job-no-longer-runs-without-urls.yaml @@ -0,0 +1,7 @@ + +--- +type: change +issue: 6179 +title: "The $reindex operation could potentially initiate a reindex job without any urls provided in the parameters. +We now internally generate a list of urls out of all the supported resource types and attempt to reindex +found resources of each type separately. As a result, each reindex (batch2) job chunk will be always associated with a url." \ No newline at end of file diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6179-reindex-with-custom-resource-type-based-partitioning-interceptor.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6179-reindex-with-custom-resource-type-based-partitioning-interceptor.yaml new file mode 100644 index 00000000000..a275e086340 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6179-reindex-with-custom-resource-type-based-partitioning-interceptor.yaml @@ -0,0 +1,6 @@ +--- +type: fix +issue: 6179 +title: "Previously, the $reindex operation would fail when using a custom partitioning interceptor which decides the partition +based on the resource type in the request. This has been fixed, such that we avoid retrieving the resource type from +the request, rather we use the urls provided as parameters to the operation to determine the partitions." diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6188-non-cross-partition-subs-match-all-resources.yaml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6188-non-cross-partition-subs-match-all-resources.yaml new file mode 100644 index 00000000000..1ebe811c35a --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_4_0/6188-non-cross-partition-subs-match-all-resources.yaml @@ -0,0 +1,6 @@ +--- +type: fix +issue: 6188 +jira: SMILE-8759 +title: "Previously, a Subscription not marked as a cross-partition subscription could listen to incoming resources from +other partitions. This issue is fixed." diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/reindex/Batch2DaoSvcImpl.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/Batch2DaoSvcImpl.java similarity index 94% rename from hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/reindex/Batch2DaoSvcImpl.java rename to hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/Batch2DaoSvcImpl.java index bf20c8cfc38..f6a67c30579 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/reindex/Batch2DaoSvcImpl.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/Batch2DaoSvcImpl.java @@ -17,7 +17,7 @@ * limitations under the License. * #L% */ -package ca.uhn.fhir.jpa.reindex; +package ca.uhn.fhir.jpa.batch2; import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.RuntimeResourceDefinition; @@ -41,8 +41,10 @@ import ca.uhn.fhir.rest.api.SortSpec; import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; import ca.uhn.fhir.util.DateRangeUtil; +import ca.uhn.fhir.util.Logs; import jakarta.annotation.Nonnull; import jakarta.annotation.Nullable; +import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.Validate; import java.util.Date; @@ -50,7 +52,7 @@ import java.util.function.Supplier; import java.util.stream.Stream; public class Batch2DaoSvcImpl implements IBatch2DaoSvc { - private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(Batch2DaoSvcImpl.class); + private static final org.slf4j.Logger ourLog = Logs.getBatchTroubleshootingLog(); private final IResourceTableDao myResourceTableDao; @@ -83,7 +85,7 @@ public class Batch2DaoSvcImpl implements IBatch2DaoSvc { @Override public IResourcePidStream fetchResourceIdStream( Date theStart, Date theEnd, RequestPartitionId theRequestPartitionId, String theUrl) { - if (theUrl == null) { + if (StringUtils.isBlank(theUrl)) { return makeStreamResult( theRequestPartitionId, () -> streamResourceIdsNoUrl(theStart, theEnd, theRequestPartitionId)); } else { @@ -127,6 +129,10 @@ public class Batch2DaoSvcImpl implements IBatch2DaoSvc { return new TypedResourceStream(theRequestPartitionId, streamTemplate); } + /** + * At the moment there is no use-case for this method. + * This can be cleaned up at a later point in time if there is no use for it. + */ @Nonnull private Stream streamResourceIdsNoUrl( Date theStart, Date theEnd, RequestPartitionId theRequestPartitionId) { diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/JpaBatch2Config.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/JpaBatch2Config.java index d8a97947837..f73a88570f3 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/JpaBatch2Config.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/JpaBatch2Config.java @@ -19,7 +19,6 @@ */ package ca.uhn.fhir.jpa.batch2; -import ca.uhn.fhir.batch2.api.IJobPartitionProvider; import ca.uhn.fhir.batch2.api.IJobPersistence; import ca.uhn.fhir.batch2.config.BaseBatch2Config; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; @@ -28,8 +27,6 @@ import ca.uhn.fhir.jpa.dao.data.IBatch2JobInstanceRepository; import ca.uhn.fhir.jpa.dao.data.IBatch2WorkChunkMetadataViewRepository; import ca.uhn.fhir.jpa.dao.data.IBatch2WorkChunkRepository; import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; -import ca.uhn.fhir.jpa.partition.IPartitionLookupSvc; -import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; import jakarta.persistence.EntityManager; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; @@ -55,10 +52,4 @@ public class JpaBatch2Config extends BaseBatch2Config { theEntityManager, theInterceptorBroadcaster); } - - @Bean - public IJobPartitionProvider jobPartitionProvider( - IRequestPartitionHelperSvc theRequestPartitionHelperSvc, IPartitionLookupSvc thePartitionLookupSvc) { - return new JpaJobPartitionProvider(theRequestPartitionHelperSvc, thePartitionLookupSvc); - } } diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/JpaJobPartitionProvider.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/JpaJobPartitionProvider.java index 7ef82704ee2..5edca1289a2 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/JpaJobPartitionProvider.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/batch2/JpaJobPartitionProvider.java @@ -19,45 +19,47 @@ */ package ca.uhn.fhir.jpa.batch2; -import ca.uhn.fhir.batch2.api.IJobPartitionProvider; +import ca.uhn.fhir.batch2.coordinator.DefaultJobPartitionProvider; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; +import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.jpa.entity.PartitionEntity; import ca.uhn.fhir.jpa.partition.IPartitionLookupSvc; import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; -import ca.uhn.fhir.rest.api.server.RequestDetails; +import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import java.util.List; import java.util.stream.Collectors; /** * The default JPA implementation, which uses {@link IRequestPartitionHelperSvc} and {@link IPartitionLookupSvc} - * to compute the partition to run a batch2 job. + * to compute the {@link PartitionedUrl} list to run a batch2 job. * The latter will be used to handle cases when the job is configured to run against all partitions * (bulk system operation) and will return the actual list with all the configured partitions. */ -public class JpaJobPartitionProvider implements IJobPartitionProvider { - protected final IRequestPartitionHelperSvc myRequestPartitionHelperSvc; +@Deprecated +public class JpaJobPartitionProvider extends DefaultJobPartitionProvider { private final IPartitionLookupSvc myPartitionLookupSvc; public JpaJobPartitionProvider( IRequestPartitionHelperSvc theRequestPartitionHelperSvc, IPartitionLookupSvc thePartitionLookupSvc) { - myRequestPartitionHelperSvc = theRequestPartitionHelperSvc; + super(theRequestPartitionHelperSvc); + myPartitionLookupSvc = thePartitionLookupSvc; + } + + public JpaJobPartitionProvider( + FhirContext theFhirContext, + IRequestPartitionHelperSvc theRequestPartitionHelperSvc, + MatchUrlService theMatchUrlService, + IPartitionLookupSvc thePartitionLookupSvc) { + super(theFhirContext, theRequestPartitionHelperSvc, theMatchUrlService); myPartitionLookupSvc = thePartitionLookupSvc; } @Override - public List getPartitions(RequestDetails theRequestDetails, String theOperation) { - RequestPartitionId partitionId = myRequestPartitionHelperSvc.determineReadPartitionForRequestForServerOperation( - theRequestDetails, theOperation); - if (!partitionId.isAllPartitions()) { - return List.of(partitionId); - } - // handle (bulk) system operations that are typically configured with RequestPartitionId.allPartitions() - // populate the actual list of all partitions - List partitionIdList = myPartitionLookupSvc.listPartitions().stream() + public List getAllPartitions() { + return myPartitionLookupSvc.listPartitions().stream() .map(PartitionEntity::toRequestPartitionId) .collect(Collectors.toList()); - partitionIdList.add(RequestPartitionId.defaultPartition()); - return partitionIdList; } } diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/config/Batch2SupportConfig.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/config/Batch2SupportConfig.java index 95171836324..83f308ba0db 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/config/Batch2SupportConfig.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/config/Batch2SupportConfig.java @@ -25,6 +25,7 @@ import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc; import ca.uhn.fhir.jpa.api.svc.IDeleteExpungeSvc; import ca.uhn.fhir.jpa.api.svc.IIdHelperService; +import ca.uhn.fhir.jpa.batch2.Batch2DaoSvcImpl; import ca.uhn.fhir.jpa.dao.IFulltextSearchSvc; import ca.uhn.fhir.jpa.dao.data.IResourceLinkDao; import ca.uhn.fhir.jpa.dao.data.IResourceTableDao; @@ -32,7 +33,6 @@ import ca.uhn.fhir.jpa.dao.expunge.ResourceTableFKProvider; import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; import ca.uhn.fhir.jpa.delete.batch2.DeleteExpungeSqlBuilder; import ca.uhn.fhir.jpa.delete.batch2.DeleteExpungeSvcImpl; -import ca.uhn.fhir.jpa.reindex.Batch2DaoSvcImpl; import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import jakarta.persistence.EntityManager; import org.springframework.beans.factory.annotation.Autowired; diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/BaseHapiFhirResourceDao.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/BaseHapiFhirResourceDao.java index d8e4042ce7d..f6a04aa33c5 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/BaseHapiFhirResourceDao.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/BaseHapiFhirResourceDao.java @@ -20,7 +20,7 @@ package ca.uhn.fhir.jpa.dao; import ca.uhn.fhir.batch2.api.IJobCoordinator; -import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner; +import ca.uhn.fhir.batch2.api.IJobPartitionProvider; import ca.uhn.fhir.batch2.jobs.reindex.ReindexAppCtx; import ca.uhn.fhir.batch2.jobs.reindex.ReindexJobParameters; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; @@ -103,7 +103,6 @@ import ca.uhn.fhir.rest.server.exceptions.PreconditionFailedException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException; import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException; -import ca.uhn.fhir.rest.server.provider.ProviderConstants; import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails; import ca.uhn.fhir.rest.server.util.CompositeInterceptorBroadcaster; import ca.uhn.fhir.util.ReflectionUtil; @@ -193,6 +192,9 @@ public abstract class BaseHapiFhirResourceDao extends B @Autowired private IRequestPartitionHelperSvc myRequestPartitionHelperService; + @Autowired + private IJobPartitionProvider myJobPartitionProvider; + @Autowired private MatchUrlService myMatchUrlService; @@ -214,9 +216,6 @@ public abstract class BaseHapiFhirResourceDao extends B private TransactionTemplate myTxTemplate; - @Autowired - private UrlPartitioner myUrlPartitioner; - @Autowired private ResourceSearchUrlSvc myResourceSearchUrlSvc; @@ -1306,14 +1305,12 @@ public abstract class BaseHapiFhirResourceDao extends B ReindexJobParameters params = new ReindexJobParameters(); + List urls = List.of(); if (!isCommonSearchParam(theBase)) { - addAllResourcesTypesToReindex(theBase, theRequestDetails, params); + urls = theBase.stream().map(t -> t + "?").collect(Collectors.toList()); } - RequestPartitionId requestPartition = - myRequestPartitionHelperService.determineReadPartitionForRequestForServerOperation( - theRequestDetails, ProviderConstants.OPERATION_REINDEX); - params.setRequestPartitionId(requestPartition); + myJobPartitionProvider.getPartitionedUrls(theRequestDetails, urls).forEach(params::addPartitionedUrl); JobInstanceStartRequest request = new JobInstanceStartRequest(); request.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX); @@ -1334,14 +1331,6 @@ public abstract class BaseHapiFhirResourceDao extends B return Boolean.parseBoolean(shouldSkip.toString()); } - private void addAllResourcesTypesToReindex( - List theBase, RequestDetails theRequestDetails, ReindexJobParameters params) { - theBase.stream() - .map(t -> t + "?") - .map(url -> myUrlPartitioner.partitionUrl(url, theRequestDetails)) - .forEach(params::addPartitionedUrl); - } - private boolean isCommonSearchParam(List theBase) { // If the base contains the special resource "Resource", this is a common SP that applies to all resources return theBase.stream().map(String::toLowerCase).anyMatch(BASE_RESOURCE_NAME::equals); @@ -2457,11 +2446,13 @@ public abstract class BaseHapiFhirResourceDao extends B RestOperationTypeEnum theOperationType, TransactionDetails theTransactionDetails) { - // we stored a resource searchUrl at creation time to prevent resource duplication. Let's remove the entry on - // the - // first update but guard against unnecessary trips to the database on subsequent ones. + /* + * We stored a resource searchUrl at creation time to prevent resource duplication. + * We'll clear any currently existing urls from the db, otherwise we could hit + * duplicate index violations if we try to add another (after this create/update) + */ ResourceTable entity = (ResourceTable) theEntity; - if (entity.isSearchUrlPresent() && thePerformIndexing) { + if (entity.isSearchUrlPresent()) { myResourceSearchUrlSvc.deleteByResId( (Long) theEntity.getPersistentId().getId()); entity.setSearchUrlPresent(false); diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/TransactionProcessor.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/TransactionProcessor.java index 93d81a1f82c..86d9f727906 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/TransactionProcessor.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/TransactionProcessor.java @@ -27,7 +27,6 @@ import ca.uhn.fhir.jpa.api.dao.IFhirSystemDao; import ca.uhn.fhir.jpa.api.model.DaoMethodOutcome; import ca.uhn.fhir.jpa.api.svc.IIdHelperService; import ca.uhn.fhir.jpa.config.HapiFhirHibernateJpaDialect; -import ca.uhn.fhir.jpa.model.config.PartitionSettings; import ca.uhn.fhir.jpa.model.dao.JpaPid; import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamToken; import ca.uhn.fhir.jpa.model.entity.StorageSettings; @@ -97,9 +96,6 @@ public class TransactionProcessor extends BaseTransactionProcessor { @Autowired private IIdHelperService myIdHelperService; - @Autowired - private PartitionSettings myPartitionSettings; - @Autowired private JpaStorageSettings myStorageSettings; @@ -150,14 +146,9 @@ public class TransactionProcessor extends BaseTransactionProcessor { List theEntries, StopWatch theTransactionStopWatch) { - ITransactionProcessorVersionAdapter versionAdapter = getVersionAdapter(); - RequestPartitionId requestPartitionId = null; - if (!myPartitionSettings.isPartitioningEnabled()) { - requestPartitionId = RequestPartitionId.allPartitions(); - } else { - // If all entries in the transaction point to the exact same partition, we'll try and do a pre-fetch - requestPartitionId = getSinglePartitionForAllEntriesOrNull(theRequest, theEntries, versionAdapter); - } + ITransactionProcessorVersionAdapter versionAdapter = getVersionAdapter(); + RequestPartitionId requestPartitionId = + super.determineRequestPartitionIdForWriteEntries(theRequest, theEntries); if (requestPartitionId != null) { preFetch(theTransactionDetails, theEntries, versionAdapter, requestPartitionId); @@ -472,24 +463,6 @@ public class TransactionProcessor extends BaseTransactionProcessor { } } - private RequestPartitionId getSinglePartitionForAllEntriesOrNull( - RequestDetails theRequest, List theEntries, ITransactionProcessorVersionAdapter versionAdapter) { - RequestPartitionId retVal = null; - Set requestPartitionIdsForAllEntries = new HashSet<>(); - for (IBase nextEntry : theEntries) { - IBaseResource resource = versionAdapter.getResource(nextEntry); - if (resource != null) { - RequestPartitionId requestPartition = myRequestPartitionSvc.determineCreatePartitionForRequest( - theRequest, resource, myFhirContext.getResourceType(resource)); - requestPartitionIdsForAllEntries.add(requestPartition); - } - } - if (requestPartitionIdsForAllEntries.size() == 1) { - retVal = requestPartitionIdsForAllEntries.iterator().next(); - } - return retVal; - } - /** * Given a token parameter, build the query predicate based on its hash. Uses system and value if both are available, otherwise just value. * If neither are available, it returns null. @@ -570,11 +543,6 @@ public class TransactionProcessor extends BaseTransactionProcessor { } } - @VisibleForTesting - public void setPartitionSettingsForUnitTest(PartitionSettings thePartitionSettings) { - myPartitionSettings = thePartitionSettings; - } - @VisibleForTesting public void setIdHelperServiceForUnitTest(IIdHelperService theIdHelperService) { myIdHelperService = theIdHelperService; diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/data/IResourceTableDao.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/data/IResourceTableDao.java index 21f233891e5..a081668cdae 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/data/IResourceTableDao.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/data/IResourceTableDao.java @@ -135,7 +135,8 @@ public interface IResourceTableDao * This method returns a Collection where each row is an element in the collection. Each element in the collection * is an object array, where the order matters (the array represents columns returned by the query). Be careful if you change this query in any way. */ - @Query("SELECT t.myResourceType, t.myId, t.myDeleted FROM ResourceTable t WHERE t.myId IN (:pid)") + @Query( + "SELECT t.myResourceType, t.myId, t.myDeleted, t.myPartitionIdValue, t.myPartitionDateValue FROM ResourceTable t WHERE t.myId IN (:pid)") Collection findLookupFieldsByResourcePid(@Param("pid") List thePids); /** @@ -143,7 +144,7 @@ public interface IResourceTableDao * is an object array, where the order matters (the array represents columns returned by the query). Be careful if you change this query in any way. */ @Query( - "SELECT t.myResourceType, t.myId, t.myDeleted FROM ResourceTable t WHERE t.myId IN (:pid) AND t.myPartitionIdValue IN :partition_id") + "SELECT t.myResourceType, t.myId, t.myDeleted, t.myPartitionIdValue, t.myPartitionDateValue FROM ResourceTable t WHERE t.myId IN (:pid) AND t.myPartitionIdValue IN :partition_id") Collection findLookupFieldsByResourcePidInPartitionIds( @Param("pid") List thePids, @Param("partition_id") Collection thePartitionId); @@ -152,7 +153,7 @@ public interface IResourceTableDao * is an object array, where the order matters (the array represents columns returned by the query). Be careful if you change this query in any way. */ @Query( - "SELECT t.myResourceType, t.myId, t.myDeleted FROM ResourceTable t WHERE t.myId IN (:pid) AND (t.myPartitionIdValue IS NULL OR t.myPartitionIdValue IN :partition_id)") + "SELECT t.myResourceType, t.myId, t.myDeleted, t.myPartitionIdValue, t.myPartitionDateValue FROM ResourceTable t WHERE t.myId IN (:pid) AND (t.myPartitionIdValue IS NULL OR t.myPartitionIdValue IN :partition_id)") Collection findLookupFieldsByResourcePidInPartitionIdsOrNullPartition( @Param("pid") List thePids, @Param("partition_id") Collection thePartitionId); @@ -161,7 +162,7 @@ public interface IResourceTableDao * is an object array, where the order matters (the array represents columns returned by the query). Be careful if you change this query in any way. */ @Query( - "SELECT t.myResourceType, t.myId, t.myDeleted FROM ResourceTable t WHERE t.myId IN (:pid) AND t.myPartitionIdValue IS NULL") + "SELECT t.myResourceType, t.myId, t.myDeleted, t.myPartitionIdValue, t.myPartitionDateValue FROM ResourceTable t WHERE t.myId IN (:pid) AND t.myPartitionIdValue IS NULL") Collection findLookupFieldsByResourcePidInPartitionNull(@Param("pid") List thePids); @Query("SELECT t.myVersion FROM ResourceTable t WHERE t.myId = :pid") diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/data/custom/IResourceTableDaoImpl.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/data/custom/IResourceTableDaoImpl.java index d05aac4b708..cd7e6d17cb7 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/data/custom/IResourceTableDaoImpl.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/data/custom/IResourceTableDaoImpl.java @@ -56,9 +56,10 @@ public class IResourceTableDaoImpl implements IForcedIdQueries { @Override public Collection findAndResolveByForcedIdWithNoType( String theResourceType, Collection theForcedIds, boolean theExcludeDeleted) { - String query = "SELECT t.myResourceType, t.myId, t.myFhirId, t.myDeleted " - + "FROM ResourceTable t " - + "WHERE t.myResourceType = :resource_type AND t.myFhirId IN ( :forced_id )"; + String query = + "SELECT t.myResourceType, t.myId, t.myFhirId, t.myDeleted, t.myPartitionIdValue, t.myPartitionDateValue " + + "FROM ResourceTable t " + + "WHERE t.myResourceType = :resource_type AND t.myFhirId IN ( :forced_id )"; if (theExcludeDeleted) { query += " AND t.myDeleted IS NULL"; @@ -82,9 +83,10 @@ public class IResourceTableDaoImpl implements IForcedIdQueries { Collection theForcedIds, Collection thePartitionId, boolean theExcludeDeleted) { - String query = "SELECT t.myResourceType, t.myId, t.myFhirId, t.myDeleted " - + "FROM ResourceTable t " - + "WHERE t.myResourceType = :resource_type AND t.myFhirId IN ( :forced_id ) AND t.myPartitionIdValue IN ( :partition_id )"; + String query = + "SELECT t.myResourceType, t.myId, t.myFhirId, t.myDeleted, t.myPartitionIdValue, t.myPartitionDateValue " + + "FROM ResourceTable t " + + "WHERE t.myResourceType = :resource_type AND t.myFhirId IN ( :forced_id ) AND t.myPartitionIdValue IN ( :partition_id )"; if (theExcludeDeleted) { query += " AND t.myDeleted IS NULL"; @@ -106,9 +108,11 @@ public class IResourceTableDaoImpl implements IForcedIdQueries { @Override public Collection findAndResolveByForcedIdWithNoTypeInPartitionNull( String theResourceType, Collection theForcedIds, boolean theExcludeDeleted) { - String query = "SELECT t.myResourceType, t.myId, t.myFhirId, t.myDeleted " - + "FROM ResourceTable t " - + "WHERE t.myResourceType = :resource_type AND t.myFhirId IN ( :forced_id ) AND t.myPartitionIdValue IS NULL"; + // we fetch myPartitionIdValue and myPartitionDateValue for resultSet processing consistency + String query = + "SELECT t.myResourceType, t.myId, t.myFhirId, t.myDeleted, t.myPartitionIdValue, t.myPartitionDateValue " + + "FROM ResourceTable t " + + "WHERE t.myResourceType = :resource_type AND t.myFhirId IN ( :forced_id ) AND t.myPartitionIdValue IS NULL"; if (theExcludeDeleted) { query += " AND t.myDeleted IS NULL"; @@ -132,9 +136,10 @@ public class IResourceTableDaoImpl implements IForcedIdQueries { Collection theForcedIds, List thePartitionIdsWithoutDefault, boolean theExcludeDeleted) { - String query = "SELECT t.myResourceType, t.myId, t.myFhirId, t.myDeleted " - + "FROM ResourceTable t " - + "WHERE t.myResourceType = :resource_type AND t.myFhirId IN ( :forced_id ) AND (t.myPartitionIdValue IS NULL OR t.myPartitionIdValue IN ( :partition_id ))"; + String query = + "SELECT t.myResourceType, t.myId, t.myFhirId, t.myDeleted, t.myPartitionIdValue, t.myPartitionDateValue " + + "FROM ResourceTable t " + + "WHERE t.myResourceType = :resource_type AND t.myFhirId IN ( :forced_id ) AND (t.myPartitionIdValue IS NULL OR t.myPartitionIdValue IN ( :partition_id ))"; if (theExcludeDeleted) { query += " AND t.myDeleted IS NULL"; diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/index/IdHelperService.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/index/IdHelperService.java index ba16be6115f..9e81e87f598 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/index/IdHelperService.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/index/IdHelperService.java @@ -30,6 +30,7 @@ import ca.uhn.fhir.jpa.model.config.PartitionSettings; import ca.uhn.fhir.jpa.model.cross.IResourceLookup; import ca.uhn.fhir.jpa.model.cross.JpaResourceLookup; import ca.uhn.fhir.jpa.model.dao.JpaPid; +import ca.uhn.fhir.jpa.model.entity.PartitionablePartitionId; import ca.uhn.fhir.jpa.model.entity.ResourceTable; import ca.uhn.fhir.jpa.search.builder.SearchBuilder; import ca.uhn.fhir.jpa.util.MemoryCacheService; @@ -59,12 +60,11 @@ import org.hl7.fhir.instance.model.api.IAnyResource; import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.r4.model.IdType; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Service; import org.springframework.transaction.support.TransactionSynchronizationManager; +import java.time.LocalDate; import java.util.ArrayList; import java.util.Collection; import java.util.Collections; @@ -100,7 +100,6 @@ import static org.apache.commons.lang3.StringUtils.isNotBlank; */ @Service public class IdHelperService implements IIdHelperService { - private static final Logger ourLog = LoggerFactory.getLogger(IdHelperService.class); public static final Predicate[] EMPTY_PREDICATE_ARRAY = new Predicate[0]; public static final String RESOURCE_PID = "RESOURCE_PID"; @@ -523,7 +522,7 @@ public class IdHelperService implements IIdHelperService { if (myStorageSettings.getResourceClientIdStrategy() != JpaStorageSettings.ClientIdStrategyEnum.ANY) { List pids = theId.stream() .filter(t -> isValidPid(t)) - .map(t -> t.getIdPartAsLong()) + .map(IIdType::getIdPartAsLong) .collect(Collectors.toList()); if (!pids.isEmpty()) { resolvePids(requestPartitionId, pids, retVal); @@ -578,8 +577,14 @@ public class IdHelperService implements IIdHelperService { Long resourcePid = (Long) next[1]; String forcedId = (String) next[2]; Date deletedAt = (Date) next[3]; + Integer partitionId = (Integer) next[4]; + LocalDate partitionDate = (LocalDate) next[5]; - JpaResourceLookup lookup = new JpaResourceLookup(resourceType, resourcePid, deletedAt); + JpaResourceLookup lookup = new JpaResourceLookup( + resourceType, + resourcePid, + deletedAt, + PartitionablePartitionId.with(partitionId, partitionDate)); retVal.computeIfAbsent(forcedId, id -> new ArrayList<>()).add(lookup); if (!myStorageSettings.isDeleteEnabled()) { @@ -638,7 +643,11 @@ public class IdHelperService implements IIdHelperService { } } lookup.stream() - .map(t -> new JpaResourceLookup((String) t[0], (Long) t[1], (Date) t[2])) + .map(t -> new JpaResourceLookup( + (String) t[0], + (Long) t[1], + (Date) t[2], + PartitionablePartitionId.with((Integer) t[3], (LocalDate) t[4]))) .forEach(t -> { String id = t.getPersistentId().toString(); if (!theTargets.containsKey(id)) { @@ -683,9 +692,8 @@ public class IdHelperService implements IIdHelperService { MemoryCacheService.CacheEnum.PID_TO_FORCED_ID, nextResourcePid, Optional.empty()); } Map> convertRetVal = new HashMap<>(); - retVal.forEach((k, v) -> { - convertRetVal.put(JpaPid.fromId(k), v); - }); + retVal.forEach((k, v) -> convertRetVal.put(JpaPid.fromId(k), v)); + return new PersistentIdToForcedIdMap<>(convertRetVal); } @@ -716,7 +724,8 @@ public class IdHelperService implements IIdHelperService { } if (!myStorageSettings.isDeleteEnabled()) { - JpaResourceLookup lookup = new JpaResourceLookup(theResourceType, theJpaPid.getId(), theDeletedAt); + JpaResourceLookup lookup = new JpaResourceLookup( + theResourceType, theJpaPid.getId(), theDeletedAt, theJpaPid.getPartitionablePartitionId()); String nextKey = theJpaPid.toString(); myMemoryCacheService.putAfterCommit(MemoryCacheService.CacheEnum.RESOURCE_LOOKUP, nextKey, lookup); } @@ -744,8 +753,7 @@ public class IdHelperService implements IIdHelperService { @Nonnull public List getPidsOrThrowException( @Nonnull RequestPartitionId theRequestPartitionId, List theIds) { - List resourcePersistentIds = resolveResourcePersistentIdsWithCache(theRequestPartitionId, theIds); - return resourcePersistentIds; + return resolveResourcePersistentIdsWithCache(theRequestPartitionId, theIds); } @Override diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/search/ExtendedHSearchSearchBuilder.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/search/ExtendedHSearchSearchBuilder.java index 5b9634b6a80..dc5cc4d70b2 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/search/ExtendedHSearchSearchBuilder.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/search/ExtendedHSearchSearchBuilder.java @@ -59,7 +59,7 @@ public class ExtendedHSearchSearchBuilder { /** * These params have complicated semantics, or are best resolved at the JPA layer for now. */ - public static final Set ourUnsafeSearchParmeters = Sets.newHashSet("_id", "_meta"); + public static final Set ourUnsafeSearchParmeters = Sets.newHashSet("_id", "_meta", "_count"); /** * Determine if ExtendedHibernateSearchBuilder can support this parameter @@ -67,20 +67,22 @@ public class ExtendedHSearchSearchBuilder { * @param theActiveParamsForResourceType active search parameters for the desired resource type * @return whether or not this search parameter is supported in hibernate */ - public boolean supportsSearchParameter(String theParamName, ResourceSearchParams theActiveParamsForResourceType) { + public boolean illegalForHibernateSearch(String theParamName, ResourceSearchParams theActiveParamsForResourceType) { if (theActiveParamsForResourceType == null) { - return false; + return true; } if (ourUnsafeSearchParmeters.contains(theParamName)) { - return false; + return true; } if (!theActiveParamsForResourceType.containsParamName(theParamName)) { - return false; + return true; } - return true; + return false; } /** + * By default, do not use Hibernate Search. + * If a Search Parameter is supported by hibernate search, * Are any of the queries supported by our indexing? * - * If not, do not use hibernate, because the results will @@ -88,12 +90,12 @@ public class ExtendedHSearchSearchBuilder { */ public boolean canUseHibernateSearch( String theResourceType, SearchParameterMap myParams, ISearchParamRegistry theSearchParamRegistry) { - boolean canUseHibernate = true; + boolean canUseHibernate = false; ResourceSearchParams resourceActiveSearchParams = theSearchParamRegistry.getActiveSearchParams(theResourceType); for (String paramName : myParams.keySet()) { // is this parameter supported? - if (!supportsSearchParameter(paramName, resourceActiveSearchParams)) { + if (illegalForHibernateSearch(paramName, resourceActiveSearchParams)) { canUseHibernate = false; } else { // are the parameter values supported? @@ -218,7 +220,7 @@ public class ExtendedHSearchSearchBuilder { ArrayList paramNames = compileParamNames(searchParameterMap); ResourceSearchParams activeSearchParams = searchParamRegistry.getActiveSearchParams(resourceType); for (String nextParam : paramNames) { - if (!supportsSearchParameter(nextParam, activeSearchParams)) { + if (illegalForHibernateSearch(nextParam, activeSearchParams)) { // ignore magic params handled in JPA continue; } diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/migrate/tasks/HapiFhirJpaMigrationTasks.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/migrate/tasks/HapiFhirJpaMigrationTasks.java index 7579a609f69..6f0bb2b3c3d 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/migrate/tasks/HapiFhirJpaMigrationTasks.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/migrate/tasks/HapiFhirJpaMigrationTasks.java @@ -414,6 +414,7 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks { version.onTable("HFJ_IDX_CMB_TOK_NU") .addIndex("20240625.10", "IDX_IDXCMBTOKNU_HASHC") .unique(false) + .online(true) .withColumns("HASH_COMPLETE", "RES_ID", "PARTITION_ID"); version.onTable("HFJ_IDX_CMP_STRING_UNIQ") .addColumn("20240625.20", "HASH_COMPLETE") @@ -470,10 +471,26 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks { } } - version.onTable(Search.HFJ_SEARCH) - .modifyColumn("20240722.1", Search.SEARCH_UUID) - .nonNullable() - .withType(ColumnTypeEnum.STRING, 48); + { + // Add target resource partition id/date columns to resource link + Builder.BuilderWithTableName resourceLinkTable = version.onTable("HFJ_RES_LINK"); + + resourceLinkTable + .addColumn("20240718.10", "TARGET_RES_PARTITION_ID") + .nullable() + .type(ColumnTypeEnum.INT); + resourceLinkTable + .addColumn("20240718.20", "TARGET_RES_PARTITION_DATE") + .nullable() + .type(ColumnTypeEnum.DATE_ONLY); + } + + { + version.onTable(Search.HFJ_SEARCH) + .modifyColumn("20240722.1", Search.SEARCH_UUID) + .nonNullable() + .withType(ColumnTypeEnum.STRING, 48); + } { final Builder.BuilderWithTableName hfjResource = version.onTable("HFJ_RESOURCE"); diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/model/cross/JpaResourceLookup.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/model/cross/JpaResourceLookup.java index 65393d22be5..891617aa007 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/model/cross/JpaResourceLookup.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/model/cross/JpaResourceLookup.java @@ -20,18 +20,26 @@ package ca.uhn.fhir.jpa.model.cross; import ca.uhn.fhir.jpa.model.dao.JpaPid; +import ca.uhn.fhir.jpa.model.entity.PartitionablePartitionId; import java.util.Date; public class JpaResourceLookup implements IResourceLookup { + private final String myResourceType; private final Long myResourcePid; private final Date myDeletedAt; + private final PartitionablePartitionId myPartitionablePartitionId; - public JpaResourceLookup(String theResourceType, Long theResourcePid, Date theDeletedAt) { + public JpaResourceLookup( + String theResourceType, + Long theResourcePid, + Date theDeletedAt, + PartitionablePartitionId thePartitionablePartitionId) { myResourceType = theResourceType; myResourcePid = theResourcePid; myDeletedAt = theDeletedAt; + myPartitionablePartitionId = thePartitionablePartitionId; } @Override @@ -46,6 +54,9 @@ public class JpaResourceLookup implements IResourceLookup { @Override public JpaPid getPersistentId() { - return JpaPid.fromId(myResourcePid); + JpaPid jpaPid = JpaPid.fromId(myResourcePid); + jpaPid.setPartitionablePartitionId(myPartitionablePartitionId); + + return jpaPid; } } diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/provider/TerminologyUploaderProvider.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/provider/TerminologyUploaderProvider.java index eda484f7cbe..1365917162b 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/provider/TerminologyUploaderProvider.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/provider/TerminologyUploaderProvider.java @@ -475,6 +475,9 @@ public class TerminologyUploaderProvider extends BaseJpaProvider { } private static String csvEscape(String theValue) { + if (theValue == null) { + return ""; + } return '"' + theValue.replace("\"", "\"\"").replace("\n", "\\n").replace("\r", "") + '"'; } } diff --git a/hapi-fhir-jpaserver-base/src/test/java/ca/uhn/fhir/jpa/batch2/JpaJobPartitionProviderTest.java b/hapi-fhir-jpaserver-base/src/test/java/ca/uhn/fhir/jpa/batch2/JpaJobPartitionProviderTest.java deleted file mode 100644 index 8540511959c..00000000000 --- a/hapi-fhir-jpaserver-base/src/test/java/ca/uhn/fhir/jpa/batch2/JpaJobPartitionProviderTest.java +++ /dev/null @@ -1,76 +0,0 @@ -package ca.uhn.fhir.jpa.batch2; - -import ca.uhn.fhir.interceptor.model.RequestPartitionId; -import ca.uhn.fhir.jpa.entity.PartitionEntity; -import ca.uhn.fhir.jpa.partition.IPartitionLookupSvc; -import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; -import ca.uhn.fhir.rest.api.server.SystemRequestDetails; -import ca.uhn.fhir.rest.server.provider.ProviderConstants; -import org.assertj.core.api.Assertions; -import org.junit.jupiter.api.Test; -import org.junit.jupiter.api.extension.ExtendWith; -import org.mockito.ArgumentMatchers; -import org.mockito.InjectMocks; -import org.mockito.Mock; -import org.mockito.junit.jupiter.MockitoExtension; - -import java.util.ArrayList; -import java.util.List; - -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; - -@ExtendWith(MockitoExtension.class) -public class JpaJobPartitionProviderTest { - @Mock - private IRequestPartitionHelperSvc myRequestPartitionHelperSvc; - @Mock - private IPartitionLookupSvc myPartitionLookupSvc; - @InjectMocks - private JpaJobPartitionProvider myJobPartitionProvider; - - @Test - public void getPartitions_requestSpecificPartition_returnsPartition() { - // setup - SystemRequestDetails requestDetails = new SystemRequestDetails(); - String operation = ProviderConstants.OPERATION_EXPORT; - - RequestPartitionId partitionId = RequestPartitionId.fromPartitionId(1); - when(myRequestPartitionHelperSvc.determineReadPartitionForRequestForServerOperation(ArgumentMatchers.eq(requestDetails), ArgumentMatchers.eq(operation))).thenReturn(partitionId); - - // test - List partitionIds = myJobPartitionProvider.getPartitions(requestDetails, operation); - - // verify - Assertions.assertThat(partitionIds).hasSize(1); - Assertions.assertThat(partitionIds).containsExactlyInAnyOrder(partitionId); - } - - @Test - public void getPartitions_requestAllPartitions_returnsListOfAllSpecificPartitions() { - // setup - SystemRequestDetails requestDetails = new SystemRequestDetails(); - String operation = ProviderConstants.OPERATION_EXPORT; - - when(myRequestPartitionHelperSvc.determineReadPartitionForRequestForServerOperation(ArgumentMatchers.eq(requestDetails), ArgumentMatchers.eq(operation))) - .thenReturn( RequestPartitionId.allPartitions()); - List partitionIds = List.of(RequestPartitionId.fromPartitionIds(1), RequestPartitionId.fromPartitionIds(2)); - - List partitionEntities = new ArrayList<>(); - partitionIds.forEach(partitionId -> { - PartitionEntity entity = mock(PartitionEntity.class); - when(entity.toRequestPartitionId()).thenReturn(partitionId); - partitionEntities.add(entity); - }); - when(myPartitionLookupSvc.listPartitions()).thenReturn(partitionEntities); - List expectedPartitionIds = new ArrayList<>(partitionIds); - expectedPartitionIds.add(RequestPartitionId.defaultPartition()); - - // test - List actualPartitionIds = myJobPartitionProvider.getPartitions(requestDetails, operation); - - // verify - Assertions.assertThat(actualPartitionIds).hasSize(expectedPartitionIds.size()); - Assertions.assertThat(actualPartitionIds).containsExactlyInAnyOrder(expectedPartitionIds.toArray(new RequestPartitionId[0])); - } -} \ No newline at end of file diff --git a/hapi-fhir-jpaserver-elastic-test-utilities/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4SearchWithElasticSearchIT.java b/hapi-fhir-jpaserver-elastic-test-utilities/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4SearchWithElasticSearchIT.java index 3ee1dd89aaf..1770bcd910c 100644 --- a/hapi-fhir-jpaserver-elastic-test-utilities/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4SearchWithElasticSearchIT.java +++ b/hapi-fhir-jpaserver-elastic-test-utilities/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4SearchWithElasticSearchIT.java @@ -23,6 +23,7 @@ import ca.uhn.fhir.jpa.model.dao.JpaPid; import ca.uhn.fhir.jpa.model.entity.NormalizedQuantitySearchLevel; import ca.uhn.fhir.jpa.model.entity.ResourceTable; import ca.uhn.fhir.jpa.model.search.StorageProcessingMessage; +import ca.uhn.fhir.jpa.rp.r4.PatientResourceProvider; import ca.uhn.fhir.jpa.search.BaseSourceSearchParameterTestCases; import ca.uhn.fhir.jpa.search.CompositeSearchParameterTestCases; import ca.uhn.fhir.jpa.search.QuantitySearchParameterTestCases; @@ -73,6 +74,7 @@ import org.hl7.fhir.r4.model.DateTimeType; import org.hl7.fhir.r4.model.DecimalType; import org.hl7.fhir.r4.model.DiagnosticReport; import org.hl7.fhir.r4.model.Encounter; +import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.Identifier; import org.hl7.fhir.r4.model.Meta; import org.hl7.fhir.r4.model.Narrative; @@ -101,6 +103,7 @@ import org.mockito.Mockito; import org.mockito.junit.jupiter.MockitoExtension; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Qualifier; +import org.springframework.mock.web.MockHttpServletRequest; import org.springframework.test.annotation.DirtiesContext; import org.springframework.test.context.ContextConfiguration; import org.springframework.test.context.ContextHierarchy; @@ -118,6 +121,7 @@ import java.io.IOException; import java.net.URLEncoder; import java.util.ArrayList; import java.util.Collection; +import java.util.Collections; import java.util.Date; import java.util.HashSet; import java.util.List; @@ -126,11 +130,13 @@ import java.util.stream.Collectors; import static ca.uhn.fhir.jpa.model.util.UcumServiceUtil.UCUM_CODESYSTEM_URL; import static ca.uhn.fhir.rest.api.Constants.CHARSET_UTF8; +import static ca.uhn.fhir.rest.api.Constants.HEADER_CACHE_CONTROL; import static org.assertj.core.api.Assertions.assertThat; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.when; @ExtendWith(SpringExtension.class) @ExtendWith(MockitoExtension.class) @@ -229,6 +235,8 @@ public class FhirResourceDaoR4SearchWithElasticSearchIT extends BaseJpaTest impl @Mock private IHSearchEventListener mySearchEventListener; @Autowired + private PatientResourceProvider myPatientRpR4; + @Autowired private ElasticsearchSvcImpl myElasticsearchSvc; @@ -954,6 +962,24 @@ public class FhirResourceDaoR4SearchWithElasticSearchIT extends BaseJpaTest impl myStorageSettings.setStoreResourceInHSearchIndex(defaultConfig.isStoreResourceInHSearchIndex()); } + @Test + public void testEverythingType() { + Patient p = new Patient(); + p.setId("my-patient"); + myPatientDao.update(p); + IBundleProvider iBundleProvider = myPatientRpR4.patientTypeEverything(new MockHttpServletRequest(), null, null, null, null, null, null, null, null, null, null, mySrd); + assertEquals(iBundleProvider.getAllResources().size(), 1); + } + + @Test + public void testEverythingInstance() { + Patient p = new Patient(); + p.setId("my-patient"); + myPatientDao.update(p); + IBundleProvider iBundleProvider = myPatientRpR4.patientInstanceEverything(new MockHttpServletRequest(), new IdType("Patient/my-patient"), null, null, null, null, null, null, null, null, null, mySrd); + assertEquals(iBundleProvider.getAllResources().size(), 1); + } + @Test public void testExpandWithIsAInExternalValueSet() { createExternalCsAndLocalVs(); diff --git a/hapi-fhir-jpaserver-mdm/src/main/java/ca/uhn/fhir/jpa/mdm/svc/MdmControllerSvcImpl.java b/hapi-fhir-jpaserver-mdm/src/main/java/ca/uhn/fhir/jpa/mdm/svc/MdmControllerSvcImpl.java index 0ccaf493fc9..45501abc0a2 100644 --- a/hapi-fhir-jpaserver-mdm/src/main/java/ca/uhn/fhir/jpa/mdm/svc/MdmControllerSvcImpl.java +++ b/hapi-fhir-jpaserver-mdm/src/main/java/ca/uhn/fhir/jpa/mdm/svc/MdmControllerSvcImpl.java @@ -20,6 +20,7 @@ package ca.uhn.fhir.jpa.mdm.svc; import ca.uhn.fhir.batch2.api.IJobCoordinator; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.interceptor.api.HookParams; @@ -360,9 +361,9 @@ public class MdmControllerSvcImpl implements IMdmControllerSvc { if (hasBatchSize) { params.setBatchSize(theBatchSize.getValue().intValue()); } - params.setRequestPartitionId(RequestPartitionId.allPartitions()); - - theUrls.forEach(params::addUrl); + RequestPartitionId partitionId = RequestPartitionId.allPartitions(); + theUrls.forEach( + url -> params.addPartitionedUrl(new PartitionedUrl().setUrl(url).setRequestPartitionId(partitionId))); JobInstanceStartRequest request = new JobInstanceStartRequest(); request.setParameters(params); diff --git a/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/dao/JpaPid.java b/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/dao/JpaPid.java index 2954dda4ab9..c2f21c82894 100644 --- a/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/dao/JpaPid.java +++ b/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/dao/JpaPid.java @@ -19,6 +19,7 @@ */ package ca.uhn.fhir.jpa.model.dao; +import ca.uhn.fhir.jpa.model.entity.PartitionablePartitionId; import ca.uhn.fhir.rest.api.server.storage.BaseResourcePersistentId; import java.util.ArrayList; @@ -34,6 +35,7 @@ import java.util.Set; */ public class JpaPid extends BaseResourcePersistentId { private final Long myId; + private PartitionablePartitionId myPartitionablePartitionId; private JpaPid(Long theId) { super(null); @@ -55,6 +57,15 @@ public class JpaPid extends BaseResourcePersistentId { myId = theId; } + public PartitionablePartitionId getPartitionablePartitionId() { + return myPartitionablePartitionId; + } + + public JpaPid setPartitionablePartitionId(PartitionablePartitionId thePartitionablePartitionId) { + myPartitionablePartitionId = thePartitionablePartitionId; + return this; + } + public static List toLongList(Collection thePids) { List retVal = new ArrayList<>(thePids.size()); for (JpaPid next : thePids) { diff --git a/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/BasePartitionable.java b/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/BasePartitionable.java index 533650e3e5b..8a2c2393b68 100644 --- a/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/BasePartitionable.java +++ b/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/BasePartitionable.java @@ -25,6 +25,7 @@ import jakarta.persistence.Embedded; import jakarta.persistence.MappedSuperclass; import java.io.Serializable; +import java.time.LocalDate; /** * This is the base class for entities with partitioning that does NOT include Hibernate Envers logging. @@ -44,6 +45,13 @@ public abstract class BasePartitionable implements Serializable { @Column(name = PartitionablePartitionId.PARTITION_ID, insertable = false, updatable = false, nullable = true) private Integer myPartitionIdValue; + /** + * This is here to support queries only, do not set this field directly + */ + @SuppressWarnings("unused") + @Column(name = PartitionablePartitionId.PARTITION_DATE, insertable = false, updatable = false, nullable = true) + private LocalDate myPartitionDateValue; + @Nullable public PartitionablePartitionId getPartitionId() { return myPartitionId; @@ -57,6 +65,7 @@ public abstract class BasePartitionable implements Serializable { public String toString() { return "BasePartitionable{" + "myPartitionId=" + myPartitionId + ", myPartitionIdValue=" - + myPartitionIdValue + '}'; + + myPartitionIdValue + ", myPartitionDateValue=" + + myPartitionDateValue + '}'; } } diff --git a/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/PartitionablePartitionId.java b/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/PartitionablePartitionId.java index 01001d7e0fe..90609d9acaf 100644 --- a/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/PartitionablePartitionId.java +++ b/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/PartitionablePartitionId.java @@ -34,6 +34,7 @@ import java.time.LocalDate; public class PartitionablePartitionId implements Cloneable { static final String PARTITION_ID = "PARTITION_ID"; + static final String PARTITION_DATE = "PARTITION_DATE"; @Column(name = PARTITION_ID, nullable = true, insertable = true, updatable = false) private Integer myPartitionId; @@ -132,4 +133,9 @@ public class PartitionablePartitionId implements Cloneable { } return new PartitionablePartitionId(partitionId, theRequestPartitionId.getPartitionDate()); } + + public static PartitionablePartitionId with( + @Nullable Integer thePartitionId, @Nullable LocalDate thePartitionDate) { + return new PartitionablePartitionId(thePartitionId, thePartitionDate); + } } diff --git a/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/ResourceLink.java b/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/ResourceLink.java index 7c47c07025d..545d5bd7603 100644 --- a/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/ResourceLink.java +++ b/hapi-fhir-jpaserver-model/src/main/java/ca/uhn/fhir/jpa/model/entity/ResourceLink.java @@ -19,8 +19,9 @@ */ package ca.uhn.fhir.jpa.model.entity; -import jakarta.annotation.Nullable; +import jakarta.persistence.AttributeOverride; import jakarta.persistence.Column; +import jakarta.persistence.Embedded; import jakarta.persistence.Entity; import jakarta.persistence.FetchType; import jakarta.persistence.ForeignKey; @@ -119,6 +120,11 @@ public class ResourceLink extends BaseResourceIndex { @Transient private transient String myTargetResourceId; + @Embedded + @AttributeOverride(name = "myPartitionId", column = @Column(name = "TARGET_RES_PARTITION_ID")) + @AttributeOverride(name = "myPartitionDate", column = @Column(name = "TARGET_RES_PARTITION_DATE")) + private PartitionablePartitionId myTargetResourcePartitionId; + /** * Constructor */ @@ -188,6 +194,7 @@ public class ResourceLink extends BaseResourceIndex { myTargetResourceType = source.getTargetResourceType(); myTargetResourceVersion = source.getTargetResourceVersion(); myTargetResourceUrl = source.getTargetResourceUrl(); + myTargetResourcePartitionId = source.getTargetResourcePartitionId(); } public String getSourcePath() { @@ -270,6 +277,15 @@ public class ResourceLink extends BaseResourceIndex { myId = theId; } + public PartitionablePartitionId getTargetResourcePartitionId() { + return myTargetResourcePartitionId; + } + + public ResourceLink setTargetResourcePartitionId(PartitionablePartitionId theTargetResourcePartitionId) { + myTargetResourcePartitionId = theTargetResourcePartitionId; + return this; + } + @Override public void clearHashes() { // nothing right now @@ -363,23 +379,113 @@ public class ResourceLink extends BaseResourceIndex { return retVal; } - /** - * @param theTargetResourceVersion This should only be populated if the reference actually had a version - */ public static ResourceLink forLocalReference( - String theSourcePath, - ResourceTable theSourceResource, - String theTargetResourceType, - Long theTargetResourcePid, - String theTargetResourceId, - Date theUpdated, - @Nullable Long theTargetResourceVersion) { + ResourceLinkForLocalReferenceParams theResourceLinkForLocalReferenceParams) { + ResourceLink retVal = new ResourceLink(); - retVal.setSourcePath(theSourcePath); - retVal.setSourceResource(theSourceResource); - retVal.setTargetResource(theTargetResourceType, theTargetResourcePid, theTargetResourceId); - retVal.setTargetResourceVersion(theTargetResourceVersion); - retVal.setUpdated(theUpdated); + retVal.setSourcePath(theResourceLinkForLocalReferenceParams.getSourcePath()); + retVal.setSourceResource(theResourceLinkForLocalReferenceParams.getSourceResource()); + retVal.setTargetResource( + theResourceLinkForLocalReferenceParams.getTargetResourceType(), + theResourceLinkForLocalReferenceParams.getTargetResourcePid(), + theResourceLinkForLocalReferenceParams.getTargetResourceId()); + + retVal.setTargetResourcePartitionId( + theResourceLinkForLocalReferenceParams.getTargetResourcePartitionablePartitionId()); + retVal.setTargetResourceVersion(theResourceLinkForLocalReferenceParams.getTargetResourceVersion()); + retVal.setUpdated(theResourceLinkForLocalReferenceParams.getUpdated()); + return retVal; } + + public static class ResourceLinkForLocalReferenceParams { + private String mySourcePath; + private ResourceTable mySourceResource; + private String myTargetResourceType; + private Long myTargetResourcePid; + private String myTargetResourceId; + private Date myUpdated; + private Long myTargetResourceVersion; + private PartitionablePartitionId myTargetResourcePartitionablePartitionId; + + public static ResourceLinkForLocalReferenceParams instance() { + return new ResourceLinkForLocalReferenceParams(); + } + + public String getSourcePath() { + return mySourcePath; + } + + public ResourceLinkForLocalReferenceParams setSourcePath(String theSourcePath) { + mySourcePath = theSourcePath; + return this; + } + + public ResourceTable getSourceResource() { + return mySourceResource; + } + + public ResourceLinkForLocalReferenceParams setSourceResource(ResourceTable theSourceResource) { + mySourceResource = theSourceResource; + return this; + } + + public String getTargetResourceType() { + return myTargetResourceType; + } + + public ResourceLinkForLocalReferenceParams setTargetResourceType(String theTargetResourceType) { + myTargetResourceType = theTargetResourceType; + return this; + } + + public Long getTargetResourcePid() { + return myTargetResourcePid; + } + + public ResourceLinkForLocalReferenceParams setTargetResourcePid(Long theTargetResourcePid) { + myTargetResourcePid = theTargetResourcePid; + return this; + } + + public String getTargetResourceId() { + return myTargetResourceId; + } + + public ResourceLinkForLocalReferenceParams setTargetResourceId(String theTargetResourceId) { + myTargetResourceId = theTargetResourceId; + return this; + } + + public Date getUpdated() { + return myUpdated; + } + + public ResourceLinkForLocalReferenceParams setUpdated(Date theUpdated) { + myUpdated = theUpdated; + return this; + } + + public Long getTargetResourceVersion() { + return myTargetResourceVersion; + } + + /** + * @param theTargetResourceVersion This should only be populated if the reference actually had a version + */ + public ResourceLinkForLocalReferenceParams setTargetResourceVersion(Long theTargetResourceVersion) { + myTargetResourceVersion = theTargetResourceVersion; + return this; + } + + public PartitionablePartitionId getTargetResourcePartitionablePartitionId() { + return myTargetResourcePartitionablePartitionId; + } + + public ResourceLinkForLocalReferenceParams setTargetResourcePartitionablePartitionId( + PartitionablePartitionId theTargetResourcePartitionablePartitionId) { + myTargetResourcePartitionablePartitionId = theTargetResourcePartitionablePartitionId; + return this; + } + } } diff --git a/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/interceptor/model/ReadPartitionIdRequestDetails.java b/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/interceptor/model/ReadPartitionIdRequestDetails.java index 36218fd388f..5748dac592d 100644 --- a/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/interceptor/model/ReadPartitionIdRequestDetails.java +++ b/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/interceptor/model/ReadPartitionIdRequestDetails.java @@ -113,6 +113,24 @@ public class ReadPartitionIdRequestDetails extends PartitionIdRequestDetails { null, RestOperationTypeEnum.EXTENDED_OPERATION_SERVER, null, null, null, null, theOperationName); } + /** + * @since 7.4.0 + */ + public static ReadPartitionIdRequestDetails forDelete(@Nonnull String theResourceType, @Nonnull IIdType theId) { + RestOperationTypeEnum op = RestOperationTypeEnum.DELETE; + return new ReadPartitionIdRequestDetails( + theResourceType, op, theId.withResourceType(theResourceType), null, null, null, null); + } + + /** + * @since 7.4.0 + */ + public static ReadPartitionIdRequestDetails forPatch(String theResourceType, IIdType theId) { + RestOperationTypeEnum op = RestOperationTypeEnum.PATCH; + return new ReadPartitionIdRequestDetails( + theResourceType, op, theId.withResourceType(theResourceType), null, null, null, null); + } + public static ReadPartitionIdRequestDetails forRead( String theResourceType, @Nonnull IIdType theId, boolean theIsVread) { RestOperationTypeEnum op = theIsVread ? RestOperationTypeEnum.VREAD : RestOperationTypeEnum.READ; diff --git a/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/jpa/searchparam/MatchUrlService.java b/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/jpa/searchparam/MatchUrlService.java index 49f48f74433..3b0676acf68 100644 --- a/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/jpa/searchparam/MatchUrlService.java +++ b/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/jpa/searchparam/MatchUrlService.java @@ -95,7 +95,7 @@ public class MatchUrlService { } if (Constants.PARAM_LASTUPDATED.equals(nextParamName)) { - if (paramList != null && paramList.size() > 0) { + if (!paramList.isEmpty()) { if (paramList.size() > 2) { throw new InvalidRequestException(Msg.code(484) + "Failed to parse match URL[" + theMatchUrl + "] - Can not have more than 2 " + Constants.PARAM_LASTUPDATED @@ -111,9 +111,7 @@ public class MatchUrlService { myFhirContext, RestSearchParameterTypeEnum.HAS, nextParamName, paramList); paramMap.add(nextParamName, param); } else if (Constants.PARAM_COUNT.equals(nextParamName)) { - if (paramList != null - && paramList.size() > 0 - && paramList.get(0).size() > 0) { + if (!paramList.isEmpty() && !paramList.get(0).isEmpty()) { String intString = paramList.get(0).get(0); try { paramMap.setCount(Integer.parseInt(intString)); @@ -123,16 +121,14 @@ public class MatchUrlService { } } } else if (Constants.PARAM_SEARCH_TOTAL_MODE.equals(nextParamName)) { - if (paramList != null - && !paramList.isEmpty() - && !paramList.get(0).isEmpty()) { + if (!paramList.isEmpty() && !paramList.get(0).isEmpty()) { String totalModeEnumStr = paramList.get(0).get(0); SearchTotalModeEnum searchTotalMode = SearchTotalModeEnum.fromCode(totalModeEnumStr); if (searchTotalMode == null) { // We had an oops here supporting the UPPER CASE enum instead of the FHIR code for _total. // Keep supporting it in case someone is using it. try { - paramMap.setSearchTotalMode(SearchTotalModeEnum.valueOf(totalModeEnumStr)); + searchTotalMode = SearchTotalModeEnum.valueOf(totalModeEnumStr); } catch (IllegalArgumentException e) { throw new InvalidRequestException(Msg.code(2078) + "Invalid " + Constants.PARAM_SEARCH_TOTAL_MODE + " value: " + totalModeEnumStr); @@ -141,9 +137,7 @@ public class MatchUrlService { paramMap.setSearchTotalMode(searchTotalMode); } } else if (Constants.PARAM_OFFSET.equals(nextParamName)) { - if (paramList != null - && paramList.size() > 0 - && paramList.get(0).size() > 0) { + if (!paramList.isEmpty() && !paramList.get(0).isEmpty()) { String intString = paramList.get(0).get(0); try { paramMap.setOffset(Integer.parseInt(intString)); @@ -238,40 +232,27 @@ public class MatchUrlService { return getResourceSearch(theUrl, null); } - public abstract static class Flag { - - /** - * Constructor - */ - Flag() { - // nothing - } - - abstract void process( - String theParamName, List theValues, SearchParameterMap theMapToPopulate); + public interface Flag { + void process(String theParamName, List theValues, SearchParameterMap theMapToPopulate); } /** * Indicates that the parser should process _include and _revinclude (by default these are not handled) */ public static Flag processIncludes() { - return new Flag() { - - @Override - void process(String theParamName, List theValues, SearchParameterMap theMapToPopulate) { - if (Constants.PARAM_INCLUDE.equals(theParamName)) { - for (QualifiedParamList nextQualifiedList : theValues) { - for (String nextValue : nextQualifiedList) { - theMapToPopulate.addInclude(new Include( - nextValue, ParameterUtil.isIncludeIterate(nextQualifiedList.getQualifier()))); - } + return (theParamName, theValues, theMapToPopulate) -> { + if (Constants.PARAM_INCLUDE.equals(theParamName)) { + for (QualifiedParamList nextQualifiedList : theValues) { + for (String nextValue : nextQualifiedList) { + theMapToPopulate.addInclude(new Include( + nextValue, ParameterUtil.isIncludeIterate(nextQualifiedList.getQualifier()))); } - } else if (Constants.PARAM_REVINCLUDE.equals(theParamName)) { - for (QualifiedParamList nextQualifiedList : theValues) { - for (String nextValue : nextQualifiedList) { - theMapToPopulate.addRevInclude(new Include( - nextValue, ParameterUtil.isIncludeIterate(nextQualifiedList.getQualifier()))); - } + } + } else if (Constants.PARAM_REVINCLUDE.equals(theParamName)) { + for (QualifiedParamList nextQualifiedList : theValues) { + for (String nextValue : nextQualifiedList) { + theMapToPopulate.addRevInclude(new Include( + nextValue, ParameterUtil.isIncludeIterate(nextQualifiedList.getQualifier()))); } } } diff --git a/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/jpa/searchparam/extractor/SearchParamExtractorService.java b/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/jpa/searchparam/extractor/SearchParamExtractorService.java index 51c2d79fea9..354996c66f1 100644 --- a/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/jpa/searchparam/extractor/SearchParamExtractorService.java +++ b/hapi-fhir-jpaserver-searchparam/src/main/java/ca/uhn/fhir/jpa/searchparam/extractor/SearchParamExtractorService.java @@ -37,6 +37,7 @@ import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboStringUnique; import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboTokenNonUnique; import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamString; import ca.uhn.fhir.jpa.model.entity.ResourceLink; +import ca.uhn.fhir.jpa.model.entity.ResourceLink.ResourceLinkForLocalReferenceParams; import ca.uhn.fhir.jpa.model.entity.ResourceTable; import ca.uhn.fhir.jpa.model.entity.SearchParamPresentEntity; import ca.uhn.fhir.jpa.model.entity.StorageSettings; @@ -71,6 +72,8 @@ import java.util.Optional; import java.util.Set; import java.util.stream.Collectors; +import static ca.uhn.fhir.jpa.model.config.PartitionSettings.CrossPartitionReferenceMode.ALLOWED_UNQUALIFIED; +import static ca.uhn.fhir.jpa.model.entity.ResourceLink.forLocalReference; import static org.apache.commons.lang3.StringUtils.isBlank; import static org.apache.commons.lang3.StringUtils.isNotBlank; @@ -105,26 +108,6 @@ public class SearchParamExtractorService { mySearchParamExtractor = theSearchParamExtractor; } - public void extractFromResource( - RequestPartitionId theRequestPartitionId, - RequestDetails theRequestDetails, - ResourceIndexedSearchParams theParams, - ResourceTable theEntity, - IBaseResource theResource, - TransactionDetails theTransactionDetails, - boolean theFailOnInvalidReference) { - extractFromResource( - theRequestPartitionId, - theRequestDetails, - theParams, - ResourceIndexedSearchParams.withSets(), - theEntity, - theResource, - theTransactionDetails, - theFailOnInvalidReference, - ISearchParamExtractor.ALL_PARAMS); - } - /** * This method is responsible for scanning a resource for all of the search parameter instances. * I.e. for all search parameters defined for @@ -702,14 +685,18 @@ public class SearchParamExtractorService { * need to resolve it again */ myResourceLinkResolver.validateTypeOrThrowException(type); - resourceLink = ResourceLink.forLocalReference( - thePathAndRef.getPath(), - theEntity, - typeString, - resolvedTargetId.getId(), - targetId, - transactionDate, - targetVersionId); + + ResourceLinkForLocalReferenceParams params = ResourceLinkForLocalReferenceParams.instance() + .setSourcePath(thePathAndRef.getPath()) + .setSourceResource(theEntity) + .setTargetResourceType(typeString) + .setTargetResourcePid(resolvedTargetId.getId()) + .setTargetResourceId(targetId) + .setUpdated(transactionDate) + .setTargetResourceVersion(targetVersionId) + .setTargetResourcePartitionablePartitionId(resolvedTargetId.getPartitionablePartitionId()); + + resourceLink = forLocalReference(params); } else if (theFailOnInvalidReference) { @@ -748,6 +735,7 @@ public class SearchParamExtractorService { } else { // Cache the outcome in the current transaction in case there are more references JpaPid persistentId = JpaPid.fromId(resourceLink.getTargetResourcePid()); + persistentId.setPartitionablePartitionId(resourceLink.getTargetResourcePartitionId()); theTransactionDetails.addResolvedResourceId(referenceElement, persistentId); } @@ -757,11 +745,15 @@ public class SearchParamExtractorService { * Just assume the reference is valid. This is used for in-memory matching since there * is no expectation of a database in this situation */ - ResourceTable target; - target = new ResourceTable(); - target.setResourceType(typeString); - resourceLink = ResourceLink.forLocalReference( - thePathAndRef.getPath(), theEntity, typeString, null, targetId, transactionDate, targetVersionId); + ResourceLinkForLocalReferenceParams params = ResourceLinkForLocalReferenceParams.instance() + .setSourcePath(thePathAndRef.getPath()) + .setSourceResource(theEntity) + .setTargetResourceType(typeString) + .setTargetResourceId(targetId) + .setUpdated(transactionDate) + .setTargetResourceVersion(targetVersionId); + + resourceLink = forLocalReference(params); } theNewParams.myLinks.add(resourceLink); @@ -912,19 +904,24 @@ public class SearchParamExtractorService { RequestDetails theRequest, TransactionDetails theTransactionDetails) { JpaPid resolvedResourceId = (JpaPid) theTransactionDetails.getResolvedResourceId(theNextId); + if (resolvedResourceId != null) { String targetResourceType = theNextId.getResourceType(); Long targetResourcePid = resolvedResourceId.getId(); String targetResourceIdPart = theNextId.getIdPart(); Long targetVersion = theNextId.getVersionIdPartAsLong(); - return ResourceLink.forLocalReference( - thePathAndRef.getPath(), - theEntity, - targetResourceType, - targetResourcePid, - targetResourceIdPart, - theUpdateTime, - targetVersion); + + ResourceLinkForLocalReferenceParams params = ResourceLinkForLocalReferenceParams.instance() + .setSourcePath(thePathAndRef.getPath()) + .setSourceResource(theEntity) + .setTargetResourceType(targetResourceType) + .setTargetResourcePid(targetResourcePid) + .setTargetResourceId(targetResourceIdPart) + .setUpdated(theUpdateTime) + .setTargetResourceVersion(targetVersion) + .setTargetResourcePartitionablePartitionId(resolvedResourceId.getPartitionablePartitionId()); + + return ResourceLink.forLocalReference(params); } /* @@ -936,8 +933,7 @@ public class SearchParamExtractorService { IResourceLookup targetResource; if (myPartitionSettings.isPartitioningEnabled()) { - if (myPartitionSettings.getAllowReferencesAcrossPartitions() - == PartitionSettings.CrossPartitionReferenceMode.ALLOWED_UNQUALIFIED) { + if (myPartitionSettings.getAllowReferencesAcrossPartitions() == ALLOWED_UNQUALIFIED) { // Interceptor: Pointcut.JPA_CROSS_PARTITION_REFERENCE_DETECTED if (CompositeInterceptorBroadcaster.hasHooks( @@ -981,21 +977,25 @@ public class SearchParamExtractorService { Long targetResourcePid = targetResource.getPersistentId().getId(); String targetResourceIdPart = theNextId.getIdPart(); Long targetVersion = theNextId.getVersionIdPartAsLong(); - return ResourceLink.forLocalReference( - thePathAndRef.getPath(), - theEntity, - targetResourceType, - targetResourcePid, - targetResourceIdPart, - theUpdateTime, - targetVersion); + + ResourceLinkForLocalReferenceParams params = ResourceLinkForLocalReferenceParams.instance() + .setSourcePath(thePathAndRef.getPath()) + .setSourceResource(theEntity) + .setTargetResourceType(targetResourceType) + .setTargetResourcePid(targetResourcePid) + .setTargetResourceId(targetResourceIdPart) + .setUpdated(theUpdateTime) + .setTargetResourceVersion(targetVersion) + .setTargetResourcePartitionablePartitionId( + targetResource.getPersistentId().getPartitionablePartitionId()); + + return forLocalReference(params); } private RequestPartitionId determineResolverPartitionId(@Nonnull RequestPartitionId theRequestPartitionId) { RequestPartitionId targetRequestPartitionId = theRequestPartitionId; if (myPartitionSettings.isPartitioningEnabled() - && myPartitionSettings.getAllowReferencesAcrossPartitions() - == PartitionSettings.CrossPartitionReferenceMode.ALLOWED_UNQUALIFIED) { + && myPartitionSettings.getAllowReferencesAcrossPartitions() == ALLOWED_UNQUALIFIED) { targetRequestPartitionId = RequestPartitionId.allPartitions(); } return targetRequestPartitionId; diff --git a/hapi-fhir-jpaserver-searchparam/src/test/java/ca/uhn/fhir/jpa/searchparam/extractor/ResourceIndexedSearchParamsTest.java b/hapi-fhir-jpaserver-searchparam/src/test/java/ca/uhn/fhir/jpa/searchparam/extractor/ResourceIndexedSearchParamsTest.java index a1ad93d1f2d..f1f61388f5e 100644 --- a/hapi-fhir-jpaserver-searchparam/src/test/java/ca/uhn/fhir/jpa/searchparam/extractor/ResourceIndexedSearchParamsTest.java +++ b/hapi-fhir-jpaserver-searchparam/src/test/java/ca/uhn/fhir/jpa/searchparam/extractor/ResourceIndexedSearchParamsTest.java @@ -37,7 +37,7 @@ public class ResourceIndexedSearchParamsTest { @Test public void matchResourceLinksStringCompareToLong() { - ResourceLink link = ResourceLink.forLocalReference("organization", mySource, "Organization", 123L, LONG_ID, new Date(), null); + ResourceLink link = getResourceLinkForLocalReference(LONG_ID); myParams.getResourceLinks().add(link); ReferenceParam referenceParam = getReferenceParam(STRING_ID); @@ -47,7 +47,7 @@ public class ResourceIndexedSearchParamsTest { @Test public void matchResourceLinksStringCompareToString() { - ResourceLink link = ResourceLink.forLocalReference("organization", mySource, "Organization", 123L, STRING_ID, new Date(), null); + ResourceLink link = getResourceLinkForLocalReference(STRING_ID); myParams.getResourceLinks().add(link); ReferenceParam referenceParam = getReferenceParam(STRING_ID); @@ -57,7 +57,7 @@ public class ResourceIndexedSearchParamsTest { @Test public void matchResourceLinksLongCompareToString() { - ResourceLink link = ResourceLink.forLocalReference("organization", mySource, "Organization", 123L, STRING_ID, new Date(), null); + ResourceLink link = getResourceLinkForLocalReference(STRING_ID); myParams.getResourceLinks().add(link); ReferenceParam referenceParam = getReferenceParam(LONG_ID); @@ -67,7 +67,7 @@ public class ResourceIndexedSearchParamsTest { @Test public void matchResourceLinksLongCompareToLong() { - ResourceLink link = ResourceLink.forLocalReference("organization", mySource, "Organization", 123L, LONG_ID, new Date(), null); + ResourceLink link = getResourceLinkForLocalReference(LONG_ID); myParams.getResourceLinks().add(link); ReferenceParam referenceParam = getReferenceParam(LONG_ID); @@ -75,6 +75,21 @@ public class ResourceIndexedSearchParamsTest { assertTrue(result); } + private ResourceLink getResourceLinkForLocalReference(String theTargetResourceId){ + + ResourceLink.ResourceLinkForLocalReferenceParams params = ResourceLink.ResourceLinkForLocalReferenceParams + .instance() + .setSourcePath("organization") + .setSourceResource(mySource) + .setTargetResourceType("Organization") + .setTargetResourcePid(123L) + .setTargetResourceId(theTargetResourceId) + .setUpdated(new Date()); + + return ResourceLink.forLocalReference(params); + + } + private ReferenceParam getReferenceParam(String theId) { ReferenceParam retVal = new ReferenceParam(); retVal.setValue(theId); diff --git a/hapi-fhir-jpaserver-searchparam/src/test/java/ca/uhn/fhir/jpa/searchparam/registry/FhirContextSearchParamRegistryTest.java b/hapi-fhir-jpaserver-searchparam/src/test/java/ca/uhn/fhir/jpa/searchparam/registry/FhirContextSearchParamRegistryTest.java new file mode 100644 index 00000000000..d51b344ccd0 --- /dev/null +++ b/hapi-fhir-jpaserver-searchparam/src/test/java/ca/uhn/fhir/jpa/searchparam/registry/FhirContextSearchParamRegistryTest.java @@ -0,0 +1,38 @@ +package ca.uhn.fhir.jpa.searchparam.registry; + +import ca.uhn.fhir.context.FhirContext; +import ca.uhn.fhir.context.RuntimeSearchParam; +import ca.uhn.fhir.rest.server.util.FhirContextSearchParamRegistry; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.CsvSource; + +import static org.assertj.core.api.Assertions.assertThat; +import static org.hl7.fhir.instance.model.api.IAnyResource.SP_RES_ID; +import static org.hl7.fhir.instance.model.api.IAnyResource.SP_RES_LAST_UPDATED; +import static org.hl7.fhir.instance.model.api.IAnyResource.SP_RES_PROFILE; +import static org.hl7.fhir.instance.model.api.IAnyResource.SP_RES_SECURITY; +import static org.hl7.fhir.instance.model.api.IAnyResource.SP_RES_TAG; + +class FhirContextSearchParamRegistryTest { + + private static final FhirContext ourFhirContext = FhirContext.forR4(); + + FhirContextSearchParamRegistry mySearchParamRegistry = new FhirContextSearchParamRegistry(ourFhirContext); + + @ParameterizedTest + @CsvSource({ + SP_RES_ID + ", Resource.id", + SP_RES_LAST_UPDATED + ", Resource.meta.lastUpdated", + SP_RES_TAG + ", Resource.meta.tag", + SP_RES_PROFILE + ", Resource.meta.profile", + SP_RES_SECURITY + ", Resource.meta.security" + }) + void testResourceLevelSearchParamsAreRegistered(String theSearchParamName, String theSearchParamPath) { + RuntimeSearchParam sp = mySearchParamRegistry.getActiveSearchParam("Patient", theSearchParamName); + + assertThat(sp) + .as("path is null for search parameter: '%s'", theSearchParamName) + .isNotNull().extracting("path").isEqualTo(theSearchParamPath); + } + +} diff --git a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/match/matcher/subscriber/SubscriptionMatchingSubscriber.java b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/match/matcher/subscriber/SubscriptionMatchingSubscriber.java index fef9581af10..83f49aca4f7 100644 --- a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/match/matcher/subscriber/SubscriptionMatchingSubscriber.java +++ b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/match/matcher/subscriber/SubscriptionMatchingSubscriber.java @@ -151,15 +151,17 @@ public class SubscriptionMatchingSubscriber implements MessageHandler { */ private boolean processSubscription( ResourceModifiedMessage theMsg, IIdType theResourceId, ActiveSubscription theActiveSubscription) { - // skip if the partitions don't match + CanonicalSubscription subscription = theActiveSubscription.getSubscription(); + if (subscription != null && theMsg.getPartitionId() != null && theMsg.getPartitionId().hasPartitionIds() - && !subscription.getCrossPartitionEnabled() + && !subscription.isCrossPartitionEnabled() && !theMsg.getPartitionId().hasPartitionId(subscription.getRequestPartitionId())) { return false; } + String nextSubscriptionId = theActiveSubscription.getId(); if (isNotBlank(theMsg.getSubscriptionId())) { diff --git a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/submit/interceptor/SubscriptionValidatingInterceptor.java b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/submit/interceptor/SubscriptionValidatingInterceptor.java index 359ca463aa8..0e274b5fa84 100644 --- a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/submit/interceptor/SubscriptionValidatingInterceptor.java +++ b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/submit/interceptor/SubscriptionValidatingInterceptor.java @@ -241,7 +241,7 @@ public class SubscriptionValidatingInterceptor { RequestPartitionId theRequestPartitionId, Pointcut thePointcut) { // If the subscription has the cross partition tag - if (SubscriptionUtil.isCrossPartition(theSubscription) + if (SubscriptionUtil.isDefinedAsCrossPartitionSubcription(theSubscription) && !(theRequestDetails instanceof SystemRequestDetails)) { if (!mySubscriptionSettings.isCrossPartitionSubscriptionEnabled()) { throw new UnprocessableEntityException( diff --git a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/triggering/SubscriptionTriggeringSvcImpl.java b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/triggering/SubscriptionTriggeringSvcImpl.java index 5ffb68dd9fc..2a712297a4d 100644 --- a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/triggering/SubscriptionTriggeringSvcImpl.java +++ b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/triggering/SubscriptionTriggeringSvcImpl.java @@ -151,7 +151,7 @@ public class SubscriptionTriggeringSvcImpl implements ISubscriptionTriggeringSvc if (theSubscriptionId != null) { IFhirResourceDao subscriptionDao = myDaoRegistry.getSubscriptionDao(); IBaseResource subscription = subscriptionDao.read(theSubscriptionId, theRequestDetails); - if (mySubscriptionCanonicalizer.canonicalize(subscription).getCrossPartitionEnabled()) { + if (mySubscriptionCanonicalizer.canonicalize(subscription).isCrossPartitionEnabled()) { requestPartitionId = RequestPartitionId.allPartitions(); } else { // Otherwise, trust the partition passed in via tenant/interceptor. diff --git a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/util/SubscriptionUtil.java b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/util/SubscriptionUtil.java index 59a39655061..147fe098b30 100644 --- a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/util/SubscriptionUtil.java +++ b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/subscription/util/SubscriptionUtil.java @@ -34,7 +34,7 @@ public class SubscriptionUtil { RequestPartitionId requestPartitionId = new PartitionablePartitionId(theSubscription.getRequestPartitionId(), null).toPartitionId(); - if (theSubscription.getCrossPartitionEnabled()) { + if (theSubscription.isCrossPartitionEnabled()) { requestPartitionId = RequestPartitionId.allPartitions(); } diff --git a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/topic/SubscriptionTopicConfig.java b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/topic/SubscriptionTopicConfig.java index 3b5225246c0..eb14b9af406 100644 --- a/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/topic/SubscriptionTopicConfig.java +++ b/hapi-fhir-jpaserver-subscription/src/main/java/ca/uhn/fhir/jpa/topic/SubscriptionTopicConfig.java @@ -34,7 +34,7 @@ import org.springframework.context.annotation.Lazy; @Import(SubscriptionConfig.class) public class SubscriptionTopicConfig { @Bean - SubscriptionTopicMatchingSubscriber subscriptionTopicMatchingSubscriber( + public SubscriptionTopicMatchingSubscriber subscriptionTopicMatchingSubscriber( FhirContext theFhirContext, MemoryCacheService memoryCacheService) { switch (theFhirContext.getVersion().getVersion()) { case R5: @@ -47,19 +47,19 @@ public class SubscriptionTopicConfig { @Bean @Lazy - SubscriptionTopicRegistry subscriptionTopicRegistry() { + public SubscriptionTopicRegistry subscriptionTopicRegistry() { return new SubscriptionTopicRegistry(); } @Bean @Lazy - SubscriptionTopicSupport subscriptionTopicSupport( + public SubscriptionTopicSupport subscriptionTopicSupport( FhirContext theFhirContext, DaoRegistry theDaoRegistry, SearchParamMatcher theSearchParamMatcher) { return new SubscriptionTopicSupport(theFhirContext, theDaoRegistry, theSearchParamMatcher); } @Bean - SubscriptionTopicLoader subscriptionTopicLoader(FhirContext theFhirContext) { + public SubscriptionTopicLoader subscriptionTopicLoader(FhirContext theFhirContext) { switch (theFhirContext.getVersion().getVersion()) { case R5: case R4B: @@ -70,7 +70,7 @@ public class SubscriptionTopicConfig { } @Bean - SubscriptionTopicRegisteringSubscriber subscriptionTopicRegisteringSubscriber(FhirContext theFhirContext) { + public SubscriptionTopicRegisteringSubscriber subscriptionTopicRegisteringSubscriber(FhirContext theFhirContext) { switch (theFhirContext.getVersion().getVersion()) { case R5: case R4B: @@ -81,7 +81,7 @@ public class SubscriptionTopicConfig { } @Bean - SubscriptionTopicValidatingInterceptor subscriptionTopicValidatingInterceptor( + public SubscriptionTopicValidatingInterceptor subscriptionTopicValidatingInterceptor( FhirContext theFhirContext, SubscriptionQueryValidator theSubscriptionQueryValidator) { switch (theFhirContext.getVersion().getVersion()) { case R5: diff --git a/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/match/registry/SubscriptionCanonicalizerTest.java b/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/match/registry/SubscriptionCanonicalizerTest.java index 4100a79d448..803087b4066 100644 --- a/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/match/registry/SubscriptionCanonicalizerTest.java +++ b/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/match/registry/SubscriptionCanonicalizerTest.java @@ -1,43 +1,37 @@ package ca.uhn.fhir.jpa.subscription.match.registry; import ca.uhn.fhir.context.FhirContext; +import ca.uhn.fhir.interceptor.model.RequestPartitionId; +import ca.uhn.fhir.jpa.model.config.SubscriptionSettings; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscriptionChannelType; import ca.uhn.fhir.jpa.subscription.model.CanonicalTopicSubscriptionFilter; -import ca.uhn.fhir.jpa.model.config.SubscriptionSettings; import ca.uhn.fhir.model.api.ExtensionDt; -import ca.uhn.fhir.model.api.IFhirVersion; -import ca.uhn.fhir.model.dstu2.FhirDstu2; import ca.uhn.fhir.model.primitive.BooleanDt; import ca.uhn.fhir.subscription.SubscriptionConstants; import ca.uhn.fhir.subscription.SubscriptionTestDataHelper; +import ca.uhn.fhir.util.HapiExtensions; import jakarta.annotation.Nonnull; -import org.hl7.fhir.dstu3.hapi.ctx.FhirDstu3; -import org.hl7.fhir.instance.model.api.IBaseResource; -import org.hl7.fhir.r4.hapi.ctx.FhirR4; import org.hl7.fhir.r4.model.BooleanType; import org.hl7.fhir.r4.model.Extension; import org.hl7.fhir.r4.model.Subscription; -import org.hl7.fhir.r4b.hapi.ctx.FhirR4B; -import org.hl7.fhir.r5.hapi.ctx.FhirR5; import org.hl7.fhir.r5.model.Coding; import org.hl7.fhir.r5.model.Enumerations; import org.junit.jupiter.api.Test; import org.junit.jupiter.params.ParameterizedTest; -import org.junit.jupiter.params.provider.Arguments; import org.junit.jupiter.params.provider.MethodSource; import org.junit.jupiter.params.provider.ValueSource; import java.util.stream.Stream; import static ca.uhn.fhir.rest.api.Constants.CT_FHIR_JSON_NEW; +import static ca.uhn.fhir.rest.api.Constants.RESOURCE_PARTITION_ID; import static ca.uhn.fhir.util.HapiExtensions.EX_SEND_DELETE_MESSAGES; import static org.assertj.core.api.AssertionsForInterfaceTypes.assertThat; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertTrue; -import static org.junit.jupiter.params.provider.Arguments.arguments; class SubscriptionCanonicalizerTest { @@ -172,44 +166,158 @@ class SubscriptionCanonicalizerTest { verifyChannelParameters(canonical, thePayloadContent); } - - private static Stream crossPartitionParams() { - return Stream.of( - arguments(true, FhirContext.forDstu2Cached()), - arguments(false, FhirContext.forDstu2Cached()), - arguments(true, FhirContext.forDstu3Cached()), - arguments(false, FhirContext.forDstu3Cached()), - arguments(true, FhirContext.forR4Cached()), - arguments(false, FhirContext.forR4Cached()), - arguments(true, FhirContext.forR4BCached()), - arguments(false, FhirContext.forR4BCached()), - arguments(true, FhirContext.forR5Cached()), - arguments(false, FhirContext.forR5Cached()) - ); + private static Stream crossPartitionParams() { + return Stream.of(null, RequestPartitionId.fromPartitionId(1), RequestPartitionId.defaultPartition()) ; } + @ParameterizedTest @MethodSource("crossPartitionParams") - void testCrossPartition(boolean theCrossPartitionSubscriptionEnabled, FhirContext theFhirContext) { - final IFhirVersion version = theFhirContext.getVersion(); - IBaseResource subscription = null; - if (version instanceof FhirDstu2){ - subscription = new ca.uhn.fhir.model.dstu2.resource.Subscription(); - } else if (version instanceof FhirDstu3){ - subscription = new org.hl7.fhir.dstu3.model.Subscription(); - } else if (version instanceof FhirR4){ - subscription = new Subscription(); - } else if (version instanceof FhirR4B){ - subscription = new org.hl7.fhir.r4b.model.Subscription(); - } else if (version instanceof FhirR5){ - subscription = new org.hl7.fhir.r5.model.Subscription(); + void testSubscriptionCrossPartitionEnableProperty_forDstu2WithExtensionAndPartitions(RequestPartitionId theRequestPartitionId) { + final SubscriptionSettings subscriptionSettings = new SubscriptionSettings(); + subscriptionSettings.setCrossPartitionSubscriptionEnabled(true); + final SubscriptionCanonicalizer subscriptionCanonicalizer = new SubscriptionCanonicalizer(FhirContext.forDstu2(), subscriptionSettings); + + ca.uhn.fhir.model.dstu2.resource.Subscription subscriptionWithoutExtension = new ca.uhn.fhir.model.dstu2.resource.Subscription(); + subscriptionWithoutExtension.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + + final CanonicalSubscription canonicalSubscriptionWithoutExtension = subscriptionCanonicalizer.canonicalize(subscriptionWithoutExtension); + + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + } + + @ParameterizedTest + @MethodSource("crossPartitionParams") + void testSubscriptionCrossPartitionEnableProperty_forDstu3WithExtensionAndPartitions(RequestPartitionId theRequestPartitionId) { + final SubscriptionSettings subscriptionSettings = new SubscriptionSettings(); + subscriptionSettings.setCrossPartitionSubscriptionEnabled(true); + final SubscriptionCanonicalizer subscriptionCanonicalizer = new SubscriptionCanonicalizer(FhirContext.forDstu3(), subscriptionSettings); + + org.hl7.fhir.dstu3.model.Subscription subscriptionWithoutExtension = new org.hl7.fhir.dstu3.model.Subscription(); + subscriptionWithoutExtension.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + + org.hl7.fhir.dstu3.model.Subscription subscriptionWithExtensionCrossPartitionTrue = new org.hl7.fhir.dstu3.model.Subscription(); + subscriptionWithExtensionCrossPartitionTrue.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + subscriptionWithExtensionCrossPartitionTrue.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.dstu3.model.BooleanType().setValue(true)); + + org.hl7.fhir.dstu3.model.Subscription subscriptionWithExtensionCrossPartitionFalse = new org.hl7.fhir.dstu3.model.Subscription(); + subscriptionWithExtensionCrossPartitionFalse.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + subscriptionWithExtensionCrossPartitionFalse.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.dstu3.model.BooleanType().setValue(false)); + + final CanonicalSubscription canonicalSubscriptionWithoutExtension = subscriptionCanonicalizer.canonicalize(subscriptionWithoutExtension); + final CanonicalSubscription canonicalSubscriptionWithExtensionCrossPartitionTrue = subscriptionCanonicalizer.canonicalize(subscriptionWithExtensionCrossPartitionTrue); + final CanonicalSubscription canonicalSubscriptionWithExtensionCrossPartitionFalse = subscriptionCanonicalizer.canonicalize(subscriptionWithExtensionCrossPartitionFalse); + + if(RequestPartitionId.isDefaultPartition(theRequestPartitionId)){ + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionTrue.isCrossPartitionEnabled()).isTrue(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionFalse.isCrossPartitionEnabled()).isFalse(); + } else { + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionTrue.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionFalse.isCrossPartitionEnabled()).isFalse(); + } + } + + @ParameterizedTest + @MethodSource("crossPartitionParams") + void testSubscriptionCrossPartitionEnableProperty_forR4WithExtensionAndPartitions(RequestPartitionId theRequestPartitionId) { + final SubscriptionSettings subscriptionSettings = new SubscriptionSettings(); + subscriptionSettings.setCrossPartitionSubscriptionEnabled(true); + + final SubscriptionCanonicalizer subscriptionCanonicalizer = new SubscriptionCanonicalizer(FhirContext.forR4Cached(), subscriptionSettings); + + Subscription subscriptionWithoutExtension = new Subscription(); + subscriptionWithoutExtension.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + + Subscription subscriptionWithExtensionCrossPartitionTrue = new Subscription(); + subscriptionWithExtensionCrossPartitionTrue.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + subscriptionWithExtensionCrossPartitionTrue.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.r4.model.BooleanType().setValue(true)); + + Subscription subscriptionWithExtensionCrossPartitionFalse = new Subscription(); + subscriptionWithExtensionCrossPartitionFalse.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + subscriptionWithExtensionCrossPartitionFalse.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.r4.model.BooleanType().setValue(false)); + + + final CanonicalSubscription canonicalSubscriptionWithoutExtension = subscriptionCanonicalizer.canonicalize(subscriptionWithoutExtension); + final CanonicalSubscription canonicalSubscriptionWithExtensionCrossPartitionTrue = subscriptionCanonicalizer.canonicalize(subscriptionWithExtensionCrossPartitionTrue); + final CanonicalSubscription canonicalSubscriptionWithExtensionCrossPartitionFalse = subscriptionCanonicalizer.canonicalize(subscriptionWithExtensionCrossPartitionFalse); + + if(RequestPartitionId.isDefaultPartition(theRequestPartitionId)){ + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionTrue.isCrossPartitionEnabled()).isTrue(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionFalse.isCrossPartitionEnabled()).isFalse(); + } else { + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionTrue.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionFalse.isCrossPartitionEnabled()).isFalse(); } - final SubscriptionSettings subscriptionSettings = new SubscriptionSettings(); - subscriptionSettings.setCrossPartitionSubscriptionEnabled(theCrossPartitionSubscriptionEnabled); - final SubscriptionCanonicalizer subscriptionCanonicalizer = new SubscriptionCanonicalizer(theFhirContext, subscriptionSettings); - final CanonicalSubscription canonicalSubscription = subscriptionCanonicalizer.canonicalize(subscription); + } - assertEquals(theCrossPartitionSubscriptionEnabled, canonicalSubscription.getCrossPartitionEnabled()); + @ParameterizedTest + @MethodSource("crossPartitionParams") + void testSubscriptionCrossPartitionEnableProperty_forR4BWithExtensionAndPartitions(RequestPartitionId theRequestPartitionId) { + final SubscriptionSettings subscriptionSettings = new SubscriptionSettings(); + subscriptionSettings.setCrossPartitionSubscriptionEnabled(true); + final SubscriptionCanonicalizer subscriptionCanonicalizer = new SubscriptionCanonicalizer(FhirContext.forR4BCached(), subscriptionSettings); + + org.hl7.fhir.r4b.model.Subscription subscriptionWithoutExtension = new org.hl7.fhir.r4b.model.Subscription(); + subscriptionWithoutExtension.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + + org.hl7.fhir.r4b.model.Subscription subscriptionWithExtensionCrossPartitionTrue = new org.hl7.fhir.r4b.model.Subscription(); + subscriptionWithExtensionCrossPartitionTrue.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + subscriptionWithExtensionCrossPartitionTrue.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.r4b.model.BooleanType().setValue(true)); + + org.hl7.fhir.r4b.model.Subscription subscriptionWithExtensionCrossPartitionFalse = new org.hl7.fhir.r4b.model.Subscription(); + subscriptionWithExtensionCrossPartitionFalse.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + subscriptionWithExtensionCrossPartitionFalse.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.r4b.model.BooleanType().setValue(false)); + + final CanonicalSubscription canonicalSubscriptionWithoutExtension = subscriptionCanonicalizer.canonicalize(subscriptionWithoutExtension); + final CanonicalSubscription canonicalSubscriptionWithExtensionCrossPartitionTrue = subscriptionCanonicalizer.canonicalize(subscriptionWithExtensionCrossPartitionTrue); + final CanonicalSubscription canonicalSubscriptionWithExtensionCrossPartitionFalse = subscriptionCanonicalizer.canonicalize(subscriptionWithExtensionCrossPartitionFalse); + + if(RequestPartitionId.isDefaultPartition(theRequestPartitionId)){ + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionTrue.isCrossPartitionEnabled()).isTrue(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionFalse.isCrossPartitionEnabled()).isFalse(); + } else { + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionTrue.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionFalse.isCrossPartitionEnabled()).isFalse(); + } + } + + @ParameterizedTest + @MethodSource("crossPartitionParams") + void testSubscriptionCrossPartitionEnableProperty_forR5WithExtensionAndPartitions(RequestPartitionId theRequestPartitionId) { + final SubscriptionSettings subscriptionSettings = new SubscriptionSettings(); + subscriptionSettings.setCrossPartitionSubscriptionEnabled(true); + final SubscriptionCanonicalizer subscriptionCanonicalizer = new SubscriptionCanonicalizer(FhirContext.forR5Cached(), subscriptionSettings); + + org.hl7.fhir.r5.model.Subscription subscriptionWithoutExtension = new org.hl7.fhir.r5.model.Subscription(); + subscriptionWithoutExtension.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + + org.hl7.fhir.r5.model.Subscription subscriptionWithExtensionCrossPartitionTrue = new org.hl7.fhir.r5.model.Subscription(); + subscriptionWithExtensionCrossPartitionTrue.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + subscriptionWithExtensionCrossPartitionTrue.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.r5.model.BooleanType().setValue(true)); + + org.hl7.fhir.r5.model.Subscription subscriptionWithExtensionCrossPartitionFalse = new org.hl7.fhir.r5.model.Subscription(); + subscriptionWithExtensionCrossPartitionFalse.setUserData(RESOURCE_PARTITION_ID, theRequestPartitionId); + subscriptionWithExtensionCrossPartitionFalse.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.r5.model.BooleanType().setValue(false)); + + final CanonicalSubscription canonicalSubscriptionWithoutExtension = subscriptionCanonicalizer.canonicalize(subscriptionWithoutExtension); + final CanonicalSubscription canonicalSubscriptionWithExtensionCrossPartitionTrue = subscriptionCanonicalizer.canonicalize(subscriptionWithExtensionCrossPartitionTrue); + final CanonicalSubscription canonicalSubscriptionWithExtensionCrossPartitionFalse = subscriptionCanonicalizer.canonicalize(subscriptionWithExtensionCrossPartitionFalse); + + if(RequestPartitionId.isDefaultPartition(theRequestPartitionId)){ + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionTrue.isCrossPartitionEnabled()).isTrue(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionFalse.isCrossPartitionEnabled()).isFalse(); + } else { + assertThat(canonicalSubscriptionWithoutExtension.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionTrue.isCrossPartitionEnabled()).isFalse(); + assertThat(canonicalSubscriptionWithExtensionCrossPartitionFalse.isCrossPartitionEnabled()).isFalse(); + } } private org.hl7.fhir.r4b.model.Subscription buildR4BSubscription(String thePayloadContent) { diff --git a/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/module/CanonicalSubscriptionTest.java b/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/module/CanonicalSubscriptionTest.java index 8b2afd6695a..d0f2b63c6c4 100644 --- a/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/module/CanonicalSubscriptionTest.java +++ b/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/module/CanonicalSubscriptionTest.java @@ -1,22 +1,23 @@ package ca.uhn.fhir.jpa.subscription.module; import ca.uhn.fhir.context.FhirContext; -import ca.uhn.fhir.jpa.model.entity.StorageSettings; +import ca.uhn.fhir.interceptor.model.RequestPartitionId; +import ca.uhn.fhir.jpa.model.config.SubscriptionSettings; import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionCanonicalizer; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.ResourceDeliveryJsonMessage; import ca.uhn.fhir.jpa.subscription.model.ResourceDeliveryMessage; -import ca.uhn.fhir.jpa.model.config.SubscriptionSettings; import ca.uhn.fhir.util.HapiExtensions; import com.fasterxml.jackson.core.JsonProcessingException; import com.fasterxml.jackson.databind.ObjectMapper; import jakarta.annotation.Nonnull; import org.assertj.core.util.Lists; import org.hl7.fhir.r4.model.BooleanType; -import org.hl7.fhir.r4.model.StringType; import org.hl7.fhir.r4.model.Subscription; import org.junit.jupiter.api.Test; import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; import org.junit.jupiter.params.provider.ValueSource; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -24,11 +25,13 @@ import org.slf4j.LoggerFactory; import java.io.IOException; import java.util.HashMap; import java.util.List; +import java.util.stream.Stream; +import static ca.uhn.fhir.rest.api.Constants.RESOURCE_PARTITION_ID; import static org.assertj.core.api.Assertions.assertThat; import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertTrue; public class CanonicalSubscriptionTest { @@ -71,7 +74,7 @@ public class CanonicalSubscriptionTest { } @Test - public void testCanonicalSubscriptionRetainsMetaTags() throws IOException { + public void testCanonicalSubscriptionRetainsMetaTags() { SubscriptionCanonicalizer canonicalizer = new SubscriptionCanonicalizer(FhirContext.forR4(), new SubscriptionSettings()); CanonicalSubscription sub1 = canonicalizer.canonicalize(makeMdmSubscription()); assertThat(sub1.getTags()).containsKey(TAG_SYSTEM); @@ -86,40 +89,48 @@ public class CanonicalSubscriptionTest { assertTrue(sub1.equals(sub2)); } - @Test - public void testSerializeMultiPartitionSubscription(){ - SubscriptionCanonicalizer canonicalizer = new SubscriptionCanonicalizer(FhirContext.forR4(), new SubscriptionSettings()); + @ParameterizedTest + @ValueSource(booleans = {true, false}) + public void testSubscriptionCrossPartitionEnabled_basedOnGlobalFlagAndExtensionFalse(boolean theIsCrossPartitionEnabled){ + final SubscriptionSettings subscriptionSettings = buildSubscriptionSettings(theIsCrossPartitionEnabled); + SubscriptionCanonicalizer canonicalizer = new SubscriptionCanonicalizer(FhirContext.forR4(), subscriptionSettings); + + Subscription subscriptionWithExtensionSetToBooleanFalse = makeEmailSubscription(); + subscriptionWithExtensionSetToBooleanFalse.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new BooleanType().setValue(false)); + + CanonicalSubscription canonicalSubscriptionExtensionSetToBooleanFalse = canonicalizer.canonicalize(subscriptionWithExtensionSetToBooleanFalse); + + assertEquals(canonicalSubscriptionExtensionSetToBooleanFalse.isCrossPartitionEnabled(), false); + } + + static Stream requestPartitionIds() { + return Stream.of( + Arguments.of(null, false), + Arguments.of(RequestPartitionId.allPartitions(), false), + Arguments.of(RequestPartitionId.fromPartitionIds(1), false), + Arguments.of(RequestPartitionId.defaultPartition(), true) + ); + } + + @ParameterizedTest + @MethodSource("requestPartitionIds") + public void testSubscriptionCrossPartitionEnabled_basedOnGlobalFlagAndExtensionAndPartitionId(RequestPartitionId theRequestPartitionId, boolean theExpectedIsCrossPartitionEnabled){ + final boolean globalIsCrossPartitionEnabled = true; // not required but to help understand what the test is doing. + SubscriptionCanonicalizer canonicalizer = new SubscriptionCanonicalizer(FhirContext.forR4(), buildSubscriptionSettings(globalIsCrossPartitionEnabled)); + Subscription subscription = makeEmailSubscription(); + subscription.setUserData(RESOURCE_PARTITION_ID,theRequestPartitionId); subscription.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new BooleanType().setValue(true)); - CanonicalSubscription canonicalSubscription = canonicalizer.canonicalize(subscription); - assertEquals(canonicalSubscription.getCrossPartitionEnabled(), true); - } - - @ParameterizedTest - @ValueSource(booleans = {true, false}) - public void testSerializeIncorrectMultiPartitionSubscription(boolean theIsCrossPartitionEnabled){ - final SubscriptionSettings subscriptionSettings = buildSubscriptionSettings(theIsCrossPartitionEnabled); - SubscriptionCanonicalizer canonicalizer = new SubscriptionCanonicalizer(FhirContext.forR4(), subscriptionSettings); - Subscription subscription = makeEmailSubscription(); - subscription.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new StringType().setValue("false")); - CanonicalSubscription canonicalSubscription = canonicalizer.canonicalize(subscription); - - assertEquals(canonicalSubscription.getCrossPartitionEnabled(), theIsCrossPartitionEnabled); - } - - @ParameterizedTest - @ValueSource(booleans = {true, false}) - public void testSerializeNonMultiPartitionSubscription(boolean theIsCrossPartitionEnabled){ - final SubscriptionSettings subscriptionSettings = buildSubscriptionSettings(theIsCrossPartitionEnabled); - SubscriptionCanonicalizer canonicalizer = new SubscriptionCanonicalizer(FhirContext.forR4(), subscriptionSettings); - Subscription subscription = makeEmailSubscription(); - subscription.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new BooleanType().setValue(false)); CanonicalSubscription canonicalSubscription = canonicalizer.canonicalize(subscription); System.out.print(canonicalSubscription); - assertEquals(canonicalSubscription.getCrossPartitionEnabled(), theIsCrossPartitionEnabled); + // for a Subscription to be a cross-partition subscription, 3 things are required: + // - The Subs need to be created on the default partition + // - The Subs need to have extension EXTENSION_SUBSCRIPTION_CROSS_PARTITION set to true + // - Global flag CrossPartitionSubscriptionEnabled needs to be true + assertThat(canonicalSubscription.isCrossPartitionEnabled()).isEqualTo(theExpectedIsCrossPartitionEnabled); } @Test @@ -130,7 +141,7 @@ public class CanonicalSubscriptionTest { CanonicalSubscription payload = resourceDeliveryMessage.getPayload().getSubscription(); - assertFalse(payload.getCrossPartitionEnabled()); + assertFalse(payload.isCrossPartitionEnabled()); } private Subscription makeEmailSubscription() { diff --git a/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/module/subscriber/SubscriptionMatchingSubscriberTest.java b/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/module/subscriber/SubscriptionMatchingSubscriberTest.java index 19393f29976..5f47b22ceaf 100644 --- a/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/module/subscriber/SubscriptionMatchingSubscriberTest.java +++ b/hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/module/subscriber/SubscriptionMatchingSubscriberTest.java @@ -4,8 +4,8 @@ import ca.uhn.fhir.interceptor.api.HookParams; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.model.RequestPartitionId; -import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; +import ca.uhn.fhir.jpa.model.config.SubscriptionSettings; import ca.uhn.fhir.jpa.subscription.match.matcher.subscriber.SubscriptionCriteriaParser; import ca.uhn.fhir.jpa.subscription.match.matcher.subscriber.SubscriptionMatchDeliverer; import ca.uhn.fhir.jpa.subscription.match.matcher.subscriber.SubscriptionMatchingSubscriber; @@ -14,7 +14,6 @@ import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscription; import ca.uhn.fhir.jpa.subscription.model.ResourceModifiedMessage; import ca.uhn.fhir.jpa.subscription.module.standalone.BaseBlockingQueueSubscribableChannelDstu3Test; -import ca.uhn.fhir.jpa.model.config.SubscriptionSettings; import ca.uhn.fhir.model.primitive.IdDt; import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.server.messaging.BaseResourceModifiedMessage; @@ -192,35 +191,16 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri assertEquals(Constants.CT_FHIR_XML_NEW, ourContentTypes.get(0)); } - @Test - public void testSubscriptionAndResourceOnTheSamePartitionMatch() throws InterruptedException { + @ParameterizedTest + @ValueSource(ints = {0,1}) + public void testSubscriptionAndResourceOnTheSamePartitionMatch(int thePartitionId) throws InterruptedException { myPartitionSettings.setPartitioningEnabled(true); String payload = "application/fhir+json"; String code = "1000000050"; String criteria = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; - RequestPartitionId requestPartitionId = RequestPartitionId.fromPartitionId(0); - Subscription subscription = makeActiveSubscription(criteria, payload, ourListenerServerBase); - mockSubscriptionRead(requestPartitionId, subscription); - sendSubscription(subscription, requestPartitionId, true); - - ourObservationListener.setExpectedCount(1); - mySubscriptionResourceMatched.setExpectedCount(1); - sendObservation(code, "SNOMED-CT", requestPartitionId); - mySubscriptionResourceMatched.awaitExpected(); - ourObservationListener.awaitExpected(); - } - - @Test - public void testSubscriptionAndResourceOnTheSamePartitionMatchPart2() throws InterruptedException { - myPartitionSettings.setPartitioningEnabled(true); - String payload = "application/fhir+json"; - - String code = "1000000050"; - String criteria = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; - - RequestPartitionId requestPartitionId = RequestPartitionId.fromPartitionId(1); + RequestPartitionId requestPartitionId = RequestPartitionId.fromPartitionId(thePartitionId); Subscription subscription = makeActiveSubscription(criteria, payload, ourListenerServerBase); mockSubscriptionRead(requestPartitionId, subscription); sendSubscription(subscription, requestPartitionId, true); @@ -234,7 +214,7 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri @ParameterizedTest @ValueSource(booleans = {true, false}) - public void testSubscriptionAndResourceOnDiffPartitionNotMatch(boolean theIsCrossPartitionEnabled) throws InterruptedException { + public void testSubscriptionCrossPartitionMatching_whenSubscriptionAndResourceOnDiffPartition_withGlobalFlagCrossPartitionSubscriptionEnable(boolean theIsCrossPartitionEnabled) throws InterruptedException { mySubscriptionSettings.setCrossPartitionSubscriptionEnabled(theIsCrossPartitionEnabled); myPartitionSettings.setPartitioningEnabled(true); String payload = "application/fhir+json"; @@ -242,8 +222,9 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri String code = "1000000050"; String criteria = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; - RequestPartitionId requestPartitionId = RequestPartitionId.fromPartitionId(1); + RequestPartitionId requestPartitionId = RequestPartitionId.defaultPartition(); Subscription subscription = makeActiveSubscription(criteria, payload, ourListenerServerBase); + subscription.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.dstu3.model.BooleanType().setValue(true)); mockSubscriptionRead(requestPartitionId, subscription); sendSubscription(subscription, requestPartitionId, true); @@ -255,80 +236,8 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri } } - @ParameterizedTest - @ValueSource(booleans = {true, false}) - public void testSubscriptionAndResourceOnDiffPartitionNotMatchPart2(boolean theIsCrossPartitionEnabled) throws InterruptedException { - mySubscriptionSettings.setCrossPartitionSubscriptionEnabled(theIsCrossPartitionEnabled); - myPartitionSettings.setPartitioningEnabled(true); - String payload = "application/fhir+json"; - - String code = "1000000050"; - String criteria = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; - - RequestPartitionId requestPartitionId = RequestPartitionId.fromPartitionId(0); - Subscription subscription = makeActiveSubscription(criteria, payload, ourListenerServerBase); - mockSubscriptionRead(requestPartitionId, subscription); - sendSubscription(subscription, requestPartitionId, true); - - final ThrowsInterrupted throwsInterrupted = () -> sendObservation(code, "SNOMED-CT", RequestPartitionId.fromPartitionId(1)); - - if (theIsCrossPartitionEnabled) { - runWithinLatchLogicExpectSuccess(throwsInterrupted); - } else { - runWithLatchLogicExpectFailure(throwsInterrupted); - } - } - - @ParameterizedTest - @ValueSource(booleans = {true, false}) - public void testSubscriptionOnDefaultPartitionAndResourceOnDiffPartitionNotMatch(boolean theIsCrossPartitionEnabled) throws InterruptedException { - mySubscriptionSettings.setCrossPartitionSubscriptionEnabled(theIsCrossPartitionEnabled); - myPartitionSettings.setPartitioningEnabled(true); - String payload = "application/fhir+json"; - - String code = "1000000050"; - String criteria = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; - - RequestPartitionId requestPartitionId = RequestPartitionId.defaultPartition(); - Subscription subscription = makeActiveSubscription(criteria, payload, ourListenerServerBase); - mockSubscriptionRead(requestPartitionId, subscription); - sendSubscription(subscription, requestPartitionId, true); - - final ThrowsInterrupted throwsInterrupted = () -> sendObservation(code, "SNOMED-CT", RequestPartitionId.fromPartitionId(1)); - - if (theIsCrossPartitionEnabled) { - runWithinLatchLogicExpectSuccess(throwsInterrupted); - } else { - runWithLatchLogicExpectFailure(throwsInterrupted); - } - } - - @ParameterizedTest - @ValueSource(booleans = {true, false}) - public void testSubscriptionOnAPartitionAndResourceOnDefaultPartitionNotMatch(boolean theIsCrossPartitionEnabled) throws InterruptedException { - mySubscriptionSettings.setCrossPartitionSubscriptionEnabled(theIsCrossPartitionEnabled); - myPartitionSettings.setPartitioningEnabled(true); - String payload = "application/fhir+json"; - - String code = "1000000050"; - String criteria = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; - - RequestPartitionId requestPartitionId = RequestPartitionId.fromPartitionId(1); - Subscription subscription = makeActiveSubscription(criteria, payload, ourListenerServerBase); - mockSubscriptionRead(requestPartitionId, subscription); - sendSubscription(subscription, requestPartitionId, true); - - final ThrowsInterrupted throwsInterrupted = () -> sendObservation(code, "SNOMED-CT", RequestPartitionId.defaultPartition()); - - if (theIsCrossPartitionEnabled) { - runWithinLatchLogicExpectSuccess(throwsInterrupted); - } else { - runWithLatchLogicExpectFailure(throwsInterrupted); - } - } - @Test - public void testSubscriptionOnOnePartitionMatchResourceOnMultiplePartitions() throws InterruptedException { + public void testSubscriptionOnOnePartition_whenResourceCreatedOnMultiplePartitions_matchesOnlyResourceCreatedOnSamePartition() throws InterruptedException { myPartitionSettings.setPartitioningEnabled(true); String payload = "application/fhir+json"; @@ -350,7 +259,7 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri @ParameterizedTest @ValueSource(booleans = {true, false}) - public void testSubscriptionOnOnePartitionDoNotMatchResourceOnMultiplePartitions(boolean theIsCrossPartitionEnabled) throws InterruptedException { + public void testSubscriptionCrossPartitionMatching_whenSubscriptionAndResourceOnDiffPartition_withGlobalFlagCrossPartitionSubscriptionEnable2(boolean theIsCrossPartitionEnabled) throws InterruptedException { mySubscriptionSettings.setCrossPartitionSubscriptionEnabled(theIsCrossPartitionEnabled); myPartitionSettings.setPartitioningEnabled(true); String payload = "application/fhir+json"; @@ -358,8 +267,9 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri String code = "1000000050"; String criteria = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; - RequestPartitionId requestPartitionId = RequestPartitionId.fromPartitionId(1); + RequestPartitionId requestPartitionId = RequestPartitionId.defaultPartition(); Subscription subscription = makeActiveSubscription(criteria, payload, ourListenerServerBase); + subscription.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.dstu3.model.BooleanType().setValue(true)); mockSubscriptionRead(requestPartitionId, subscription); sendSubscription(subscription, requestPartitionId, true); diff --git a/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/dao/dstu3/FhirResourceDaoDstu3SearchCustomSearchParamTest.java b/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/dao/dstu3/FhirResourceDaoDstu3SearchCustomSearchParamTest.java index b5a4e35f883..b396da2fce0 100644 --- a/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/dao/dstu3/FhirResourceDaoDstu3SearchCustomSearchParamTest.java +++ b/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/dao/dstu3/FhirResourceDaoDstu3SearchCustomSearchParamTest.java @@ -1,6 +1,5 @@ package ca.uhn.fhir.jpa.dao.dstu3; -import static org.junit.jupiter.api.Assertions.assertNotNull; import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.model.DaoMethodOutcome; @@ -54,8 +53,10 @@ import java.util.List; import java.util.stream.Collectors; import static org.assertj.core.api.Assertions.assertThat; -import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.fail; public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu3Test { private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(FhirResourceDaoDstu3SearchCustomSearchParamTest.class); @@ -192,7 +193,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu } @Test - public void testCustomReferenceParameter() throws Exception { + public void testCustomReferenceParameter() { SearchParameter sp = new SearchParameter(); sp.addBase("Patient"); sp.setCode("myDoctor"); @@ -238,7 +239,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu Patient p1 = new Patient(); p1.setActive(true); p1.addExtension().setUrl("http://acme.org/eyecolour").addExtension().setUrl("http://foo").setValue(new StringType("VAL")); - IIdType p1id = myPatientDao.create(p1).getId().toUnqualifiedVersionless(); + myPatientDao.create(p1).getId().toUnqualifiedVersionless(); } @@ -253,7 +254,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu attendingSp.setXpathUsage(org.hl7.fhir.dstu3.model.SearchParameter.XPathUsageType.NORMAL); attendingSp.setStatus(org.hl7.fhir.dstu3.model.Enumerations.PublicationStatus.ACTIVE); attendingSp.getTarget().add(new CodeType("Practitioner")); - IIdType spId = mySearchParameterDao.create(attendingSp, mySrd).getId().toUnqualifiedVersionless(); + mySearchParameterDao.create(attendingSp, mySrd).getId().toUnqualifiedVersionless(); mySearchParamRegistry.forceRefresh(); @@ -417,7 +418,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu Patient p2 = new Patient(); p2.addName().setFamily("P2"); p2.addExtension().setUrl("http://acme.org/sibling").setValue(new Reference(p1id)); - IIdType p2id = myPatientDao.create(p2).getId().toUnqualifiedVersionless(); + myPatientDao.create(p2).getId().toUnqualifiedVersionless(); SearchParameterMap map; IBundleProvider results; @@ -571,7 +572,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu Patient p2 = new Patient(); p2.setActive(true); p2.addExtension().setUrl("http://acme.org/eyecolour").setValue(new CodeType("green")); - IIdType p2id = myPatientDao.create(p2).getId().toUnqualifiedVersionless(); + myPatientDao.create(p2).getId().toUnqualifiedVersionless(); // Try with custom gender SP SearchParameterMap map = new SearchParameterMap(); @@ -889,7 +890,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu .setUrl("http://acme.org/bar") .setValue(new Reference(aptId.getValue())); - IIdType p2id = myPatientDao.create(patient).getId().toUnqualifiedVersionless(); + myPatientDao.create(patient).getId().toUnqualifiedVersionless(); SearchParameterMap map; IBundleProvider results; @@ -1035,7 +1036,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu Patient pat2 = new Patient(); pat.setGender(AdministrativeGender.FEMALE); - IIdType patId2 = myPatientDao.create(pat2, mySrd).getId().toUnqualifiedVersionless(); + myPatientDao.create(pat2, mySrd).getId().toUnqualifiedVersionless(); SearchParameterMap map; IBundleProvider results; @@ -1069,7 +1070,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu myPatientDao.search(map).size(); fail(""); } catch (InvalidRequestException e) { - assertEquals(Msg.code(1223) + "Unknown search parameter \"foo\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); + assertEquals(Msg.code(1223) + "Unknown search parameter \"foo\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, _profile, _security, _tag, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); } } @@ -1094,7 +1095,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu Patient pat2 = new Patient(); pat.setGender(AdministrativeGender.FEMALE); - IIdType patId2 = myPatientDao.create(pat2, mySrd).getId().toUnqualifiedVersionless(); + myPatientDao.create(pat2, mySrd).getId().toUnqualifiedVersionless(); SearchParameterMap map; IBundleProvider results; @@ -1107,7 +1108,7 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu myPatientDao.search(map).size(); fail(""); } catch (InvalidRequestException e) { - assertEquals(Msg.code(1223) + "Unknown search parameter \"foo\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); + assertEquals(Msg.code(1223) + "Unknown search parameter \"foo\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, _profile, _security, _tag, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); } // Try with normal gender SP @@ -1148,10 +1149,10 @@ public class FhirResourceDaoDstu3SearchCustomSearchParamTest extends BaseJpaDstu .findAll() .stream() .filter(t -> t.getParamName().equals("medicationadministration-ingredient-medication")) - .collect(Collectors.toList()); - ourLog.info("Tokens:\n * {}", tokens.stream().map(t -> t.toString()).collect(Collectors.joining("\n * "))); + .toList(); + ourLog.info("Tokens:\n * {}", tokens.stream().map(ResourceIndexedSearchParamToken::toString).collect(Collectors.joining("\n * "))); assertEquals(1, tokens.size(), tokens.toString()); - assertEquals(false, tokens.get(0).isMissing()); + assertFalse(tokens.get(0).isMissing()); }); diff --git a/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/dao/dstu3/FhirResourceDaoDstu3Test.java b/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/dao/dstu3/FhirResourceDaoDstu3Test.java index f1a4171d565..f193c55af7c 100644 --- a/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/dao/dstu3/FhirResourceDaoDstu3Test.java +++ b/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/dao/dstu3/FhirResourceDaoDstu3Test.java @@ -1,5 +1,8 @@ package ca.uhn.fhir.jpa.dao.dstu3; +import static org.hl7.fhir.instance.model.api.IAnyResource.SP_RES_PROFILE; +import static org.hl7.fhir.instance.model.api.IAnyResource.SP_RES_SECURITY; +import static org.hl7.fhir.instance.model.api.IAnyResource.SP_RES_TAG; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNull; @@ -33,6 +36,7 @@ import ca.uhn.fhir.rest.api.MethodOutcome; import ca.uhn.fhir.rest.api.SortOrderEnum; import ca.uhn.fhir.rest.api.SortSpec; import ca.uhn.fhir.rest.api.server.IBundleProvider; +import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId; import ca.uhn.fhir.rest.param.DateParam; import ca.uhn.fhir.rest.param.DateRangeParam; @@ -42,6 +46,7 @@ import ca.uhn.fhir.rest.param.ReferenceParam; import ca.uhn.fhir.rest.param.StringParam; import ca.uhn.fhir.rest.param.TokenOrListParam; import ca.uhn.fhir.rest.param.TokenParam; +import ca.uhn.fhir.rest.param.UriParam; import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; import ca.uhn.fhir.rest.server.exceptions.ResourceGoneException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; @@ -111,6 +116,7 @@ import java.util.Collections; import java.util.Comparator; import java.util.Date; import java.util.List; +import java.util.Objects; import static org.apache.commons.lang3.StringUtils.defaultString; import static org.assertj.core.api.Assertions.assertThat; @@ -183,7 +189,7 @@ public class FhirResourceDaoDstu3Test extends BaseJpaDstu3Test { SearchParameterMap map = new SearchParameterMap(); map.setLoadSynchronous(true); map.add("_tag", new TokenParam(methodName, methodName)); - assertEquals(1, myOrganizationDao.search(map).size().intValue()); + assertEquals(1, Objects.requireNonNull(myOrganizationDao.search(map).size()).intValue()); myOrganizationDao.delete(orgId, mySrd); @@ -2082,7 +2088,102 @@ public class FhirResourceDaoDstu3Test extends BaseJpaDstu3Test { found = toList(myPatientDao.search(new SearchParameterMap(Patient.SP_BIRTHDATE + "AAAA", new DateParam(ParamPrefixEnum.GREATERTHAN, "2000-01-01")).setLoadSynchronous(true))); assertThat(found).isEmpty(); } catch (InvalidRequestException e) { - assertEquals(Msg.code(1223) + "Unknown search parameter \"birthdateAAAA\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); + assertEquals(Msg.code(1223) + "Unknown search parameter \"birthdateAAAA\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, _profile, _security, _tag, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); + } + } + + @Test + public void testPersistSearchParamTag() { + Coding TEST_PARAM_VALUE_CODING_1 = new Coding("test-system", "test-code-1", null); + Coding TEST_PARAM_VALUE_CODING_2 = new Coding("test-system", "test-code-2", null); + + List found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_TAG, new TokenParam(TEST_PARAM_VALUE_CODING_1)).setLoadSynchronous(true), new SystemRequestDetails())); + int initialSizeCoding1 = found.size(); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_TAG, new TokenParam(TEST_PARAM_VALUE_CODING_2)).setLoadSynchronous(true), new SystemRequestDetails())); + int initialSizeCoding2 = found.size(); + + Patient patient = new Patient(); + patient.addIdentifier().setSystem("urn:system").setValue("001"); + patient.getMeta().addTag(TEST_PARAM_VALUE_CODING_1); + + myPatientDao.create(patient, mySrd); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_TAG, new TokenParam(TEST_PARAM_VALUE_CODING_1)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).hasSize(1 + initialSizeCoding1); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_TAG, new TokenParam(TEST_PARAM_VALUE_CODING_2)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).hasSize(initialSizeCoding2); + + // If this throws an exception, that would be an acceptable outcome as well.. + try { + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_TAG + "AAAA", new TokenParam(TEST_PARAM_VALUE_CODING_1)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).isEmpty(); + } catch (InvalidRequestException e) { + assertEquals(Msg.code(1223) + "Unknown search parameter \"" + SP_RES_TAG+"AAAA" + "\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, _profile, _security, _tag, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); + } + } + + @Test + public void testPersistSearchParamProfile() { + String profileA = "profile-AAA"; + String profileB = "profile-BBB"; + List found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_PROFILE, new UriParam(profileA)).setLoadSynchronous(true), new SystemRequestDetails())); + int initialSizeProfileA = found.size(); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_PROFILE, new UriParam(profileB)).setLoadSynchronous(true), new SystemRequestDetails())); + int initialSizeProfileB = found.size(); + + Patient patient = new Patient(); + patient.addIdentifier().setSystem("urn:system").setValue("001"); + patient.getMeta().addProfile(profileA); + + myPatientDao.create(patient, mySrd); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_PROFILE, new UriParam(profileA)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).hasSize(1 + initialSizeProfileA); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_PROFILE, new UriParam(profileB)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).hasSize(initialSizeProfileB); + + // If this throws an exception, that would be an acceptable outcome as well.. + try { + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_PROFILE + "AAAA", new UriParam(profileA)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).isEmpty(); + } catch (InvalidRequestException e) { + assertEquals(Msg.code(1223) + "Unknown search parameter \"" + SP_RES_PROFILE+"AAAA" + "\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, _profile, _security, _tag, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); + } + } + + @Test + public void testPersistSearchParamSecurity() { + Coding TEST_PARAM_VALUE_CODING_1 = new Coding("test-system", "test-code-1", null); + Coding TEST_PARAM_VALUE_CODING_2 = new Coding("test-system", "test-code-2", null); + + List found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_SECURITY, new TokenParam(TEST_PARAM_VALUE_CODING_1)).setLoadSynchronous(true), new SystemRequestDetails())); + int initialSizeSecurityA = found.size(); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_SECURITY, new TokenParam(TEST_PARAM_VALUE_CODING_2)).setLoadSynchronous(true), new SystemRequestDetails())); + int initialSizeSecurityB = found.size(); + + Patient patient = new Patient(); + patient.addIdentifier().setSystem("urn:system").setValue("001"); + patient.getMeta().addSecurity(TEST_PARAM_VALUE_CODING_1); + + myPatientDao.create(patient, mySrd); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_SECURITY, new TokenParam(TEST_PARAM_VALUE_CODING_1)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).hasSize(1 + initialSizeSecurityA); + + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_SECURITY, new TokenParam(TEST_PARAM_VALUE_CODING_2)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).hasSize(initialSizeSecurityB); + + // If this throws an exception, that would be an acceptable outcome as well.. + try { + found = toList(myPatientDao.search(new SearchParameterMap(SP_RES_SECURITY + "AAAA", new TokenParam(TEST_PARAM_VALUE_CODING_1)).setLoadSynchronous(true), new SystemRequestDetails())); + assertThat(found).isEmpty(); + } catch (InvalidRequestException e) { + assertEquals(Msg.code(1223) + "Unknown search parameter \"" + SP_RES_SECURITY+"AAAA" + "\" for resource type \"Patient\". Valid search parameters for this search are: [_id, _lastUpdated, _profile, _security, _tag, active, address, address-city, address-country, address-postalcode, address-state, address-use, animal-breed, animal-species, birthdate, death-date, deceased, email, family, gender, general-practitioner, given, identifier, language, link, name, organization, phone, phonetic, telecom]", e.getMessage()); } } diff --git a/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/searchparam/MatchUrlServiceTest.java b/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/searchparam/MatchUrlServiceTest.java index 27c478df65c..f05aa52be17 100644 --- a/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/searchparam/MatchUrlServiceTest.java +++ b/hapi-fhir-jpaserver-test-dstu3/src/test/java/ca/uhn/fhir/jpa/searchparam/MatchUrlServiceTest.java @@ -1,7 +1,5 @@ package ca.uhn.fhir.jpa.searchparam; -import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.assertNotNull; import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.context.RuntimeResourceDefinition; import ca.uhn.fhir.i18n.Msg; @@ -22,10 +20,10 @@ import org.springframework.test.context.junit.jupiter.SpringExtension; import org.springframework.transaction.PlatformTransactionManager; import static org.assertj.core.api.Assertions.assertThat; -import static org.junit.jupiter.api.Assertions.fail; import static org.assertj.core.api.Assertions.within; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.fail; - import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.mock; @@ -104,19 +102,19 @@ public class MatchUrlServiceTest extends BaseJpaTest { @Test void testTotal_fromStandardLowerCase() { - // given - // when + // given + // when var map = myMatchUrlService.translateMatchUrl("Patient?family=smith&_total=none", ourCtx.getResourceDefinition("Patient")); - // then - assertEquals(SearchTotalModeEnum.NONE, map.getSearchTotalMode()); + // then + assertEquals(SearchTotalModeEnum.NONE, map.getSearchTotalMode()); } @Test void testTotal_fromUpperCase() { // given // when - var map = myMatchUrlService.translateMatchUrl("Patient?family=smith&_total=none", ourCtx.getResourceDefinition("Patient")); + var map = myMatchUrlService.translateMatchUrl("Patient?family=smith&_total=NONE", ourCtx.getResourceDefinition("Patient")); // then assertEquals(SearchTotalModeEnum.NONE, map.getSearchTotalMode()); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2DaoSvcImplTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2DaoSvcImplTest.java new file mode 100644 index 00000000000..d34e1c030d4 --- /dev/null +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2DaoSvcImplTest.java @@ -0,0 +1,225 @@ +package ca.uhn.fhir.jpa.batch2; + +import ca.uhn.fhir.interceptor.model.RequestPartitionId; +import ca.uhn.fhir.jpa.api.pid.IResourcePidStream; +import ca.uhn.fhir.jpa.api.pid.TypedResourcePid; +import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc; +import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; +import ca.uhn.fhir.jpa.searchparam.MatchUrlService; +import ca.uhn.fhir.jpa.test.BaseJpaR4Test; +import ca.uhn.fhir.model.primitive.IdDt; +import ca.uhn.fhir.parser.DataFormatException; +import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; +import jakarta.annotation.Nonnull; +import org.hl7.fhir.instance.model.api.IIdType; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.NullSource; +import org.junit.jupiter.params.provider.ValueSource; +import org.springframework.beans.factory.annotation.Autowired; + +import java.time.LocalDate; +import java.time.Month; +import java.time.ZoneId; +import java.util.Date; +import java.util.List; +import java.util.stream.IntStream; +import java.util.stream.Stream; + +import static org.assertj.core.api.Assertions.assertThat; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; + +class Batch2DaoSvcImplTest extends BaseJpaR4Test { + private static final Date PREVIOUS_MILLENNIUM = toDate(LocalDate.of(1999, Month.DECEMBER, 31)); + private static final Date TOMORROW = toDate(LocalDate.now().plusDays(1)); + + @Autowired + private MatchUrlService myMatchUrlService; + @Autowired + private IHapiTransactionService myIHapiTransactionService ; + private IBatch2DaoSvc mySvc; + + @BeforeEach + void beforeEach() { + mySvc = new Batch2DaoSvcImpl(myResourceTableDao, myMatchUrlService, myDaoRegistry, myFhirContext, myIHapiTransactionService); + } + + @Test + void fetchResourceIds_ByUrlInvalidUrl() { + IResourcePidStream stream = mySvc.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, null, "Patient"); + final InternalErrorException exception = assertThrows(InternalErrorException.class, () -> stream.visitStream(Stream::toList)); + + assertEquals("HAPI-2422: this should never happen: URL is missing a '?'", exception.getMessage()); + } + + @Test + void fetchResourceIds_ByUrlSingleQuestionMark() { + IResourcePidStream stream = mySvc.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, null, "?"); + final IllegalArgumentException exception = assertThrows(IllegalArgumentException.class, () -> stream.visitStream(Stream::toList)); + + assertEquals("theResourceName must not be blank", exception.getMessage()); + } + + @Test + void fetchResourceIds_ByUrlNonsensicalResource() { + IResourcePidStream stream = mySvc.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, null, "Banana?_expunge=true"); + final DataFormatException exception = assertThrows(DataFormatException.class, () -> stream.visitStream(Stream::toList)); + + assertEquals("HAPI-1684: Unknown resource name \"Banana\" (this name is not known in FHIR version \"R4\")", exception.getMessage()); + } + + @ParameterizedTest + @ValueSource(ints = {0, 9, 10, 11, 21, 22, 23, 45}) + void fetchResourceIds_ByUrl(int expectedNumResults) { + final List patientIds = IntStream.range(0, expectedNumResults) + .mapToObj(num -> createPatient()) + .toList(); + + final IResourcePidStream resourcePidList = mySvc.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, RequestPartitionId.defaultPartition(), "Patient?_expunge=true"); + + final List actualPatientIds = + resourcePidList.visitStream(s-> s.map(typePid -> new IdDt(typePid.resourceType, (Long) typePid.id.getId())) + .toList()); + assertIdsEqual(patientIds, actualPatientIds); + } + + @Test + public void fetchResourceIds_ByUrl_WithData() { + // Setup + createPatient(withActiveFalse()).getIdPartAsLong(); + sleepUntilTimeChange(); + + // Start of resources within range + Date start = new Date(); + sleepUntilTimeChange(); + Long patientId1 = createPatient(withActiveFalse()).getIdPartAsLong(); + createObservation(withObservationCode("http://foo", "bar")); + createObservation(withObservationCode("http://foo", "bar")); + sleepUntilTimeChange(); + Long patientId2 = createPatient(withActiveFalse()).getIdPartAsLong(); + sleepUntilTimeChange(); + Date end = new Date(); + // End of resources within range + + createObservation(withObservationCode("http://foo", "bar")); + createPatient(withActiveFalse()).getIdPartAsLong(); + sleepUntilTimeChange(); + + // Execute + myCaptureQueriesListener.clear(); + IResourcePidStream queryStream = mySvc.fetchResourceIdStream(start, end, null, "Patient?active=false"); + + // Verify + List typedResourcePids = queryStream.visitStream(Stream::toList); + + assertThat(typedResourcePids) + .hasSize(2) + .containsExactly( + new TypedResourcePid("Patient", patientId1), + new TypedResourcePid("Patient", patientId2)); + + assertThat(myCaptureQueriesListener.logSelectQueries()).hasSize(1); + assertEquals(0, myCaptureQueriesListener.countInsertQueries()); + assertEquals(0, myCaptureQueriesListener.countUpdateQueries()); + assertEquals(0, myCaptureQueriesListener.countDeleteQueries()); + assertEquals(1, myCaptureQueriesListener.getCommitCount()); + assertEquals(0, myCaptureQueriesListener.getRollbackCount()); + } + + @ParameterizedTest + @ValueSource(ints = {0, 9, 10, 11, 21, 22, 23, 45}) + void fetchResourceIds_NoUrl(int expectedNumResults) { + final List patientIds = IntStream.range(0, expectedNumResults) + .mapToObj(num -> createPatient()) + .toList(); + + // at the moment there is no Prod use-case for noUrl use-case + // reindex will always have urls as well (see https://github.com/hapifhir/hapi-fhir/issues/6179) + final IResourcePidStream resourcePidList = mySvc.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, RequestPartitionId.defaultPartition(), null); + + final List actualPatientIds = + resourcePidList.visitStream(s-> s.map(typePid -> new IdDt(typePid.resourceType, (Long) typePid.id.getId())) + .toList()); + assertIdsEqual(patientIds, actualPatientIds); + } + + private static void assertIdsEqual(List expectedResourceIds, List actualResourceIds) { + assertThat(actualResourceIds).hasSize(expectedResourceIds.size()); + + for (int index = 0; index < expectedResourceIds.size(); index++) { + final IIdType expectedIdType = expectedResourceIds.get(index); + final IIdType actualIdType = actualResourceIds.get(index); + + assertEquals(expectedIdType.getResourceType(), actualIdType.getResourceType()); + assertEquals(expectedIdType.getIdPartAsLong(), actualIdType.getIdPartAsLong()); + } + } + + @Nonnull + private static Date toDate(LocalDate theLocalDate) { + return Date.from(theLocalDate.atStartOfDay(ZoneId.systemDefault()).toInstant()); + } + + @ParameterizedTest + @NullSource + @ValueSource(strings = {"", " "}) + public void fetchResourceIds_NoUrl_WithData(String theMissingUrl) { + // Setup + createPatient(withActiveFalse()); + sleepUntilTimeChange(); + + Date start = new Date(); + Long id0 = createPatient(withActiveFalse()).getIdPartAsLong(); + sleepUntilTimeChange(); + Long id1 = createPatient(withActiveFalse()).getIdPartAsLong(); + sleepUntilTimeChange(); + Long id2 = createObservation(withObservationCode("http://foo", "bar")).getIdPartAsLong(); + sleepUntilTimeChange(); + + Date end = new Date(); + sleepUntilTimeChange(); + createPatient(withActiveFalse()); + + // Execute + myCaptureQueriesListener.clear(); + IResourcePidStream queryStream = mySvc.fetchResourceIdStream(start, end, null, theMissingUrl); + + // Verify + List typedPids = queryStream.visitStream(Stream::toList); + assertThat(typedPids) + .hasSize(3) + .containsExactly( + new TypedResourcePid("Patient", id0), + new TypedResourcePid("Patient", id1), + new TypedResourcePid("Observation", id2)); + + assertThat(myCaptureQueriesListener.logSelectQueries()).hasSize(1); + assertEquals(0, myCaptureQueriesListener.countInsertQueries()); + assertEquals(0, myCaptureQueriesListener.countUpdateQueries()); + assertEquals(0, myCaptureQueriesListener.countDeleteQueries()); + assertEquals(1, myCaptureQueriesListener.getCommitCount()); + assertEquals(0, myCaptureQueriesListener.getRollbackCount()); + } + + @ParameterizedTest + @NullSource + @ValueSource(strings = {"", " "}) + public void fetchResourceIds_NoUrl_NoData(String theMissingUrl) { + // Execute + myCaptureQueriesListener.clear(); + IResourcePidStream queryStream = mySvc.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, null, theMissingUrl); + + // Verify + List typedPids = queryStream.visitStream(Stream::toList); + + assertThat(typedPids).isEmpty(); + assertThat(myCaptureQueriesListener.logSelectQueries()).hasSize(1); + assertEquals(0, myCaptureQueriesListener.countInsertQueries()); + assertEquals(0, myCaptureQueriesListener.countUpdateQueries()); + assertEquals(0, myCaptureQueriesListener.countDeleteQueries()); + assertEquals(1, myCaptureQueriesListener.getCommitCount()); + assertEquals(0, myCaptureQueriesListener.getRollbackCount()); + } +} diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2JobMaintenanceIT.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2JobMaintenanceIT.java index 63ea2feaebb..4fd4de0fdc7 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2JobMaintenanceIT.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2JobMaintenanceIT.java @@ -14,7 +14,6 @@ import ca.uhn.fhir.batch2.model.JobDefinition; import ca.uhn.fhir.batch2.model.JobInstance; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; import ca.uhn.fhir.batch2.model.JobWorkNotificationJsonMessage; -import ca.uhn.fhir.batch2.model.StatusEnum; import ca.uhn.fhir.jpa.subscription.channel.api.ChannelConsumerSettings; import ca.uhn.fhir.jpa.subscription.channel.api.IChannelFactory; import ca.uhn.fhir.jpa.subscription.channel.impl.LinkedBlockingChannel; @@ -53,7 +52,7 @@ import static org.assertj.core.api.Assertions.assertThat; * {@link ca.uhn.fhir.batch2.maintenance.JobInstanceProcessor#cleanupInstance()} * For chunks: - * {@link ca.uhn.fhir.jpa.batch2.JpaJobPersistenceImpl#onWorkChunkCreate} + * {@link JpaJobPersistenceImpl#onWorkChunkCreate} * {@link JpaJobPersistenceImpl#onWorkChunkDequeue(String)} * Chunk execution {@link ca.uhn.fhir.batch2.coordinator.StepExecutor#executeStep} */ diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/JpaJobPersistenceImplTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/JpaJobPersistenceImplTest.java index 00b280a1e21..3c1721d9a78 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/JpaJobPersistenceImplTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/JpaJobPersistenceImplTest.java @@ -1,6 +1,5 @@ package ca.uhn.fhir.jpa.batch2; -import static org.junit.jupiter.api.Assertions.assertFalse; import ca.uhn.fhir.batch2.api.IJobMaintenanceService; import ca.uhn.fhir.batch2.api.IJobPersistence; import ca.uhn.fhir.batch2.api.JobOperationResultJson; @@ -68,6 +67,7 @@ import java.util.stream.Collectors; import static org.assertj.core.api.Assertions.assertThat; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertNotEquals; import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNull; diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/binstore/FilesystemBinaryStorageSvcImplTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/binstore/FilesystemBinaryStorageSvcImplTest.java index c7df9de4b94..7a3dfb00071 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/binstore/FilesystemBinaryStorageSvcImplTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/binstore/FilesystemBinaryStorageSvcImplTest.java @@ -9,6 +9,9 @@ import ca.uhn.fhir.jpa.binary.api.StoredDetails; import ca.uhn.fhir.rest.server.exceptions.PayloadTooLargeException; import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails; +import com.fasterxml.jackson.annotation.JsonInclude; +import com.fasterxml.jackson.databind.ObjectMapper; +import com.fasterxml.jackson.databind.SerializationFeature; import org.apache.commons.io.FileUtils; import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.r4.model.IdType; @@ -24,6 +27,7 @@ import java.io.File; import java.io.IOException; import static org.assertj.core.api.Assertions.assertThat; +import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.fail; public class FilesystemBinaryStorageSvcImplTest { @@ -46,6 +50,30 @@ public class FilesystemBinaryStorageSvcImplTest { FileUtils.deleteDirectory(myPath); } + /** + * See https://github.com/hapifhir/hapi-fhir/pull/6134 + */ + @Test + public void testStoreAndRetrievePostMigration() throws IOException { + String blobId = "some-blob-id"; + String oldDescriptor = "{\n" + + " \"blobId\" : \"" + blobId + "\",\n" + + " \"bytes\" : 80926,\n" + + " \"contentType\" : \"application/fhir+json\",\n" + + " \"hash\" : \"f57596cefbee4c48c8493a2a57ef5f70c52a2c5afa0e48f57cfbf4f219eb0a38\",\n" + + " \"published\" : \"2024-07-20T00:12:28.187+05:30\"\n" + + "}"; + + ObjectMapper myJsonSerializer; + myJsonSerializer = new ObjectMapper(); + myJsonSerializer.setSerializationInclusion(JsonInclude.Include.NON_NULL); + myJsonSerializer.enable(SerializationFeature.INDENT_OUTPUT); + + StoredDetails storedDetails = myJsonSerializer.readValue(oldDescriptor, StoredDetails.class); + assertTrue(storedDetails.getBinaryContentId().equals(blobId));; + + } + @Test public void testStoreAndRetrieve() throws IOException { IIdType id = new IdType("Patient/123"); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/BaseHapiFhirResourceDaoTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/BaseHapiFhirResourceDaoTest.java index 16eb37aa33a..02fbafee38a 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/BaseHapiFhirResourceDaoTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/BaseHapiFhirResourceDaoTest.java @@ -1,11 +1,8 @@ package ca.uhn.fhir.jpa.dao; -import static org.assertj.core.api.Assertions.assertThatThrownBy; -import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.assertNotNull; import ca.uhn.fhir.batch2.api.IJobCoordinator; +import ca.uhn.fhir.batch2.api.IJobPartitionProvider; import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; -import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner; import ca.uhn.fhir.batch2.jobs.reindex.ReindexJobParameters; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; import ca.uhn.fhir.context.FhirContext; @@ -68,9 +65,10 @@ import java.util.stream.Collectors; import java.util.stream.Stream; import static org.assertj.core.api.Assertions.assertThat; +import static org.assertj.core.api.Assertions.assertThatThrownBy; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.fail; -import static org.junit.jupiter.api.Assertions.fail; - import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyLong; import static org.mockito.ArgumentMatchers.isNotNull; @@ -87,6 +85,9 @@ class BaseHapiFhirResourceDaoTest { @Mock private IRequestPartitionHelperSvc myRequestPartitionHelperSvc; + @Mock + private IJobPartitionProvider myJobPartitionProvider; + @Mock private IIdHelperService myIdHelperService; @@ -102,9 +103,6 @@ class BaseHapiFhirResourceDaoTest { @Mock private IJpaStorageResourceParser myJpaStorageResourceParser; - @Mock - private UrlPartitioner myUrlPartitioner; - @Mock private ApplicationContext myApplicationContext; @@ -267,12 +265,12 @@ class BaseHapiFhirResourceDaoTest { public void requestReindexForRelatedResources_withValidBase_includesUrlsInJobParameters() { when(myStorageSettings.isMarkResourcesForReindexingUponSearchParameterChange()).thenReturn(true); - List base = Lists.newArrayList("Patient", "Group"); + RequestPartitionId partitionId = RequestPartitionId.fromPartitionId(1); + List base = Lists.newArrayList("Patient", "Group", "Practitioner"); - when(myUrlPartitioner.partitionUrl(any(), any())).thenAnswer(i -> { - PartitionedUrl partitionedUrl = new PartitionedUrl(); - partitionedUrl.setUrl(i.getArgument(0)); - return partitionedUrl; + when(myJobPartitionProvider.getPartitionedUrls(any(), any())).thenAnswer(i -> { + List urls = i.getArgument(1); + return urls.stream().map(url -> new PartitionedUrl().setUrl(url).setRequestPartitionId(partitionId)).collect(Collectors.toList()); }); mySvc.requestReindexForRelatedResources(false, base, new ServletRequestDetails()); @@ -285,9 +283,12 @@ class BaseHapiFhirResourceDaoTest { assertNotNull(actualRequest.getParameters()); ReindexJobParameters actualParameters = actualRequest.getParameters(ReindexJobParameters.class); - assertThat(actualParameters.getPartitionedUrls()).hasSize(2); - assertEquals("Patient?", actualParameters.getPartitionedUrls().get(0).getUrl()); - assertEquals("Group?", actualParameters.getPartitionedUrls().get(1).getUrl()); + assertThat(actualParameters.getPartitionedUrls()).hasSize(base.size()); + for (int i = 0; i < base.size(); i++) { + PartitionedUrl partitionedUrl = actualParameters.getPartitionedUrls().get(i); + assertEquals(base.get(i) + "?", partitionedUrl.getUrl()); + assertEquals(partitionId, partitionedUrl.getRequestPartitionId()); + } } @Test diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/index/IdHelperServiceTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/index/IdHelperServiceTest.java index c2d763d091d..fa0870d4e67 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/index/IdHelperServiceTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/index/IdHelperServiceTest.java @@ -1,7 +1,5 @@ package ca.uhn.fhir.jpa.dao.index; -import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.assertFalse; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.dao.data.IResourceTableDao; @@ -29,6 +27,8 @@ import java.util.function.Function; import java.util.stream.Collectors; import static org.assertj.core.api.Assertions.assertThat; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyBoolean; import static org.mockito.Mockito.when; @@ -87,13 +87,17 @@ public class IdHelperServiceTest { "Patient", 123l, "RED", - new Date() + new Date(), + null, + null }; Object[] blueView = new Object[] { "Patient", 456l, "BLUE", - new Date() + new Date(), + null, + null }; // when @@ -155,11 +159,13 @@ public class IdHelperServiceTest { String resourceType = "Patient"; String resourceForcedId = "AAA"; - Object[] forcedIdView = new Object[4]; + Object[] forcedIdView = new Object[6]; forcedIdView[0] = resourceType; forcedIdView[1] = 1L; forcedIdView[2] = resourceForcedId; forcedIdView[3] = null; + forcedIdView[4] = null; + forcedIdView[5] = null; Collection testForcedIdViews = new ArrayList<>(); testForcedIdViews.add(forcedIdView); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/BasePartitioningR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/BasePartitioningR4Test.java index 1507e9487ed..0773223b9f0 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/BasePartitioningR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/BasePartitioningR4Test.java @@ -55,7 +55,7 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest { @AfterEach public void after() { - myPartitionInterceptor.assertNoRemainingIds(); + assertNoRemainingPartitionIds(); myPartitionSettings.setIncludePartitionInSearchHashes(new PartitionSettings().isIncludePartitionInSearchHashes()); myPartitionSettings.setPartitioningEnabled(new PartitionSettings().isPartitioningEnabled()); @@ -70,6 +70,10 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest { myStorageSettings.setMatchUrlCacheEnabled(new JpaStorageSettings().getMatchUrlCache()); } + protected void assertNoRemainingPartitionIds() { + myPartitionInterceptor.assertNoRemainingIds(); + } + @Override @BeforeEach public void before() throws Exception { @@ -89,7 +93,8 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest { myPartitionId4 = 4; myPartitionInterceptor = new MyReadWriteInterceptor(); - mySrdInterceptorService.registerInterceptor(myPartitionInterceptor); + + registerPartitionInterceptor(); myPartitionConfigSvc.createPartition(new PartitionEntity().setId(myPartitionId).setName(PARTITION_1), null); myPartitionConfigSvc.createPartition(new PartitionEntity().setId(myPartitionId2).setName(PARTITION_2), null); @@ -106,6 +111,11 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest { for (int i = 1; i <= 4; i++) { myPartitionConfigSvc.getPartitionById(i); } + + } + + protected void registerPartitionInterceptor() { + mySrdInterceptorService.registerInterceptor(myPartitionInterceptor); } @Override diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4QueryCountTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4QueryCountTest.java index 9d6150deb28..52735147cd9 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4QueryCountTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4QueryCountTest.java @@ -18,7 +18,7 @@ import ca.uhn.fhir.jpa.api.model.DeleteMethodOutcome; import ca.uhn.fhir.jpa.api.model.ExpungeOptions; import ca.uhn.fhir.jpa.api.model.HistoryCountModeEnum; import ca.uhn.fhir.jpa.dao.data.ISearchParamPresentDao; -import ca.uhn.fhir.jpa.delete.job.ReindexTestHelper; +import ca.uhn.fhir.jpa.reindex.ReindexTestHelper; import ca.uhn.fhir.jpa.entity.TermValueSet; import ca.uhn.fhir.jpa.entity.TermValueSetPreExpansionStatusEnum; import ca.uhn.fhir.jpa.interceptor.ForceOffsetSearchModeInterceptor; @@ -595,11 +595,11 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test fail(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(e.getOperationOutcome())); } myCaptureQueriesListener.logSelectQueriesForCurrentThread(); - assertEquals(8, myCaptureQueriesListener.getSelectQueriesForCurrentThread().size()); + assertEquals(10, myCaptureQueriesListener.getSelectQueriesForCurrentThread().size()); assertEquals(0, myCaptureQueriesListener.getUpdateQueriesForCurrentThread().size()); assertEquals(0, myCaptureQueriesListener.getInsertQueriesForCurrentThread().size()); assertEquals(0, myCaptureQueriesListener.getDeleteQueriesForCurrentThread().size()); - assertEquals(6, myCaptureQueriesListener.getCommitCount()); + assertEquals(8, myCaptureQueriesListener.getCommitCount()); // Validate again (should rely only on caches) myCaptureQueriesListener.clear(); @@ -2315,7 +2315,7 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test assertEquals(4, myCaptureQueriesListener.countInsertQueries()); myCaptureQueriesListener.logUpdateQueries(); assertEquals(8, myCaptureQueriesListener.countUpdateQueries()); - assertEquals(0, myCaptureQueriesListener.countDeleteQueries()); + assertEquals(4, myCaptureQueriesListener.countDeleteQueries()); /* * Third time with mass ingestion mode enabled @@ -3104,7 +3104,7 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test assertEquals(3, myCaptureQueriesListener.countSelectQueriesForCurrentThread()); assertEquals(6, myCaptureQueriesListener.countInsertQueriesForCurrentThread()); assertEquals(1, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); - assertEquals(0, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); + assertEquals(1, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); assertEquals(1, myCaptureQueriesListener.countCommits()); assertEquals(0, myCaptureQueriesListener.countRollbacks()); @@ -3496,8 +3496,8 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test assertEquals(6, myCaptureQueriesListener.countSelectQueriesForCurrentThread()); myCaptureQueriesListener.logInsertQueries(); assertEquals(4, myCaptureQueriesListener.countInsertQueriesForCurrentThread()); - assertEquals(6, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); - assertEquals(0, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); + assertEquals(7, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); + assertEquals(2, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); ourLog.info(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome)); IdType patientId = new IdType(outcome.getEntry().get(1).getResponse().getLocation()); @@ -3579,8 +3579,8 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test myCaptureQueriesListener.logSelectQueriesForCurrentThread(); assertEquals(6, myCaptureQueriesListener.countSelectQueriesForCurrentThread()); assertEquals(2, myCaptureQueriesListener.countInsertQueriesForCurrentThread()); - assertEquals(5, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); - assertEquals(0, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); + assertEquals(6, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); + assertEquals(2, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); } diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4Test.java index 9a07abb0cdc..c220b635433 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4Test.java @@ -1,12 +1,10 @@ package ca.uhn.fhir.jpa.dao.r4; -import static org.junit.jupiter.api.Assertions.assertNotNull; -import static org.junit.jupiter.api.Assertions.assertNull; -import static org.junit.jupiter.api.Assertions.assertFalse; import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; +import ca.uhn.fhir.jpa.api.dao.IFhirSystemDao; import ca.uhn.fhir.jpa.api.model.HistoryCountModeEnum; import ca.uhn.fhir.jpa.api.pid.StreamTemplate; import ca.uhn.fhir.jpa.dao.BaseHapiFhirDao; @@ -33,11 +31,13 @@ import ca.uhn.fhir.model.api.Include; import ca.uhn.fhir.model.api.ResourceMetadataKeyEnum; import ca.uhn.fhir.model.valueset.BundleEntrySearchModeEnum; import ca.uhn.fhir.model.valueset.BundleEntryTransactionMethodEnum; +import ca.uhn.fhir.parser.IParser; import ca.uhn.fhir.rest.api.Constants; import ca.uhn.fhir.rest.api.MethodOutcome; import ca.uhn.fhir.rest.api.SortOrderEnum; import ca.uhn.fhir.rest.api.SortSpec; import ca.uhn.fhir.rest.api.server.IBundleProvider; +import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId; import ca.uhn.fhir.rest.param.DateParam; @@ -114,8 +114,10 @@ import org.hl7.fhir.r4.model.Reference; import org.hl7.fhir.r4.model.SimpleQuantity; import org.hl7.fhir.r4.model.StringType; import org.hl7.fhir.r4.model.StructureDefinition; +import org.hl7.fhir.r4.model.Task; import org.hl7.fhir.r4.model.Timing; import org.hl7.fhir.r4.model.UriType; +import org.intellij.lang.annotations.Language; import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Disabled; @@ -142,6 +144,7 @@ import java.util.concurrent.ExecutionException; import java.util.concurrent.ExecutorService; import java.util.concurrent.Executors; import java.util.concurrent.Future; +import java.util.function.Function; import java.util.stream.Collectors; import java.util.stream.IntStream; @@ -150,9 +153,12 @@ import static ca.uhn.fhir.rest.api.Constants.PARAM_HAS; import static org.apache.commons.lang3.StringUtils.countMatches; import static org.apache.commons.lang3.StringUtils.defaultString; import static org.assertj.core.api.Assertions.assertThat; -import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.junit.jupiter.api.Assertions.fail; @SuppressWarnings({"unchecked", "deprecation", "Duplicates"}) @@ -4316,6 +4322,158 @@ public class FhirResourceDaoR4Test extends BaseJpaR4Test { assertEquals(ids, createdIds); } + @Test + public void bundle1CreatesResourceByCondition_bundle2UpdatesExistingResourceToNotMatchConditionThenCreatesBySameCondition_shouldPass() { + // setup + IParser parser = myFhirContext.newJsonParser(); + String idToReplace = "/Task/11852"; // in bundle 2 + String identifierSystem = "https://tempuri.org"; + Bundle bundle1; + Bundle bundle2; + { + @Language("JSON") + String bundleStr = """ + { + "resourceType": "Bundle", + "type": "transaction", + "entry": [ + { + "fullUrl": "urn:uuid:1fee7dea-c2a8-47b1-80a9-a681457cc44f", + "resource": { + "resourceType": "Task", + "identifier": [ + { + "system": "https://tempuri.org", + "value": "t1" + } + ] + }, + "request": { + "method": "POST", + "url": "/Task", + "ifNoneExist": "identifier=https://tempuri.org|t1" + } + } + ] + } + """; + bundle1 = parser.parseResource(Bundle.class, bundleStr); + } + { + @Language("JSON") + String bundleStr = """ + { + "resourceType": "Bundle", + "type": "transaction", + "entry": [ + { + "fullUrl": "urn:uuid:1fee7dea-c2a8-47b1-80a9-a681457cc44f", + "resource": { + "resourceType": "Task", + "identifier": [ + { + "system": "https://tempuri.org", + "value": "t1" + } + ] + }, + "request": { + "method": "POST", + "url": "/Task", + "ifNoneExist": "identifier=https://tempuri.org|t1" + } + }, + { + "fullUrl": "http://localhost:8000/Task/11852", + "resource": { + "resourceType": "Task", + "identifier": [ + { + "system": "https://tempuri.org", + "value": "t2" + } + ] + }, + "request": { + "method": "PUT", + "url": "/Task/11852" + } + } + ] + } + """; + bundle2 = parser.parseResource(Bundle.class, bundleStr); + } + + IFhirSystemDao systemDao = myDaoRegistry.getSystemDao(); + RequestDetails reqDets = new SystemRequestDetails(); + + Bundle createdBundle; + String id; + { + // create bundle1 + createdBundle = systemDao.transaction(reqDets, bundle1); + assertNotNull(createdBundle); + assertFalse(createdBundle.getEntry().isEmpty()); + Optional entry = createdBundle.getEntry() + .stream().filter(e -> e.getResponse() != null && e.getResponse().getStatus().contains("201 Created")) + .findFirst(); + assertTrue(entry.isPresent()); + String idAndVersion = entry.get().getResponse().getLocation(); + + if (idAndVersion.contains("/_history")) { + IIdType idt = new IdType(idAndVersion); + id = idt.toVersionless().getValue(); + } else { + id = idAndVersion; + } + + // verify task creation + Task task = getTaskForId(id, identifierSystem); + assertTrue(task.getIdentifier().stream().anyMatch(i -> i.getSystem().equals(identifierSystem) && i.getValue().equals("t1"))); + } + + // update second bundle to use already-saved-Task's id + Optional entryComponent = bundle2.getEntry().stream() + .filter(e -> e.getRequest().getUrl().equals(idToReplace)) + .findFirst(); + assertTrue(entryComponent.isPresent()); + BundleEntryComponent entry = entryComponent.get(); + entry.getRequest().setUrl(id); + entry.setFullUrl(entry.getFullUrl().replace(idToReplace, id)); + + { + // post second bundle first time + createdBundle = systemDao.transaction(reqDets, bundle2); + assertNotNull(createdBundle); + // check that the Task is recognized, but not recreated + assertFalse(createdBundle.getEntry().stream().anyMatch(e -> e.getResponse() != null && e.getResponse().getStatus().contains("201"))); + // but the Task should have been updated + // (changing it to not match the identifier anymore) + assertTrue(createdBundle.getEntry().stream().anyMatch(e -> e.getResponse() != null && e.getResponse().getLocation().equals(id + "/_history/2"))); + + // verify task update + Task task = getTaskForId(id, identifierSystem); + assertTrue(task.getIdentifier().stream().anyMatch(i -> i.getSystem().equals(identifierSystem) && i.getValue().equals("t2"))); + + // post again; should succeed (not throw) + createdBundle = systemDao.transaction(reqDets, bundle2); + assertNotNull(createdBundle); + // should have created the second task + assertTrue(createdBundle.getEntry().stream() + .anyMatch(e -> e.getResponse() != null && e.getResponse().getStatus().contains("201 Created"))); + } + } + + private Task getTaskForId(String theId, String theIdentifier) { + Task task = myTaskDao.read(new IdType(theId), new SystemRequestDetails()); + assertNotNull(task); + assertFalse(task.getIdentifier().isEmpty()); + assertTrue(task.getIdentifier().stream().anyMatch(i -> i.getSystem().equals(theIdentifier))); + + return task; + } + public static void assertConflictException(String theResourceType, ResourceVersionConflictException e) { assertThat(e.getMessage()).matches(Msg.code(550) + Msg.code(515) + "Unable to delete [a-zA-Z]+/[0-9]+ because at least one resource has a reference to this resource. First reference found was resource " + theResourceType + "/[0-9]+ in path [a-zA-Z]+.[a-zA-Z]+"); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitionedStrictTransactionR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitionedStrictTransactionR4Test.java new file mode 100644 index 00000000000..6c8f9f09d8d --- /dev/null +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitionedStrictTransactionR4Test.java @@ -0,0 +1,211 @@ +package ca.uhn.fhir.jpa.dao.r4; + +import ca.uhn.fhir.interceptor.api.Hook; +import ca.uhn.fhir.interceptor.api.Pointcut; +import ca.uhn.fhir.interceptor.model.ReadPartitionIdRequestDetails; +import ca.uhn.fhir.interceptor.model.RequestPartitionId; +import ca.uhn.fhir.jpa.dao.tx.HapiTransactionService; +import ca.uhn.fhir.rest.api.Constants; +import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; +import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; +import ca.uhn.fhir.rest.server.exceptions.ResourceGoneException; +import ca.uhn.fhir.util.BundleBuilder; +import jakarta.annotation.Nonnull; +import org.hl7.fhir.instance.model.api.IBaseResource; +import org.hl7.fhir.instance.model.api.IIdType; +import org.hl7.fhir.r4.model.Bundle; +import org.hl7.fhir.r4.model.CodeType; +import org.hl7.fhir.r4.model.IdType; +import org.hl7.fhir.r4.model.Observation; +import org.hl7.fhir.r4.model.Parameters; +import org.hl7.fhir.r4.model.Patient; +import org.hl7.fhir.r4.model.StringType; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.CsvSource; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.transaction.annotation.Propagation; + +import static org.assertj.core.api.Assertions.assertThat; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; + +public class PartitionedStrictTransactionR4Test extends BasePartitioningR4Test { + + @Autowired + private HapiTransactionService myTransactionService; + + @Override + public void before() throws Exception { + super.before(); + myTransactionService.setTransactionPropagationWhenChangingPartitions(Propagation.REQUIRES_NEW); + } + + @Override + public void after() { + super.after(); + myTransactionService.setTransactionPropagationWhenChangingPartitions(HapiTransactionService.DEFAULT_TRANSACTION_PROPAGATION_WHEN_CHANGING_PARTITIONS); + myInterceptorRegistry.unregisterInterceptorsIf(t -> t instanceof MyPartitionSelectorInterceptor); + } + + /** + * We manually register {@link MyPartitionSelectorInterceptor} for this test class + * as the partition interceptor + */ + @Override + protected void registerPartitionInterceptor() { + myInterceptorRegistry.registerInterceptor(new MyPartitionSelectorInterceptor()); + } + + @Override + protected void assertNoRemainingPartitionIds() { + // We don't use the superclass to manage partition IDs + } + + + @ParameterizedTest + @CsvSource({ + "batch , 2", + "transaction , 1", + }) + public void testSinglePartitionCreate(String theBundleType, int theExpectedCommitCount) { + BundleBuilder bb = new BundleBuilder(myFhirContext); + bb.addTransactionCreateEntry(newPatient()); + bb.addTransactionCreateEntry(newPatient()); + bb.setType(theBundleType); + Bundle input = bb.getBundleTyped(); + + // Test + myCaptureQueriesListener.clear(); + Bundle output = mySystemDao.transaction(mySrd, input); + + // Verify + assertEquals(theExpectedCommitCount, myCaptureQueriesListener.countCommits()); + assertEquals(0, myCaptureQueriesListener.countRollbacks()); + IdType id = new IdType(output.getEntry().get(0).getResponse().getLocation()); + Patient actualPatient = myPatientDao.read(id, mySrd); + RequestPartitionId actualPartitionId = (RequestPartitionId) actualPatient.getUserData(Constants.RESOURCE_PARTITION_ID); + assertThat(actualPartitionId.getPartitionIds()).containsExactly(myPartitionId); + } + + + @Test + public void testSinglePartitionDelete() { + createPatient(withId("A"), withActiveTrue()); + + BundleBuilder bb = new BundleBuilder(myFhirContext); + bb.addTransactionDeleteEntry(new IdType("Patient/A")); + Bundle input = bb.getBundleTyped(); + + // Test + myCaptureQueriesListener.clear(); + Bundle output = mySystemDao.transaction(mySrd, input); + + // Verify + assertEquals(1, myCaptureQueriesListener.countCommits()); + assertEquals(0, myCaptureQueriesListener.countRollbacks()); + IdType id = new IdType(output.getEntry().get(0).getResponse().getLocation()); + assertEquals("2", id.getVersionIdPart()); + + assertThrows(ResourceGoneException.class, () -> myPatientDao.read(id.toUnqualifiedVersionless(), mySrd)); + } + + @Test + public void testSinglePartitionPatch() { + IIdType id = createPatient(withId("A"), withActiveTrue()); + assertTrue(myPatientDao.read(id.toUnqualifiedVersionless(), mySrd).getActive()); + + Parameters patch = new Parameters(); + Parameters.ParametersParameterComponent operation = patch.addParameter(); + operation.setName("operation"); + operation + .addPart() + .setName("type") + .setValue(new CodeType("replace")); + operation + .addPart() + .setName("path") + .setValue(new StringType("Patient.active")); + operation + .addPart() + .setName("name") + .setValue(new CodeType("false")); + + BundleBuilder bb = new BundleBuilder(myFhirContext); + bb.addTransactionFhirPatchEntry(new IdType("Patient/A"), patch); + Bundle input = bb.getBundleTyped(); + + + // Test + myCaptureQueriesListener.clear(); + Bundle output = mySystemDao.transaction(mySrd, input); + + // Verify + assertEquals(1, myCaptureQueriesListener.countCommits()); + assertEquals(0, myCaptureQueriesListener.countRollbacks()); + id = new IdType(output.getEntry().get(0).getResponse().getLocation()); + assertEquals("2", id.getVersionIdPart()); + + assertFalse(myPatientDao.read(id.toUnqualifiedVersionless(), mySrd).getActive()); + } + + @Test + public void testMultipleNonMatchingPartitions() { + BundleBuilder bb = new BundleBuilder(myFhirContext); + bb.addTransactionCreateEntry(newPatient()); + bb.addTransactionCreateEntry(newObservation()); + Bundle input = bb.getBundleTyped(); + + // Test + var e = assertThrows(InvalidRequestException.class, () -> mySystemDao.transaction(mySrd, input)); + assertThat(e.getMessage()).contains("HAPI-2541: Can not process transaction with 2 entries: Entries require access to multiple/conflicting partitions"); + + } + + private static @Nonnull Patient newPatient() { + Patient patient = new Patient(); + patient.setActive(true); + return patient; + } + + private static @Nonnull Observation newObservation() { + Observation observation = new Observation(); + observation.setStatus(Observation.ObservationStatus.FINAL); + return observation; + } + + + public class MyPartitionSelectorInterceptor { + + @Hook(Pointcut.STORAGE_PARTITION_IDENTIFY_CREATE) + public RequestPartitionId selectPartitionCreate(IBaseResource theResource) { + String resourceType = myFhirContext.getResourceType(theResource); + return selectPartition(resourceType); + } + + @Hook(Pointcut.STORAGE_PARTITION_IDENTIFY_READ) + public RequestPartitionId selectPartitionRead(ReadPartitionIdRequestDetails theDetails) { + return selectPartition(theDetails.getResourceType()); + } + + @Nonnull + private RequestPartitionId selectPartition(String theResourceType) { + switch (theResourceType) { + case "Patient": + return RequestPartitionId.fromPartitionId(myPartitionId); + case "Observation": + return RequestPartitionId.fromPartitionId(myPartitionId2); + case "SearchParameter": + case "Organization": + return RequestPartitionId.defaultPartition(); + default: + throw new InternalErrorException("Don't know how to handle resource type: " + theResourceType); + } + } + + } + + +} diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitioningNonNullDefaultPartitionR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitioningNonNullDefaultPartitionR4Test.java index 21428684168..66857177717 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitioningNonNullDefaultPartitionR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitioningNonNullDefaultPartitionR4Test.java @@ -47,8 +47,7 @@ public class PartitioningNonNullDefaultPartitionR4Test extends BasePartitioningR addCreateDefaultPartition(); // we need two read partition accesses for when the creation of the SP triggers a reindex of Patient addReadDefaultPartition(); // one for search param validation - addReadDefaultPartition(); // one to rewrite the resource url - addReadDefaultPartition(); // and one for the job request itself + addReadDefaultPartition(); // and one for the reindex job SearchParameter sp = new SearchParameter(); sp.addBase("Patient"); sp.setStatus(Enumerations.PublicationStatus.ACTIVE); @@ -80,8 +79,7 @@ public class PartitioningNonNullDefaultPartitionR4Test extends BasePartitioningR addCreateDefaultPartition(); // we need two read partition accesses for when the creation of the SP triggers a reindex of Patient addReadDefaultPartition(); // one for search param validation - addReadDefaultPartition(); // one to rewrite the resource url - addReadDefaultPartition(); // and one for the job request itself + addReadDefaultPartition(); // and one for the reindex job SearchParameter sp = new SearchParameter(); sp.setId("SearchParameter/A"); sp.addBase("Patient"); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitioningSqlR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitioningSqlR4Test.java index 69b55c35813..9eddb6767e8 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitioningSqlR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/PartitioningSqlR4Test.java @@ -3,6 +3,7 @@ package ca.uhn.fhir.jpa.dao.r4; import ca.uhn.fhir.batch2.api.IJobCoordinator; import ca.uhn.fhir.batch2.jobs.expunge.DeleteExpungeAppCtx; import ca.uhn.fhir.batch2.jobs.expunge.DeleteExpungeJobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; import ca.uhn.fhir.batch2.model.JobInstance; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; import ca.uhn.fhir.context.RuntimeResourceDefinition; @@ -90,13 +91,11 @@ import java.util.stream.Collectors; import static ca.uhn.fhir.util.TestUtil.sleepAtLeast; import static org.apache.commons.lang3.StringUtils.countMatches; import static org.assertj.core.api.Assertions.assertThat; -import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertNotEquals; import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.fail; - import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.times; @@ -173,7 +172,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { // Look up the referenced subject/patient String sql = selectQueries.get(0).getSql(true, false).toLowerCase(); assertThat(sql).contains(" from hfj_resource "); - assertEquals(0, StringUtils.countMatches(selectQueries.get(0).getSql(true, false).toLowerCase(), "partition")); + assertEquals(2, StringUtils.countMatches(selectQueries.get(0).getSql(true, false).toLowerCase(), "partition")); runInTransaction(() -> { List resLinks = myResourceLinkDao.findAll(); @@ -181,6 +180,8 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { assertEquals(2, resLinks.size()); assertEquals(obsId.getIdPartAsLong(), resLinks.get(0).getSourceResourcePid()); assertEquals(patientId.getIdPartAsLong(), resLinks.get(0).getTargetResourcePid()); + assertEquals(myPartitionId, resLinks.get(0).getTargetResourcePartitionId().getPartitionId()); + assertLocalDateFromDbMatches(myPartitionDate, resLinks.get(0).getTargetResourcePartitionId().getPartitionDate()); }); } @@ -465,6 +466,8 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { assertEquals(1, resourceLinks.size()); assertEquals(myPartitionId, resourceLinks.get(0).getPartitionId().getPartitionId().intValue()); assertLocalDateFromDbMatches(myPartitionDate, resourceLinks.get(0).getPartitionId().getPartitionDate()); + assertEquals(myPartitionId, resourceLinks.get(0).getTargetResourcePartitionId().getPartitionId().intValue()); + assertLocalDateFromDbMatches(myPartitionDate, resourceLinks.get(0).getTargetResourcePartitionId().getPartitionDate()); // HFJ_RES_PARAM_PRESENT List presents = mySearchParamPresentDao.findAllForResource(resourceTable); @@ -658,6 +661,8 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { addCreatePartition(myPartitionId, myPartitionDate); addCreatePartition(myPartitionId, myPartitionDate); addCreatePartition(myPartitionId, myPartitionDate); + addCreatePartition(myPartitionId, myPartitionDate); + addCreatePartition(myPartitionId, myPartitionDate); Bundle input = new Bundle(); input.setType(Bundle.BundleType.TRANSACTION); @@ -713,8 +718,10 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { assertEquals(1, myObservationDao.search(SearchParameterMap.newSynchronous(), mySrd).size()); DeleteExpungeJobParameters jobParameters = new DeleteExpungeJobParameters(); - jobParameters.addUrl("Patient?_id=" + p1.getIdPart() + "," + p2.getIdPart()); - jobParameters.setRequestPartitionId(RequestPartitionId.fromPartitionId(myPartitionId)); + PartitionedUrl partitionedUrl = new PartitionedUrl() + .setUrl("Patient?_id=" + p1.getIdPart() + "," + p2.getIdPart()) + .setRequestPartitionId(RequestPartitionId.fromPartitionId(myPartitionId)); + jobParameters.addPartitionedUrl(partitionedUrl); jobParameters.setCascade(true); JobInstanceStartRequest startRequest = new JobInstanceStartRequest(); @@ -1375,7 +1382,8 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { // Only the read columns should be used, no criteria use partition assertThat(searchSql).as(searchSql).contains("PARTITION_ID IN ('1')"); - assertThat(StringUtils.countMatches(searchSql, "PARTITION_ID")).as(searchSql).isEqualTo(1); + assertThat(StringUtils.countMatches(searchSql, "PARTITION_ID,")).as(searchSql).isEqualTo(1); + assertThat(StringUtils.countMatches(searchSql, "PARTITION_DATE")).as(searchSql).isEqualTo(1); } // Read in null Partition @@ -1428,7 +1436,8 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { String searchSql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, false).toUpperCase(); ourLog.info("Search SQL:\n{}", searchSql); assertThat(searchSql).as(searchSql).contains("PARTITION_ID IN ('1')"); - assertThat(StringUtils.countMatches(searchSql, "PARTITION_ID")).as(searchSql).isEqualTo(1); + assertThat(StringUtils.countMatches(searchSql, "PARTITION_ID,")).as(searchSql).isEqualTo(1); + assertThat(StringUtils.countMatches(searchSql, "PARTITION_DATE")).as(searchSql).isEqualTo(1); // Second SQL performs the search searchSql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(1).getSql(true, false).toUpperCase(); @@ -2086,9 +2095,11 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { myCaptureQueriesListener.logSelectQueriesForCurrentThread(1); assertThat(outcome.getResources(0, 1)).hasSize(1); - String searchSql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, true); + String searchSql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, false); ourLog.info("Search SQL:\n{}", searchSql); - assertThat(StringUtils.countMatches(searchSql, "PARTITION_ID")).as(searchSql).isEqualTo(1); + assertThat(searchSql).as(searchSql).contains("PARTITION_ID in ('1')"); + assertThat(StringUtils.countMatches(searchSql, "PARTITION_ID,")).as(searchSql).isEqualTo(1); + assertThat(StringUtils.countMatches(searchSql, "PARTITION_DATE")).as(searchSql).isEqualTo(1); } @@ -2875,7 +2886,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { ourLog.info("About to start transaction"); - for (int i = 0; i < 40; i++) { + for (int i = 0; i < 60; i++) { addCreatePartition(1, null); } @@ -2908,7 +2919,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { assertEquals(4, myCaptureQueriesListener.countInsertQueriesForCurrentThread()); myCaptureQueriesListener.logUpdateQueriesForCurrentThread(); assertEquals(8, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); - assertEquals(0, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); + assertEquals(4, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); /* * Third time with mass ingestion mode enabled @@ -2987,8 +2998,8 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test { assertEquals(26, myCaptureQueriesListener.countSelectQueriesForCurrentThread()); assertEquals(0, myCaptureQueriesListener.countInsertQueriesForCurrentThread()); - assertEquals(0, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); - assertEquals(0, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); + assertEquals(326, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); + assertEquals(326, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); assertEquals(1, myCaptureQueriesListener.countCommits()); assertEquals(0, myCaptureQueriesListener.countRollbacks()); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/tx/ReindexStepTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/tx/ReindexStepTest.java index bc178d513a4..69ed1e1507c 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/tx/ReindexStepTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/tx/ReindexStepTest.java @@ -4,6 +4,7 @@ import static org.junit.jupiter.api.Assertions.assertEquals; import ca.uhn.fhir.batch2.api.IJobDataSink; import ca.uhn.fhir.batch2.api.VoidModel; import ca.uhn.fhir.batch2.jobs.chunk.ResourceIdListWorkChunkJson; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; import ca.uhn.fhir.batch2.jobs.reindex.ReindexJobParameters; import ca.uhn.fhir.batch2.jobs.reindex.ReindexStep; import ca.uhn.fhir.interceptor.model.RequestPartitionId; @@ -45,7 +46,7 @@ public class ReindexStepTest { RequestPartitionId partitionId = RequestPartitionId.fromPartitionId(expectedPartitionId); ResourceIdListWorkChunkJson data = new ResourceIdListWorkChunkJson(List.of(), partitionId); ReindexJobParameters reindexJobParameters = new ReindexJobParameters(); - reindexJobParameters.setRequestPartitionId(partitionId); + reindexJobParameters.addPartitionedUrl(new PartitionedUrl().setRequestPartitionId(partitionId)); when(myHapiTransactionService.withRequest(any())).thenCallRealMethod(); when(myHapiTransactionService.buildExecutionBuilder(any())).thenCallRealMethod(); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/interceptor/ResourceTypePartitionInterceptorR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/interceptor/ResourceTypePartitionInterceptorR4Test.java new file mode 100644 index 00000000000..1771bdaec84 --- /dev/null +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/interceptor/ResourceTypePartitionInterceptorR4Test.java @@ -0,0 +1,88 @@ +package ca.uhn.fhir.jpa.interceptor; + +import ca.uhn.fhir.batch2.model.StatusEnum; +import ca.uhn.fhir.interceptor.api.Hook; +import ca.uhn.fhir.interceptor.api.Pointcut; +import ca.uhn.fhir.interceptor.model.ReadPartitionIdRequestDetails; +import ca.uhn.fhir.interceptor.model.RequestPartitionId; +import ca.uhn.fhir.jpa.entity.PartitionEntity; +import ca.uhn.fhir.jpa.model.config.PartitionSettings; +import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test; +import ca.uhn.fhir.rest.server.provider.ProviderConstants; +import jakarta.annotation.Nonnull; +import org.hl7.fhir.instance.model.api.IBaseResource; +import org.hl7.fhir.instance.model.api.IIdType; +import org.hl7.fhir.r4.model.Parameters; +import org.hl7.fhir.r4.model.StringType; +import org.junit.jupiter.api.AfterEach; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.CsvSource; + +import static org.assertj.core.api.Assertions.assertThat; + +public class ResourceTypePartitionInterceptorR4Test extends BaseResourceProviderR4Test { + private final MyPartitionSelectorInterceptor myPartitionInterceptor = new MyPartitionSelectorInterceptor(); + + @Override + @BeforeEach + public void before() { + myPartitionSettings.setPartitioningEnabled(true); + myPartitionSettings.setAllowReferencesAcrossPartitions(PartitionSettings.CrossPartitionReferenceMode.ALLOWED_UNQUALIFIED); + myInterceptorRegistry.registerInterceptor(myPartitionInterceptor); + + myPartitionConfigSvc.createPartition(new PartitionEntity().setId(1).setName("PART-1"), null); + myPartitionConfigSvc.createPartition(new PartitionEntity().setId(2).setName("PART-2"), null); + myPartitionConfigSvc.createPartition(new PartitionEntity().setId(3).setName("PART-3"), null); + } + + @AfterEach + public void after() { + myPartitionSettings.setPartitioningEnabled(new PartitionSettings().isPartitioningEnabled()); + myPartitionSettings.setAllowReferencesAcrossPartitions(new PartitionSettings().getAllowReferencesAcrossPartitions()); + myInterceptorRegistry.unregisterInterceptor(myPartitionInterceptor); + } + + @ParameterizedTest + @CsvSource(value = {"Patient?, 1", "Observation?, 1", ",3"}) + public void reindex_withUrl_completesSuccessfully(String theUrl, int theExpectedIndexedResourceCount) { + IIdType patientId = createPatient(withGiven("John")); + createObservation(withSubject(patientId)); + createEncounter(); + + Parameters input = new Parameters(); + input.setParameter(ProviderConstants.OPERATION_REINDEX_PARAM_URL, theUrl); + Parameters response = myClient + .operation() + .onServer() + .named(ProviderConstants.OPERATION_REINDEX) + .withParameters(input) + .execute(); + + String jobId = ((StringType)response.getParameterValue(ProviderConstants.OPERATION_REINDEX_RESPONSE_JOB_ID)).getValue(); + myBatch2JobHelper.awaitJobHasStatus(jobId, StatusEnum.COMPLETED); + assertThat(myBatch2JobHelper.getCombinedRecordsProcessed(jobId)).isEqualTo(theExpectedIndexedResourceCount); + } + + public class MyPartitionSelectorInterceptor { + @Hook(Pointcut.STORAGE_PARTITION_IDENTIFY_CREATE) + public RequestPartitionId selectPartitionCreate(IBaseResource theResource) { + return selectPartition(myFhirContext.getResourceType(theResource)); + } + + @Hook(Pointcut.STORAGE_PARTITION_IDENTIFY_READ) + public RequestPartitionId selectPartitionRead(ReadPartitionIdRequestDetails theDetails) { + return selectPartition(theDetails.getResourceType()); + } + + @Nonnull + private static RequestPartitionId selectPartition(String resourceType) { + return switch (resourceType) { + case "Patient" -> RequestPartitionId.fromPartitionId(1); + case "Observation" -> RequestPartitionId.fromPartitionId(2); + default -> RequestPartitionId.fromPartitionId(3); + }; + } + + } +} diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/partition/PartitionedSubscriptionTriggeringR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/partition/PartitionedSubscriptionTriggeringR4Test.java index 16b3a938b87..e345d711db2 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/partition/PartitionedSubscriptionTriggeringR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/partition/PartitionedSubscriptionTriggeringR4Test.java @@ -11,9 +11,9 @@ import ca.uhn.fhir.jpa.api.model.ExpungeOptions; import ca.uhn.fhir.jpa.dao.r4.BasePartitioningR4Test; import ca.uhn.fhir.jpa.entity.PartitionEntity; import ca.uhn.fhir.jpa.model.config.PartitionSettings; +import ca.uhn.fhir.jpa.model.config.SubscriptionSettings; import ca.uhn.fhir.jpa.subscription.BaseSubscriptionsR4Test; import ca.uhn.fhir.jpa.subscription.resthook.RestHookTestR4Test; -import ca.uhn.fhir.jpa.model.config.SubscriptionSettings; import ca.uhn.fhir.jpa.subscription.triggering.ISubscriptionTriggeringSvc; import ca.uhn.fhir.jpa.subscription.triggering.SubscriptionTriggeringSvcImpl; import ca.uhn.fhir.jpa.test.util.StoppableSubscriptionDeliveringRestHookSubscriber; @@ -26,7 +26,12 @@ import jakarta.servlet.ServletException; import org.awaitility.core.ConditionTimeoutException; import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IPrimitiveType; -import org.hl7.fhir.r4.model.*; +import org.hl7.fhir.r4.model.BooleanType; +import org.hl7.fhir.r4.model.Observation; +import org.hl7.fhir.r4.model.Parameters; +import org.hl7.fhir.r4.model.Patient; +import org.hl7.fhir.r4.model.Resource; +import org.hl7.fhir.r4.model.Subscription; import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; @@ -43,9 +48,9 @@ import java.util.List; import java.util.stream.Stream; import static org.assertj.core.api.Assertions.assertThat; -import static org.junit.jupiter.api.Assertions.fail; import static org.awaitility.Awaitility.await; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.fail; public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4Test { private static final Logger ourLog = LoggerFactory.getLogger(RestHookTestR4Test.class); @@ -60,6 +65,7 @@ public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4 public static final RequestPartitionId REQ_PART_1 = RequestPartitionId.fromPartitionNames(PARTITION_1); static final String PARTITION_2 = "PART-2"; public static final RequestPartitionId REQ_PART_2 = RequestPartitionId.fromPartitionNames(PARTITION_2); + public static final RequestPartitionId REQ_PART_DEFAULT = RequestPartitionId.defaultPartition(); protected MyReadWriteInterceptor myPartitionInterceptor; protected LocalDate myPartitionDate; @@ -127,7 +133,7 @@ public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4 String criteria1 = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; Subscription subscription = newSubscription(criteria1, payload); - assertEquals(mySrdInterceptorService.getAllRegisteredInterceptors().size(), 1); + assertThat(mySrdInterceptorService.getAllRegisteredInterceptors()).hasSize(1); myDaoRegistry.getResourceDao("Subscription").create(subscription, mySrd); @@ -150,13 +156,13 @@ public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4 mySubscriptionSettings.setCrossPartitionSubscriptionEnabled(theIsCrossPartitionEnabled); String payload = "application/fhir+json"; - String code = "1000000050"; String criteria1 = "Patient?active=true"; Subscription subscription = newSubscription(criteria1, payload); + subscription.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.r4.model.BooleanType().setValue(true)); - assertEquals(mySrdInterceptorService.getAllRegisteredInterceptors().size(), 1); + assertThat(mySrdInterceptorService.getAllRegisteredInterceptors()).hasSize(1); - myDaoRegistry.getResourceDao("Subscription").create(subscription, mySrd); + myDaoRegistry.getResourceDao("Subscription").create(subscription, new SystemRequestDetails().setRequestPartitionId(RequestPartitionId.defaultPartition())); waitForActivatedSubscriptionCount(1); @@ -184,10 +190,8 @@ public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4 } } - @ParameterizedTest - @ValueSource(booleans = {true, false}) - public void testManualTriggeredSubscriptionDoesNotCheckOutsideOfPartition(boolean theIsCrossPartitionEnabled) throws Exception { - mySubscriptionSettings.setCrossPartitionSubscriptionEnabled(theIsCrossPartitionEnabled); + @Test + public void testManualTriggeredSubscriptionDoesNotMatchOnAllPartitions() throws Exception { String payload = "application/fhir+json"; String code = "1000000050"; String criteria1 = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; @@ -201,8 +205,9 @@ public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4 myPartitionInterceptor.setRequestPartitionId(REQ_PART_1); IIdType observationIdPartitionOne = observation.create(createBaseObservation(code, "SNOMED-CT"), mySrd).getId(); - //Given: We create a subscrioption on Partition 1 - IIdType subscriptionId = myDaoRegistry.getResourceDao("Subscription").create(newSubscription(criteria1, payload), mySrd).getId(); + //Given: We create a subscription on partition 1 + Subscription subscription = newSubscription(criteria1, payload); + IIdType subscriptionId = myDaoRegistry.getResourceDao("Subscription").create(subscription, mySrd).getId(); waitForActivatedSubscriptionCount(1); ArrayList> searchUrlList = new ArrayList<>(); @@ -213,14 +218,49 @@ public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4 waitForQueueToDrain(); List resourceUpdates = BaseSubscriptionsR4Test.ourObservationProvider.getResourceUpdates(); - if (theIsCrossPartitionEnabled) { - assertEquals(2, resourceUpdates.size()); - assertEquals(Stream.of(observationIdPartitionOne, observationIdPartitionTwo).map(Object::toString).sorted().toList(), - resourceUpdates.stream().map(Resource::getId).sorted().toList()); - } else { - assertEquals(1, resourceUpdates.size()); - assertEquals(observationIdPartitionOne.toString(), resourceUpdates.get(0).getId()); - } + + assertEquals(1, resourceUpdates.size()); + assertEquals(observationIdPartitionOne.toString(), resourceUpdates.get(0).getId()); + + String responseValue = resultParameters.getParameter().get(0).getValue().primitiveValue(); + assertThat(responseValue).contains("Subscription triggering job submitted as JOB ID"); + } + + @Test + public void testManualTriggeredCrossPartitinedSubscriptionDoesMatchOnAllPartitions() throws Exception { + mySubscriptionSettings.setCrossPartitionSubscriptionEnabled(true); + String payload = "application/fhir+json"; + String code = "1000000050"; + String criteria1 = "Observation?code=SNOMED-CT|" + code + "&_format=xml"; + + //Given: We store a resource in partition 2 + myPartitionInterceptor.setRequestPartitionId(REQ_PART_2); + final IFhirResourceDao observation = myDaoRegistry.getResourceDao("Observation"); + IIdType observationIdPartitionTwo = observation.create(createBaseObservation(code, "SNOMED-CT"), mySrd).getId(); + + //Given: We store a similar resource in partition 1 + myPartitionInterceptor.setRequestPartitionId(REQ_PART_1); + IIdType observationIdPartitionOne = observation.create(createBaseObservation(code, "SNOMED-CT"), mySrd).getId(); + + //Given: We create a subscription on the default partition + myPartitionInterceptor.setRequestPartitionId(REQ_PART_DEFAULT); + Subscription subscription = newSubscription(criteria1, payload); + subscription.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new org.hl7.fhir.r4.model.BooleanType().setValue(true)); + IIdType subscriptionId = myDaoRegistry.getResourceDao("Subscription").create(subscription, mySrd).getId(); + waitForActivatedSubscriptionCount(1); + + ArrayList> searchUrlList = new ArrayList<>(); + searchUrlList.add(new StringDt("Observation?")); + + Parameters resultParameters = (Parameters) mySubscriptionTriggeringSvc.triggerSubscription(null, searchUrlList, subscriptionId, mySrd); + mySubscriptionTriggeringSvc.runDeliveryPass(); + + waitForQueueToDrain(); + List resourceUpdates = BaseSubscriptionsR4Test.ourObservationProvider.getResourceUpdates(); + + assertEquals(2, resourceUpdates.size()); + assertEquals(Stream.of(observationIdPartitionOne, observationIdPartitionTwo).map(Object::toString).sorted().toList(), + resourceUpdates.stream().map(Resource::getId).sorted().toList()); String responseValue = resultParameters.getParameter().get(0).getValue().primitiveValue(); assertThat(responseValue).contains("Subscription triggering job submitted as JOB ID"); @@ -240,17 +280,17 @@ public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4 myPartitionInterceptor.setRequestPartitionId(REQ_PART_1); myDaoRegistry.getResourceDao("Observation").create(createBaseObservation(code, "SNOMED-CT"), mySrd).getId(); - //Given: We create a subscription on Partition 1 + //Given: We create a subscription on default partition Subscription theResource = newSubscription(criteria1, payload); theResource.addExtension(HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION, new BooleanType(Boolean.TRUE)); - myPartitionInterceptor.setRequestPartitionId(RequestPartitionId.defaultPartition()); + myPartitionInterceptor.setRequestPartitionId(REQ_PART_DEFAULT); IIdType subscriptionId = myDaoRegistry.getResourceDao("Subscription").create(theResource, mySrd).getId(); waitForActivatedSubscriptionCount(1); ArrayList> searchUrlList = new ArrayList<>(); searchUrlList.add(new StringDt("Observation?")); - myPartitionInterceptor.setRequestPartitionId(RequestPartitionId.defaultPartition()); + myPartitionInterceptor.setRequestPartitionId(REQ_PART_DEFAULT); mySubscriptionTriggeringSvc.triggerSubscription(null, searchUrlList, subscriptionId, mySrd); mySubscriptionTriggeringSvc.runDeliveryPass(); @@ -273,7 +313,7 @@ public class PartitionedSubscriptionTriggeringR4Test extends BaseSubscriptionsR4 // Create the subscription now DaoMethodOutcome subscriptionOutcome = myDaoRegistry.getResourceDao("Subscription").create(newSubscription(criteria1, payload), mySrd); - assertEquals(mySrdInterceptorService.getAllRegisteredInterceptors().size(), 1); + assertThat(mySrdInterceptorService.getAllRegisteredInterceptors()).hasSize(1); Subscription subscription = (Subscription) subscriptionOutcome.getResource(); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/provider/r4/MultitenantBatchOperationR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/provider/r4/MultitenantBatchOperationR4Test.java index cdfc4296613..b1301470eab 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/provider/r4/MultitenantBatchOperationR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/provider/r4/MultitenantBatchOperationR4Test.java @@ -7,8 +7,9 @@ import ca.uhn.fhir.interceptor.api.IPointcut; import ca.uhn.fhir.interceptor.api.Pointcut; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; -import ca.uhn.fhir.jpa.delete.job.ReindexTestHelper; import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamToken; +import ca.uhn.fhir.jpa.model.entity.ResourceTable; +import ca.uhn.fhir.jpa.reindex.ReindexTestHelper; import ca.uhn.fhir.rest.api.CacheControlDirective; import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.server.provider.ProviderConstants; @@ -74,10 +75,10 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv public void testDeleteExpungeOperation() { // Create patients - IIdType idAT = createPatient(withTenant(TENANT_A), withActiveTrue()); - IIdType idAF = createPatient(withTenant(TENANT_A), withActiveFalse()); - IIdType idBT = createPatient(withTenant(TENANT_B), withActiveTrue()); - IIdType idBF = createPatient(withTenant(TENANT_B), withActiveFalse()); + createPatient(withTenant(TENANT_A), withActiveTrue()); + createPatient(withTenant(TENANT_A), withActiveFalse()); + createPatient(withTenant(TENANT_B), withActiveTrue()); + createPatient(withTenant(TENANT_B), withActiveFalse()); // validate setup assertEquals(2, getAllPatientsInTenant(TENANT_A).getTotal()); @@ -103,7 +104,7 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv String jobId = BatchHelperR4.jobIdFromBatch2Parameters(response); myBatch2JobHelper.awaitJobCompletion(jobId); - assertThat(interceptor.requestPartitionIds).hasSize(4); + assertThat(interceptor.requestPartitionIds).hasSize(3); RequestPartitionId partitionId = interceptor.requestPartitionIds.get(0); assertEquals(TENANT_B_ID, partitionId.getFirstPartitionIdOrNull()); assertEquals(TENANT_B, partitionId.getFirstPartitionNameOrNull()); @@ -127,20 +128,20 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv IIdType obsFinalA = doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension()); myTenantClientInterceptor.setTenantId(TENANT_B); - IIdType obsFinalB = doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension()); + doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension()); myTenantClientInterceptor.setTenantId(DEFAULT_PARTITION_NAME); - IIdType obsFinalD = doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension()); + doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension()); reindexTestHelper.createAlleleSearchParameter(); // The searchparam value is on the observation, but it hasn't been indexed yet myTenantClientInterceptor.setTenantId(TENANT_A); - assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(0); + assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).isEmpty(); myTenantClientInterceptor.setTenantId(TENANT_B); - assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(0); + assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).isEmpty(); myTenantClientInterceptor.setTenantId(DEFAULT_PARTITION_NAME); - assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(0); + assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).isEmpty(); // setup Parameters input = new Parameters(); @@ -163,13 +164,13 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv // validate - runInTransaction(()->{ + runInTransaction(() -> { long indexedSps = myResourceIndexedSearchParamTokenDao .findAll() .stream() .filter(t->t.getParamName().equals("alleleName")) .count(); - assertEquals(1, indexedSps, ()->"Token indexes:\n * " + myResourceIndexedSearchParamTokenDao.findAll().stream().filter(t->t.getParamName().equals("alleleName")).map(ResourceIndexedSearchParamToken::toString).collect(Collectors.joining("\n * "))); + assertEquals(1, indexedSps, () -> "Token indexes:\n * " + myResourceIndexedSearchParamTokenDao.findAll().stream().filter(t->t.getParamName().equals("alleleName")).map(ResourceIndexedSearchParamToken::toString).collect(Collectors.joining("\n * "))); }); List alleleObservationIds = reindexTestHelper.getAlleleObservationIds(myClient); @@ -178,9 +179,9 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(1); assertEquals(obsFinalA.getIdPart(), alleleObservationIds.get(0)); myTenantClientInterceptor.setTenantId(TENANT_B); - assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(0); + assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).isEmpty(); myTenantClientInterceptor.setTenantId(DEFAULT_PARTITION_NAME); - assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(0); + assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).isEmpty(); // Reindex default partition myTenantClientInterceptor.setTenantId(DEFAULT_PARTITION_NAME); @@ -198,13 +199,13 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv ourLog.info("Search params: {}", mySearchParamRegistry.getActiveSearchParams("Observation").getSearchParamNames()); logAllTokenIndexes(); - runInTransaction(()->{ + runInTransaction(() -> { long indexedSps = myResourceIndexedSearchParamTokenDao .findAll() .stream() - .filter(t->t.getParamName().equals("alleleName")) + .filter(t -> t.getParamName().equals("alleleName")) .count(); - assertEquals(3, indexedSps, ()->"Resources:\n * " + myResourceTableDao.findAll().stream().map(t->t.toString()).collect(Collectors.joining("\n * "))); + assertEquals(2, indexedSps, () -> "Resources:\n * " + myResourceTableDao.findAll().stream().map(ResourceTable::toString).collect(Collectors.joining("\n * "))); }); myTenantClientInterceptor.setTenantId(DEFAULT_PARTITION_NAME); @@ -216,20 +217,20 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv ReindexTestHelper reindexTestHelper = new ReindexTestHelper(myFhirContext, myDaoRegistry, mySearchParamRegistry); myTenantClientInterceptor.setTenantId(TENANT_A); IIdType obsFinalA = doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension(Observation.ObservationStatus.FINAL)); - IIdType obsCancelledA = doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension(Observation.ObservationStatus.CANCELLED)); + doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension(Observation.ObservationStatus.CANCELLED)); myTenantClientInterceptor.setTenantId(TENANT_B); - IIdType obsFinalB = doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension(Observation.ObservationStatus.FINAL)); - IIdType obsCancelledB = doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension(Observation.ObservationStatus.CANCELLED)); + doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension(Observation.ObservationStatus.FINAL)); + doCreateResource(reindexTestHelper.buildObservationWithAlleleExtension(Observation.ObservationStatus.CANCELLED)); reindexTestHelper.createAlleleSearchParameter(); ourLog.info("Search params: {}", mySearchParamRegistry.getActiveSearchParams("Observation").getSearchParamNames()); // The searchparam value is on the observation, but it hasn't been indexed yet myTenantClientInterceptor.setTenantId(TENANT_A); - assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(0); + assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).isEmpty(); myTenantClientInterceptor.setTenantId(TENANT_B); - assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(0); + assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).isEmpty(); // setup Parameters input = new Parameters(); @@ -259,7 +260,7 @@ public class MultitenantBatchOperationR4Test extends BaseMultitenantResourceProv assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(1); assertEquals(obsFinalA.getIdPart(), alleleObservationIds.get(0)); myTenantClientInterceptor.setTenantId(TENANT_B); - assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).hasSize(0); + assertThat(reindexTestHelper.getAlleleObservationIds(myClient)).isEmpty(); } private Bundle getAllPatientsInTenant(String theTenantId) { diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/provider/r4/TerminologyUploaderProviderR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/provider/r4/TerminologyUploaderProviderR4Test.java index 3deec41ad94..99180105bde 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/provider/r4/TerminologyUploaderProviderR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/provider/r4/TerminologyUploaderProviderR4Test.java @@ -347,7 +347,7 @@ public class TerminologyUploaderProviderR4Test extends BaseResourceProviderR4Tes "\"reference\": \"CodeSystem/" ); - assertHierarchyContains( + assertHierarchyContainsExactly( "CHEM seq=0", " HB seq=0", " NEUT seq=1", @@ -387,7 +387,7 @@ public class TerminologyUploaderProviderR4Test extends BaseResourceProviderR4Tes "\"reference\": \"CodeSystem/" ); - assertHierarchyContains( + assertHierarchyContainsExactly( "CHEM seq=0", " HB seq=0", " NEUT seq=1", @@ -457,7 +457,7 @@ public class TerminologyUploaderProviderR4Test extends BaseResourceProviderR4Tes "\"reference\": \"CodeSystem/" ); - assertHierarchyContains( + assertHierarchyContainsExactly( "1111222233 seq=0", " 1111222234 seq=0" ); @@ -527,13 +527,45 @@ public class TerminologyUploaderProviderR4Test extends BaseResourceProviderR4Tes "\"reference\": \"CodeSystem/" ); - assertHierarchyContains( + assertHierarchyContainsExactly( "CHEM seq=0", " HB seq=0", " HBA seq=0" ); } + @Test + public void testApplyDeltaAdd_UsingCodeSystem_NoDisplaySetOnConcepts() throws IOException { + + CodeSystem codeSystem = new CodeSystem(); + codeSystem.setUrl("http://foo/cs"); + // setting codes are enough, no need to call setDisplay etc + codeSystem.addConcept().setCode("Code1"); + codeSystem.addConcept().setCode("Code2"); + + LoggingInterceptor interceptor = new LoggingInterceptor(true); + myClient.registerInterceptor(interceptor); + Parameters outcome = myClient + .operation() + .onType(CodeSystem.class) + .named(JpaConstants.OPERATION_APPLY_CODESYSTEM_DELTA_ADD) + .withParameter(Parameters.class, TerminologyUploaderProvider.PARAM_SYSTEM, new UriType("http://foo/cs")) + .andParameter(TerminologyUploaderProvider.PARAM_CODESYSTEM, codeSystem) + .prettyPrint() + .execute(); + myClient.unregisterInterceptor(interceptor); + + String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome); + ourLog.info(encoded); + assertThat(encoded).contains("\"valueInteger\": 2"); + + // assert other codes remain, and HB and NEUT is removed + assertHierarchyContainsExactly( + "Code1 seq=0", + "Code2 seq=0" + ); + } + @Test public void testApplyDeltaAdd_MissingSystem() throws IOException { String conceptsCsv = loadResource("/custom_term/concepts.csv"); @@ -582,30 +614,12 @@ public class TerminologyUploaderProviderR4Test extends BaseResourceProviderR4Tes } @Test - public void testApplyDeltaRemove() throws IOException { - String conceptsCsv = loadResource("/custom_term/concepts.csv"); - Attachment conceptsAttachment = new Attachment() - .setData(conceptsCsv.getBytes(Charsets.UTF_8)) - .setContentType("text/csv") - .setUrl("file:/foo/concepts.csv"); - String hierarchyCsv = loadResource("/custom_term/hierarchy.csv"); - Attachment hierarchyAttachment = new Attachment() - .setData(hierarchyCsv.getBytes(Charsets.UTF_8)) - .setContentType("text/csv") - .setUrl("file:/foo/hierarchy.csv"); + public void testApplyDeltaRemove_UsingCsvFiles_RemoveAllCodes() throws IOException { // Add the codes - myClient - .operation() - .onType(CodeSystem.class) - .named(JpaConstants.OPERATION_APPLY_CODESYSTEM_DELTA_ADD) - .withParameter(Parameters.class, TerminologyUploaderProvider.PARAM_SYSTEM, new UriType("http://foo/cs")) - .andParameter(TerminologyUploaderProvider.PARAM_FILE, conceptsAttachment) - .andParameter(TerminologyUploaderProvider.PARAM_FILE, hierarchyAttachment) - .prettyPrint() - .execute(); + applyDeltaAddCustomTermCodes(); - // And remove them + // And remove all of them using the same set of csv files LoggingInterceptor interceptor = new LoggingInterceptor(true); myClient.registerInterceptor(interceptor); Parameters outcome = myClient @@ -613,8 +627,8 @@ public class TerminologyUploaderProviderR4Test extends BaseResourceProviderR4Tes .onType(CodeSystem.class) .named(JpaConstants.OPERATION_APPLY_CODESYSTEM_DELTA_REMOVE) .withParameter(Parameters.class, TerminologyUploaderProvider.PARAM_SYSTEM, new UriType("http://foo/cs")) - .andParameter(TerminologyUploaderProvider.PARAM_FILE, conceptsAttachment) - .andParameter(TerminologyUploaderProvider.PARAM_FILE, hierarchyAttachment) + .andParameter(TerminologyUploaderProvider.PARAM_FILE, getCustomTermConceptsAttachment()) + .andParameter(TerminologyUploaderProvider.PARAM_FILE, getCustomTermHierarchyAttachment()) .prettyPrint() .execute(); myClient.unregisterInterceptor(interceptor); @@ -622,8 +636,129 @@ public class TerminologyUploaderProviderR4Test extends BaseResourceProviderR4Tes String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome); ourLog.info(encoded); assertThat(encoded).contains("\"valueInteger\": 5"); + + + // providing no arguments, since there should be no code left + assertHierarchyContainsExactly(); } + @Test + public void testApplyDeltaRemove_UsingConceptsCsvFileOnly() throws IOException { + + //add some concepts + applyDeltaAddCustomTermCodes(); + + // And remove 2 of them, providing values for DISPLAY is not necessary + String conceptsToRemoveCsvData = """ + CODE,DISPLAY + HB, + NEUT, + """; + + Attachment conceptsAttachment = createCsvAttachment(conceptsToRemoveCsvData, "file:/concepts.csv"); + + LoggingInterceptor interceptor = new LoggingInterceptor(true); + myClient.registerInterceptor(interceptor); + Parameters outcome = myClient + .operation() + .onType(CodeSystem.class) + .named(JpaConstants.OPERATION_APPLY_CODESYSTEM_DELTA_REMOVE) + .withParameter(Parameters.class, TerminologyUploaderProvider.PARAM_SYSTEM, new UriType("http://foo/cs")) + // submitting concepts is enough (no need to submit hierarchy) + .andParameter(TerminologyUploaderProvider.PARAM_FILE, conceptsAttachment) + .prettyPrint() + .execute(); + myClient.unregisterInterceptor(interceptor); + + String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome); + ourLog.info(encoded); + assertThat(encoded).contains("\"valueInteger\": 2"); + + // assert other codes remain, and HB and NEUT is removed + assertHierarchyContainsExactly( + "CHEM seq=0", + "MICRO seq=0", + " C&S seq=0" + ); + } + + @Test + public void testApplyDeltaRemove_UsingCodeSystemPayload() throws IOException { + + // add some custom codes + applyDeltaAddCustomTermCodes(); + + + // remove 2 of them using CodeSystemPayload + CodeSystem codeSystem = new CodeSystem(); + codeSystem.setUrl("http://foo/cs"); + // setting codes are enough for remove, no need to call setDisplay etc + codeSystem.addConcept().setCode("HB"); + codeSystem.addConcept().setCode("NEUT"); + + LoggingInterceptor interceptor = new LoggingInterceptor(true); + myClient.registerInterceptor(interceptor); + Parameters outcome = myClient + .operation() + .onType(CodeSystem.class) + .named(JpaConstants.OPERATION_APPLY_CODESYSTEM_DELTA_REMOVE) + .withParameter(Parameters.class, TerminologyUploaderProvider.PARAM_SYSTEM, new UriType("http://foo/cs")) + .andParameter(TerminologyUploaderProvider.PARAM_CODESYSTEM, codeSystem) + .prettyPrint() + .execute(); + myClient.unregisterInterceptor(interceptor); + + String encoded = myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome); + ourLog.info(encoded); + assertThat(encoded).contains("\"valueInteger\": 2"); + + // assert other codes remain, and HB and NEUT is removed + assertHierarchyContainsExactly( + "CHEM seq=0", + "MICRO seq=0", + " C&S seq=0" + ); + } + + + private Attachment createCsvAttachment(String theData, String theUrl) { + return new Attachment() + .setData(theData.getBytes(Charsets.UTF_8)) + .setContentType("text/csv") + .setUrl(theUrl); + } + + private Attachment getCustomTermConceptsAttachment() throws IOException { + String conceptsCsv = loadResource("/custom_term/concepts.csv"); + return createCsvAttachment(conceptsCsv, "file:/foo/concepts.csv"); + } + + private Attachment getCustomTermHierarchyAttachment() throws IOException { + String hierarchyCsv = loadResource("/custom_term/hierarchy.csv"); + return createCsvAttachment(hierarchyCsv, "file:/foo/hierarchy.csv"); + } + + private void applyDeltaAddCustomTermCodes() throws IOException { + myClient + .operation() + .onType(CodeSystem.class) + .named(JpaConstants.OPERATION_APPLY_CODESYSTEM_DELTA_ADD) + .withParameter(Parameters.class, TerminologyUploaderProvider.PARAM_SYSTEM, new UriType("http://foo/cs")) + .andParameter(TerminologyUploaderProvider.PARAM_FILE, getCustomTermConceptsAttachment()) + .andParameter(TerminologyUploaderProvider.PARAM_FILE, getCustomTermHierarchyAttachment()) + .prettyPrint() + .execute(); + + assertHierarchyContainsExactly( + "CHEM seq=0", + " HB seq=0", + " NEUT seq=1", + "MICRO seq=0", + " C&S seq=0" + ); + + + } private static void addFile(ZipOutputStream theZos, String theFileName) throws IOException { theZos.putNextEntry(new ZipEntry(theFileName)); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/Batch2DaoSvcImplTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/Batch2DaoSvcImplTest.java deleted file mode 100644 index 6b54a00e5b0..00000000000 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/Batch2DaoSvcImplTest.java +++ /dev/null @@ -1,127 +0,0 @@ -package ca.uhn.fhir.jpa.reindex; - -import static org.junit.jupiter.api.Assertions.assertEquals; -import ca.uhn.fhir.interceptor.model.RequestPartitionId; -import ca.uhn.fhir.jpa.api.pid.IResourcePidStream; -import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc; -import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; -import ca.uhn.fhir.jpa.searchparam.MatchUrlService; -import ca.uhn.fhir.jpa.test.BaseJpaR4Test; -import ca.uhn.fhir.model.primitive.IdDt; -import ca.uhn.fhir.parser.DataFormatException; -import ca.uhn.fhir.rest.server.exceptions.InternalErrorException; -import org.hl7.fhir.instance.model.api.IIdType; -import org.junit.jupiter.api.BeforeEach; -import org.junit.jupiter.api.Test; -import org.junit.jupiter.params.ParameterizedTest; -import org.junit.jupiter.params.provider.ValueSource; -import org.springframework.beans.factory.annotation.Autowired; - -import jakarta.annotation.Nonnull; -import java.time.LocalDate; -import java.time.Month; -import java.time.ZoneId; -import java.util.Date; -import java.util.List; -import java.util.stream.IntStream; -import java.util.stream.Stream; - -import static org.assertj.core.api.Assertions.assertThat; -import static org.junit.jupiter.api.Assertions.assertThrows; - -class Batch2DaoSvcImplTest extends BaseJpaR4Test { - - private static final Date PREVIOUS_MILLENNIUM = toDate(LocalDate.of(1999, Month.DECEMBER, 31)); - private static final Date TOMORROW = toDate(LocalDate.now().plusDays(1)); - private static final String URL_PATIENT_EXPUNGE_TRUE = "Patient?_expunge=true"; - private static final String PATIENT = "Patient"; - - @Autowired - private MatchUrlService myMatchUrlService; - @Autowired - private IHapiTransactionService myIHapiTransactionService ; - - private IBatch2DaoSvc mySubject; - - - @BeforeEach - void beforeEach() { - - mySubject = new Batch2DaoSvcImpl(myResourceTableDao, myMatchUrlService, myDaoRegistry, myFhirContext, myIHapiTransactionService); - } - - // TODO: LD this test won't work with the nonUrl variant yet: error: No existing transaction found for transaction marked with propagation 'mandatory' - - @Test - void fetchResourcesByUrlEmptyUrl() { - final InternalErrorException exception = - assertThrows( - InternalErrorException.class, - () -> mySubject.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, RequestPartitionId.defaultPartition(), "") - .visitStream(Stream::toList)); - - assertEquals("HAPI-2422: this should never happen: URL is missing a '?'", exception.getMessage()); - } - - @Test - void fetchResourcesByUrlSingleQuestionMark() { - final IllegalArgumentException exception = assertThrows(IllegalArgumentException.class, () -> mySubject.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, RequestPartitionId.defaultPartition(), "?").visitStream(Stream::toList)); - - assertEquals("theResourceName must not be blank", exception.getMessage()); - } - - @Test - void fetchResourcesByUrlNonsensicalResource() { - final DataFormatException exception = assertThrows(DataFormatException.class, () -> mySubject.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, RequestPartitionId.defaultPartition(), "Banana?_expunge=true").visitStream(Stream::toList)); - - assertEquals("HAPI-1684: Unknown resource name \"Banana\" (this name is not known in FHIR version \"R4\")", exception.getMessage()); - } - - @ParameterizedTest - @ValueSource(ints = {0, 9, 10, 11, 21, 22, 23, 45}) - void fetchResourcesByUrl(int expectedNumResults) { - final List patientIds = IntStream.range(0, expectedNumResults) - .mapToObj(num -> createPatient()) - .toList(); - - final IResourcePidStream resourcePidList = mySubject.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, RequestPartitionId.defaultPartition(), URL_PATIENT_EXPUNGE_TRUE); - - final List actualPatientIds = - resourcePidList.visitStream(s-> s.map(typePid -> new IdDt(typePid.resourceType, (Long) typePid.id.getId())) - .toList()); - assertIdsEqual(patientIds, actualPatientIds); - } - - @ParameterizedTest - @ValueSource(ints = {0, 9, 10, 11, 21, 22, 23, 45}) - void fetchResourcesNoUrl(int expectedNumResults) { - final int pageSizeWellBelowThreshold = 2; - final List patientIds = IntStream.range(0, expectedNumResults) - .mapToObj(num -> createPatient()) - .toList(); - - final IResourcePidStream resourcePidList = mySubject.fetchResourceIdStream(PREVIOUS_MILLENNIUM, TOMORROW, RequestPartitionId.defaultPartition(), null); - - final List actualPatientIds = - resourcePidList.visitStream(s-> s.map(typePid -> new IdDt(typePid.resourceType, (Long) typePid.id.getId())) - .toList()); - assertIdsEqual(patientIds, actualPatientIds); - } - - private static void assertIdsEqual(List expectedResourceIds, List actualResourceIds) { - assertThat(actualResourceIds).hasSize(expectedResourceIds.size()); - - for (int index = 0; index < expectedResourceIds.size(); index++) { - final IIdType expectedIdType = expectedResourceIds.get(index); - final IIdType actualIdType = actualResourceIds.get(index); - - assertEquals(expectedIdType.getResourceType(), actualIdType.getResourceType()); - assertEquals(expectedIdType.getIdPartAsLong(), actualIdType.getIdPartAsLong()); - } - } - - @Nonnull - private static Date toDate(LocalDate theLocalDate) { - return Date.from(theLocalDate.atStartOfDay(ZoneId.systemDefault()).toInstant()); - } -} diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexJobTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexJobTest.java similarity index 99% rename from hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexJobTest.java rename to hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexJobTest.java index 1936483e496..fa9d93fb4ab 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexJobTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexJobTest.java @@ -1,6 +1,5 @@ -package ca.uhn.fhir.jpa.delete.job; +package ca.uhn.fhir.jpa.reindex; -import static org.junit.jupiter.api.Assertions.assertTrue; import ca.uhn.fhir.batch2.api.IJobCoordinator; import ca.uhn.fhir.batch2.api.IJobPersistence; import ca.uhn.fhir.batch2.jobs.reindex.ReindexAppCtx; @@ -41,17 +40,17 @@ import java.util.List; import java.util.stream.Stream; import static org.assertj.core.api.Assertions.assertThat; -import static org.junit.jupiter.api.Assertions.fail; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertNotNull; import static org.junit.jupiter.api.Assertions.assertNull; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.junit.jupiter.api.Assertions.fail; @SuppressWarnings("SqlDialectInspection") public class ReindexJobTest extends BaseJpaR4Test { @Autowired private IJobCoordinator myJobCoordinator; - @Autowired private IJobPersistence myJobPersistence; diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexJobWithPartitioningTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexJobWithPartitioningTest.java similarity index 52% rename from hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexJobWithPartitioningTest.java rename to hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexJobWithPartitioningTest.java index d2641ba753b..8a643006439 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexJobWithPartitioningTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexJobWithPartitioningTest.java @@ -1,7 +1,8 @@ -package ca.uhn.fhir.jpa.delete.job; +package ca.uhn.fhir.jpa.reindex; import ca.uhn.fhir.batch2.api.IJobCoordinator; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; import ca.uhn.fhir.batch2.jobs.reindex.ReindexAppCtx; import ca.uhn.fhir.batch2.model.JobInstance; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; @@ -11,7 +12,6 @@ import ca.uhn.fhir.jpa.entity.PartitionEntity; import ca.uhn.fhir.jpa.model.config.PartitionSettings; import ca.uhn.fhir.jpa.test.BaseJpaR4Test; import ca.uhn.fhir.rest.api.server.SystemRequestDetails; -import ca.uhn.fhir.rest.server.interceptor.partition.RequestTenantPartitionInterceptor; import org.hl7.fhir.r4.model.Observation; import org.hl7.fhir.r4.model.Patient; import org.junit.jupiter.api.AfterEach; @@ -28,15 +28,13 @@ import java.util.stream.Stream; import static org.assertj.core.api.Assertions.assertThat; @TestInstance(TestInstance.Lifecycle.PER_CLASS) public class ReindexJobWithPartitioningTest extends BaseJpaR4Test { + @Autowired private IJobCoordinator myJobCoordinator; - private final RequestTenantPartitionInterceptor myPartitionInterceptor = new RequestTenantPartitionInterceptor(); @BeforeEach public void before() { - myInterceptorRegistry.registerInterceptor(myPartitionInterceptor); myPartitionSettings.setPartitioningEnabled(true); - myPartitionConfigSvc.createPartition(new PartitionEntity().setId(1).setName("TestPartition1"), null); myPartitionConfigSvc.createPartition(new PartitionEntity().setId(2).setName("TestPartition2"), null); @@ -61,47 +59,77 @@ public class ReindexJobWithPartitioningTest extends BaseJpaR4Test { @AfterEach public void after() { - myInterceptorRegistry.unregisterInterceptor(myPartitionInterceptor); myPartitionSettings.setPartitioningEnabled(new PartitionSettings().isPartitioningEnabled()); } public static Stream getReindexParameters() { - List twoPartitions = List.of(RequestPartitionId.fromPartitionId(1), RequestPartitionId.fromPartitionId(2)); - List partition1 = List.of(RequestPartitionId.fromPartitionId(1)); - List allPartitions = List.of(RequestPartitionId.allPartitions()); + RequestPartitionId partition1 = RequestPartitionId.fromPartitionId(1); + RequestPartitionId partition2 = RequestPartitionId.fromPartitionId(2); + RequestPartitionId allPartitions = RequestPartitionId.allPartitions(); return Stream.of( - // includes all resources from all partitions - partition 1, partition 2 and default partition - Arguments.of(List.of(), List.of(), false, 6), - // includes all Observations - Arguments.of(List.of("Observation?"), twoPartitions, false, 3), - // includes all Observations - Arguments.of(List.of("Observation?"), allPartitions, false, 3), - Arguments.of(List.of("Observation?"), List.of(), false, 0), - // includes Observations in partition 1 - Arguments.of(List.of("Observation?"), partition1, true, 2), - // includes all Patients from all partitions - partition 1, partition 2 and default partition - Arguments.of(List.of("Patient?"), allPartitions, false, 3), - // includes Patients and Observations in partitions 1 and 2 - Arguments.of(List.of("Observation?", "Patient?"), twoPartitions, false, 5), - // includes Observations from partition 1 and Patients from partition 2 - Arguments.of(List.of("Observation?", "Patient?"), twoPartitions, true, 3), - // includes final Observations and Patients from partitions 1 and 2 - Arguments.of(List.of("Observation?status=final", "Patient?"), twoPartitions, false, 4), - // includes final Observations from partition 1 and Patients from partition 2 - Arguments.of(List.of("Observation?status=final", "Patient?"), twoPartitions, true, 2), - // includes final Observations and Patients from partitions 1 - Arguments.of(List.of("Observation?status=final", "Patient?"), partition1, false, 2) + // 1. includes all resources + Arguments.of(List.of(), 6), + // 2. includes all resources from partition 1 + Arguments.of(List.of(new PartitionedUrl().setRequestPartitionId(partition1)), 3), + // 3. includes all resources in all partitions + Arguments.of(List.of(new PartitionedUrl().setUrl("").setRequestPartitionId(allPartitions)), 6), + // 4. includes all Observations in partition 1 and partition 2 + Arguments.of( + List.of( + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(partition1), + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(partition2) + ), 3), + // 5. includes all Observations in all partitions (partition 1, partition 2 and default partition) + Arguments.of( + List.of( + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(allPartitions)), + 3), + // 6. includes all Observations in partition 1 + Arguments.of( + List.of( + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(partition1)), + 2), + // 7. includes all Patients from all partitions + Arguments.of( + List.of( + new PartitionedUrl().setUrl("Patient?").setRequestPartitionId(allPartitions) + ), 3), + // 8. includes Patients and Observations in partitions 1 and 2 + Arguments.of( + List.of( + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(partition1), + new PartitionedUrl().setUrl("Patient?").setRequestPartitionId(partition2) + ), 3), + // 9. includes final Observations and Patients from partitions 1 and 2 + Arguments.of( + List.of( + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(partition1), + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(partition2), + new PartitionedUrl().setUrl("Patient?").setRequestPartitionId(partition1), + new PartitionedUrl().setUrl("Patient?").setRequestPartitionId(partition2) + ), 5), + // 10. includes final Observations from partition 1 and Patients from partition 2 + Arguments.of( + List.of( + new PartitionedUrl().setUrl("Observation?status=final").setRequestPartitionId(partition1), + new PartitionedUrl().setUrl("Observation?status=final").setRequestPartitionId(partition2), + new PartitionedUrl().setUrl("Patient?").setRequestPartitionId(partition2) + ), 3), + // 11. includes final Observations and Patients from partitions 1 + Arguments.of( + List.of( + new PartitionedUrl().setUrl("Observation?status=final").setRequestPartitionId(partition1), + new PartitionedUrl().setUrl("Patient?").setRequestPartitionId(partition1) + ), 2) ); } @ParameterizedTest @MethodSource(value = "getReindexParameters") - public void testReindex_byMultipleUrlsAndPartitions_indexesMatchingResources(List theUrls, - List thePartitions, - boolean theShouldAssignPartitionToUrl, - int theExpectedIndexedResourceCount) { - - JobParameters parameters = JobParameters.from(theUrls, thePartitions, theShouldAssignPartitionToUrl); + public void testReindex_withPartitionedUrls_indexesMatchingResources(List thePartitionedUrls, + int theExpectedIndexedResourceCount) { + PartitionedUrlJobParameters parameters = new PartitionedUrlJobParameters(); + thePartitionedUrls.forEach(parameters::addPartitionedUrl); // execute JobInstanceStartRequest startRequest = new JobInstanceStartRequest(); diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexTestHelper.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexTestHelper.java similarity index 99% rename from hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexTestHelper.java rename to hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexTestHelper.java index f5597a0142d..29a6358115e 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/delete/job/ReindexTestHelper.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/reindex/ReindexTestHelper.java @@ -1,4 +1,4 @@ -package ca.uhn.fhir.jpa.delete.job; +package ca.uhn.fhir.jpa.reindex; import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.jpa.api.dao.DaoRegistry; diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/stresstest/GiantTransactionPerfTest.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/stresstest/GiantTransactionPerfTest.java index f5cc4817110..ceadc2c1709 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/stresstest/GiantTransactionPerfTest.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/stresstest/GiantTransactionPerfTest.java @@ -61,6 +61,7 @@ import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Answers; import org.mockito.Mock; +import org.mockito.Spy; import org.mockito.junit.jupiter.MockitoExtension; import org.slf4j.Logger; import org.slf4j.LoggerFactory; diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/term/TerminologySvcDeltaR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/term/TerminologySvcDeltaR4Test.java index 09eeadb6796..ccb9ce4b962 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/term/TerminologySvcDeltaR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/term/TerminologySvcDeltaR4Test.java @@ -61,7 +61,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { delta.addRootConcept("RootA", "Root A"); delta.addRootConcept("RootB", "Root B"); myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo/cs", delta); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", "RootB seq=0" ); @@ -70,7 +70,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { delta.addRootConcept("RootC", "Root C"); delta.addRootConcept("RootD", "Root D"); myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo/cs", delta); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", "RootB seq=0", "RootC seq=0", @@ -104,7 +104,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { ourLog.info("Starting testAddHierarchyConcepts"); createNotPresentCodeSystem(); - assertHierarchyContains(); + assertHierarchyContainsExactly(); ourLog.info("Have created code system"); runInTransaction(() -> { @@ -117,7 +117,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { delta.addRootConcept("RootA", "Root A"); delta.addRootConcept("RootB", "Root B"); myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo/cs", delta); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", "RootB seq=0" ); @@ -139,7 +139,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { myCaptureQueriesListener.logAllQueriesForCurrentThread(); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", " ChildAA seq=0", " ChildAB seq=1", @@ -151,7 +151,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { @Test public void testAddMoveConceptFromOneParentToAnother() { createNotPresentCodeSystem(); - assertHierarchyContains(); + assertHierarchyContainsExactly(); UploadStatistics outcome; CustomTerminologySet delta; @@ -162,7 +162,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { .addChild(TermConceptParentChildLink.RelationshipTypeEnum.ISA).setCode("ChildAAA").setDisplay("Child AAA"); delta.addRootConcept("RootB", "Root B"); outcome = myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo/cs", delta); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", " ChildAA seq=0", " ChildAAA seq=0", @@ -174,7 +174,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { delta.addRootConcept("RootB", "Root B") .addChild(TermConceptParentChildLink.RelationshipTypeEnum.ISA).setCode("ChildAA").setDisplay("Child AA"); outcome = myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo/cs", delta); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", " ChildAA seq=0", " ChildAAA seq=0", @@ -195,7 +195,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { @Test public void testReAddingConceptsDoesntRecreateExistingLinks() { createNotPresentCodeSystem(); - assertHierarchyContains(); + assertHierarchyContainsExactly(); UploadStatistics outcome; CustomTerminologySet delta; @@ -206,7 +206,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { delta.addRootConcept("RootA", "Root A") .addChild(TermConceptParentChildLink.RelationshipTypeEnum.ISA).setCode("ChildAA").setDisplay("Child AA"); myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo/cs", delta); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", " ChildAA seq=0" ); @@ -223,7 +223,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { .addChild(TermConceptParentChildLink.RelationshipTypeEnum.ISA).setCode("ChildAA").setDisplay("Child AA") .addChild(TermConceptParentChildLink.RelationshipTypeEnum.ISA).setCode("ChildAAA").setDisplay("Child AAA"); myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo/cs", delta); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", " ChildAA seq=0", " ChildAAA seq=0" @@ -242,7 +242,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { .addChild(TermConceptParentChildLink.RelationshipTypeEnum.ISA).setCode("ChildAAA").setDisplay("Child AAA") .addChild(TermConceptParentChildLink.RelationshipTypeEnum.ISA).setCode("ChildAAAA").setDisplay("Child AAAA"); myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo/cs", delta); - assertHierarchyContains( + assertHierarchyContainsExactly( "RootA seq=0", " ChildAA seq=0", " ChildAAA seq=0", @@ -293,7 +293,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo", set); // Check so far - assertHierarchyContains( + assertHierarchyContainsExactly( "ParentA seq=0", " ChildA seq=0" ); @@ -306,7 +306,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo", set); // Check so far - assertHierarchyContains( + assertHierarchyContainsExactly( "ParentA seq=0", " ChildA seq=0", " ChildAA seq=0" @@ -331,7 +331,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo", set); // Check so far - assertHierarchyContains( + assertHierarchyContainsExactly( "ParentA seq=0", " ChildA seq=0" ); @@ -344,7 +344,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { myTermCodeSystemStorageSvc.applyDeltaCodeSystemsAdd("http://foo", set); // Check so far - assertHierarchyContains( + assertHierarchyContainsExactly( "ParentA seq=0", " ChildA seq=0", " ChildAA seq=0" @@ -416,7 +416,7 @@ public class TerminologySvcDeltaR4Test extends BaseJpaR4Test { expectedHierarchy.add(expected); } - assertHierarchyContains(expectedHierarchy.toArray(new String[0])); + assertHierarchyContainsExactly(expectedHierarchy.toArray(new String[0])); } diff --git a/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/dao/r5/FhirSystemDaoTransactionR5Test.java b/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/dao/r5/FhirSystemDaoTransactionR5Test.java index c68bdf16780..4cf1ea982c5 100644 --- a/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/dao/r5/FhirSystemDaoTransactionR5Test.java +++ b/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/dao/r5/FhirSystemDaoTransactionR5Test.java @@ -149,8 +149,8 @@ public class FhirSystemDaoTransactionR5Test extends BaseJpaR5Test { assertEquals(theMatchUrlCacheEnabled ? 3 : 4, myCaptureQueriesListener.countSelectQueriesForCurrentThread()); assertEquals(0, myCaptureQueriesListener.countInsertQueriesForCurrentThread()); - assertEquals(0, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); - assertEquals(0, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); + assertEquals(4, myCaptureQueriesListener.countUpdateQueriesForCurrentThread()); + assertEquals(4, myCaptureQueriesListener.countDeleteQueriesForCurrentThread()); assertEquals(1, myCaptureQueriesListener.countCommits()); assertEquals(0, myCaptureQueriesListener.countRollbacks()); diff --git a/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/provider/r5/ResourceProviderR5Test.java b/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/provider/r5/ResourceProviderR5Test.java index 3870487c81c..f6f26c10fc9 100644 --- a/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/provider/r5/ResourceProviderR5Test.java +++ b/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/provider/r5/ResourceProviderR5Test.java @@ -1,5 +1,6 @@ package ca.uhn.fhir.jpa.provider.r5; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; import ca.uhn.fhir.batch2.jobs.reindex.ReindexAppCtx; import ca.uhn.fhir.batch2.jobs.reindex.ReindexJobParameters; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; @@ -371,7 +372,7 @@ public class ResourceProviderR5Test extends BaseResourceProviderR5Test { // do a reindex ReindexJobParameters jobParameters = new ReindexJobParameters(); - jobParameters.setRequestPartitionId(RequestPartitionId.allPartitions()); + jobParameters.addPartitionedUrl(new PartitionedUrl().setRequestPartitionId(RequestPartitionId.allPartitions())); JobInstanceStartRequest request = new JobInstanceStartRequest(); request.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX); request.setParameters(jobParameters); diff --git a/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/search/reindex/InstanceReindexServiceImplNarrativeR5Test.java b/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/search/reindex/InstanceReindexServiceImplNarrativeR5Test.java index 68fb51a390e..7ca089ac3a0 100644 --- a/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/search/reindex/InstanceReindexServiceImplNarrativeR5Test.java +++ b/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/search/reindex/InstanceReindexServiceImplNarrativeR5Test.java @@ -1,6 +1,5 @@ package ca.uhn.fhir.jpa.search.reindex; -import static org.junit.jupiter.api.Assertions.assertEquals; import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.model.config.PartitionSettings; @@ -17,22 +16,22 @@ import ca.uhn.fhir.jpa.model.entity.ResourceTable; import ca.uhn.fhir.jpa.model.entity.SearchParamPresentEntity; import ca.uhn.fhir.jpa.searchparam.extractor.ResourceIndexedSearchParams; import ca.uhn.fhir.test.utilities.HtmlUtil; -import org.htmlunit.html.HtmlPage; -import org.htmlunit.html.HtmlTable; +import jakarta.annotation.Nonnull; import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.Parameters; import org.hl7.fhir.r4.model.StringType; +import org.htmlunit.html.HtmlPage; +import org.htmlunit.html.HtmlTable; import org.junit.jupiter.api.Test; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import jakarta.annotation.Nonnull; import java.io.IOException; import java.math.BigDecimal; import java.util.Collections; import java.util.Date; -import static org.assertj.core.api.Assertions.assertThat; +import static org.junit.jupiter.api.Assertions.assertEquals; /** * Tests the narrative generation in {@link InstanceReindexServiceImpl}. This is a separate test @@ -135,7 +134,7 @@ public class InstanceReindexServiceImplNarrativeR5Test { public void testIndexResourceLink() throws IOException { // Setup ResourceIndexedSearchParams newParams = newParams(); - newParams.myLinks.add(ResourceLink.forLocalReference("Observation.subject", myEntity, "Patient", 123L, "123", new Date(), 555L)); + newParams.myLinks.add(getResourceLinkForLocalReference()); // Test Parameters outcome = mySvc.buildIndexResponse(newParams(), newParams, true, Collections.emptyList()); @@ -311,4 +310,19 @@ public class InstanceReindexServiceImplNarrativeR5Test { return ResourceIndexedSearchParams.withSets(); } + private ResourceLink getResourceLinkForLocalReference(){ + + ResourceLink.ResourceLinkForLocalReferenceParams params = ResourceLink.ResourceLinkForLocalReferenceParams + .instance() + .setSourcePath("Observation.subject") + .setSourceResource(myEntity) + .setTargetResourceType("Patient") + .setTargetResourcePid(123L) + .setTargetResourceId("123") + .setUpdated(new Date()) + .setTargetResourceVersion(555L); + + return ResourceLink.forLocalReference(params); + } + } diff --git a/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/search/reindex/InstanceReindexServiceImplR5Test.java b/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/search/reindex/InstanceReindexServiceImplR5Test.java index 40e68b557ca..de2be916e6a 100644 --- a/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/search/reindex/InstanceReindexServiceImplR5Test.java +++ b/hapi-fhir-jpaserver-test-r5/src/test/java/ca/uhn/fhir/jpa/search/reindex/InstanceReindexServiceImplR5Test.java @@ -71,7 +71,7 @@ public class InstanceReindexServiceImplR5Test extends BaseJpaR5Test { .map(t -> t.getName() + " " + getPartValue("Action", t) + " " + getPartValue("Type", t) + " " + getPartValue("Missing", t)) .sorted() .toList(); - assertThat(indexInstances).as(indexInstances.toString()).containsExactly("_id NO_CHANGE Token true", "active NO_CHANGE Token true", "address NO_CHANGE String true", "address-city NO_CHANGE String true", "address-country NO_CHANGE String true", "address-postalcode NO_CHANGE String true", "address-state NO_CHANGE String true", "address-use NO_CHANGE Token true", "birthdate NO_CHANGE Date true", "death-date NO_CHANGE Date true", "email NO_CHANGE Token true", "gender NO_CHANGE Token true", "general-practitioner NO_CHANGE Reference true", "identifier NO_CHANGE Token true", "language NO_CHANGE Token true", "link NO_CHANGE Reference true", "organization NO_CHANGE Reference true", "part-agree NO_CHANGE Reference true", "phone NO_CHANGE Token true", "telecom NO_CHANGE Token true"); + assertThat(indexInstances).as(indexInstances.toString()).containsExactly("active NO_CHANGE Token true", "address NO_CHANGE String true", "address-city NO_CHANGE String true", "address-country NO_CHANGE String true", "address-postalcode NO_CHANGE String true", "address-state NO_CHANGE String true", "address-use NO_CHANGE Token true", "birthdate NO_CHANGE Date true", "death-date NO_CHANGE Date true", "email NO_CHANGE Token true", "gender NO_CHANGE Token true", "general-practitioner NO_CHANGE Reference true", "identifier NO_CHANGE Token true", "language NO_CHANGE Token true", "link NO_CHANGE Reference true", "organization NO_CHANGE Reference true", "part-agree NO_CHANGE Reference true", "phone NO_CHANGE Token true", "telecom NO_CHANGE Token true"); } diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/HapiEmbeddedDatabasesExtension.java b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/HapiEmbeddedDatabasesExtension.java index e3ccbf1d79c..c2ca070788a 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/HapiEmbeddedDatabasesExtension.java +++ b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/HapiEmbeddedDatabasesExtension.java @@ -106,10 +106,15 @@ public class HapiEmbeddedDatabasesExtension implements AfterAllCallback { try { myDatabaseInitializerHelper.insertPersistenceTestData(getEmbeddedDatabase(theDriverType), theVersionEnum); } catch (Exception theE) { - ourLog.info( - "Could not insert persistence test data most likely because we don't have any for version {} and driver {}", - theVersionEnum, - theDriverType); + if (theE.getMessage().contains("Error loading file: migration/releases/")) { + ourLog.info( + "Could not insert persistence test data most likely because we don't have any for version {} and driver {}", + theVersionEnum, + theDriverType); + } else { + // throw sql execution Exceptions + throw theE; + } } } diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/search/QuantitySearchParameterTestCases.java b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/search/QuantitySearchParameterTestCases.java index 06bf44d0b7c..d4b2a506ac8 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/search/QuantitySearchParameterTestCases.java +++ b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/search/QuantitySearchParameterTestCases.java @@ -656,7 +656,11 @@ public abstract class QuantitySearchParameterTestCases implements ITestDataBuild .getIdPart(); // 70_000 // this search is not freetext because there is no freetext-known parameter name - List allIds = myTestDaoSearch.searchForIds("/Observation?_sort=value-quantity"); + // search by value quantity was added here because empty search params would cause the search to go + // through jpa search which does not + // support normalized quantity sorting. + List allIds = + myTestDaoSearch.searchForIds("/Observation?value-quantity=ge0&_sort=value-quantity"); assertThat(allIds).containsExactly(idAlpha2, idAlpha1, idAlpha3); } } diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/BaseJpaR4Test.java b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/BaseJpaR4Test.java index 9d535aa67c5..2fb5a9b87f6 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/BaseJpaR4Test.java +++ b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/BaseJpaR4Test.java @@ -740,7 +740,7 @@ public abstract class BaseJpaR4Test extends BaseJpaTest implements ITestDataBuil dao.update(resourceParsed); } - protected void assertHierarchyContains(String... theStrings) { + protected void assertHierarchyContainsExactly(String... theStrings) { List hierarchy = runInTransaction(() -> { List hierarchyHolder = new ArrayList<>(); TermCodeSystem codeSystem = myTermCodeSystemDao.findAll().iterator().next(); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/BaseJpaTest.java b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/BaseJpaTest.java index 2ff5ddb261a..de0e82e76ae 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/BaseJpaTest.java +++ b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/BaseJpaTest.java @@ -150,6 +150,7 @@ import java.util.concurrent.atomic.AtomicBoolean; import java.util.stream.Collectors; import java.util.stream.Stream; +import static ca.uhn.fhir.rest.api.Constants.HEADER_CACHE_CONTROL; import static ca.uhn.fhir.util.TestUtil.doRandomizeLocaleAndTimezone; import static java.util.stream.Collectors.joining; import static org.awaitility.Awaitility.await; @@ -428,6 +429,7 @@ public abstract class BaseJpaTest extends BaseTest { when(mySrd.getInterceptorBroadcaster()).thenReturn(mySrdInterceptorService); when(mySrd.getUserData()).thenReturn(new HashMap<>()); when(mySrd.getHeaders(eq(JpaConstants.HEADER_META_SNAPSHOT_MODE))).thenReturn(new ArrayList<>()); + when(mySrd.getHeaders(eq(HEADER_CACHE_CONTROL))).thenReturn(new ArrayList<>()); // TODO enforce strict mocking everywhere lenient().when(mySrd.getServer().getDefaultPageSize()).thenReturn(null); lenient().when(mySrd.getServer().getMaximumPageSize()).thenReturn(null); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/Batch2JobHelper.java b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/Batch2JobHelper.java index 6a6f933de24..429e3ee5e39 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/Batch2JobHelper.java +++ b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/Batch2JobHelper.java @@ -82,7 +82,7 @@ public class Batch2JobHelper { } public JobInstance awaitJobHasStatus(String theInstanceId, StatusEnum... theExpectedStatus) { - return awaitJobHasStatus(theInstanceId, 10, theExpectedStatus); + return awaitJobHasStatus(theInstanceId, 30, theExpectedStatus); } public JobInstance awaitJobHasStatusWithoutMaintenancePass(String theInstanceId, StatusEnum... theExpectedStatus) { diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/H2_EMBEDDED.sql b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/H2_EMBEDDED.sql index e84689abc35..99689b7ae08 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/H2_EMBEDDED.sql +++ b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/H2_EMBEDDED.sql @@ -1 +1,91 @@ -INSERT INTO TRM_CONCEPT_MAP_GRP_ELM_TGT (PID, TARGET_CODE, CONCEPT_MAP_URL, TARGET_DISPLAY, TARGET_EQUIVALENCE, SYSTEM_URL, SYSTEM_VERSION, VALUESET_URL, CONCEPT_MAP_GRP_ELM_PID) VALUES (61, NULL, NULL, 'PYRIDOXINE', 'UNMATCHED', NULL, NULL, NULL, 60); +INSERT INTO TRM_CONCEPT ( + PID, + CODEVAL, + CODESYSTEM_PID, + DISPLAY, + INDEX_STATUS, + PARENT_PIDS, + PARENT_PIDS_VC, + CODE_SEQUENCE, + CONCEPT_UPDATED +) VALUES ( + 1, + 'LA4393-0', + 54, + 'CR_1430_Reason no radiation', + 1, + '1415721', + '1415721', + 3, + '2024-05-01 17:02:39.139' + ); + +INSERT INTO TRM_VALUESET_CONCEPT ( + PID, + CODEVAL, + DISPLAY, + INDEX_STATUS, + VALUESET_ORDER, + SOURCE_DIRECT_PARENT_PIDS, + SOURCE_DIRECT_PARENT_PIDS_VC, + SOURCE_PID, + SYSTEM_URL, + SYSTEM_VER, + VALUESET_PID +) VALUES ( + 1, + 'LA4382-3', + 'CR_1550_Radiation treatment location', + 1, + 2, + '1415722', + '1415722', + 1, + 'HTTP://LOINC.ORG', + 'V2.67', + 59 +); + +INSERT INTO TRM_CONCEPT_PROPERTY ( + PID, + PROP_CODESYSTEM, + PROP_DISPLAY, + PROP_KEY, + PROP_TYPE, + PROP_VAL, + PROP_VAL_BIN, + PROP_VAL_LOB, + CS_VER_PID, + CONCEPT_PID +) VALUES ( + 154, + 'http://loinc.org', + 'code-A', + 'CODING', + 1, + 'LP98185-9', + '\x48656c6c6f20776f726c6421', + 83006307, + 54, + 1 +); + +INSERT INTO HFJ_BINARY_STORAGE_BLOB ( + BLOB_ID, + BLOB_DATA, + CONTENT_TYPE, + BLOB_HASH, + PUBLISHED_DATE, + RESOURCE_ID, + BLOB_SIZE, + STORAGE_CONTENT_BIN +) VALUES ( + '1', + 72995, + 'TEXT', + 'dc7197cfab936698bef7818975c185a9b88b71a0a0a2493deea487706ddf20cb', + '2024-06-15 09:58:42.92', + '1678', + 16, + '\x48656c6c6f20776f726c6421' +); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/MSSQL_2012.sql b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/MSSQL_2012.sql index e84689abc35..ae7b3014efe 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/MSSQL_2012.sql +++ b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/MSSQL_2012.sql @@ -1 +1,91 @@ -INSERT INTO TRM_CONCEPT_MAP_GRP_ELM_TGT (PID, TARGET_CODE, CONCEPT_MAP_URL, TARGET_DISPLAY, TARGET_EQUIVALENCE, SYSTEM_URL, SYSTEM_VERSION, VALUESET_URL, CONCEPT_MAP_GRP_ELM_PID) VALUES (61, NULL, NULL, 'PYRIDOXINE', 'UNMATCHED', NULL, NULL, NULL, 60); +INSERT INTO TRM_CONCEPT ( + PID, + CODEVAL, + CODESYSTEM_PID, + DISPLAY, + INDEX_STATUS, + PARENT_PIDS, + PARENT_PIDS_VC, + CODE_SEQUENCE, + CONCEPT_UPDATED +) VALUES ( + 1, + 'LA4393-0', + 54, + 'CR_1430_Reason no radiation', + 1, + '1415721', + '1415721', + 3, + '2024-05-01 17:02:39.139' + ); + +INSERT INTO TRM_VALUESET_CONCEPT ( + PID, + CODEVAL, + DISPLAY, + INDEX_STATUS, + VALUESET_ORDER, + SOURCE_DIRECT_PARENT_PIDS, + SOURCE_DIRECT_PARENT_PIDS_VC, + SOURCE_PID, + SYSTEM_URL, + SYSTEM_VER, + VALUESET_PID +) VALUES ( + 1, + 'LA4382-3', + 'CR_1550_Radiation treatment location', + 1, + 2, + '1415722', + '1415722', + 1, + 'HTTP://LOINC.ORG', + 'V2.67', + 59 +); + +INSERT INTO TRM_CONCEPT_PROPERTY ( + PID, + PROP_CODESYSTEM, + PROP_DISPLAY, + PROP_KEY, + PROP_TYPE, + PROP_VAL, + PROP_VAL_BIN, + PROP_VAL_LOB, + CS_VER_PID, + CONCEPT_PID +) VALUES ( + 154, + 'http://loinc.org', + 'code-A', + 'CODING', + 1, + 'LP98185-9', + 8479927, + 83006307, + 54, + 1 +); + +INSERT INTO HFJ_BINARY_STORAGE_BLOB ( + BLOB_ID, + BLOB_DATA, + CONTENT_TYPE, + BLOB_HASH, + PUBLISHED_DATE, + RESOURCE_ID, + BLOB_SIZE, + STORAGE_CONTENT_BIN +) VALUES ( + '1', + 72995, + 'TEXT', + 'dc7197cfab936698bef7818975c185a9b88b71a0a0a2493deea487706ddf20cb', + '2024-06-15 09:58:42.92', + '1678', + 16, + 7368816 +); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/ORACLE_12C.sql b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/ORACLE_12C.sql index e84689abc35..ee801e3afb9 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/ORACLE_12C.sql +++ b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/ORACLE_12C.sql @@ -1 +1,91 @@ -INSERT INTO TRM_CONCEPT_MAP_GRP_ELM_TGT (PID, TARGET_CODE, CONCEPT_MAP_URL, TARGET_DISPLAY, TARGET_EQUIVALENCE, SYSTEM_URL, SYSTEM_VERSION, VALUESET_URL, CONCEPT_MAP_GRP_ELM_PID) VALUES (61, NULL, NULL, 'PYRIDOXINE', 'UNMATCHED', NULL, NULL, NULL, 60); +INSERT INTO TRM_CONCEPT ( + PID, + CODEVAL, + CODESYSTEM_PID, + DISPLAY, + INDEX_STATUS, + PARENT_PIDS, + PARENT_PIDS_VC, + CODE_SEQUENCE, + CONCEPT_UPDATED +) VALUES ( + 1, + 'LA4393-0', + 54, + 'CR_1430_Reason no radiation', + 1, + '1415721', + '1415721', + 3, + SYSDATE + ); + +INSERT INTO TRM_VALUESET_CONCEPT ( + PID, + CODEVAL, + DISPLAY, + INDEX_STATUS, + VALUESET_ORDER, + SOURCE_DIRECT_PARENT_PIDS, + SOURCE_DIRECT_PARENT_PIDS_VC, + SOURCE_PID, + SYSTEM_URL, + SYSTEM_VER, + VALUESET_PID +) VALUES ( + 1, + 'LA4382-3', + 'CR_1550_Radiation treatment location', + 1, + 2, + '1415722', + '1415722', + 1, + 'HTTP://LOINC.ORG', + 'V2.67', + 59 +); + +INSERT INTO TRM_CONCEPT_PROPERTY ( + PID, + PROP_CODESYSTEM, + PROP_DISPLAY, + PROP_KEY, + PROP_TYPE, + PROP_VAL, + PROP_VAL_BIN, + PROP_VAL_LOB, + CS_VER_PID, + CONCEPT_PID +) VALUES ( + 154, + 'http://loinc.org', + 'code-A', + 'CODING', + 1, + 'LP98185-9', + HEXTORAW('453d7a34'), + HEXTORAW('8B9D5255'), + 54, + 1 +); + +INSERT INTO HFJ_BINARY_STORAGE_BLOB ( + BLOB_ID, + BLOB_DATA, + CONTENT_TYPE, + BLOB_HASH, + PUBLISHED_DATE, + RESOURCE_ID, + BLOB_SIZE, + STORAGE_CONTENT_BIN +) VALUES ( + '1', + HEXTORAW('28721FB0'), + 'TEXT', + 'dc7197cfab936698bef7818975c185a9b88b71a0a0a2493deea487706ddf20cb', + SYSDATE, + '1678', + 16, + HEXTORAW('1B25E293') +); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/POSTGRES_9_4.sql b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/POSTGRES_9_4.sql index e52db927eca..99689b7ae08 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/POSTGRES_9_4.sql +++ b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_2_0/data/POSTGRES_9_4.sql @@ -1,5 +1,91 @@ -INSERT INTO TRM_CONCEPT_MAP_GRP_ELM_TGT (PID, TARGET_CODE, CONCEPT_MAP_URL, TARGET_DISPLAY, TARGET_EQUIVALENCE, SYSTEM_URL, SYSTEM_VERSION, VALUESET_URL, CONCEPT_MAP_GRP_ELM_PID) VALUES (61, NULL, NULL, 'PYRIDOXINE', 'UNMATCHED', NULL, NULL, NULL, 60); -INSERT INTO HFJ_BINARY_STORAGE (CONTENT_ID, RESOURCE_ID, CONTENT_TYPE, STORAGE_CONTENT_BIN, PUBLISHED_DATE ) VALUES ('1', '2', 'TEXT', '\x48656c6c6f20776f726c6421', '2023-06-15 09:58:42.92'); -INSERT INTO TRM_CONCEPT (PID, CODEVAL, PARENT_PIDS_VC ) VALUES (1, 'aCode', '1 2 3 4'); -INSERT INTO TRM_CONCEPT_PROPERTY (PID, PROP_KEY, PROP_VAL_BIN, PROP_TYPE) VALUES (1, 'key', '\x48656c6c6f20776f726c6421', 1); -INSERT INTO TRM_VALUESET_CONCEPT (PID, VALUESET_PID, VALUESET_ORDER, SOURCE_DIRECT_PARENT_PIDS_VC, SYSTEM_URL, CODEVAL) VALUES (1, 59, 1, '1 2 3 4 5 6', 'http://systemUlr', 'codeVal'); +INSERT INTO TRM_CONCEPT ( + PID, + CODEVAL, + CODESYSTEM_PID, + DISPLAY, + INDEX_STATUS, + PARENT_PIDS, + PARENT_PIDS_VC, + CODE_SEQUENCE, + CONCEPT_UPDATED +) VALUES ( + 1, + 'LA4393-0', + 54, + 'CR_1430_Reason no radiation', + 1, + '1415721', + '1415721', + 3, + '2024-05-01 17:02:39.139' + ); + +INSERT INTO TRM_VALUESET_CONCEPT ( + PID, + CODEVAL, + DISPLAY, + INDEX_STATUS, + VALUESET_ORDER, + SOURCE_DIRECT_PARENT_PIDS, + SOURCE_DIRECT_PARENT_PIDS_VC, + SOURCE_PID, + SYSTEM_URL, + SYSTEM_VER, + VALUESET_PID +) VALUES ( + 1, + 'LA4382-3', + 'CR_1550_Radiation treatment location', + 1, + 2, + '1415722', + '1415722', + 1, + 'HTTP://LOINC.ORG', + 'V2.67', + 59 +); + +INSERT INTO TRM_CONCEPT_PROPERTY ( + PID, + PROP_CODESYSTEM, + PROP_DISPLAY, + PROP_KEY, + PROP_TYPE, + PROP_VAL, + PROP_VAL_BIN, + PROP_VAL_LOB, + CS_VER_PID, + CONCEPT_PID +) VALUES ( + 154, + 'http://loinc.org', + 'code-A', + 'CODING', + 1, + 'LP98185-9', + '\x48656c6c6f20776f726c6421', + 83006307, + 54, + 1 +); + +INSERT INTO HFJ_BINARY_STORAGE_BLOB ( + BLOB_ID, + BLOB_DATA, + CONTENT_TYPE, + BLOB_HASH, + PUBLISHED_DATE, + RESOURCE_ID, + BLOB_SIZE, + STORAGE_CONTENT_BIN +) VALUES ( + '1', + 72995, + 'TEXT', + 'dc7197cfab936698bef7818975c185a9b88b71a0a0a2493deea487706ddf20cb', + '2024-06-15 09:58:42.92', + '1678', + 16, + '\x48656c6c6f20776f726c6421' +); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/H2_EMBEDDED.sql b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/H2_EMBEDDED.sql new file mode 100644 index 00000000000..fd2eb6c3c98 --- /dev/null +++ b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/H2_EMBEDDED.sql @@ -0,0 +1,53 @@ +INSERT INTO HFJ_RES_SEARCH_URL ( + PARTITION_DATE, + PARTITION_ID, + RES_SEARCH_URL, + CREATED_TIME, + RES_ID +) VALUES ( + '2024-04-05', + 1, + 'https://example.com', + '2024-06-29 10:14:39.69', + 1906 +); + +INSERT INTO HFJ_IDX_CMP_STRING_UNIQ ( + PID, + PARTITION_DATE, + PARTITION_ID, + HASH_COMPLETE, + HASH_COMPLETE_2, + IDX_STRING, + RES_ID +) VALUES ( + 2, + '2024-04-05', + 1, + -8173309116900170400, + -7180360017667394276, + 'Patient?birthdate=2024-07-24&family=FAM', + 1906 +); + +INSERT INTO TRM_CONCEPT_DESIG ( + PID, + LANG, + USE_CODE, + USE_DISPLAY, + USE_SYSTEM, + VAL, + VAL_VC, + CS_VER_PID, + CONCEPT_PID +) VALUES ( + 106, + 'NL', + '900000000000013009', + 'SYNONYM', + 'HTTP://SNOMED.INFO/SCT', + 'SYSTOLISCHE BLOEDDRUK - EXPIRATIE', + 'SYSTOLISCHE BLOEDDRUK - EXPIRATIE', + 54, + 150 +); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/MSSQL_2012.sql b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/MSSQL_2012.sql new file mode 100644 index 00000000000..fd2eb6c3c98 --- /dev/null +++ b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/MSSQL_2012.sql @@ -0,0 +1,53 @@ +INSERT INTO HFJ_RES_SEARCH_URL ( + PARTITION_DATE, + PARTITION_ID, + RES_SEARCH_URL, + CREATED_TIME, + RES_ID +) VALUES ( + '2024-04-05', + 1, + 'https://example.com', + '2024-06-29 10:14:39.69', + 1906 +); + +INSERT INTO HFJ_IDX_CMP_STRING_UNIQ ( + PID, + PARTITION_DATE, + PARTITION_ID, + HASH_COMPLETE, + HASH_COMPLETE_2, + IDX_STRING, + RES_ID +) VALUES ( + 2, + '2024-04-05', + 1, + -8173309116900170400, + -7180360017667394276, + 'Patient?birthdate=2024-07-24&family=FAM', + 1906 +); + +INSERT INTO TRM_CONCEPT_DESIG ( + PID, + LANG, + USE_CODE, + USE_DISPLAY, + USE_SYSTEM, + VAL, + VAL_VC, + CS_VER_PID, + CONCEPT_PID +) VALUES ( + 106, + 'NL', + '900000000000013009', + 'SYNONYM', + 'HTTP://SNOMED.INFO/SCT', + 'SYSTOLISCHE BLOEDDRUK - EXPIRATIE', + 'SYSTOLISCHE BLOEDDRUK - EXPIRATIE', + 54, + 150 +); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/ORACLE_12C.sql b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/ORACLE_12C.sql new file mode 100644 index 00000000000..442d6661919 --- /dev/null +++ b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/ORACLE_12C.sql @@ -0,0 +1,53 @@ +INSERT INTO HFJ_RES_SEARCH_URL ( + PARTITION_DATE, + PARTITION_ID, + RES_SEARCH_URL, + CREATED_TIME, + RES_ID +) VALUES ( + SYSDATE, + 1, + 'https://example.com', + SYSDATE, + 1906 +); + +INSERT INTO HFJ_IDX_CMP_STRING_UNIQ ( + PID, + PARTITION_DATE, + PARTITION_ID, + HASH_COMPLETE, + HASH_COMPLETE_2, + IDX_STRING, + RES_ID +) VALUES ( + 2, + SYSDATE, + 1, + -8173309116900170400, + -7180360017667394276, + 'Patient?birthdate=2024-07-24&family=FAM', + 1906 +); + +INSERT INTO TRM_CONCEPT_DESIG ( + PID, + LANG, + USE_CODE, + USE_DISPLAY, + USE_SYSTEM, + VAL, + VAL_VC, + CS_VER_PID, + CONCEPT_PID +) VALUES ( + 106, + 'NL', + '900000000000013009', + 'SYNONYM', + 'HTTP://SNOMED.INFO/SCT', + 'SYSTOLISCHE BLOEDDRUK - EXPIRATIE', + 'SYSTOLISCHE BLOEDDRUK - EXPIRATIE', + 54, + 150 +); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/POSTGRES_9_4.sql b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/POSTGRES_9_4.sql new file mode 100644 index 00000000000..fd2eb6c3c98 --- /dev/null +++ b/hapi-fhir-jpaserver-test-utilities/src/main/resources/migration/releases/V7_4_0/data/POSTGRES_9_4.sql @@ -0,0 +1,53 @@ +INSERT INTO HFJ_RES_SEARCH_URL ( + PARTITION_DATE, + PARTITION_ID, + RES_SEARCH_URL, + CREATED_TIME, + RES_ID +) VALUES ( + '2024-04-05', + 1, + 'https://example.com', + '2024-06-29 10:14:39.69', + 1906 +); + +INSERT INTO HFJ_IDX_CMP_STRING_UNIQ ( + PID, + PARTITION_DATE, + PARTITION_ID, + HASH_COMPLETE, + HASH_COMPLETE_2, + IDX_STRING, + RES_ID +) VALUES ( + 2, + '2024-04-05', + 1, + -8173309116900170400, + -7180360017667394276, + 'Patient?birthdate=2024-07-24&family=FAM', + 1906 +); + +INSERT INTO TRM_CONCEPT_DESIG ( + PID, + LANG, + USE_CODE, + USE_DISPLAY, + USE_SYSTEM, + VAL, + VAL_VC, + CS_VER_PID, + CONCEPT_PID +) VALUES ( + 106, + 'NL', + '900000000000013009', + 'SYNONYM', + 'HTTP://SNOMED.INFO/SCT', + 'SYSTOLISCHE BLOEDDRUK - EXPIRATIE', + 'SYSTOLISCHE BLOEDDRUK - EXPIRATIE', + 54, + 150 +); diff --git a/hapi-fhir-jpaserver-test-utilities/src/test/java/ca/uhn/fhir/jpa/embedded/HapiSchemaMigrationTest.java b/hapi-fhir-jpaserver-test-utilities/src/test/java/ca/uhn/fhir/jpa/embedded/HapiSchemaMigrationTest.java index 0cc6156dfdd..daf26965bc8 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/test/java/ca/uhn/fhir/jpa/embedded/HapiSchemaMigrationTest.java +++ b/hapi-fhir-jpaserver-test-utilities/src/test/java/ca/uhn/fhir/jpa/embedded/HapiSchemaMigrationTest.java @@ -171,7 +171,7 @@ public class HapiSchemaMigrationTest { final Object allCountValue = allCount.get(0).values().iterator().next(); if (allCountValue instanceof Number allCountNumber) { - assertThat(allCountNumber.intValue()).isEqualTo(1); + assertThat(allCountNumber.intValue()).isEqualTo(2); } try (final Connection connection = theDatabase.getDataSource().getConnection()) { @@ -230,7 +230,7 @@ public class HapiSchemaMigrationTest { final Object queryResultValueVal = queryResultValuesVal.iterator().next(); assertThat(queryResultValueVal).isInstanceOf(Number.class); if (queryResultValueVal instanceof Number queryResultNumber) { - assertThat(queryResultNumber.intValue()).isEqualTo(1); + assertThat(queryResultNumber.intValue()).isEqualTo(2); } assertThat(nullValVcCount).hasSize(1); @@ -244,7 +244,7 @@ public class HapiSchemaMigrationTest { final Object allCountValue = allCount.get(0).values().iterator().next(); if (allCountValue instanceof Number allCountNumber) { - assertThat(allCountNumber.intValue()).isEqualTo(1); + assertThat(allCountNumber.intValue()).isEqualTo(2); } try (final Connection connection = theDatabase.getDataSource().getConnection()) { diff --git a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobParameters.java b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobParameters.java index fd57f9c78f7..82d154c86f7 100644 --- a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobParameters.java +++ b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobParameters.java @@ -19,10 +19,10 @@ */ package ca.uhn.fhir.batch2.jobs.expunge; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import com.fasterxml.jackson.annotation.JsonProperty; -public class DeleteExpungeJobParameters extends JobParameters { +public class DeleteExpungeJobParameters extends PartitionedUrlJobParameters { @JsonProperty("cascade") private boolean myCascade; diff --git a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobParametersValidator.java b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobParametersValidator.java index fcb3381b6a8..f57df320e14 100644 --- a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobParametersValidator.java +++ b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobParametersValidator.java @@ -55,9 +55,6 @@ public class DeleteExpungeJobParametersValidator implements IJobParametersValida } // Verify that the user has access to all requested partitions - myRequestPartitionHelperSvc.validateHasPartitionPermissions( - theRequestDetails, null, theParameters.getRequestPartitionId()); - for (PartitionedUrl partitionedUrl : theParameters.getPartitionedUrls()) { String url = partitionedUrl.getUrl(); ValidateUtil.isTrueOrThrowInvalidRequest( @@ -68,6 +65,6 @@ public class DeleteExpungeJobParametersValidator implements IJobParametersValida theRequestDetails, null, partitionedUrl.getRequestPartitionId()); } } - return myUrlListValidator.validatePartitionedUrls(theParameters.getPartitionedUrls()); + return myUrlListValidator.validateUrls(theParameters.getUrls()); } } diff --git a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobSubmitterImpl.java b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobSubmitterImpl.java index 5fe7b69687d..526bacd71d5 100644 --- a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobSubmitterImpl.java +++ b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/expunge/DeleteExpungeJobSubmitterImpl.java @@ -20,9 +20,9 @@ package ca.uhn.fhir.batch2.jobs.expunge; import ca.uhn.fhir.batch2.api.IJobCoordinator; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; -import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.interceptor.api.HookParams; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; @@ -31,7 +31,6 @@ import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse; import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; -import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.storage.IDeleteExpungeJobSubmitter; import ca.uhn.fhir.rest.server.exceptions.ForbiddenOperationException; @@ -51,12 +50,6 @@ public class DeleteExpungeJobSubmitterImpl implements IDeleteExpungeJobSubmitter @Autowired IJobCoordinator myJobCoordinator; - @Autowired - FhirContext myFhirContext; - - @Autowired - MatchUrlService myMatchUrlService; - @Autowired IRequestPartitionHelperSvc myRequestPartitionHelperSvc; @@ -102,11 +95,16 @@ public class DeleteExpungeJobSubmitterImpl implements IDeleteExpungeJobSubmitter .forEach(deleteExpungeJobParameters::addPartitionedUrl); deleteExpungeJobParameters.setBatchSize(theBatchSize); + // TODO MM: apply changes similar to ReindexProvider to compute the PartitionedUrl list using + // IJobPartitionProvider. + // so that feature https://github.com/hapifhir/hapi-fhir/issues/6008 can be implemented for this operation // Also set top level partition in case there are no urls - RequestPartitionId requestPartition = - myRequestPartitionHelperSvc.determineReadPartitionForRequestForServerOperation( - theRequestDetails, ProviderConstants.OPERATION_DELETE_EXPUNGE); - deleteExpungeJobParameters.setRequestPartitionId(requestPartition); + if (theUrlsToDeleteExpunge.isEmpty()) { // fix for https://github.com/hapifhir/hapi-fhir/issues/6179 + RequestPartitionId requestPartition = + myRequestPartitionHelperSvc.determineReadPartitionForRequestForServerOperation( + theRequestDetails, ProviderConstants.OPERATION_DELETE_EXPUNGE); + deleteExpungeJobParameters.addPartitionedUrl(new PartitionedUrl().setRequestPartitionId(requestPartition)); + } deleteExpungeJobParameters.setCascade(theCascade); deleteExpungeJobParameters.setCascadeMaxRounds(theCascadeMaxRounds); diff --git a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexAppCtx.java b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexAppCtx.java index f8648250974..a49d054bf39 100644 --- a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexAppCtx.java +++ b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexAppCtx.java @@ -26,7 +26,6 @@ import ca.uhn.fhir.batch2.api.VoidModel; import ca.uhn.fhir.batch2.jobs.chunk.ChunkRangeJson; import ca.uhn.fhir.batch2.jobs.chunk.ResourceIdListWorkChunkJson; import ca.uhn.fhir.batch2.jobs.parameters.UrlListValidator; -import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner; import ca.uhn.fhir.batch2.jobs.step.GenerateRangeChunksStep; import ca.uhn.fhir.batch2.jobs.step.LoadIdsStep; import ca.uhn.fhir.batch2.model.JobDefinition; @@ -90,8 +89,7 @@ public class ReindexAppCtx { public ReindexProvider reindexProvider( FhirContext theFhirContext, IJobCoordinator theJobCoordinator, - IJobPartitionProvider theJobPartitionHandler, - UrlPartitioner theUrlPartitioner) { - return new ReindexProvider(theFhirContext, theJobCoordinator, theJobPartitionHandler, theUrlPartitioner); + IJobPartitionProvider theJobPartitionHandler) { + return new ReindexProvider(theFhirContext, theJobCoordinator, theJobPartitionHandler); } } diff --git a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexJobParameters.java b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexJobParameters.java index 2a5f4131888..03913ad025f 100644 --- a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexJobParameters.java +++ b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexJobParameters.java @@ -19,14 +19,14 @@ */ package ca.uhn.fhir.batch2.jobs.reindex; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import ca.uhn.fhir.jpa.api.dao.ReindexParameters; import com.fasterxml.jackson.annotation.JsonProperty; import jakarta.annotation.Nullable; import static org.apache.commons.lang3.ObjectUtils.defaultIfNull; -public class ReindexJobParameters extends JobParameters { +public class ReindexJobParameters extends PartitionedUrlJobParameters { public static final String OPTIMIZE_STORAGE = "optimizeStorage"; public static final String REINDEX_SEARCH_PARAMETERS = "reindexSearchParameters"; diff --git a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexJobParametersValidator.java b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexJobParametersValidator.java index d267bde3f1a..b7560a8ef57 100644 --- a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexJobParametersValidator.java +++ b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexJobParametersValidator.java @@ -20,8 +20,7 @@ package ca.uhn.fhir.batch2.jobs.reindex; import ca.uhn.fhir.batch2.api.IJobParametersValidator; -import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; -import ca.uhn.fhir.batch2.jobs.parameters.UrlListValidator; +import ca.uhn.fhir.batch2.jobs.parameters.IUrlListValidator; import ca.uhn.fhir.rest.api.server.RequestDetails; import jakarta.annotation.Nonnull; import jakarta.annotation.Nullable; @@ -31,30 +30,26 @@ import java.util.List; public class ReindexJobParametersValidator implements IJobParametersValidator { - private final UrlListValidator myUrlListValidator; + private final IUrlListValidator myUrlListValidator; - public ReindexJobParametersValidator(UrlListValidator theUrlListValidator) { + public ReindexJobParametersValidator(IUrlListValidator theUrlListValidator) { myUrlListValidator = theUrlListValidator; } @Nullable @Override public List validate(RequestDetails theRequestDetails, @Nonnull ReindexJobParameters theParameters) { - List errors = myUrlListValidator.validatePartitionedUrls(theParameters.getPartitionedUrls()); + List errors = myUrlListValidator.validateUrls(theParameters.getUrls()); if (errors == null || errors.isEmpty()) { // only check if there's no other errors (new list to fix immutable issues) errors = new ArrayList<>(); - List urls = theParameters.getPartitionedUrls(); - for (PartitionedUrl purl : urls) { - String url = purl.getUrl(); - + for (String url : theParameters.getUrls()) { if (url.contains(" ") || url.contains("\n") || url.contains("\t")) { errors.add("Invalid URL. URL cannot contain spaces : " + url); } } } - return errors; } } diff --git a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexProvider.java b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexProvider.java index 6677c52e067..5889a75d4e9 100644 --- a/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexProvider.java +++ b/hapi-fhir-storage-batch2-jobs/src/main/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexProvider.java @@ -21,7 +21,6 @@ package ca.uhn.fhir.batch2.jobs.reindex; import ca.uhn.fhir.batch2.api.IJobCoordinator; import ca.uhn.fhir.batch2.api.IJobPartitionProvider; -import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.jpa.api.dao.ReindexParameters; @@ -41,6 +40,7 @@ import org.hl7.fhir.instance.model.api.IBaseParameters; import org.hl7.fhir.instance.model.api.IPrimitiveType; import java.util.List; +import java.util.stream.Collectors; import static ca.uhn.fhir.batch2.jobs.reindex.ReindexJobParameters.OPTIMIZE_STORAGE; import static ca.uhn.fhir.batch2.jobs.reindex.ReindexJobParameters.REINDEX_SEARCH_PARAMETERS; @@ -50,7 +50,6 @@ public class ReindexProvider { private final FhirContext myFhirContext; private final IJobCoordinator myJobCoordinator; private final IJobPartitionProvider myJobPartitionProvider; - private final UrlPartitioner myUrlPartitioner; /** * Constructor @@ -58,12 +57,10 @@ public class ReindexProvider { public ReindexProvider( FhirContext theFhirContext, IJobCoordinator theJobCoordinator, - IJobPartitionProvider theJobPartitionProvider, - UrlPartitioner theUrlPartitioner) { + IJobPartitionProvider theJobPartitionProvider) { myFhirContext = theFhirContext; myJobCoordinator = theJobCoordinator; myJobPartitionProvider = theJobPartitionProvider; - myUrlPartitioner = theUrlPartitioner; } @Operation(name = ProviderConstants.OPERATION_REINDEX, idempotent = false) @@ -119,17 +116,15 @@ public class ReindexProvider { params.setOptimisticLock(theOptimisticLock.getValue()); } + List urls = List.of(); if (theUrlsToReindex != null) { - theUrlsToReindex.stream() + urls = theUrlsToReindex.stream() .map(IPrimitiveType::getValue) .filter(StringUtils::isNotBlank) - .map(url -> myUrlPartitioner.partitionUrl(url, theRequestDetails)) - .forEach(params::addPartitionedUrl); + .collect(Collectors.toList()); } - myJobPartitionProvider - .getPartitions(theRequestDetails, ProviderConstants.OPERATION_REINDEX) - .forEach(params::addRequestPartitionId); + myJobPartitionProvider.getPartitionedUrls(theRequestDetails, urls).forEach(params::addPartitionedUrl); JobInstanceStartRequest request = new JobInstanceStartRequest(); request.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX); diff --git a/hapi-fhir-storage-batch2-jobs/src/test/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexProviderTest.java b/hapi-fhir-storage-batch2-jobs/src/test/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexProviderTest.java index 30185eea738..66cacbd1bfe 100644 --- a/hapi-fhir-storage-batch2-jobs/src/test/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexProviderTest.java +++ b/hapi-fhir-storage-batch2-jobs/src/test/java/ca/uhn/fhir/batch2/jobs/reindex/ReindexProviderTest.java @@ -3,13 +3,11 @@ package ca.uhn.fhir.batch2.jobs.reindex; import ca.uhn.fhir.batch2.api.IJobCoordinator; import ca.uhn.fhir.batch2.api.IJobPartitionProvider; import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; -import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner; import ca.uhn.fhir.batch2.model.JobInstanceStartRequest; import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.jpa.api.dao.ReindexParameters; import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse; -import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; import ca.uhn.fhir.rest.server.provider.ProviderConstants; import ca.uhn.fhir.test.utilities.server.RestfulServerExtension; import org.hl7.fhir.r4.model.BooleanType; @@ -22,6 +20,9 @@ import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.RegisterExtension; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.NullSource; +import org.junit.jupiter.params.provider.ValueSource; import org.mockito.ArgumentCaptor; import org.mockito.Captor; import org.mockito.InjectMocks; @@ -38,8 +39,6 @@ import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertTrue; import static org.mockito.ArgumentMatchers.any; -import static org.mockito.ArgumentMatchers.anyString; -import static org.mockito.ArgumentMatchers.eq; import static org.mockito.ArgumentMatchers.isNotNull; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; @@ -54,15 +53,11 @@ public class ReindexProviderTest { private final FhirContext myCtx = FhirContext.forR4Cached(); @RegisterExtension - private final RestfulServerExtension myServerExtension = new RestfulServerExtension(myCtx); + public final RestfulServerExtension myServerExtension = new RestfulServerExtension(myCtx); @Mock private IJobCoordinator myJobCoordinator; - @Mock - private IRequestPartitionHelperSvc myRequestPartitionHelperSvc; - @Mock - private UrlPartitioner myUrlPartitioner; @Mock private IJobPartitionProvider myJobPartitionProvider; @@ -78,7 +73,6 @@ public class ReindexProviderTest { when(myJobCoordinator.startInstance(isNotNull(), any())) .thenReturn(createJobStartResponse()); - when(myJobPartitionProvider.getPartitions(any(), any())).thenReturn(List.of(RequestPartitionId.allPartitions())); } private Batch2JobStartResponse createJobStartResponse() { @@ -92,20 +86,26 @@ public class ReindexProviderTest { myServerExtension.unregisterProvider(mySvc); } - @Test - public void testReindex_ByUrl() { + @ParameterizedTest + @NullSource + @ValueSource(strings = {"Observation?status=active", ""}) + public void testReindex_withUrlAndNonDefaultParams(String theUrl) { // setup Parameters input = new Parameters(); - String url = "Observation?status=active"; int batchSize = 2401; - input.addParameter(ProviderConstants.OPERATION_REINDEX_PARAM_URL, url); + input.addParameter(ProviderConstants.OPERATION_REINDEX_PARAM_URL, theUrl); input.addParameter(ProviderConstants.OPERATION_REINDEX_PARAM_BATCH_SIZE, new DecimalType(batchSize)); + input.addParameter(ReindexJobParameters.REINDEX_SEARCH_PARAMETERS, new CodeType("none")); + input.addParameter(ReindexJobParameters.OPTIMISTIC_LOCK, new BooleanType(false)); + input.addParameter(ReindexJobParameters.OPTIMIZE_STORAGE, new CodeType("current_version")); + + RequestPartitionId partitionId = RequestPartitionId.fromPartitionId(1); + final PartitionedUrl partitionedUrl = new PartitionedUrl().setUrl(theUrl).setRequestPartitionId(partitionId); + when(myJobPartitionProvider.getPartitionedUrls(any(), any())).thenReturn(List.of(partitionedUrl)); ourLog.debug(myCtx.newJsonParser().setPrettyPrint(true).encodeResourceToString(input)); - when(myUrlPartitioner.partitionUrl(anyString(), any())).thenReturn(new PartitionedUrl().setUrl(url).setRequestPartitionId(RequestPartitionId.defaultPartition())); // Execute - Parameters response = myServerExtension .getFhirClient() .operation() @@ -115,54 +115,47 @@ public class ReindexProviderTest { .execute(); // Verify + ourLog.debug(myCtx.newJsonParser().setPrettyPrint(true).encodeResourceToString(response)); + StringType jobId = (StringType) response.getParameterValue(ProviderConstants.OPERATION_REINDEX_RESPONSE_JOB_ID); + assertEquals(TEST_JOB_ID, jobId.getValue()); + verify(myJobCoordinator, times(1)).startInstance(isNotNull(), myStartRequestCaptor.capture()); + + ReindexJobParameters params = myStartRequestCaptor.getValue().getParameters(ReindexJobParameters.class); + assertThat(params.getPartitionedUrls().iterator().next()).isEqualTo(partitionedUrl); + + // Non-default values + assertEquals(ReindexParameters.ReindexSearchParametersEnum.NONE, params.getReindexSearchParameters()); + assertFalse(params.getOptimisticLock()); + assertEquals(ReindexParameters.OptimizeStorageModeEnum.CURRENT_VERSION, params.getOptimizeStorage()); + } + + @Test + public void testReindex_withDefaults() { + // setup + Parameters input = new Parameters(); + ourLog.debug(myCtx.newJsonParser().setPrettyPrint(true).encodeResourceToString(input)); + + // Execute + Parameters response = myServerExtension + .getFhirClient() + .operation() + .onServer() + .named(ProviderConstants.OPERATION_REINDEX) + .withParameters(input) + .execute(); + + // Verify ourLog.debug(myCtx.newJsonParser().setPrettyPrint(true).encodeResourceToString(response)); StringType jobId = (StringType) response.getParameterValue(ProviderConstants.OPERATION_REINDEX_RESPONSE_JOB_ID); assertEquals(TEST_JOB_ID, jobId.getValue()); verify(myJobCoordinator, times(1)).startInstance(isNotNull(), myStartRequestCaptor.capture()); ReindexJobParameters params = myStartRequestCaptor.getValue().getParameters(ReindexJobParameters.class); - assertThat(params.getPartitionedUrls()).hasSize(1); - assertEquals(url, params.getPartitionedUrls().get(0).getUrl()); + // Default values assertEquals(ReindexParameters.ReindexSearchParametersEnum.ALL, params.getReindexSearchParameters()); assertTrue(params.getOptimisticLock()); assertEquals(ReindexParameters.OptimizeStorageModeEnum.NONE, params.getOptimizeStorage()); } - - @Test - public void testReindex_NoUrl() { - // setup - Parameters input = new Parameters(); - input.addParameter(ReindexJobParameters.REINDEX_SEARCH_PARAMETERS, new CodeType("none")); - input.addParameter(ReindexJobParameters.OPTIMISTIC_LOCK, new BooleanType(false)); - input.addParameter(ReindexJobParameters.OPTIMIZE_STORAGE, new CodeType("current_version")); - - ourLog.debug(myCtx.newJsonParser().setPrettyPrint(true).encodeResourceToString(input)); - - // Execute - - Parameters response = myServerExtension - .getFhirClient() - .operation() - .onServer() - .named(ProviderConstants.OPERATION_REINDEX) - .withParameters(input) - .execute(); - - // Verify - - ourLog.debug(myCtx.newJsonParser().setPrettyPrint(true).encodeResourceToString(response)); - StringType jobId = (StringType) response.getParameterValue(ProviderConstants.OPERATION_REINDEX_RESPONSE_JOB_ID); - assertEquals(TEST_JOB_ID, jobId.getValue()); - - verify(myJobCoordinator, times(1)).startInstance(isNotNull(), myStartRequestCaptor.capture()); - ReindexJobParameters params = myStartRequestCaptor.getValue().getParameters(ReindexJobParameters.class); - assertThat(params.getPartitionedUrls()).isEmpty(); - // Non-default values - assertEquals(ReindexParameters.ReindexSearchParametersEnum.NONE, params.getReindexSearchParameters()); - assertFalse(params.getOptimisticLock()); - assertEquals(ReindexParameters.OptimizeStorageModeEnum.CURRENT_VERSION, params.getOptimizeStorage()); - - } } diff --git a/hapi-fhir-storage-batch2-test-utilities/src/main/java/ca/uhn/hapi/fhir/batch2/test/IJobPartitionProviderTest.java b/hapi-fhir-storage-batch2-test-utilities/src/main/java/ca/uhn/hapi/fhir/batch2/test/IJobPartitionProviderTest.java new file mode 100644 index 00000000000..3ded0be3bd3 --- /dev/null +++ b/hapi-fhir-storage-batch2-test-utilities/src/main/java/ca/uhn/hapi/fhir/batch2/test/IJobPartitionProviderTest.java @@ -0,0 +1,114 @@ +/*- + * #%L + * HAPI FHIR JPA Server - Batch2 specification tests + * %% + * Copyright (C) 2014 - 2024 Smile CDR, Inc. + * %% + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * #L% + */ +package ca.uhn.hapi.fhir.batch2.test; + +import ca.uhn.fhir.batch2.api.IJobPartitionProvider; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; +import ca.uhn.fhir.context.FhirContext; +import ca.uhn.fhir.interceptor.model.RequestPartitionId; +import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; +import ca.uhn.fhir.jpa.searchparam.MatchUrlService; +import ca.uhn.fhir.jpa.searchparam.ResourceSearch; +import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; +import ca.uhn.fhir.rest.api.server.RequestDetails; +import ca.uhn.fhir.rest.api.server.SystemRequestDetails; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.junit.jupiter.MockitoExtension; + +import java.util.Collection; +import java.util.LinkedHashSet; +import java.util.List; +import java.util.Set; + +import static org.assertj.core.api.Assertions.assertThat; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +public interface IJobPartitionProviderTest { + FhirContext getFhirContext(); + IRequestPartitionHelperSvc getRequestPartitionHelper(); + IJobPartitionProvider getJobPartitionProvider(); + MatchUrlService getMatchUrlService(); + + @Test + default void getPartitionedUrls_noUrls_returnsCorrectly() { + // setup + SystemRequestDetails requestDetails = new SystemRequestDetails(); + + setupResourceNameUrlWithPartition(requestDetails, "Patient", RequestPartitionId.fromPartitionId(1)); + setupResourceNameUrlWithPartition(requestDetails, "Observation", RequestPartitionId.fromPartitionId(2)); + setupResourceNameUrlWithPartition(requestDetails, "Practitioner", null); + setupResourceNameUrlWithPartition(requestDetails, "SearchParameter", RequestPartitionId.defaultPartition()); + + Set resourceTypes = Set.of("Patient", "Observation", "Practitioner", "SearchParameter"); + when(getFhirContext().getResourceTypes()).thenReturn(resourceTypes); + + // execute and verify + List partitionedUrls = List.of( + new PartitionedUrl().setUrl("Patient?").setRequestPartitionId(RequestPartitionId.fromPartitionId(1)), + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(RequestPartitionId.fromPartitionId(2)), + new PartitionedUrl().setUrl("Practitioner?"), + new PartitionedUrl().setUrl("SearchParameter?").setRequestPartitionId(RequestPartitionId.defaultPartition())); + + executeAndVerifyGetPartitionedUrls(requestDetails, List.of(), partitionedUrls); + executeAndVerifyGetPartitionedUrls(requestDetails, null, partitionedUrls); + } + + @Test + default void getPartitionedUrls_withUrls_returnsCorrectly() { + // setup + SystemRequestDetails requestDetails = new SystemRequestDetails(); + + setupResourceNameUrlWithPartition(requestDetails, "Patient", RequestPartitionId.fromPartitionId(1)); + setupResourceNameUrlWithPartition(requestDetails, "Observation", RequestPartitionId.allPartitions()); + setupResourceNameUrlWithPartition(requestDetails, "Practitioner", null); + + // execute and verify + List urls = List.of("Patient?", "Observation?", "Practitioner?"); + List partitionedUrls = List.of( + new PartitionedUrl().setUrl("Patient?").setRequestPartitionId(RequestPartitionId.fromPartitionId(1)), + new PartitionedUrl().setUrl("Observation?").setRequestPartitionId(RequestPartitionId.allPartitions()), + new PartitionedUrl().setUrl("Practitioner?")); + executeAndVerifyGetPartitionedUrls(requestDetails, urls, partitionedUrls); + } + + default void executeAndVerifyGetPartitionedUrls(RequestDetails theRequestDetails, List theUrls, Collection thePartitionedUrls) { + // test + List actualPartitionedUrls = getJobPartitionProvider().getPartitionedUrls(theRequestDetails, theUrls); + + // verify + assertThat(actualPartitionedUrls).hasSize(thePartitionedUrls.size()).containsExactlyInAnyOrder(thePartitionedUrls.toArray(new PartitionedUrl[0])); + } + + default void setupResourceNameUrlWithPartition(RequestDetails theRequestDetails, String theResourceName, RequestPartitionId thePartitionId) { + final String url = theResourceName + "?"; + ResourceSearch resourceSearch = mock(ResourceSearch.class); + when(getMatchUrlService().getResourceSearch(url)).thenReturn(resourceSearch); + when(resourceSearch.getResourceName()).thenReturn(theResourceName); + SearchParameterMap searchParameterMap = mock(SearchParameterMap.class); + when(resourceSearch.getSearchParameterMap()).thenReturn(searchParameterMap); + + when(getRequestPartitionHelper().determineReadPartitionForRequestForSearchType(theRequestDetails, theResourceName, searchParameterMap)).thenReturn(thePartitionId); + } + + void setupPartitions(List thePartitionIds); +} diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/api/IJobPartitionProvider.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/api/IJobPartitionProvider.java index d0765c8809b..2f10b57563d 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/api/IJobPartitionProvider.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/api/IJobPartitionProvider.java @@ -19,19 +19,18 @@ */ package ca.uhn.fhir.batch2.api; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.rest.api.server.RequestDetails; import java.util.List; +import java.util.stream.Collectors; /** - * Provides the list of partitions that a job should run against. - * TODO MM: Consider moving UrlPartitioner calls to this class once other batch operations need to support running - * across all partitions on a multitenant FHIR server. - * That way all partitioning related logic exists only here for batch jobs. - * After that PartitionedUrl#myRequestPartitionId can be marked as deprecated. + * Provides the list of {@link PartitionedUrl} that a job should run against. */ public interface IJobPartitionProvider { + /** * Provides the list of partitions to run job steps against, based on the request that initiates the job. * @param theRequestDetails the requestDetails @@ -40,5 +39,14 @@ public interface IJobPartitionProvider { */ List getPartitions(RequestDetails theRequestDetails, String theOperation); - // List getPartitions(RequestDetails theRequestDetails, String theOperation, String theUrls); + /** + * Provides the list of {@link PartitionedUrl} to run job steps against, based on the request that initiates the job + * and the urls that it's configured with. + * @param theRequestDetails the requestDetails + * @param theUrls the urls to run the job against + * @return the list of {@link PartitionedUrl} + */ + default List getPartitionedUrls(RequestDetails theRequestDetails, List theUrls) { + return theUrls.stream().map(url -> new PartitionedUrl().setUrl(url)).collect(Collectors.toList()); + } } diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/config/BaseBatch2Config.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/config/BaseBatch2Config.java index 2dbc531df48..29580729b67 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/config/BaseBatch2Config.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/config/BaseBatch2Config.java @@ -25,17 +25,19 @@ import ca.uhn.fhir.batch2.api.IJobPartitionProvider; import ca.uhn.fhir.batch2.api.IJobPersistence; import ca.uhn.fhir.batch2.api.IReductionStepExecutorService; import ca.uhn.fhir.batch2.channel.BatchJobSender; +import ca.uhn.fhir.batch2.coordinator.DefaultJobPartitionProvider; import ca.uhn.fhir.batch2.coordinator.JobCoordinatorImpl; import ca.uhn.fhir.batch2.coordinator.JobDefinitionRegistry; import ca.uhn.fhir.batch2.coordinator.ReductionStepExecutorServiceImpl; -import ca.uhn.fhir.batch2.coordinator.SimpleJobPartitionProvider; import ca.uhn.fhir.batch2.coordinator.WorkChunkProcessor; import ca.uhn.fhir.batch2.maintenance.JobMaintenanceServiceImpl; import ca.uhn.fhir.batch2.model.JobWorkNotificationJsonMessage; +import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.jpa.api.config.JpaStorageSettings; import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; import ca.uhn.fhir.jpa.model.sched.ISchedulerService; import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; +import ca.uhn.fhir.jpa.searchparam.MatchUrlService; import ca.uhn.fhir.jpa.subscription.channel.api.ChannelConsumerSettings; import ca.uhn.fhir.jpa.subscription.channel.api.ChannelProducerSettings; import ca.uhn.fhir.jpa.subscription.channel.api.IChannelFactory; @@ -144,7 +146,10 @@ public abstract class BaseBatch2Config { } @Bean - public IJobPartitionProvider jobPartitionProvider(IRequestPartitionHelperSvc theRequestPartitionHelperSvc) { - return new SimpleJobPartitionProvider(theRequestPartitionHelperSvc); + public IJobPartitionProvider jobPartitionProvider( + FhirContext theFhirContext, + IRequestPartitionHelperSvc theRequestPartitionHelperSvc, + MatchUrlService theMatchUrlService) { + return new DefaultJobPartitionProvider(theFhirContext, theRequestPartitionHelperSvc, theMatchUrlService); } } diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/coordinator/DefaultJobPartitionProvider.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/coordinator/DefaultJobPartitionProvider.java new file mode 100644 index 00000000000..f58688b4c2b --- /dev/null +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/coordinator/DefaultJobPartitionProvider.java @@ -0,0 +1,107 @@ +/*- + * #%L + * HAPI FHIR JPA Server - Batch2 Task Processor + * %% + * Copyright (C) 2014 - 2024 Smile CDR, Inc. + * %% + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * #L% + */ +package ca.uhn.fhir.batch2.coordinator; + +import ca.uhn.fhir.batch2.api.IJobPartitionProvider; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; +import ca.uhn.fhir.context.FhirContext; +import ca.uhn.fhir.interceptor.model.RequestPartitionId; +import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; +import ca.uhn.fhir.jpa.searchparam.MatchUrlService; +import ca.uhn.fhir.jpa.searchparam.ResourceSearch; +import ca.uhn.fhir.rest.api.server.RequestDetails; + +import java.util.ArrayList; +import java.util.LinkedHashSet; +import java.util.List; +import java.util.Set; +import java.util.stream.Collectors; + +/** + * Default implementation which provides the {@link PartitionedUrl} list for a certain operation request. + */ +public class DefaultJobPartitionProvider implements IJobPartitionProvider { + protected final IRequestPartitionHelperSvc myRequestPartitionHelper; + protected FhirContext myFhirContext; + private MatchUrlService myMatchUrlService; + + public DefaultJobPartitionProvider(IRequestPartitionHelperSvc theRequestPartitionHelperSvc) { + myRequestPartitionHelper = theRequestPartitionHelperSvc; + } + + public DefaultJobPartitionProvider( + FhirContext theFhirContext, + IRequestPartitionHelperSvc theRequestPartitionHelperSvc, + MatchUrlService theMatchUrlService) { + myFhirContext = theFhirContext; + myRequestPartitionHelper = theRequestPartitionHelperSvc; + myMatchUrlService = theMatchUrlService; + } + + public List getPartitions(RequestDetails theRequestDetails, String theOperation) { + RequestPartitionId partitionId = myRequestPartitionHelper.determineReadPartitionForRequestForServerOperation( + theRequestDetails, theOperation); + return List.of(partitionId); + } + + @Override + public List getPartitionedUrls(RequestDetails theRequestDetails, List theUrls) { + List urls = theUrls; + + // if the url list is empty, use all the supported resource types to build the url list + // we can go back to no url scenario if all resource types point to the same partition + if (theUrls == null || theUrls.isEmpty()) { + urls = myFhirContext.getResourceTypes().stream() + .map(resourceType -> resourceType + "?") + .collect(Collectors.toList()); + } + + // determine the partition associated with each of the urls + List partitionedUrls = new ArrayList<>(); + for (String s : urls) { + ResourceSearch resourceSearch = myMatchUrlService.getResourceSearch(s); + RequestPartitionId partitionId = myRequestPartitionHelper.determineReadPartitionForRequestForSearchType( + theRequestDetails, resourceSearch.getResourceName(), resourceSearch.getSearchParameterMap()); + partitionedUrls.add(new PartitionedUrl().setUrl(s).setRequestPartitionId(partitionId)); + } + + // handle (bulk) system operations that are typically configured with RequestPartitionId.allPartitions() + // populate the actual list of all partitions, if that is supported + Set allPartitions = new LinkedHashSet<>(getAllPartitions()); + List retVal = new ArrayList<>(); + for (PartitionedUrl partitionedUrl : partitionedUrls) { + String url = partitionedUrl.getUrl(); + RequestPartitionId partitionId = partitionedUrl.getRequestPartitionId(); + if (partitionId != null && partitionId.isAllPartitions() && !allPartitions.isEmpty()) { + allPartitions.stream() + .map(p -> (new PartitionedUrl().setUrl(url).setRequestPartitionId(p))) + .forEach(retVal::add); + } else { + retVal.add(partitionedUrl); + } + } + + return retVal; + } + + public List getAllPartitions() { + return List.of(RequestPartitionId.allPartitions()); + } +} diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/coordinator/SimpleJobPartitionProvider.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/coordinator/SimpleJobPartitionProvider.java deleted file mode 100644 index 47e40747393..00000000000 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/coordinator/SimpleJobPartitionProvider.java +++ /dev/null @@ -1,45 +0,0 @@ -/*- - * #%L - * HAPI FHIR JPA Server - Batch2 Task Processor - * %% - * Copyright (C) 2014 - 2024 Smile CDR, Inc. - * %% - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * #L% - */ -package ca.uhn.fhir.batch2.coordinator; - -import ca.uhn.fhir.batch2.api.IJobPartitionProvider; -import ca.uhn.fhir.interceptor.model.RequestPartitionId; -import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; -import ca.uhn.fhir.rest.api.server.RequestDetails; - -import java.util.List; - -/** - * Basic implementation which provides the partition list for a certain request which is composed of a single partition. - */ -public class SimpleJobPartitionProvider implements IJobPartitionProvider { - protected final IRequestPartitionHelperSvc myRequestPartitionHelperSvc; - - public SimpleJobPartitionProvider(IRequestPartitionHelperSvc theRequestPartitionHelperSvc) { - myRequestPartitionHelperSvc = theRequestPartitionHelperSvc; - } - - @Override - public List getPartitions(RequestDetails theRequestDetails, String theOperation) { - RequestPartitionId partitionId = myRequestPartitionHelperSvc.determineReadPartitionForRequestForServerOperation( - theRequestDetails, theOperation); - return List.of(partitionId); - } -} diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/IUrlListValidator.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/IUrlListValidator.java index 82f02cda664..ded4045f598 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/IUrlListValidator.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/IUrlListValidator.java @@ -27,7 +27,4 @@ import java.util.List; public interface IUrlListValidator { @Nullable List validateUrls(@Nonnull List theUrls); - - @Nullable - List validatePartitionedUrls(@Nonnull List thePartitionedUrls); } diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/PartitionedUrl.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/PartitionedUrl.java index b29f215344e..0343a2b582d 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/PartitionedUrl.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/PartitionedUrl.java @@ -23,26 +23,27 @@ import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.model.api.IModelJson; import com.fasterxml.jackson.annotation.JsonProperty; import jakarta.validation.constraints.Pattern; +import org.apache.commons.lang3.builder.EqualsBuilder; +import org.apache.commons.lang3.builder.HashCodeBuilder; import org.apache.commons.lang3.builder.ToStringBuilder; import org.apache.commons.lang3.builder.ToStringStyle; +/** + * Represents the pair of partition and (search) url, which can be used to configure batch2 jobs. + * It will be used to determine which FHIR resources are selected for the job. + * Please note that the url is a partial url, which means it does not include server base and tenantId, + * and it starts with the with resource type. + * e.g. Patient?, Observation?status=final + */ public class PartitionedUrl implements IModelJson { - @Override - public String toString() { - ToStringBuilder b = new ToStringBuilder(this, ToStringStyle.SHORT_PREFIX_STYLE); - b.append("partition", myRequestPartitionId); - b.append("myUrl", myUrl); - return b.toString(); - } - @JsonProperty("url") @Pattern( regexp = "^[A-Z][A-Za-z0-9]+\\?.*", message = "If populated, URL must be a search URL in the form '{resourceType}?[params]'") - String myUrl; + private String myUrl; @JsonProperty("requestPartitionId") - RequestPartitionId myRequestPartitionId; + private RequestPartitionId myRequestPartitionId; public String getUrl() { return myUrl; @@ -61,4 +62,35 @@ public class PartitionedUrl implements IModelJson { myRequestPartitionId = theRequestPartitionId; return this; } + + @Override + public String toString() { + ToStringBuilder b = new ToStringBuilder(this, ToStringStyle.SHORT_PREFIX_STYLE); + b.append("myUrl", myUrl); + b.append("myRequestPartitionId", myRequestPartitionId); + return b.toString(); + } + + @Override + public boolean equals(Object obj) { + if (this == obj) { + return true; + } + if (!(obj instanceof PartitionedUrl)) { + return false; + } + PartitionedUrl other = (PartitionedUrl) obj; + EqualsBuilder b = new EqualsBuilder(); + b.append(myUrl, other.myUrl); + b.append(myRequestPartitionId, other.myRequestPartitionId); + return b.isEquals(); + } + + @Override + public int hashCode() { + HashCodeBuilder b = new HashCodeBuilder(); + b.append(myRequestPartitionId); + b.append(myUrl); + return b.hashCode(); + } } diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/JobParameters.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/PartitionedUrlJobParameters.java similarity index 57% rename from hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/JobParameters.java rename to hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/PartitionedUrlJobParameters.java index 6334cb4e803..3622198ddec 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/JobParameters.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/PartitionedUrlJobParameters.java @@ -22,20 +22,22 @@ package ca.uhn.fhir.batch2.jobs.parameters; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.model.api.IModelJson; import com.fasterxml.jackson.annotation.JsonProperty; -import com.google.common.annotations.VisibleForTesting; import jakarta.annotation.Nonnull; import jakarta.annotation.Nullable; +import org.apache.commons.lang3.StringUtils; import java.util.ArrayList; import java.util.List; +import java.util.stream.Collectors; /** * Can be used to configure parameters for batch2 jobs. * Please note that these need to be backward compatible as we do not have a way to migrate them to a different structure at the moment. */ -public class JobParameters implements IModelJson { +public class PartitionedUrlJobParameters implements IModelJson { @JsonProperty(value = "partitionId") - private List myRequestPartitionIds; + @Nullable + private RequestPartitionId myRequestPartitionId; @JsonProperty("batchSize") private Integer myBatchSize; @@ -44,31 +46,12 @@ public class JobParameters implements IModelJson { private List myPartitionedUrls; public void setRequestPartitionId(@Nullable RequestPartitionId theRequestPartitionId) { - if (theRequestPartitionId != null) { - myRequestPartitionIds = List.of(theRequestPartitionId); - } + myRequestPartitionId = theRequestPartitionId; } @Nullable public RequestPartitionId getRequestPartitionId() { - return getFirstRequestPartitionIdOrNull(); - } - - @Nullable - private RequestPartitionId getFirstRequestPartitionIdOrNull() { - return myRequestPartitionIds == null || myRequestPartitionIds.isEmpty() ? null : myRequestPartitionIds.get(0); - } - - @Nonnull - public List getRequestPartitionIds() { - if (myRequestPartitionIds == null) { - myRequestPartitionIds = new ArrayList<>(); - } - return myRequestPartitionIds; - } - - public void addRequestPartitionId(RequestPartitionId theRequestPartitionId) { - getRequestPartitionIds().add(theRequestPartitionId); + return myRequestPartitionId; } public void setBatchSize(int theBatchSize) { @@ -84,6 +67,10 @@ public class JobParameters implements IModelJson { if (myPartitionedUrls == null) { myPartitionedUrls = new ArrayList<>(); } + // TODO MM: added for backward compatibility, it can be removed once requestPartitionId is deprecated + myPartitionedUrls.stream() + .filter(thePartitionedUrl -> thePartitionedUrl.getRequestPartitionId() == null) + .forEach(thePartitionedUrl -> thePartitionedUrl.setRequestPartitionId(myRequestPartitionId)); return myPartitionedUrls; } @@ -95,22 +82,10 @@ public class JobParameters implements IModelJson { getPartitionedUrls().add(new PartitionedUrl().setUrl(theUrl)); } - @VisibleForTesting - public static JobParameters from( - List theUrls, List thePartitions, boolean theShouldAssignPartitionToUrl) { - JobParameters parameters = new JobParameters(); - if (theShouldAssignPartitionToUrl) { - assert theUrls.size() == thePartitions.size(); - for (int i = 0; i < theUrls.size(); i++) { - PartitionedUrl partitionedUrl = new PartitionedUrl(); - partitionedUrl.setUrl(theUrls.get(i)); - partitionedUrl.setRequestPartitionId(thePartitions.get(i)); - parameters.addPartitionedUrl(partitionedUrl); - } - } else { - theUrls.forEach(url -> parameters.addPartitionedUrl(new PartitionedUrl().setUrl(url))); - thePartitions.forEach(parameters::addRequestPartitionId); - } - return parameters; + public List getUrls() { + return getPartitionedUrls().stream() + .map(PartitionedUrl::getUrl) + .filter(url -> !StringUtils.isBlank(url)) + .collect(Collectors.toList()); } } diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/UrlListValidator.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/UrlListValidator.java index 61bf7018886..557018526bb 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/UrlListValidator.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/parameters/UrlListValidator.java @@ -25,7 +25,6 @@ import jakarta.annotation.Nullable; import java.util.Collections; import java.util.List; -import java.util.stream.Collectors; public class UrlListValidator implements IUrlListValidator { private final String myOperationName; @@ -39,20 +38,10 @@ public class UrlListValidator implements IUrlListValidator { @Nullable @Override public List validateUrls(@Nonnull List theUrls) { - if (theUrls.isEmpty()) { - if (!myBatch2DaoSvc.isAllResourceTypeSupported()) { - return Collections.singletonList("At least one type-specific search URL must be provided for " - + myOperationName + " on this server"); - } + if (theUrls.isEmpty() && !myBatch2DaoSvc.isAllResourceTypeSupported()) { + return Collections.singletonList("At least one type-specific search URL must be provided for " + + myOperationName + " on this server"); } return Collections.emptyList(); } - - @Nullable - @Override - public List validatePartitionedUrls(@Nonnull List thePartitionedUrls) { - List urls = - thePartitionedUrls.stream().map(PartitionedUrl::getUrl).collect(Collectors.toList()); - return validateUrls(urls); - } } diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/GenerateRangeChunksStep.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/GenerateRangeChunksStep.java index 5448fa959b5..65f41388f1c 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/GenerateRangeChunksStep.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/GenerateRangeChunksStep.java @@ -26,12 +26,10 @@ import ca.uhn.fhir.batch2.api.RunOutcome; import ca.uhn.fhir.batch2.api.StepExecutionDetails; import ca.uhn.fhir.batch2.api.VoidModel; import ca.uhn.fhir.batch2.jobs.chunk.ChunkRangeJson; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; -import ca.uhn.fhir.interceptor.model.RequestPartitionId; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import ca.uhn.fhir.util.Logs; import jakarta.annotation.Nonnull; -import jakarta.annotation.Nullable; import org.slf4j.Logger; import org.thymeleaf.util.StringUtils; @@ -40,7 +38,8 @@ import java.util.List; import static ca.uhn.fhir.batch2.util.Batch2Utils.BATCH_START_DATE; -public class GenerateRangeChunksStep implements IFirstJobStepWorker { +public class GenerateRangeChunksStep + implements IFirstJobStepWorker { private static final Logger ourLog = Logs.getBatchTroubleshootingLog(); @Nonnull @@ -54,69 +53,26 @@ public class GenerateRangeChunksStep implements IFirst Date start = BATCH_START_DATE; Date end = new Date(); - // there are partitions configured in either of the following lists, which are both optional - // the following code considers all use-cases - // the logic can be simplified once PartitionedUrl.myRequestPartitionId is deprecated - // @see IJobPartitionProvider - - List partitionIds = params.getRequestPartitionIds(); List partitionedUrls = params.getPartitionedUrls(); - if (partitionIds.isEmpty()) { - if (partitionedUrls.isEmpty()) { - ChunkRangeJson chunkRangeJson = new ChunkRangeJson(start, end); - sendChunk(chunkRangeJson, theDataSink); - return RunOutcome.SUCCESS; - } + if (!partitionedUrls.isEmpty()) { partitionedUrls.forEach(partitionedUrl -> { - String url = partitionedUrl.getUrl(); - RequestPartitionId partitionId = partitionedUrl.getRequestPartitionId(); - ChunkRangeJson chunkRangeJson = - new ChunkRangeJson(start, end).setUrl(url).setPartitionId(partitionId); + ChunkRangeJson chunkRangeJson = new ChunkRangeJson(start, end) + .setUrl(partitionedUrl.getUrl()) + .setPartitionId(partitionedUrl.getRequestPartitionId()); sendChunk(chunkRangeJson, theDataSink); }); return RunOutcome.SUCCESS; } - partitionIds.forEach(partitionId -> { - if (partitionedUrls.isEmpty()) { - ChunkRangeJson chunkRangeJson = new ChunkRangeJson(start, end).setPartitionId(partitionId); - sendChunk(chunkRangeJson, theDataSink); - return; - } - partitionedUrls.forEach(partitionedUrl -> { - String url = partitionedUrl.getUrl(); - RequestPartitionId urlPartitionId = partitionedUrl.getRequestPartitionId(); - RequestPartitionId narrowPartitionId = determineNarrowPartitionId(partitionId, urlPartitionId); - ChunkRangeJson chunkRangeJson = - new ChunkRangeJson(start, end).setUrl(url).setPartitionId(narrowPartitionId); - sendChunk(chunkRangeJson, theDataSink); - }); - }); - + ChunkRangeJson chunkRangeJson = new ChunkRangeJson(start, end); + sendChunk(chunkRangeJson, theDataSink); return RunOutcome.SUCCESS; } - private RequestPartitionId determineNarrowPartitionId( - @Nonnull RequestPartitionId theRequestPartitionId, - @Nullable RequestPartitionId theOtherRequestPartitionId) { - if (theOtherRequestPartitionId == null) { - return theRequestPartitionId; - } - if (theRequestPartitionId.isAllPartitions() && !theOtherRequestPartitionId.isAllPartitions()) { - return theOtherRequestPartitionId; - } - if (theRequestPartitionId.isDefaultPartition() - && !theOtherRequestPartitionId.isDefaultPartition() - && !theOtherRequestPartitionId.isAllPartitions()) { - return theOtherRequestPartitionId; - } - return theRequestPartitionId; - } - private void sendChunk(ChunkRangeJson theData, IJobDataSink theDataSink) { String url = theData.getUrl(); - ourLog.info( + ourLog.trace( "Creating chunks for [{}] from {} to {} for partition {}", !StringUtils.isEmpty(url) ? url : "everything", theData.getStart(), diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/LoadIdsStep.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/LoadIdsStep.java index 7e01f6124d4..feee4e144e7 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/LoadIdsStep.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/LoadIdsStep.java @@ -26,11 +26,11 @@ import ca.uhn.fhir.batch2.api.RunOutcome; import ca.uhn.fhir.batch2.api.StepExecutionDetails; import ca.uhn.fhir.batch2.jobs.chunk.ChunkRangeJson; import ca.uhn.fhir.batch2.jobs.chunk.ResourceIdListWorkChunkJson; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc; import jakarta.annotation.Nonnull; -public class LoadIdsStep +public class LoadIdsStep implements IJobStepWorker { private final ResourceIdListStep myResourceIdListStep; diff --git a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/ResourceIdListStep.java b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/ResourceIdListStep.java index 10349aa4e47..958fb6d8a34 100644 --- a/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/ResourceIdListStep.java +++ b/hapi-fhir-storage-batch2/src/main/java/ca/uhn/fhir/batch2/jobs/step/ResourceIdListStep.java @@ -27,7 +27,7 @@ import ca.uhn.fhir.batch2.api.StepExecutionDetails; import ca.uhn.fhir.batch2.jobs.chunk.ChunkRangeJson; import ca.uhn.fhir.batch2.jobs.chunk.ResourceIdListWorkChunkJson; import ca.uhn.fhir.batch2.jobs.chunk.TypedPidJson; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.jpa.api.pid.IResourcePidStream; import ca.uhn.fhir.util.Logs; @@ -42,7 +42,7 @@ import java.util.stream.Stream; import static ca.uhn.fhir.util.StreamUtil.partition; import static org.apache.commons.lang3.ObjectUtils.defaultIfNull; -public class ResourceIdListStep +public class ResourceIdListStep implements IJobStepWorker { private static final Logger ourLog = Logs.getBatchTroubleshootingLog(); @@ -66,7 +66,12 @@ public class ResourceIdListStep Date end = data.getEnd(); Integer batchSize = theStepExecutionDetails.getParameters().getBatchSize(); - ourLog.info("Beginning to submit chunks in range {} to {}", start, end); + ourLog.trace( + "Beginning to submit chunks in range {} to {} for url {} and partitionId {}", + start, + end, + data.getUrl(), + data.getPartitionId()); int chunkSize = Math.min(defaultIfNull(batchSize, MAX_BATCH_OF_IDS), MAX_BATCH_OF_IDS); final IResourcePidStream searchResult = @@ -84,7 +89,12 @@ public class ResourceIdListStep chunkCount.getAndIncrement(); submitWorkChunk(idBatch, searchResult.getRequestPartitionId(), theDataSink); }); - ourLog.info("Submitted {} chunks with {} resource IDs", chunkCount, totalIdsFound); + ourLog.trace( + "Submitted {} chunks with {} resource IDs for url {} and partitionId {}", + chunkCount, + totalIdsFound, + data.getUrl(), + data.getPartitionId()); }); return RunOutcome.SUCCESS; @@ -97,9 +107,9 @@ public class ResourceIdListStep if (theTypedPids.isEmpty()) { return; } - ourLog.info("Submitting work chunk in partition {} with {} IDs", theRequestPartitionId, theTypedPids.size()); + ourLog.trace("Submitting work chunk in partition {} with {} IDs", theRequestPartitionId, theTypedPids.size()); ResourceIdListWorkChunkJson data = new ResourceIdListWorkChunkJson(theTypedPids, theRequestPartitionId); - ourLog.debug("IDs are: {}", data); + ourLog.trace("IDs are: {}", data); theDataSink.accept(data); } } diff --git a/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/coordinator/SimpleJobPartitionProviderTest.java b/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/coordinator/SimpleJobPartitionProviderTest.java deleted file mode 100644 index df4b716db54..00000000000 --- a/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/coordinator/SimpleJobPartitionProviderTest.java +++ /dev/null @@ -1,43 +0,0 @@ -package ca.uhn.fhir.batch2.coordinator; - -import ca.uhn.fhir.interceptor.model.RequestPartitionId; -import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; -import ca.uhn.fhir.rest.api.server.SystemRequestDetails; -import ca.uhn.fhir.rest.server.provider.ProviderConstants; -import org.assertj.core.api.Assertions; -import org.junit.jupiter.api.Test; -import org.junit.jupiter.api.extension.ExtendWith; -import org.mockito.ArgumentMatchers; -import org.mockito.InjectMocks; -import org.mockito.Mock; -import org.mockito.junit.jupiter.MockitoExtension; - -import java.util.List; - -import static org.mockito.Mockito.when; - -@ExtendWith(MockitoExtension.class) -public class SimpleJobPartitionProviderTest { - @Mock - private IRequestPartitionHelperSvc myRequestPartitionHelperSvc; - @InjectMocks - private SimpleJobPartitionProvider myJobPartitionProvider; - - @Test - public void getPartitions_requestSpecificPartition_returnsPartition() { - // setup - SystemRequestDetails requestDetails = new SystemRequestDetails(); - String operation = ProviderConstants.OPERATION_EXPORT; - - RequestPartitionId partitionId = RequestPartitionId.fromPartitionId(1); - when(myRequestPartitionHelperSvc.determineReadPartitionForRequestForServerOperation(ArgumentMatchers.eq(requestDetails), ArgumentMatchers.eq(operation))).thenReturn(partitionId); - - // test - List partitionIds = myJobPartitionProvider.getPartitions(requestDetails, operation); - - // verify - Assertions.assertThat(partitionIds).hasSize(1); - Assertions.assertThat(partitionIds).containsExactlyInAnyOrder(partitionId); - } - -} \ No newline at end of file diff --git a/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/GenerateRangeChunksStepTest.java b/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/GenerateRangeChunksStepTest.java index f9a0b0b811c..b2097d9c43d 100644 --- a/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/GenerateRangeChunksStepTest.java +++ b/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/GenerateRangeChunksStepTest.java @@ -4,7 +4,7 @@ import ca.uhn.fhir.batch2.api.IJobDataSink; import ca.uhn.fhir.batch2.api.StepExecutionDetails; import ca.uhn.fhir.batch2.api.VoidModel; import ca.uhn.fhir.batch2.jobs.chunk.ChunkRangeJson; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import org.junit.jupiter.api.AfterEach; @@ -17,12 +17,9 @@ import org.mockito.ArgumentCaptor; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; -import java.util.ArrayList; -import java.util.Date; import java.util.List; import java.util.stream.Stream; -import static ca.uhn.fhir.batch2.util.Batch2Utils.BATCH_START_DATE; import static org.assertj.core.api.Assertions.assertThat; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; @@ -31,13 +28,11 @@ import static org.mockito.Mockito.when; @ExtendWith(MockitoExtension.class) public class GenerateRangeChunksStepTest { - private final GenerateRangeChunksStep myStep = new GenerateRangeChunksStep<>(); + private final GenerateRangeChunksStep myStep = new GenerateRangeChunksStep<>(); @Mock - private StepExecutionDetails myStepExecutionDetails; + private StepExecutionDetails myStepExecutionDetails; @Mock private IJobDataSink myJobDataSink; - private static final Date START = BATCH_START_DATE; - private static final Date END = new Date(); @BeforeEach void setUp() { @@ -48,93 +43,46 @@ public class GenerateRangeChunksStepTest { } public static Stream getReindexParameters() { - List threePartitions = List.of( - RequestPartitionId.fromPartitionId(1), - RequestPartitionId.fromPartitionId(2), - RequestPartitionId.fromPartitionId(3) - ); - List partition1 = List.of(RequestPartitionId.fromPartitionId(1)); + RequestPartitionId partition1 = RequestPartitionId.fromPartitionId(1); + RequestPartitionId partition2 = RequestPartitionId.fromPartitionId(2); - // the actual values (URLs, partitionId) don't matter, but we add values similar to real hapi-fhir use-cases return Stream.of( - Arguments.of(List.of(), List.of(), false, 1), - Arguments.of(List.of(), partition1, false, 1), - Arguments.of(List.of("Observation?"), threePartitions, false, 3), - Arguments.of(List.of("Observation?"), List.of(), false, 1), - Arguments.of(List.of("Observation?"), partition1, true, 1), - Arguments.of(List.of("Observation?", "Patient?"), threePartitions, false, 6), - Arguments.of(List.of("Observation?", "Patient?", "Practitioner?"), threePartitions, true, 3), - Arguments.of(List.of("Observation?status=final", "Patient?"), partition1, false, 2), - Arguments.of(List.of("Observation?status=final"), threePartitions, false, 3) + Arguments.of(List.of()), + Arguments.of(List.of(new PartitionedUrl())), + Arguments.of(List.of(new PartitionedUrl().setUrl("url").setRequestPartitionId(partition1)), + Arguments.of(List.of( + new PartitionedUrl().setUrl("url1").setRequestPartitionId(partition1), + new PartitionedUrl().setUrl("url2").setRequestPartitionId(partition2))) + ) ); } @ParameterizedTest @MethodSource(value = "getReindexParameters") - public void run_withParameters_producesExpectedChunks(List theUrls, List thePartitions, - boolean theShouldAssignPartitionToUrl, int theExpectedChunkCount) { - JobParameters parameters = JobParameters.from(theUrls, thePartitions, theShouldAssignPartitionToUrl); + public void run_withParameters_producesExpectedChunks(List thePartitionedUrls) { + PartitionedUrlJobParameters parameters = new PartitionedUrlJobParameters(); + thePartitionedUrls.forEach(parameters::addPartitionedUrl); when(myStepExecutionDetails.getParameters()).thenReturn(parameters); myStep.run(myStepExecutionDetails, myJobDataSink); ArgumentCaptor captor = ArgumentCaptor.forClass(ChunkRangeJson.class); - verify(myJobDataSink, times(theExpectedChunkCount)).accept(captor.capture()); + int expectedChunkCount = !thePartitionedUrls.isEmpty() ? thePartitionedUrls.size() : 1; + verify(myJobDataSink, times(expectedChunkCount)).accept(captor.capture()); - List chunkRangeJsonList = getExpectedChunkList(theUrls, thePartitions, theShouldAssignPartitionToUrl, theExpectedChunkCount); - - RequestPartitionId[] actualPartitionIds = captor.getAllValues().stream().map(ChunkRangeJson::getPartitionId).toList().toArray(new RequestPartitionId[0]); - RequestPartitionId[] expectedPartitionIds = chunkRangeJsonList.stream().map(ChunkRangeJson::getPartitionId).toList().toArray(new RequestPartitionId[0]); - assertThat(actualPartitionIds).containsExactlyInAnyOrder(expectedPartitionIds); - - String[] actualUrls = captor.getAllValues().stream().map(ChunkRangeJson::getUrl).toList().toArray(new String[0]); - String[] expectedUrls = chunkRangeJsonList.stream().map(ChunkRangeJson::getUrl).toList().toArray(new String[0]); - assertThat(actualUrls).containsExactlyInAnyOrder(expectedUrls); - } - - private List getExpectedChunkList(List theUrls, List thePartitions, - boolean theShouldAssignPartitionToUrl, int theExpectedChunkCount) { - List chunkRangeJsonList = new ArrayList<>(); - if (theShouldAssignPartitionToUrl) { - for (int i = 0; i < theExpectedChunkCount; i++) { - String url = theUrls.get(i); - RequestPartitionId partition = thePartitions.get(i); - ChunkRangeJson chunkRangeJson = new ChunkRangeJson(START, END).setUrl(url).setPartitionId(partition); - chunkRangeJsonList.add(chunkRangeJson); + if (thePartitionedUrls.isEmpty()) { + ChunkRangeJson chunkRangeJson = captor.getValue(); + assertThat(chunkRangeJson.getUrl()).isNull(); + assertThat(chunkRangeJson.getPartitionId()).isNull(); + } else { + List chunks = captor.getAllValues(); + assertThat(chunks).hasSize(thePartitionedUrls.size()); + for (int i = 0; i < thePartitionedUrls.size(); i++) { + PartitionedUrl partitionedUrl = thePartitionedUrls.get(i); + ChunkRangeJson chunkRangeJson = captor.getAllValues().get(i); + assertThat(chunkRangeJson.getUrl()).isEqualTo(partitionedUrl.getUrl()); + assertThat(chunkRangeJson.getPartitionId()).isEqualTo(partitionedUrl.getRequestPartitionId()); } - return chunkRangeJsonList; } - - if (theUrls.isEmpty() && thePartitions.isEmpty()) { - ChunkRangeJson chunkRangeJson = new ChunkRangeJson(START, END); - chunkRangeJsonList.add(chunkRangeJson); - return chunkRangeJsonList; - } - - - if (theUrls.isEmpty()) { - for (RequestPartitionId partition : thePartitions) { - ChunkRangeJson chunkRangeJson = new ChunkRangeJson(START, END).setPartitionId(partition); - chunkRangeJsonList.add(chunkRangeJson); - } - return chunkRangeJsonList; - } - - if (thePartitions.isEmpty()) { - for (String url : theUrls) { - ChunkRangeJson chunkRangeJson = new ChunkRangeJson(START, END).setUrl(url); - chunkRangeJsonList.add(chunkRangeJson); - } - return chunkRangeJsonList; - } - - theUrls.forEach(url -> { - for (RequestPartitionId partition : thePartitions) { - ChunkRangeJson chunkRangeJson = new ChunkRangeJson(START, END).setUrl(url).setPartitionId(partition); - chunkRangeJsonList.add(chunkRangeJson); - } - }); - - return chunkRangeJsonList; } } \ No newline at end of file diff --git a/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/LoadIdsStepTest.java b/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/LoadIdsStepTest.java index 01ac420ed13..45d15bab35f 100644 --- a/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/LoadIdsStepTest.java +++ b/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/LoadIdsStepTest.java @@ -4,7 +4,7 @@ import ca.uhn.fhir.batch2.api.IJobDataSink; import ca.uhn.fhir.batch2.api.StepExecutionDetails; import ca.uhn.fhir.batch2.jobs.chunk.ChunkRangeJson; import ca.uhn.fhir.batch2.jobs.chunk.ResourceIdListWorkChunkJson; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import ca.uhn.fhir.batch2.model.JobInstance; import ca.uhn.fhir.batch2.model.WorkChunk; import ca.uhn.fhir.jpa.api.pid.HomogeneousResourcePidList; @@ -48,7 +48,7 @@ public class LoadIdsStepTest { @Mock private IJobDataSink mySink; - private LoadIdsStep mySvc; + private LoadIdsStep mySvc; @BeforeEach public void before() { @@ -60,12 +60,12 @@ public class LoadIdsStepTest { @Test public void testGenerateSteps() { - JobParameters parameters = new JobParameters(); + PartitionedUrlJobParameters parameters = new PartitionedUrlJobParameters(); ChunkRangeJson range = new ChunkRangeJson(DATE_1, DATE_END); String instanceId = "instance-id"; JobInstance jobInstance = JobInstance.fromInstanceId(instanceId); String chunkId = "chunk-id"; - StepExecutionDetails details = new StepExecutionDetails<>(parameters, range, jobInstance, new WorkChunk().setId(chunkId)); + StepExecutionDetails details = new StepExecutionDetails<>(parameters, range, jobInstance, new WorkChunk().setId(chunkId)); // First Execution diff --git a/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/ResourceIdListStepTest.java b/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/ResourceIdListStepTest.java index e76220f4ec2..0069ef71859 100644 --- a/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/ResourceIdListStepTest.java +++ b/hapi-fhir-storage-batch2/src/test/java/ca/uhn/fhir/batch2/jobs/step/ResourceIdListStepTest.java @@ -5,7 +5,7 @@ import ca.uhn.fhir.batch2.api.RunOutcome; import ca.uhn.fhir.batch2.api.StepExecutionDetails; import ca.uhn.fhir.batch2.jobs.chunk.ChunkRangeJson; import ca.uhn.fhir.batch2.jobs.chunk.ResourceIdListWorkChunkJson; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.jpa.api.pid.HomogeneousResourcePidList; import ca.uhn.fhir.jpa.api.pid.IResourcePidStream; @@ -38,18 +38,18 @@ class ResourceIdListStepTest { @Mock private IIdChunkProducer myIdChunkProducer; @Mock - private StepExecutionDetails myStepExecutionDetails; + private StepExecutionDetails myStepExecutionDetails; @Mock private IJobDataSink myDataSink; @Mock private ChunkRangeJson myData; @Mock - private JobParameters myParameters; + private PartitionedUrlJobParameters myParameters; @Captor private ArgumentCaptor myDataCaptor; - private ResourceIdListStep myResourceIdListStep; + private ResourceIdListStep myResourceIdListStep; @BeforeEach void beforeEach() { diff --git a/hapi-fhir-storage-mdm/src/main/java/ca/uhn/fhir/mdm/batch2/clear/MdmClearJobParameters.java b/hapi-fhir-storage-mdm/src/main/java/ca/uhn/fhir/mdm/batch2/clear/MdmClearJobParameters.java index 8018b68679d..c9dee9b57a6 100644 --- a/hapi-fhir-storage-mdm/src/main/java/ca/uhn/fhir/mdm/batch2/clear/MdmClearJobParameters.java +++ b/hapi-fhir-storage-mdm/src/main/java/ca/uhn/fhir/mdm/batch2/clear/MdmClearJobParameters.java @@ -19,7 +19,7 @@ */ package ca.uhn.fhir.mdm.batch2.clear; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; import com.fasterxml.jackson.annotation.JsonProperty; import jakarta.annotation.Nonnull; import jakarta.validation.constraints.Pattern; @@ -28,7 +28,7 @@ import org.apache.commons.lang3.Validate; import java.util.ArrayList; import java.util.List; -public class MdmClearJobParameters extends JobParameters { +public class MdmClearJobParameters extends PartitionedUrlJobParameters { @JsonProperty("resourceType") @Nonnull private List<@Pattern(regexp = "^[A-Z][A-Za-z]+$", message = "If populated, must be a valid resource type'") String> diff --git a/hapi-fhir-storage-mdm/src/main/java/ca/uhn/fhir/mdm/batch2/submit/MdmSubmitJobParameters.java b/hapi-fhir-storage-mdm/src/main/java/ca/uhn/fhir/mdm/batch2/submit/MdmSubmitJobParameters.java index 8dd0ec6bb2c..6a819aafb36 100644 --- a/hapi-fhir-storage-mdm/src/main/java/ca/uhn/fhir/mdm/batch2/submit/MdmSubmitJobParameters.java +++ b/hapi-fhir-storage-mdm/src/main/java/ca/uhn/fhir/mdm/batch2/submit/MdmSubmitJobParameters.java @@ -19,6 +19,6 @@ */ package ca.uhn.fhir.mdm.batch2.submit; -import ca.uhn.fhir.batch2.jobs.parameters.JobParameters; +import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrlJobParameters; -public class MdmSubmitJobParameters extends JobParameters {} +public class MdmSubmitJobParameters extends PartitionedUrlJobParameters {} diff --git a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/binary/api/StoredDetails.java b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/binary/api/StoredDetails.java index 14b4b583a2e..45b1ca0144f 100644 --- a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/binary/api/StoredDetails.java +++ b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/binary/api/StoredDetails.java @@ -33,9 +33,22 @@ import java.util.Date; public class StoredDetails implements IModelJson { - @JsonProperty("binaryContentId") + @JsonProperty(value = "binaryContentId") private String myBinaryContentId; + /** + * This field exists to fix a break that changing this property name caused. + * in 7.2.0 we went from blobId to binaryContentId. However this did not consider installations using filesystem + * mode storage in which the data on disk was not updated, and needed to be Serialized/Deserialized at runtime. + * Existing stored details used `blobId`. This causes Jackson deserialization failures which are tough to recover + * from without manually modifying all those stored details + * on disk. + * This field is a relic to support old blobs post-upgrade to 7.2.0. It is not ever surfaced to the user, and is proxied + * into `myBinaryContentId` when needed. + */ + @JsonProperty(value = "blobId") + private String myBlobId; + @JsonProperty("bytes") private long myBytes; @@ -77,7 +90,7 @@ public class StoredDetails implements IModelJson { @Override public String toString() { return new ToStringBuilder(this) - .append("binaryContentId", myBinaryContentId) + .append("binaryContentId", getBinaryContentId()) .append("bytes", myBytes) .append("contentType", myContentType) .append("hash", myHash) @@ -115,7 +128,11 @@ public class StoredDetails implements IModelJson { @Nonnull public String getBinaryContentId() { - return myBinaryContentId; + if (myBinaryContentId == null && myBlobId != null) { + return myBlobId; + } else { + return myBinaryContentId; + } } public StoredDetails setBinaryContentId(String theBinaryContentId) { diff --git a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/BaseTransactionProcessor.java b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/BaseTransactionProcessor.java index c8a18f9c7e3..ec421c5affb 100644 --- a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/BaseTransactionProcessor.java +++ b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/BaseTransactionProcessor.java @@ -26,6 +26,7 @@ import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.interceptor.api.HookParams; import ca.uhn.fhir.interceptor.api.IInterceptorBroadcaster; import ca.uhn.fhir.interceptor.api.Pointcut; +import ca.uhn.fhir.interceptor.model.ReadPartitionIdRequestDetails; import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.interceptor.model.TransactionWriteOperationsDetails; import ca.uhn.fhir.jpa.api.dao.DaoRegistry; @@ -41,15 +42,18 @@ import ca.uhn.fhir.jpa.cache.ResourcePersistentIdMap; import ca.uhn.fhir.jpa.dao.tx.HapiTransactionService; import ca.uhn.fhir.jpa.dao.tx.IHapiTransactionService; import ca.uhn.fhir.jpa.delete.DeleteConflictUtil; +import ca.uhn.fhir.jpa.model.config.PartitionSettings; import ca.uhn.fhir.jpa.model.cross.IBasePersistedResource; import ca.uhn.fhir.jpa.model.entity.ResourceTable; import ca.uhn.fhir.jpa.model.entity.StorageSettings; import ca.uhn.fhir.jpa.model.search.StorageProcessingMessage; +import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc; import ca.uhn.fhir.jpa.searchparam.extractor.ResourceIndexedSearchParams; import ca.uhn.fhir.jpa.searchparam.matcher.InMemoryMatchResult; import ca.uhn.fhir.jpa.searchparam.matcher.InMemoryResourceMatcher; import ca.uhn.fhir.jpa.searchparam.matcher.SearchParamMatcher; import ca.uhn.fhir.model.api.ResourceMetadataKeyEnum; +import ca.uhn.fhir.model.valueset.BundleEntryTransactionMethodEnum; import ca.uhn.fhir.parser.DataFormatException; import ca.uhn.fhir.parser.IParser; import ca.uhn.fhir.rest.api.Constants; @@ -86,6 +90,7 @@ import com.google.common.annotations.VisibleForTesting; import com.google.common.collect.ArrayListMultimap; import com.google.common.collect.ListMultimap; import jakarta.annotation.Nonnull; +import jakarta.annotation.Nullable; import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.Validate; import org.hl7.fhir.dstu3.model.Bundle; @@ -98,6 +103,7 @@ import org.hl7.fhir.instance.model.api.IBaseReference; import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.instance.model.api.IPrimitiveType; +import org.hl7.fhir.r4.model.IdType; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; @@ -141,6 +147,9 @@ public abstract class BaseTransactionProcessor { public static final Pattern INVALID_PLACEHOLDER_PATTERN = Pattern.compile("[a-zA-Z]+:.*"); private static final Logger ourLog = LoggerFactory.getLogger(BaseTransactionProcessor.class); + @Autowired + private IRequestPartitionHelperSvc myRequestPartitionHelperService; + @Autowired private PlatformTransactionManager myTxManager; @@ -163,6 +172,9 @@ public abstract class BaseTransactionProcessor { @Autowired private StorageSettings myStorageSettings; + @Autowired + PartitionSettings myPartitionSettings; + @Autowired private InMemoryResourceMatcher myInMemoryResourceMatcher; @@ -375,9 +387,6 @@ public abstract class BaseTransactionProcessor { long start = System.currentTimeMillis(); - TransactionTemplate txTemplate = new TransactionTemplate(myTxManager); - txTemplate.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRES_NEW); - IBaseBundle response = myVersionAdapter.createBundle(org.hl7.fhir.r4.model.Bundle.BundleType.BATCHRESPONSE.toCode()); Map responseMap = new ConcurrentHashMap<>(); @@ -701,9 +710,13 @@ public abstract class BaseTransactionProcessor { }; EntriesToProcessMap entriesToProcess; + RequestPartitionId requestPartitionId = + determineRequestPartitionIdForWriteEntries(theRequestDetails, theEntries); + try { entriesToProcess = myHapiTransactionService .withRequest(theRequestDetails) + .withRequestPartitionId(requestPartitionId) .withTransactionDetails(theTransactionDetails) .execute(txCallback); } finally { @@ -726,6 +739,82 @@ public abstract class BaseTransactionProcessor { } } + /** + * This method looks at the FHIR actions being performed in a List of bundle entries, + * and determines the associated request partitions. + */ + @Nullable + protected RequestPartitionId determineRequestPartitionIdForWriteEntries( + RequestDetails theRequestDetails, List theEntries) { + if (!myPartitionSettings.isPartitioningEnabled()) { + return RequestPartitionId.allPartitions(); + } + + RequestPartitionId retVal = null; + + for (var nextEntry : theEntries) { + RequestPartitionId nextRequestPartitionId = null; + String verb = myVersionAdapter.getEntryRequestVerb(myContext, nextEntry); + if (isNotBlank(verb)) { + BundleEntryTransactionMethodEnum verbEnum = BundleEntryTransactionMethodEnum.valueOf(verb); + switch (verbEnum) { + case GET: + continue; + case DELETE: { + String requestUrl = myVersionAdapter.getEntryRequestUrl(nextEntry); + if (isNotBlank(requestUrl)) { + IdType id = new IdType(requestUrl); + String resourceType = id.getResourceType(); + ReadPartitionIdRequestDetails details = + ReadPartitionIdRequestDetails.forDelete(resourceType, id); + nextRequestPartitionId = myRequestPartitionHelperService.determineReadPartitionForRequest( + theRequestDetails, details); + } + break; + } + case PATCH: { + String requestUrl = myVersionAdapter.getEntryRequestUrl(nextEntry); + if (isNotBlank(requestUrl)) { + IdType id = new IdType(requestUrl); + String resourceType = id.getResourceType(); + ReadPartitionIdRequestDetails details = + ReadPartitionIdRequestDetails.forPatch(resourceType, id); + nextRequestPartitionId = myRequestPartitionHelperService.determineReadPartitionForRequest( + theRequestDetails, details); + } + break; + } + case POST: + case PUT: { + IBaseResource resource = myVersionAdapter.getResource(nextEntry); + if (resource != null) { + String resourceType = myContext.getResourceType(resource); + nextRequestPartitionId = myRequestPartitionHelperService.determineCreatePartitionForRequest( + theRequestDetails, resource, resourceType); + } + } + } + } + + if (nextRequestPartitionId == null) { + // continue + } else if (retVal == null) { + retVal = nextRequestPartitionId; + } else if (!retVal.equals(nextRequestPartitionId)) { + if (myHapiTransactionService.isRequiresNewTransactionWhenChangingPartitions()) { + String msg = myContext + .getLocalizer() + .getMessage(BaseTransactionProcessor.class, "multiplePartitionAccesses", theEntries.size()); + throw new InvalidRequestException(Msg.code(2541) + msg); + } else { + retVal = retVal.mergeIds(nextRequestPartitionId); + } + } + } + + return retVal; + } + private boolean haveWriteOperationsHooks(RequestDetails theRequestDetails) { return CompositeInterceptorBroadcaster.hasHooks( Pointcut.STORAGE_TRANSACTION_WRITE_OPERATIONS_PRE, myInterceptorBroadcaster, theRequestDetails) @@ -2042,6 +2131,11 @@ public abstract class BaseTransactionProcessor { } } + @VisibleForTesting + public void setPartitionSettingsForUnitTest(PartitionSettings thePartitionSettings) { + myPartitionSettings = thePartitionSettings; + } + /** * Transaction Order, per the spec: *

diff --git a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/tx/HapiTransactionService.java b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/tx/HapiTransactionService.java index ac592d245e3..766d400f74c 100644 --- a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/tx/HapiTransactionService.java +++ b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/tx/HapiTransactionService.java @@ -76,6 +76,13 @@ public class HapiTransactionService implements IHapiTransactionService { private static final ThreadLocal ourRequestPartitionThreadLocal = new ThreadLocal<>(); private static final ThreadLocal ourExistingTransaction = new ThreadLocal<>(); + /** + * Default value for {@link #setTransactionPropagationWhenChangingPartitions(Propagation)} + * + * @since 7.6.0 + */ + public static final Propagation DEFAULT_TRANSACTION_PROPAGATION_WHEN_CHANGING_PARTITIONS = Propagation.REQUIRED; + @Autowired protected IInterceptorBroadcaster myInterceptorBroadcaster; @@ -88,7 +95,8 @@ public class HapiTransactionService implements IHapiTransactionService { @Autowired protected PartitionSettings myPartitionSettings; - private Propagation myTransactionPropagationWhenChangingPartitions = Propagation.REQUIRED; + private Propagation myTransactionPropagationWhenChangingPartitions = + DEFAULT_TRANSACTION_PROPAGATION_WHEN_CHANGING_PARTITIONS; private SleepUtil mySleepUtil = new SleepUtil(); @@ -264,7 +272,7 @@ public class HapiTransactionService implements IHapiTransactionService { try { ourExistingTransaction.set(this); - if (myTransactionPropagationWhenChangingPartitions == Propagation.REQUIRES_NEW) { + if (isRequiresNewTransactionWhenChangingPartitions()) { return executeInNewTransactionForPartitionChange( theExecutionBuilder, theCallback, requestPartitionId, previousRequestPartitionId); } else { @@ -276,6 +284,11 @@ public class HapiTransactionService implements IHapiTransactionService { } } + @Override + public boolean isRequiresNewTransactionWhenChangingPartitions() { + return myTransactionPropagationWhenChangingPartitions == Propagation.REQUIRES_NEW; + } + @Nullable private T executeInNewTransactionForPartitionChange( ExecutionBuilder theExecutionBuilder, @@ -567,7 +580,8 @@ public class HapiTransactionService implements IHapiTransactionService { return TransactionSynchronizationManager.isActualTransactionActive() && (!TransactionSynchronizationManager.isCurrentTransactionReadOnly() || theExecutionBuilder.myReadOnly) && (theExecutionBuilder.myPropagation == null - || theExecutionBuilder.myPropagation == Propagation.REQUIRED); + || theExecutionBuilder.myPropagation + == DEFAULT_TRANSACTION_PROPAGATION_WHEN_CHANGING_PARTITIONS); } @Nullable diff --git a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/tx/IHapiTransactionService.java b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/tx/IHapiTransactionService.java index a3627d16083..7d4f18a1216 100644 --- a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/tx/IHapiTransactionService.java +++ b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/dao/tx/IHapiTransactionService.java @@ -23,6 +23,7 @@ import ca.uhn.fhir.interceptor.model.RequestPartitionId; import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.api.server.storage.TransactionDetails; import ca.uhn.fhir.util.ICallable; +import com.google.common.annotations.Beta; import jakarta.annotation.Nonnull; import jakarta.annotation.Nullable; import org.springframework.transaction.annotation.Isolation; @@ -90,6 +91,19 @@ public interface IHapiTransactionService { @Nonnull Isolation theIsolation, @Nonnull ICallable theCallback); + /** + * Returns {@literal true} if this transaction service will open a new + * transaction when the request partition is for a different partition than + * the currently executing partition. + *

+ * This is an experimental API, subject to change in a future release. + *

+ * + * @since 7.4.0 + */ + @Beta + boolean isRequiresNewTransactionWhenChangingPartitions(); + interface IExecutionBuilder extends TransactionOperations { IExecutionBuilder withIsolation(Isolation theIsolation); diff --git a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/subscription/match/registry/SubscriptionCanonicalizer.java b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/subscription/match/registry/SubscriptionCanonicalizer.java index 03ab9fff618..2d16fb306c1 100644 --- a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/subscription/match/registry/SubscriptionCanonicalizer.java +++ b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/subscription/match/registry/SubscriptionCanonicalizer.java @@ -123,7 +123,7 @@ public class SubscriptionCanonicalizer { retVal.setIdElement(subscription.getIdElement()); retVal.setPayloadString(channel.getPayload()); retVal.setTags(extractTags(subscription)); - handleCrossPartition(theSubscription, retVal); + retVal.setCrossPartitionEnabled(handleCrossPartition(theSubscription)); retVal.setSendDeleteMessages(extractDeleteExtensionDstu2(subscription)); } catch (FHIRException theE) { throw new InternalErrorException(Msg.code(557) + theE); @@ -134,7 +134,7 @@ public class SubscriptionCanonicalizer { private boolean extractDeleteExtensionDstu2(ca.uhn.fhir.model.dstu2.resource.Subscription theSubscription) { return theSubscription.getChannel().getUndeclaredExtensionsByUrl(EX_SEND_DELETE_MESSAGES).stream() .map(ExtensionDt::getValue) - .map(value -> (BooleanDt) value) + .map(BooleanDt.class::cast) .map(BasePrimitive::getValue) .findFirst() .orElse(false); @@ -175,7 +175,7 @@ public class SubscriptionCanonicalizer { retVal.setPayloadSearchCriteria( getExtensionString(subscription, HapiExtensions.EXT_SUBSCRIPTION_PAYLOAD_SEARCH_CRITERIA)); retVal.setTags(extractTags(subscription)); - handleCrossPartition(theSubscription, retVal); + retVal.setCrossPartitionEnabled(handleCrossPartition(theSubscription)); if (retVal.getChannelType() == CanonicalSubscriptionChannelType.EMAIL) { String from; @@ -306,7 +306,7 @@ public class SubscriptionCanonicalizer { getExtensionString(subscription, HapiExtensions.EXT_SUBSCRIPTION_PAYLOAD_SEARCH_CRITERIA)); retVal.setTags(extractTags(subscription)); setPartitionIdOnReturnValue(theSubscription, retVal); - handleCrossPartition(theSubscription, retVal); + retVal.setCrossPartitionEnabled(handleCrossPartition(theSubscription)); List profiles = subscription.getMeta().getProfile(); @@ -401,7 +401,7 @@ public class SubscriptionCanonicalizer { } List topicExts = subscription.getExtensionsByUrl("http://hl7.org/fhir/subscription/topics"); - if (topicExts.size() > 0) { + if (!topicExts.isEmpty()) { IBaseReference ref = (IBaseReference) topicExts.get(0).getValueAsPrimitive(); if (!"EventDefinition".equals(ref.getReferenceElement().getResourceType())) { throw new PreconditionFailedException(Msg.code(563) + "Topic reference must be an EventDefinition"); @@ -499,7 +499,7 @@ public class SubscriptionCanonicalizer { List topicExts = subscription.getExtensionsByUrl("http://hl7.org/fhir/subscription/topics"); - if (topicExts.size() > 0) { + if (!topicExts.isEmpty()) { IBaseReference ref = (IBaseReference) topicExts.get(0).getValueAsPrimitive(); if (!"EventDefinition".equals(ref.getReferenceElement().getResourceType())) { throw new PreconditionFailedException(Msg.code(566) + "Topic reference must be an EventDefinition"); @@ -511,7 +511,7 @@ public class SubscriptionCanonicalizer { retVal.setSendDeleteMessages(extension.getValueBooleanType().booleanValue()); } - handleCrossPartition(theSubscription, retVal); + retVal.setCrossPartitionEnabled(handleCrossPartition(theSubscription)); return retVal; } @@ -531,7 +531,7 @@ public class SubscriptionCanonicalizer { List topicExts = subscription.getExtensionsByUrl("http://hl7.org/fhir/subscription/topics"); - if (topicExts.size() > 0) { + if (!topicExts.isEmpty()) { IBaseReference ref = (IBaseReference) topicExts.get(0).getValueAsPrimitive(); if (!"EventDefinition".equals(ref.getReferenceElement().getResourceType())) { throw new PreconditionFailedException(Msg.code(2325) + "Topic reference must be an EventDefinition"); @@ -568,16 +568,15 @@ public class SubscriptionCanonicalizer { retVal.getTopicSubscription().setTopic(subscription.getTopic()); retVal.setChannelType(getChannelType(subscription)); - subscription.getFilterBy().forEach(filter -> { - retVal.getTopicSubscription().addFilter(convertFilter(filter)); - }); + subscription.getFilterBy().forEach(filter -> retVal.getTopicSubscription() + .addFilter(convertFilter(filter))); retVal.getTopicSubscription().setHeartbeatPeriod(subscription.getHeartbeatPeriod()); retVal.getTopicSubscription().setMaxCount(subscription.getMaxCount()); setR5FlagsBasedOnChannelType(subscription, retVal); - handleCrossPartition(theSubscription, retVal); + retVal.setCrossPartitionEnabled(handleCrossPartition(theSubscription)); return retVal; } @@ -781,11 +780,23 @@ public class SubscriptionCanonicalizer { return status.getValueAsString(); } - private void handleCrossPartition(IBaseResource theSubscription, CanonicalSubscription retVal) { - if (mySubscriptionSettings.isCrossPartitionSubscriptionEnabled()) { - retVal.setCrossPartitionEnabled(true); - } else { - retVal.setCrossPartitionEnabled(SubscriptionUtil.isCrossPartition(theSubscription)); + private boolean handleCrossPartition(IBaseResource theSubscription) { + RequestPartitionId requestPartitionId = + (RequestPartitionId) theSubscription.getUserData(Constants.RESOURCE_PARTITION_ID); + + boolean isSubscriptionCreatedOnDefaultPartition = false; + + if (nonNull(requestPartitionId)) { + isSubscriptionCreatedOnDefaultPartition = requestPartitionId.isDefaultPartition(); } + + boolean isSubscriptionDefinededAsCrossPartitionSubscription = + SubscriptionUtil.isDefinedAsCrossPartitionSubcription(theSubscription); + boolean isGlobalSettingCrossPartitionSubscriptionEnabled = + mySubscriptionSettings.isCrossPartitionSubscriptionEnabled(); + + return isSubscriptionCreatedOnDefaultPartition + && isSubscriptionDefinededAsCrossPartitionSubscription + && isGlobalSettingCrossPartitionSubscriptionEnabled; } } diff --git a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/subscription/model/CanonicalSubscription.java b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/subscription/model/CanonicalSubscription.java index 5b31111bfad..6cb37b7ae45 100644 --- a/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/subscription/model/CanonicalSubscription.java +++ b/hapi-fhir-storage/src/main/java/ca/uhn/fhir/jpa/subscription/model/CanonicalSubscription.java @@ -261,7 +261,7 @@ public class CanonicalSubscription implements Serializable, Cloneable, IModelJso myPartitionId = thePartitionId; } - public boolean getCrossPartitionEnabled() { + public boolean isCrossPartitionEnabled() { return myCrossPartitionEnabled; } diff --git a/hapi-fhir-structures-r4/src/test/java/ca/uhn/fhir/util/FhirTerserR4Test.java b/hapi-fhir-structures-r4/src/test/java/ca/uhn/fhir/util/FhirTerserR4Test.java index 032f807fcda..54ecc48086c 100644 --- a/hapi-fhir-structures-r4/src/test/java/ca/uhn/fhir/util/FhirTerserR4Test.java +++ b/hapi-fhir-structures-r4/src/test/java/ca/uhn/fhir/util/FhirTerserR4Test.java @@ -21,6 +21,7 @@ import org.hl7.fhir.r4.model.Enumerations; import org.hl7.fhir.r4.model.Extension; import org.hl7.fhir.r4.model.IdType; import org.hl7.fhir.r4.model.Identifier; +import org.hl7.fhir.r4.model.Library; import org.hl7.fhir.r4.model.MarkdownType; import org.hl7.fhir.r4.model.Medication; import org.hl7.fhir.r4.model.MedicationAdministration; @@ -28,6 +29,7 @@ import org.hl7.fhir.r4.model.MedicationRequest; import org.hl7.fhir.r4.model.Money; import org.hl7.fhir.r4.model.Observation; import org.hl7.fhir.r4.model.Organization; +import org.hl7.fhir.r4.model.Parameters; import org.hl7.fhir.r4.model.Patient; import org.hl7.fhir.r4.model.Patient.LinkType; import org.hl7.fhir.r4.model.Practitioner; @@ -1528,6 +1530,27 @@ public class FhirTerserR4Test { return retVal; } + @Test + void copyingAndParsingCreatesDuplicateContainedResources() { + var input = new Library(); + var params = new Parameters(); + var id = "#expansion-parameters-ecr"; + params.setId(id); + params.addParameter("system-version", new StringType("test2")); + var paramsExt = new Extension(); + paramsExt.setUrl("test").setValue(new Reference(id)); + input.addContained(params); + input.addExtension(paramsExt); + final var parser = FhirContext.forR4Cached().newJsonParser(); + var stringified = parser.encodeResourceToString(input); + var parsed = parser.parseResource(stringified); + var copy = ((Library) parsed).copy(); + assertEquals(1, copy.getContained().size()); + var stringifiedCopy = parser.encodeResourceToString(copy); + var parsedCopy = parser.parseResource(stringifiedCopy); + assertEquals(1, ((Library) parsedCopy).getContained().size()); + } + /** * See http://stackoverflow.com/questions/182636/how-to-determine-the-class-of-a-generic-type */ diff --git a/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/support/RemoteTerminologyServiceValidationSupport.java b/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/support/RemoteTerminologyServiceValidationSupport.java index a75e7c73406..b53a0c70dcd 100644 --- a/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/support/RemoteTerminologyServiceValidationSupport.java +++ b/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/support/RemoteTerminologyServiceValidationSupport.java @@ -12,6 +12,8 @@ import ca.uhn.fhir.i18n.Msg; import ca.uhn.fhir.rest.api.SummaryEnum; import ca.uhn.fhir.rest.client.api.IGenericClient; import ca.uhn.fhir.rest.gclient.IQuery; +import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; +import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.util.BundleUtil; import ca.uhn.fhir.util.ParametersUtil; import jakarta.annotation.Nonnull; @@ -31,7 +33,6 @@ import org.hl7.fhir.r4.model.Parameters.ParametersParameterComponent; import org.hl7.fhir.r4.model.Property; import org.hl7.fhir.r4.model.StringType; import org.hl7.fhir.r4.model.Type; -import org.hl7.fhir.r4.model.ValueSet; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -81,6 +82,7 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup String theCode, String theDisplay, String theValueSetUrl) { + return invokeRemoteValidateCode(theCodeSystem, theCode, theDisplay, theValueSetUrl, null); } @@ -99,12 +101,7 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup // so let's try to get it from the VS if is not present String codeSystem = theCodeSystem; if (isNotBlank(theCode) && isBlank(codeSystem)) { - codeSystem = extractCodeSystemForCode((ValueSet) theValueSet, theCode); - } - - // Remote terminology services shouldn't be used to validate codes with an implied system - if (isBlank(codeSystem)) { - return null; + codeSystem = ValidationSupportUtils.extractCodeSystemForCode(theValueSet, theCode); } String valueSetUrl = DefaultProfileValidationSupport.getConformanceResourceUrl(myCtx, valueSet); @@ -116,48 +113,6 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup return invokeRemoteValidateCode(codeSystem, theCode, theDisplay, valueSetUrl, valueSet); } - /** - * Try to obtain the codeSystem of the received code from the received ValueSet - */ - private String extractCodeSystemForCode(ValueSet theValueSet, String theCode) { - if (theValueSet.getCompose() == null - || theValueSet.getCompose().getInclude() == null - || theValueSet.getCompose().getInclude().isEmpty()) { - return null; - } - - if (theValueSet.getCompose().getInclude().size() == 1) { - ValueSet.ConceptSetComponent include = - theValueSet.getCompose().getInclude().iterator().next(); - return getVersionedCodeSystem(include); - } - - // when component has more than one include, their codeSystem(s) could be different, so we need to make sure - // that we are picking up the system for the include filter to which the code corresponds - for (ValueSet.ConceptSetComponent include : theValueSet.getCompose().getInclude()) { - if (include.hasSystem()) { - for (ValueSet.ConceptReferenceComponent concept : include.getConcept()) { - if (concept.hasCodeElement() && concept.getCode().equals(theCode)) { - return getVersionedCodeSystem(include); - } - } - } - } - - // at this point codeSystem couldn't be extracted for a multi-include ValueSet. Just on case it was - // because the format was not well handled, let's allow to watch the VS by an easy logging change - ourLog.trace("CodeSystem couldn't be extracted for code: {} for ValueSet: {}", theCode, theValueSet.getId()); - return null; - } - - private String getVersionedCodeSystem(ValueSet.ConceptSetComponent theComponent) { - String codeSystem = theComponent.getSystem(); - if (!codeSystem.contains("|") && theComponent.hasVersion()) { - codeSystem += "|" + theComponent.getVersion(); - } - return codeSystem; - } - @Override public IBaseResource fetchCodeSystem(String theSystem) { // callers of this want the whole resource. @@ -204,43 +159,56 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup FhirContext fhirContext = client.getFhirContext(); FhirVersionEnum fhirVersion = fhirContext.getVersion().getVersion(); - switch (fhirVersion) { - case DSTU3: - case R4: - IBaseParameters params = ParametersUtil.newInstance(fhirContext); - ParametersUtil.addParameterToParametersString(fhirContext, params, "code", code); - if (!StringUtils.isEmpty(system)) { - ParametersUtil.addParameterToParametersString(fhirContext, params, "system", system); - } - if (!StringUtils.isEmpty(displayLanguage)) { - ParametersUtil.addParameterToParametersString(fhirContext, params, "language", displayLanguage); - } - for (String propertyName : theLookupCodeRequest.getPropertyNames()) { - ParametersUtil.addParameterToParametersCode(fhirContext, params, "property", propertyName); - } - Class codeSystemClass = - myCtx.getResourceDefinition("CodeSystem").getImplementingClass(); - IBaseParameters outcome = client.operation() - .onType(codeSystemClass) - .named("$lookup") - .withParameters(params) - .useHttpGet() - .execute(); - if (outcome != null && !outcome.isEmpty()) { - switch (fhirVersion) { - case DSTU3: - return generateLookupCodeResultDstu3( - code, system, (org.hl7.fhir.dstu3.model.Parameters) outcome); - case R4: - return generateLookupCodeResultR4(code, system, (Parameters) outcome); - } - } - break; - default: - throw new UnsupportedOperationException(Msg.code(710) + "Unsupported FHIR version '" - + fhirVersion.getFhirVersionString() + "'. Only DSTU3 and R4 are supported."); + if (fhirVersion.isNewerThan(FhirVersionEnum.R4) || fhirVersion.isOlderThan(FhirVersionEnum.DSTU3)) { + throw new UnsupportedOperationException(Msg.code(710) + "Unsupported FHIR version '" + + fhirVersion.getFhirVersionString() + "'. Only DSTU3 and R4 are supported."); } - return null; + + IBaseParameters params = ParametersUtil.newInstance(fhirContext); + ParametersUtil.addParameterToParametersString(fhirContext, params, "code", code); + if (!StringUtils.isEmpty(system)) { + ParametersUtil.addParameterToParametersString(fhirContext, params, "system", system); + } + if (!StringUtils.isEmpty(displayLanguage)) { + ParametersUtil.addParameterToParametersString(fhirContext, params, "language", displayLanguage); + } + for (String propertyName : theLookupCodeRequest.getPropertyNames()) { + ParametersUtil.addParameterToParametersCode(fhirContext, params, "property", propertyName); + } + Class codeSystemClass = + myCtx.getResourceDefinition("CodeSystem").getImplementingClass(); + IBaseParameters outcome; + try { + outcome = client.operation() + .onType(codeSystemClass) + .named("$lookup") + .withParameters(params) + .useHttpGet() + .execute(); + } catch (ResourceNotFoundException | InvalidRequestException e) { + // this can potentially be moved to an interceptor and be reused in other areas + // where we call a remote server or by the client as a custom interceptor + // that interceptor would alter the status code of the response and the body into a different format + // e.g. ClientResponseInterceptorModificationTemplate + ourLog.error(e.getMessage(), e); + LookupCodeResult result = LookupCodeResult.notFound(system, code); + result.setErrorMessage( + getErrorMessage("unknownCodeInSystem", system, code, client.getServerBase(), e.getMessage())); + return result; + } + if (outcome != null && !outcome.isEmpty()) { + if (fhirVersion == FhirVersionEnum.DSTU3) { + return generateLookupCodeResultDstu3(code, system, (org.hl7.fhir.dstu3.model.Parameters) outcome); + } + if (fhirVersion == FhirVersionEnum.R4) { + return generateLookupCodeResultR4(code, system, (Parameters) outcome); + } + } + return LookupCodeResult.notFound(system, code); + } + + protected String getErrorMessage(String errorCode, Object... theParams) { + return getFhirContext().getLocalizer().getMessage(getClass(), errorCode, theParams); } private LookupCodeResult generateLookupCodeResultDstu3( @@ -278,6 +246,7 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup case "abstract": result.setCodeIsAbstract(Boolean.parseBoolean(parameterTypeAsString)); break; + default: } } return result; @@ -384,6 +353,7 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup case "value": conceptDesignation.setValue(designationComponent.getValue().toString()); break; + default: } } return conceptDesignation; @@ -422,6 +392,7 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup case "abstract": result.setCodeIsAbstract(Boolean.parseBoolean(parameterTypeAsString)); break; + default: } } return result; @@ -508,6 +479,7 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup case "value": conceptDesignation.setValue(designationComponentValue.toString()); break; + default: } } return conceptDesignation; @@ -591,6 +563,10 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup return retVal; } + public String getBaseUrl() { + return myBaseUrl; + } + protected CodeValidationResult invokeRemoteValidateCode( String theCodeSystem, String theCode, String theDisplay, String theValueSetUrl, IBaseResource theValueSet) { if (isBlank(theCode)) { @@ -607,11 +583,22 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup resourceType = "CodeSystem"; } - IBaseParameters output = client.operation() - .onType(resourceType) - .named("validate-code") - .withParameters(input) - .execute(); + IBaseParameters output; + try { + output = client.operation() + .onType(resourceType) + .named("validate-code") + .withParameters(input) + .execute(); + } catch (ResourceNotFoundException | InvalidRequestException ex) { + ourLog.error(ex.getMessage(), ex); + CodeValidationResult result = new CodeValidationResult(); + result.setSeverity(IssueSeverity.ERROR); + String errorMessage = buildErrorMessage( + theCodeSystem, theCode, theValueSetUrl, theValueSet, client.getServerBase(), ex.getMessage()); + result.setMessage(errorMessage); + return result; + } List resultValues = ParametersUtil.getNamedParameterValuesAsString(getFhirContext(), output, "result"); if (resultValues.isEmpty() || isBlank(resultValues.get(0))) { @@ -643,6 +630,21 @@ public class RemoteTerminologyServiceValidationSupport extends BaseValidationSup return retVal; } + private String buildErrorMessage( + String theCodeSystem, + String theCode, + String theValueSetUrl, + IBaseResource theValueSet, + String theServerUrl, + String theServerMessage) { + if (theValueSetUrl == null && theValueSet == null) { + return getErrorMessage("unknownCodeInSystem", theCodeSystem, theCode, theServerUrl, theServerMessage); + } else { + return getErrorMessage( + "unknownCodeInValueSet", theCodeSystem, theCode, theValueSetUrl, theServerUrl, theServerMessage); + } + } + protected IBaseParameters buildValidateCodeInputParameters( String theCodeSystem, String theCode, String theDisplay, String theValueSetUrl, IBaseResource theValueSet) { IBaseParameters params = ParametersUtil.newInstance(getFhirContext()); diff --git a/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/support/ValidationSupportUtils.java b/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/support/ValidationSupportUtils.java new file mode 100644 index 00000000000..7321f33d8c8 --- /dev/null +++ b/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/support/ValidationSupportUtils.java @@ -0,0 +1,133 @@ +package org.hl7.fhir.common.hapi.validation.support; + +import org.hl7.fhir.instance.model.api.IBaseResource; +import org.hl7.fhir.r4.model.ValueSet; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public final class ValidationSupportUtils { + + private static final Logger ourLog = LoggerFactory.getLogger(ValidationSupportUtils.class); + + private ValidationSupportUtils() {} + + public static String extractCodeSystemForCode(IBaseResource theValueSet, String theCode) { + if (theValueSet instanceof org.hl7.fhir.dstu3.model.ValueSet) { + return extractCodeSystemForCodeDSTU3((org.hl7.fhir.dstu3.model.ValueSet) theValueSet, theCode); + } else if (theValueSet instanceof ValueSet) { + return extractCodeSystemForCodeR4((ValueSet) theValueSet, theCode); + } else if (theValueSet instanceof org.hl7.fhir.r5.model.ValueSet) { + return extractCodeSystemForCodeR5((org.hl7.fhir.r5.model.ValueSet) theValueSet, theCode); + } + return null; + } + + /** + * Try to obtain the codeSystem of the received code from the input DSTU3 ValueSet + */ + private static String extractCodeSystemForCodeDSTU3(org.hl7.fhir.dstu3.model.ValueSet theValueSet, String theCode) { + if (theValueSet.getCompose().getInclude().isEmpty()) { + return null; + } + + if (theValueSet.getCompose().getInclude().size() == 1) { + org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent include = + theValueSet.getCompose().getInclude().iterator().next(); + return include.hasSystem() ? getVersionedCodeSystem(include.getSystem(), include.getVersion()) : null; + } + + // when component has more than one include, their codeSystem(s) could be different, so we need to make sure + // that we are picking up the system for the include filter to which the code corresponds + for (org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent include : + theValueSet.getCompose().getInclude()) { + if (include.hasSystem()) { + for (org.hl7.fhir.dstu3.model.ValueSet.ConceptReferenceComponent concept : include.getConcept()) { + if (concept.hasCodeElement() && concept.getCode().equals(theCode)) { + return getVersionedCodeSystem(include.getSystem(), include.getVersion()); + } + } + } + } + + // at this point codeSystem couldn't be extracted for a multi-include ValueSet. Just on case it was + // because the format was not well handled, let's allow to watch the VS by an easy logging change + logCodeAndValueSet(theCode, theValueSet.getId()); + return null; + } + + /** + * Try to obtain the codeSystem of the received code from the input R4 ValueSet + */ + private static String extractCodeSystemForCodeR4(ValueSet theValueSet, String theCode) { + if (theValueSet.getCompose().getInclude().isEmpty()) { + return null; + } + + if (theValueSet.getCompose().getInclude().size() == 1) { + ValueSet.ConceptSetComponent include = + theValueSet.getCompose().getInclude().iterator().next(); + return include.hasSystem() ? getVersionedCodeSystem(include.getSystem(), include.getVersion()) : null; + } + + // when component has more than one include, their codeSystem(s) could be different, so we need to make sure + // that we are picking up the system for the include filter to which the code corresponds + for (ValueSet.ConceptSetComponent include : theValueSet.getCompose().getInclude()) { + if (include.hasSystem()) { + for (ValueSet.ConceptReferenceComponent concept : include.getConcept()) { + if (concept.hasCodeElement() && concept.getCode().equals(theCode)) { + return getVersionedCodeSystem(include.getSystem(), include.getVersion()); + } + } + } + } + + // at this point codeSystem couldn't be extracted for a multi-include ValueSet. Just on case it was + // because the format was not well handled, let's allow to watch the VS by an easy logging change + logCodeAndValueSet(theCode, theValueSet.getId()); + return null; + } + + private static String getVersionedCodeSystem(String theCodeSystem, String theVersion) { + if (!theCodeSystem.contains("|") && theVersion != null) { + return theCodeSystem + "|" + theVersion; + } + return theCodeSystem; + } + + /** + * Try to obtain the codeSystem of the received code from the input R5 ValueSet + */ + private static String extractCodeSystemForCodeR5(org.hl7.fhir.r5.model.ValueSet theValueSet, String theCode) { + if (theValueSet.getCompose().getInclude().isEmpty()) { + return null; + } + + if (theValueSet.getCompose().getInclude().size() == 1) { + org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent include = + theValueSet.getCompose().getInclude().iterator().next(); + return include.hasSystem() ? getVersionedCodeSystem(include.getSystem(), include.getVersion()) : null; + } + + // when component has more than one include, their codeSystem(s) could be different, so we need to make sure + // that we are picking up the system for the include filter to which the code corresponds + for (org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent include : + theValueSet.getCompose().getInclude()) { + if (include.hasSystem()) { + for (org.hl7.fhir.r5.model.ValueSet.ConceptReferenceComponent concept : include.getConcept()) { + if (concept.hasCodeElement() && concept.getCode().equals(theCode)) { + return getVersionedCodeSystem(include.getSystem(), include.getVersion()); + } + } + } + } + + // at this point codeSystem couldn't be extracted for a multi-include ValueSet. Just on case it was + // because the format was not well handled, let's allow to watch the VS by an easy logging change + logCodeAndValueSet(theCode, theValueSet.getId()); + return null; + } + + private static void logCodeAndValueSet(String theCode, String theValueSet) { + ourLog.trace("CodeSystem couldn't be extracted for code: {} for ValueSet: {}", theCode, theValueSet); + } +} diff --git a/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/validator/VersionSpecificWorkerContextWrapper.java b/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/validator/VersionSpecificWorkerContextWrapper.java index 46e2a142612..64d4f75519a 100644 --- a/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/validator/VersionSpecificWorkerContextWrapper.java +++ b/hapi-fhir-validation/src/main/java/org/hl7/fhir/common/hapi/validation/validator/VersionSpecificWorkerContextWrapper.java @@ -16,6 +16,7 @@ import org.apache.commons.lang3.Validate; import org.apache.commons.lang3.builder.EqualsBuilder; import org.apache.commons.lang3.builder.HashCodeBuilder; import org.fhir.ucum.UcumService; +import org.hl7.fhir.common.hapi.validation.support.ValidationSupportUtils; import org.hl7.fhir.exceptions.FHIRException; import org.hl7.fhir.exceptions.TerminologyServiceException; import org.hl7.fhir.instance.model.api.IBaseResource; @@ -25,11 +26,13 @@ import org.hl7.fhir.r5.context.IWorkerContextManager; import org.hl7.fhir.r5.model.CodeSystem; import org.hl7.fhir.r5.model.CodeableConcept; import org.hl7.fhir.r5.model.Coding; +import org.hl7.fhir.r5.model.ElementDefinition; import org.hl7.fhir.r5.model.NamingSystem; import org.hl7.fhir.r5.model.OperationOutcome; import org.hl7.fhir.r5.model.PackageInformation; import org.hl7.fhir.r5.model.Parameters; import org.hl7.fhir.r5.model.Resource; +import org.hl7.fhir.r5.model.StringType; import org.hl7.fhir.r5.model.StructureDefinition; import org.hl7.fhir.r5.model.ValueSet; import org.hl7.fhir.r5.profilemodel.PEBuilder; @@ -66,7 +69,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo private final VersionCanonicalizer myVersionCanonicalizer; private final LoadingCache myFetchResourceCache; private volatile List myAllStructures; - private org.hl7.fhir.r5.model.Parameters myExpansionProfile; + private Parameters myExpansionProfile; public VersionSpecificWorkerContextWrapper( ValidationSupportContext theValidationSupportContext, VersionCanonicalizer theVersionCanonicalizer) { @@ -215,7 +218,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo } @Override - public org.hl7.fhir.r5.model.Parameters getExpansionParameters() { + public Parameters getExpansionParameters() { return myExpansionProfile; } @@ -224,7 +227,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo setExpansionProfile(expParameters); } - public void setExpansionProfile(org.hl7.fhir.r5.model.Parameters expParameters) { + public void setExpansionProfile(Parameters expParameters) { myExpansionProfile = expParameters; } @@ -318,7 +321,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo issue.getDetails().setText(codeValidationIssue.getMessage()); issue.addExtension() .setUrl("http://hl7.org/fhir/StructureDefinition/operationoutcome-message-id") - .setValue(new org.hl7.fhir.r5.model.StringType("Terminology_PassThrough_TX_Message")); + .setValue(new StringType("Terminology_PassThrough_TX_Message")); issues.add(issue); } return issues; @@ -369,8 +372,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo } @Override - public ValueSetExpansionOutcome expandVS( - org.hl7.fhir.r5.model.ValueSet source, boolean cacheOk, boolean Hierarchical) { + public ValueSetExpansionOutcome expandVS(ValueSet source, boolean cacheOk, boolean Hierarchical) { IBaseResource convertedSource; try { convertedSource = myVersionCanonicalizer.valueSetFromValidatorCanonical(source); @@ -381,7 +383,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo .getRootValidationSupport() .expandValueSet(myValidationSupportContext, null, convertedSource); - org.hl7.fhir.r5.model.ValueSet convertedResult = null; + ValueSet convertedResult = null; if (expanded.getValueSet() != null) { try { convertedResult = myVersionCanonicalizer.valueSetToValidatorCanonical(expanded.getValueSet()); @@ -399,7 +401,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo @Override public ValueSetExpansionOutcome expandVS( Resource src, - org.hl7.fhir.r5.model.ElementDefinition.ElementDefinitionBindingComponent binding, + ElementDefinition.ElementDefinitionBindingComponent binding, boolean cacheOk, boolean Hierarchical) { ValueSet valueSet = fetchResource(ValueSet.class, binding.getValueSet(), src); @@ -427,14 +429,14 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo } @Override - public org.hl7.fhir.r5.model.CodeSystem fetchCodeSystem(String system) { + public CodeSystem fetchCodeSystem(String system) { IBaseResource fetched = myValidationSupportContext.getRootValidationSupport().fetchCodeSystem(system); if (fetched == null) { return null; } try { - return (org.hl7.fhir.r5.model.CodeSystem) myVersionCanonicalizer.codeSystemToValidatorCanonical(fetched); + return myVersionCanonicalizer.codeSystemToValidatorCanonical(fetched); } catch (FHIRException e) { throw new InternalErrorException(Msg.code(665) + e); } @@ -448,7 +450,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo return null; } try { - return (org.hl7.fhir.r5.model.CodeSystem) myVersionCanonicalizer.codeSystemToValidatorCanonical(fetched); + return myVersionCanonicalizer.codeSystemToValidatorCanonical(fetched); } catch (FHIRException e) { throw new InternalErrorException(Msg.code(1992) + e); } @@ -746,8 +748,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo } @Override - public ValidationResult validateCode( - ValidationOptions theOptions, String code, org.hl7.fhir.r5.model.ValueSet theValueSet) { + public ValidationResult validateCode(ValidationOptions theOptions, String code, ValueSet theValueSet) { IBaseResource convertedVs = null; try { if (theValueSet != null) { @@ -757,17 +758,16 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo throw new InternalErrorException(Msg.code(690) + e); } + String system = ValidationSupportUtils.extractCodeSystemForCode(theValueSet, code); + ConceptValidationOptions validationOptions = convertConceptValidationOptions(theOptions).setInferSystem(true); - return doValidation(convertedVs, validationOptions, null, code, null); + return doValidation(convertedVs, validationOptions, system, code, null); } @Override - public ValidationResult validateCode( - ValidationOptions theOptions, - org.hl7.fhir.r5.model.Coding theCoding, - org.hl7.fhir.r5.model.ValueSet theValueSet) { + public ValidationResult validateCode(ValidationOptions theOptions, Coding theCoding, ValueSet theValueSet) { IBaseResource convertedVs = null; try { @@ -868,10 +868,7 @@ public class VersionSpecificWorkerContextWrapper extends I18nBase implements IWo } @Override - public ValidationResult validateCode( - ValidationOptions theOptions, - org.hl7.fhir.r5.model.CodeableConcept code, - org.hl7.fhir.r5.model.ValueSet theVs) { + public ValidationResult validateCode(ValidationOptions theOptions, CodeableConcept code, ValueSet theVs) { List validationResultsOk = new ArrayList<>(); List issues = new ArrayList<>(); diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/ILookupCodeTest.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/ILookupCodeTest.java index e0d693d1ea4..0b20821af34 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/ILookupCodeTest.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/ILookupCodeTest.java @@ -17,11 +17,11 @@ import java.util.ArrayList; import java.util.List; import java.util.Optional; -import static org.assertj.core.api.Assertions.assertThat; import static ca.uhn.fhir.context.support.IValidationSupport.TYPE_CODING; import static ca.uhn.fhir.context.support.IValidationSupport.TYPE_GROUP; import static ca.uhn.fhir.context.support.IValidationSupport.TYPE_STRING; import static java.util.stream.IntStream.range; +import static org.assertj.core.api.Assertions.assertThat; import static org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport.createConceptProperty; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertFalse; @@ -54,7 +54,8 @@ public interface ILookupCodeTest { default void lookupCode_forCodeSystemWithBlankCode_throwsException() { try { getService().lookupCode(null, new LookupCodeRequest(CODE_SYSTEM, "")); - fail(); } catch (IllegalArgumentException e) { + fail(); + } catch (IllegalArgumentException e) { assertEquals("theCode must be provided", e.getMessage()); } } @@ -63,6 +64,7 @@ public interface ILookupCodeTest { default void lookupCode_forCodeSystemWithPropertyInvalidType_throwsException() { // test LookupCodeResult result = new LookupCodeResult(); + result.setFound(true); result.getProperties().add(new BaseConceptProperty("someProperty") { public String getType() { return "someUnsupportedType"; @@ -72,9 +74,10 @@ public interface ILookupCodeTest { // test and verify try { - getService().lookupCode(null, new LookupCodeRequest(CODE_SYSTEM, CODE, LANGUAGE, null)); - fail(); } catch (InternalErrorException e) { - assertThat(e.getMessage()).contains("HAPI-1739: Don't know how to handle "); + getService().lookupCode(null, new LookupCodeRequest(CODE_SYSTEM, CODE, LANGUAGE, null)); + fail(); + } catch (InternalErrorException e) { + assertThat(e.getMessage()).contains("HAPI-1739: Don't know how to handle "); } } @@ -101,6 +104,7 @@ public interface ILookupCodeTest { ConceptDesignation designation1 = new ConceptDesignation().setUseCode(code1).setUseSystem("system1").setValue("value1").setLanguage("en"); ConceptDesignation designation2 = new ConceptDesignation().setUseCode(code2).setUseSystem("system2").setValue("value2").setLanguage("es"); LookupCodeResult result = new LookupCodeResult(); + result.setFound(true); result.getDesignations().add(designation1); result.getDesignations().add(designation2); getCodeSystemProvider().setLookupCodeResult(result); @@ -184,6 +188,8 @@ public interface ILookupCodeTest { assertNotNull(outcome); assertEquals(theRequest.getCode(), getCodeSystemProvider().getCode()); assertEquals(theRequest.getSystem(), getCodeSystemProvider().getSystem()); + assertEquals(theExpectedResult.isFound(), outcome.isFound()); + assertEquals(theExpectedResult.getErrorMessage(), outcome.getErrorMessage()); assertEquals(theExpectedResult.getCodeSystemDisplayName(), outcome.getCodeSystemDisplayName()); assertEquals(theExpectedResult.getCodeDisplay(), outcome.getCodeDisplay()); assertEquals(theExpectedResult.getCodeSystemVersion(), outcome.getCodeSystemVersion()); @@ -199,6 +205,7 @@ public interface ILookupCodeTest { default void verifyLookupWithConceptDesignation(final ConceptDesignation theConceptDesignation) { // setup LookupCodeResult result = new LookupCodeResult(); + result.setFound(true); result.getDesignations().add(theConceptDesignation); getCodeSystemProvider().setLookupCodeResult(result); diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/IRemoteTerminologyLookupCodeTest.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/IRemoteTerminologyLookupCodeTest.java new file mode 100644 index 00000000000..156ecea62fa --- /dev/null +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/IRemoteTerminologyLookupCodeTest.java @@ -0,0 +1,57 @@ +package org.hl7.fhir.common.hapi.validation; + +import ca.uhn.fhir.context.support.IValidationSupport.LookupCodeResult; +import ca.uhn.fhir.context.support.LookupCodeRequest; +import org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport; +import org.junit.jupiter.api.Test; + +import java.text.MessageFormat; + +/** + * Additional tests specific for Remote Terminology $lookup operation. + * Please see base interface for additional tests, implementation agnostic. + */ +public interface IRemoteTerminologyLookupCodeTest extends ILookupCodeTest { + + String MESSAGE_RESPONSE_NOT_FOUND = "Code {0} was not found"; + String MESSAGE_RESPONSE_INVALID = "Code {0} lookup is missing a system"; + + @Override + RemoteTerminologyServiceValidationSupport getService(); + + @Test + default void lookupCode_forCodeSystemWithCodeNotFound_returnsNotFound() { + String baseUrl = getService().getBaseUrl(); + final String codeNotFound = "a"; + final String system = CODE_SYSTEM; + final String codeAndSystem = system + "#" + codeNotFound; + final String exceptionMessage = MessageFormat.format(MESSAGE_RESPONSE_NOT_FOUND, codeNotFound); + LookupCodeResult result = new LookupCodeResult() + .setFound(false) + .setSearchedForCode(codeNotFound) + .setSearchedForSystem(system) + .setErrorMessage("Unknown code \"" + codeAndSystem + "\". The Remote Terminology server " + baseUrl + " returned HTTP 404 Not Found: " + exceptionMessage); + getCodeSystemProvider().setLookupCodeResult(result); + + LookupCodeRequest request = new LookupCodeRequest(system, codeNotFound, null, null); + verifyLookupCodeResult(request, result); + } + + @Test + default void lookupCode_forCodeSystemWithInvalidRequest_returnsNotFound() { + String baseUrl = getService().getBaseUrl(); + final String codeNotFound = "a"; + final String system = null; + final String codeAndSystem = system + "#" + codeNotFound; + final String exceptionMessage = MessageFormat.format(MESSAGE_RESPONSE_INVALID, codeNotFound); + LookupCodeResult result = new LookupCodeResult() + .setFound(false) + .setSearchedForCode(codeNotFound) + .setSearchedForSystem(system) + .setErrorMessage("Unknown code \"" + codeAndSystem + "\". The Remote Terminology server " + baseUrl + " returned HTTP 400 Bad Request: " + exceptionMessage); + getCodeSystemProvider().setLookupCodeResult(result); + + LookupCodeRequest request = new LookupCodeRequest(system, codeNotFound, null, null); + verifyLookupCodeResult(request, result); + } +} diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/support/ValidationSupportUtilsTest.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/support/ValidationSupportUtilsTest.java new file mode 100644 index 00000000000..6233a0b46a0 --- /dev/null +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/support/ValidationSupportUtilsTest.java @@ -0,0 +1,185 @@ +package org.hl7.fhir.common.hapi.validation.support; + +import com.google.common.collect.Lists; +import org.hl7.fhir.r4.model.ValueSet; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; + +import java.util.Collections; +import java.util.List; +import java.util.stream.Stream; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNull; + +public class ValidationSupportUtilsTest { + + public static final String SYSTEM_URL = "http://hl7.org/fhir/ValueSet/administrative-gender"; + public static final String SYSTEM_VERSION = "3.0.2"; + public static final String SYSTEM_URL_2 = "http://hl7.org/fhir/ValueSet/other-valueset"; + public static final String SYSTEM_VERSION_2 = "4.0.1"; + public static final String VALUE_SET_URL = "http://value.set/url"; + public static final String CODE = "CODE"; + public static final String NOT_THE_CODE = "not-the-code"; + + @Test + public void extractCodeSystemForCode_nullValueSet_returnsNull() { + String result = ValidationSupportUtils.extractCodeSystemForCode(null, CODE); + + assertNull(result); + } + + private static Stream extractCodeSystemForCodeDSTU3TestCases() { + List conceptWithCode = Lists.newArrayList( + new org.hl7.fhir.dstu3.model.ValueSet.ConceptReferenceComponent().setCode(NOT_THE_CODE), + new org.hl7.fhir.dstu3.model.ValueSet.ConceptReferenceComponent().setCode(CODE)); + + List conceptNoCode = Lists.newArrayList( + new org.hl7.fhir.dstu3.model.ValueSet.ConceptReferenceComponent().setCode(NOT_THE_CODE)); + + return Stream.of( + Arguments.of(Collections.emptyList(), null, "Empty ValueSet includes"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent(), + new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent()), + null, "ValueSet includes without system"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent()), + null, "ValueSet include without system"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL)), + SYSTEM_URL, "ValueSet include with one system and no code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptWithCode)), + SYSTEM_URL, "ValueSet include with one system and code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setVersion(SYSTEM_VERSION)), + SYSTEM_URL + "|" + SYSTEM_VERSION, "ValueSet include with one versioned system and no code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL), + new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + null, "ValueSet includes with two systems and no code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptWithCode), + new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + SYSTEM_URL, "ValueSet includes with two systems and correct code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setVersion(SYSTEM_VERSION).setConcept(conceptWithCode), + new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2).setVersion(SYSTEM_VERSION_2)), + SYSTEM_URL + "|" + SYSTEM_VERSION, "ValueSet includes with two systems with versions and correct code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptNoCode), + new org.hl7.fhir.dstu3.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + null, "ValueSet includes with two systems and different code")); + } + + @ParameterizedTest + @MethodSource("extractCodeSystemForCodeDSTU3TestCases") + public void extractCodeSystemForCodeDSTU3_withDifferentValueSetIncludes_returnsCorrectResult(List theValueSetComponents, + String theExpectedCodeSystem, String theMessage) { + // setup + org.hl7.fhir.dstu3.model.ValueSet valueSet = new org.hl7.fhir.dstu3.model.ValueSet(); + valueSet.setUrl(VALUE_SET_URL); + valueSet.setCompose(new org.hl7.fhir.dstu3.model.ValueSet.ValueSetComposeComponent().setInclude(theValueSetComponents)); + + // execute + String result = ValidationSupportUtils.extractCodeSystemForCode(valueSet, CODE); + + // validate + assertEquals(theExpectedCodeSystem, result, theMessage); + } + + private static Stream extractCodeSystemForCodeR4TestCases() { + List conceptWithCode = Lists.newArrayList( + new ValueSet.ConceptReferenceComponent().setCode(NOT_THE_CODE), + new ValueSet.ConceptReferenceComponent().setCode(CODE)); + + List conceptNoCode = Lists.newArrayList( + new ValueSet.ConceptReferenceComponent().setCode(NOT_THE_CODE)); + + return Stream.of( + Arguments.of(Collections.emptyList(), null, "Empty ValueSet includes"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent(), new ValueSet.ConceptSetComponent()), + null, "ValueSet includes without system"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent()), + null, "ValueSet include without system"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL)), + SYSTEM_URL, "ValueSet include with one system and no code"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptWithCode)), + SYSTEM_URL, "ValueSet include with one system and code"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setVersion(SYSTEM_VERSION)), + SYSTEM_URL + "|" + SYSTEM_VERSION, "ValueSet include with one versioned system and no code"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL), + new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + null, "ValueSet includes with two systems and no code"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptWithCode), + new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + SYSTEM_URL, "ValueSet includes with two systems and correct code"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setVersion(SYSTEM_VERSION).setConcept(conceptWithCode), + new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2).setVersion(SYSTEM_VERSION_2)), + SYSTEM_URL + "|" + SYSTEM_VERSION, "ValueSet includes with two systems with versions and correct code"), + Arguments.of(Lists.newArrayList(new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptNoCode), + new ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + null, "ValueSet includes with two systems and different code")); + } + + @ParameterizedTest + @MethodSource("extractCodeSystemForCodeR4TestCases") + public void extractCodeSystemForCodeR4_withDifferentValueSetIncludes_returnsCorrectResult(List theValueSetComponents, + String theExpectedCodeSystem, String theMessage) { + // setup + ValueSet valueSet = new ValueSet(); + valueSet.setUrl(VALUE_SET_URL); + valueSet.setCompose(new ValueSet.ValueSetComposeComponent().setInclude(theValueSetComponents)); + + // execute + String result = ValidationSupportUtils.extractCodeSystemForCode(valueSet, CODE); + + // validate + assertEquals(theExpectedCodeSystem, result, theMessage); + } + + private static Stream extractCodeSystemForCodeR5TestCases() { + List conceptWithCode = Lists.newArrayList( + new org.hl7.fhir.r5.model.ValueSet.ConceptReferenceComponent().setCode(NOT_THE_CODE), + new org.hl7.fhir.r5.model.ValueSet.ConceptReferenceComponent().setCode(CODE)); + + List conceptNoCode = Lists.newArrayList( + new org.hl7.fhir.r5.model.ValueSet.ConceptReferenceComponent().setCode(NOT_THE_CODE)); + + return Stream.of( + Arguments.of(Collections.emptyList(), null, "Empty ValueSet includes"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent(), + new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent()), + null, "ValueSet includes without system"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent()), + null, "ValueSet include without system"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL)), + SYSTEM_URL, "ValueSet include with one system and no code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptWithCode)), + SYSTEM_URL, "ValueSet include with one system and code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setVersion(SYSTEM_VERSION)), + SYSTEM_URL + "|" + SYSTEM_VERSION, "ValueSet include with one versioned system and no code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL), + new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + null, "ValueSet includes with two systems and no code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptWithCode), + new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + SYSTEM_URL, "ValueSet includes with two systems and correct code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setVersion(SYSTEM_VERSION).setConcept(conceptWithCode), + new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2).setVersion(SYSTEM_VERSION_2)), + SYSTEM_URL + "|" + SYSTEM_VERSION, "ValueSet includes with two systems with versions and correct code"), + Arguments.of(Lists.newArrayList(new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL).setConcept(conceptNoCode), + new org.hl7.fhir.r5.model.ValueSet.ConceptSetComponent().setSystem(SYSTEM_URL_2)), + null, "ValueSet includes with two systems and different code")); + } + + @ParameterizedTest + @MethodSource("extractCodeSystemForCodeR5TestCases") + public void extractCodeSystemForCodeR5_withDifferentValueSetIncludes_returnsCorrectResult(List theValueSetComponents, + String theExpectedCodeSystem, String theMessage) { + // setup + org.hl7.fhir.r5.model.ValueSet valueSet = new org.hl7.fhir.r5.model.ValueSet(); + valueSet.setUrl(VALUE_SET_URL); + valueSet.setCompose(new org.hl7.fhir.r5.model.ValueSet.ValueSetComposeComponent().setInclude(theValueSetComponents)); + + // execute + String result = ValidationSupportUtils.extractCodeSystemForCode(valueSet, CODE); + + // validate + assertEquals(theExpectedCodeSystem, result, theMessage); + } +} diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/validator/VersionSpecificWorkerContextWrapperTest.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/validator/VersionSpecificWorkerContextWrapperTest.java index 37307d603df..ceccb4b5560 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/validator/VersionSpecificWorkerContextWrapperTest.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/common/hapi/validation/validator/VersionSpecificWorkerContextWrapperTest.java @@ -8,11 +8,17 @@ import ca.uhn.fhir.fhirpath.BaseValidationTestWithInlineMocks; import ca.uhn.fhir.i18n.HapiLocalizer; import ca.uhn.hapi.converters.canonical.VersionCanonicalizer; import org.hl7.fhir.r5.model.Resource; +import org.hl7.fhir.r5.model.ValueSet; +import org.hl7.fhir.utilities.validation.ValidationOptions; import org.junit.jupiter.api.Test; import org.mockito.quality.Strictness; import static org.assertj.core.api.Assertions.assertThat; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; import static org.mockito.Mockito.withSettings; @@ -68,6 +74,28 @@ public class VersionSpecificWorkerContextWrapperTest extends BaseValidationTestW wrapper.cacheResource(mock(Resource.class)); } + @Test + public void validateCode_normally_resolvesCodeSystemFromValueSet() { + // setup + IValidationSupport validationSupport = mockValidationSupport(); + ValidationSupportContext mockContext = mockValidationSupportContext(validationSupport); + VersionCanonicalizer versionCanonicalizer = new VersionCanonicalizer(FhirContext.forR5Cached()); + VersionSpecificWorkerContextWrapper wrapper = new VersionSpecificWorkerContextWrapper(mockContext, versionCanonicalizer); + + ValueSet valueSet = new ValueSet(); + valueSet.getCompose().addInclude().setSystem("http://codesystems.com/system").addConcept().setCode("code0"); + valueSet.getCompose().addInclude().setSystem("http://codesystems.com/system2").addConcept().setCode("code2"); + when(validationSupport.fetchResource(eq(ValueSet.class), eq("http://somevalueset"))).thenReturn(valueSet); + when(validationSupport.validateCodeInValueSet(any(), any(), any(), any(), any(), any())).thenReturn(new IValidationSupport.CodeValidationResult()); + + // execute + wrapper.validateCode(new ValidationOptions(), "code0", valueSet); + + // verify + verify(validationSupport, times(1)).validateCodeInValueSet(any(), any(), eq("http://codesystems.com/system"), eq("code0"), any(), any()); + verify(validationSupport, times(1)).validateCode(any(), any(), eq("http://codesystems.com/system"), eq("code0"), any(), any()); + } + private IValidationSupport mockValidationSupportWithTwoBinaries() { IValidationSupport validationSupport; validationSupport = mockValidationSupport(); diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/dstu3/hapi/validation/FhirInstanceValidatorDstu3Test.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/dstu3/hapi/validation/FhirInstanceValidatorDstu3Test.java index c809f8b200c..6817fd74dbd 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/dstu3/hapi/validation/FhirInstanceValidatorDstu3Test.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/dstu3/hapi/validation/FhirInstanceValidatorDstu3Test.java @@ -1266,7 +1266,8 @@ public class FhirInstanceValidatorDstu3Test extends BaseValidationTestWithInline ValidationResult output = myVal.validateWithResult(input); logResultsAndReturnAll(output); - assertThat(output.getMessages().get(0).getMessage()).contains("The value provided ('notvalidcode') was not found in the value set 'ObservationStatus'"); + assertThat(output.getMessages().get(0).getMessage()).contains("Unknown code 'http://hl7.org/fhir/observation-status#notvalidcode'"); + assertThat(output.getMessages().get(1).getMessage()).contains("The value provided ('notvalidcode') was not found in the value set 'ObservationStatus'"); } @Test diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/dstu3/hapi/validation/RemoteTerminologyLookupCodeDstu3Test.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/dstu3/hapi/validation/RemoteTerminologyLookupCodeDstu3Test.java index 6269733654e..7bdf0c42518 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/dstu3/hapi/validation/RemoteTerminologyLookupCodeDstu3Test.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/dstu3/hapi/validation/RemoteTerminologyLookupCodeDstu3Test.java @@ -9,9 +9,11 @@ import ca.uhn.fhir.rest.annotation.Operation; import ca.uhn.fhir.rest.annotation.OperationParam; import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.client.interceptor.LoggingInterceptor; +import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; +import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.test.utilities.server.RestfulServerExtension; import jakarta.servlet.http.HttpServletRequest; -import org.hl7.fhir.common.hapi.validation.ILookupCodeTest; +import org.hl7.fhir.common.hapi.validation.IRemoteTerminologyLookupCodeTest; import org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport; import org.hl7.fhir.dstu3.model.BooleanType; import org.hl7.fhir.dstu3.model.CodeSystem; @@ -33,6 +35,7 @@ import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.provider.Arguments; import org.junit.jupiter.params.provider.MethodSource; +import java.text.MessageFormat; import java.util.Calendar; import java.util.List; import java.util.stream.Stream; @@ -41,7 +44,7 @@ import java.util.stream.Stream; * Version specific tests for CodeSystem $lookup against RemoteTerminologyValidationSupport. * @see RemoteTerminologyServiceValidationSupport */ -public class RemoteTerminologyLookupCodeDstu3Test implements ILookupCodeTest { +public class RemoteTerminologyLookupCodeDstu3Test implements IRemoteTerminologyLookupCodeTest { private static final FhirContext ourCtx = FhirContext.forDstu3Cached(); @RegisterExtension public static RestfulServerExtension ourRestfulServerExtension = new RestfulServerExtension(ourCtx); @@ -181,6 +184,12 @@ public class RemoteTerminologyLookupCodeDstu3Test implements ILookupCodeTest { ) { myCode = theCode; mySystemUrl = theSystem; + if (theSystem == null) { + throw new InvalidRequestException(MessageFormat.format(MESSAGE_RESPONSE_INVALID, theCode)); + } + if (!myLookupCodeResult.isFound()) { + throw new ResourceNotFoundException(MessageFormat.format(MESSAGE_RESPONSE_NOT_FOUND, theCode)); + } return myLookupCodeResult.toParameters(theRequestDetails.getFhirContext(), thePropertyNames); } diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/FhirInstanceValidatorR4Test.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/FhirInstanceValidatorR4Test.java index 30d6c4dfd47..5b9d9fc6ad5 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/FhirInstanceValidatorR4Test.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/FhirInstanceValidatorR4Test.java @@ -45,6 +45,7 @@ import org.hl7.fhir.r4.model.CodeType; import org.hl7.fhir.r4.model.Consent; import org.hl7.fhir.r4.model.ContactPoint; import org.hl7.fhir.r4.model.DateTimeType; +import org.hl7.fhir.r4.model.Enumerations; import org.hl7.fhir.r4.model.Extension; import org.hl7.fhir.r4.model.IntegerType; import org.hl7.fhir.r4.model.Media; @@ -107,6 +108,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyBoolean; import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.ArgumentMatchers.nullable; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.times; @@ -126,6 +128,7 @@ public class FhirInstanceValidatorR4Test extends BaseValidationTestWithInlineMoc private FhirValidator myFhirValidator; private ArrayList myValidConcepts; private Set myValidSystems = new HashSet<>(); + private Set myValidValueSets = new HashSet<>(); private Map myStructureDefinitionMap = new HashMap<>(); private CachingValidationSupport myValidationSupport; private IValidationSupport myMockSupport; @@ -135,6 +138,10 @@ public class FhirInstanceValidatorR4Test extends BaseValidationTestWithInlineMoc myValidConcepts.add(theSystem + "___" + theCode); } + private void addValidValueSet(String theValueSetUrl) { + myValidValueSets.add(theValueSetUrl); + } + /** * An invalid local reference should not cause a ServiceException. */ @@ -266,6 +273,16 @@ public class FhirInstanceValidatorR4Test extends BaseValidationTestWithInlineMoc } }); + when(myMockSupport.isValueSetSupported(any(), nullable(String.class))).thenAnswer(new Answer() { + @Override + public Boolean answer(InvocationOnMock theInvocation) { + String argument = theInvocation.getArgument(1, String.class); + boolean retVal = myValidValueSets.contains(argument); + ourLog.debug("isValueSetSupported({}) : {}", argument, retVal); + return retVal; + } + }); + when(myMockSupport.validateCode(any(), any(), nullable(String.class), nullable(String.class), nullable(String.class), nullable(String.class))).thenAnswer(new Answer() { @Override public IValidationSupport.CodeValidationResult answer(InvocationOnMock theInvocation) { @@ -855,6 +872,25 @@ public class FhirInstanceValidatorR4Test extends BaseValidationTestWithInlineMoc assertTrue(output.isSuccessful()); } + @Test + public void testValidate_patientWithGenderCode_resolvesCodeSystemFromValueSet() { + // setup + Patient patient = new Patient(); + patient.setGender(Enumerations.AdministrativeGender.MALE); + + addValidConcept("http://hl7.org/fhir/administrative-gender", "male"); + addValidValueSet("http://hl7.org/fhir/ValueSet/administrative-gender"); + + // execute + ValidationResult output = myFhirValidator.validateWithResult(patient); + logResultsAndReturnNonInformationalOnes(output); + + // verify + assertTrue(output.isSuccessful()); + verify(myMockSupport, times(1)).validateCodeInValueSet(any(), any(), eq("http://hl7.org/fhir/administrative-gender"), eq("male"), any(), any()); + verify(myMockSupport, times(1)).validateCode(any(), any(), eq("http://hl7.org/fhir/administrative-gender"), eq("male"), any(), any()); + } + @Test public void testValidateProfileWithExtension() throws IOException, FHIRException { PrePopulatedValidationSupport valSupport = new PrePopulatedValidationSupport(ourCtx); @@ -1298,7 +1334,8 @@ public class FhirInstanceValidatorR4Test extends BaseValidationTestWithInlineMoc "
"; ValidationResult output = myFhirValidator.validateWithResult(input); logResultsAndReturnAll(output); - assertEquals("The value provided ('notvalidcode') was not found in the value set 'ObservationStatus' (http://hl7.org/fhir/ValueSet/observation-status|4.0.1), and a code is required from this value set (error message = Unknown code 'notvalidcode' for in-memory expansion of ValueSet 'http://hl7.org/fhir/ValueSet/observation-status')", output.getMessages().get(0).getMessage()); + assertThat(output.getMessages().get(0).getMessage()).contains("Unknown code 'http://hl7.org/fhir/observation-status#notvalidcode'"); + assertThat(output.getMessages().get(1).getMessage()).contains("The value provided ('notvalidcode') was not found in the value set 'ObservationStatus' (http://hl7.org/fhir/ValueSet/observation-status|4.0.1), and a code is required from this value set (error message = Unknown code 'http://hl7.org/fhir/observation-status#notvalidcode' for in-memory expansion of ValueSet 'http://hl7.org/fhir/ValueSet/observation-status')"); } @Test @@ -1488,10 +1525,18 @@ public class FhirInstanceValidatorR4Test extends BaseValidationTestWithInlineMoc input.getValueQuantity().setCode("Heck"); output = myFhirValidator.validateWithResult(input); all = logResultsAndReturnNonInformationalOnes(output); - assertThat(all).hasSize(2); + assertThat(all).hasSize(3); + // validate first error, it has similar message as second error, but location is different + // as in R4 (as opposed to R4B/R5) Observation.value.ofType(Quantity) has no ValueSet binding, + // there is no `Unknown code for ValueSet` error message assertThat(all.get(0).getMessage()).contains("Error processing unit 'Heck': The unit 'Heck' is unknown' at position 0 (for 'http://unitsofmeasure.org#Heck')"); - assertThat(all.get(1).getMessage()).contains("The value provided ('Heck') was not found in the value set 'Body Temperature Units'"); - + assertThat(all.get(0).getLocationString()).contains("Observation.value.ofType(Quantity)"); + // validate second error, it has similar message as first error, but location is different + assertThat(all.get(1).getMessage()).contains("Error processing unit 'Heck': The unit 'Heck' is unknown' at position 0 (for 'http://unitsofmeasure.org#Heck')"); + assertThat(all.get(1).getLocationString()).contains("Observation.value.ofType(Quantity).code"); + // validate third error + assertThat(all.get(2).getMessage()).contains("The value provided ('Heck') was not found in the value set 'Body Temperature Units'"); + assertThat(all.get(2).getLocationString()).contains("Observation.value.ofType(Quantity).code"); } @Test @@ -1600,9 +1645,10 @@ public class FhirInstanceValidatorR4Test extends BaseValidationTestWithInlineMoc "}"; ValidationResult output = myFhirValidator.validateWithResult(input); List errors = logResultsAndReturnNonInformationalOnes(output); - assertThat(errors.size()).as(errors.toString()).isEqualTo(2); - assertThat(errors.get(1).getMessage()).contains("The value provided ('BLAH') was not found in the value set 'CurrencyCode' (http://hl7.org/fhir/ValueSet/currencies|4.0.1)"); - assertThat(errors.get(1).getMessage()).contains("error message = Unknown code \"urn:iso:std:iso:4217#BLAH\""); + assertThat(errors.size()).as(errors.toString()).isEqualTo(3); + assertThat(errors.get(1).getMessage()).contains("Unknown code 'urn:iso:std:iso:4217#BLAH'"); + assertThat(errors.get(2).getMessage()).contains("The value provided ('BLAH') was not found in the value set 'CurrencyCode' (http://hl7.org/fhir/ValueSet/currencies|4.0.1)"); + assertThat(errors.get(2).getMessage()).contains("error message = Unknown code \"urn:iso:std:iso:4217#BLAH\""); } diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/RemoteTerminologyLookupCodeR4Test.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/RemoteTerminologyLookupCodeR4Test.java index ec6c018bb5f..3fc83b2042a 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/RemoteTerminologyLookupCodeR4Test.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/RemoteTerminologyLookupCodeR4Test.java @@ -7,9 +7,11 @@ import ca.uhn.fhir.rest.annotation.Operation; import ca.uhn.fhir.rest.annotation.OperationParam; import ca.uhn.fhir.rest.api.server.RequestDetails; import ca.uhn.fhir.rest.client.interceptor.LoggingInterceptor; +import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; +import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.test.utilities.server.RestfulServerExtension; import jakarta.servlet.http.HttpServletRequest; -import org.hl7.fhir.common.hapi.validation.ILookupCodeTest; +import org.hl7.fhir.common.hapi.validation.IRemoteTerminologyLookupCodeTest; import org.hl7.fhir.common.hapi.validation.support.RemoteTerminologyServiceValidationSupport; import org.hl7.fhir.instance.model.api.IBaseDatatype; import org.hl7.fhir.instance.model.api.IBaseParameters; @@ -31,6 +33,7 @@ import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.provider.Arguments; import org.junit.jupiter.params.provider.MethodSource; +import java.text.MessageFormat; import java.util.Calendar; import java.util.List; import java.util.stream.Stream; @@ -42,7 +45,7 @@ import static ca.uhn.fhir.context.support.IValidationSupport.LookupCodeResult; * Version specific tests for CodeSystem $lookup against RemoteTerminologyValidationSupport. * @see RemoteTerminologyServiceValidationSupport */ -public class RemoteTerminologyLookupCodeR4Test implements ILookupCodeTest { +public class RemoteTerminologyLookupCodeR4Test implements IRemoteTerminologyLookupCodeTest { private static final FhirContext ourCtx = FhirContext.forR4Cached(); @RegisterExtension public static RestfulServerExtension ourRestfulServerExtension = new RestfulServerExtension(ourCtx); @@ -181,6 +184,12 @@ public class RemoteTerminologyLookupCodeR4Test implements ILookupCodeTest { ) { myCode = theCode; mySystemUrl = theSystem; + if (theSystem == null) { + throw new InvalidRequestException(MessageFormat.format(MESSAGE_RESPONSE_INVALID, theCode)); + } + if (!myLookupCodeResult.isFound()) { + throw new ResourceNotFoundException(MessageFormat.format(MESSAGE_RESPONSE_NOT_FOUND, theCode)); + } return myLookupCodeResult.toParameters(theRequestDetails.getFhirContext(), thePropertyNames); } @Override diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/RemoteTerminologyServiceValidationSupportR4Test.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/RemoteTerminologyServiceValidationSupportR4Test.java index a21b3da89c2..bfc82a10978 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/RemoteTerminologyServiceValidationSupportR4Test.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4/validation/RemoteTerminologyServiceValidationSupportR4Test.java @@ -21,6 +21,8 @@ import ca.uhn.fhir.rest.client.api.IHttpResponse; import ca.uhn.fhir.rest.client.interceptor.LoggingInterceptor; import ca.uhn.fhir.rest.param.UriParam; import ca.uhn.fhir.rest.server.IResourceProvider; +import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException; +import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException; import ca.uhn.fhir.test.utilities.server.RestfulServerExtension; import ca.uhn.fhir.util.ParametersUtil; import com.google.common.collect.Lists; @@ -45,11 +47,15 @@ import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Nested; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.RegisterExtension; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; import java.io.IOException; import java.util.ArrayList; import java.util.Collections; import java.util.List; +import java.util.stream.Stream; import static org.assertj.core.api.Assertions.assertThat; import static org.junit.jupiter.api.Assertions.assertEquals; @@ -119,6 +125,52 @@ public class RemoteTerminologyServiceValidationSupportR4Test extends BaseValidat assertNull(outcome); } + public static Stream getRemoteTerminologyServerResponses() { + return Stream.of( + Arguments.of(new ResourceNotFoundException("System Not Present"), "404 Not Found: System Not Present", + "Unknown code \"null#CODE\". The Remote Terminology server", null, null), + Arguments.of(new InvalidRequestException("Invalid Request"), "400 Bad Request: Invalid Request", + "Unknown code \"null#CODE\". The Remote Terminology server", null, null), + Arguments.of(new ResourceNotFoundException("System Not Present"), "404 Not Found: System Not Present", + "Unknown code \"NotFoundSystem#CODE\". The Remote Terminology server", "NotFoundSystem", null), + Arguments.of(new InvalidRequestException("Invalid Request"), "400 Bad Request: Invalid Request", + "Unknown code \"InvalidSystem#CODE\". The Remote Terminology server", "InvalidSystem", null), + Arguments.of(new ResourceNotFoundException("System Not Present"), "404 Not Found: System Not Present", + "Unknown code \"null#CODE\" for ValueSet with URL \"NotFoundValueSetUrl\". The Remote Terminology server", + null, "NotFoundValueSetUrl"), + Arguments.of(new InvalidRequestException("Invalid Request"), "400 Bad Request: Invalid Request", + "Unknown code \"null#CODE\" for ValueSet with URL \"InvalidValueSetUrl\". The Remote Terminology server", null, "InvalidValueSetUrl"), + Arguments.of(new ResourceNotFoundException("System Not Present"), "404 Not Found: System Not Present", + "Unknown code \"NotFoundSystem#CODE\" for ValueSet with URL \"NotFoundValueSetUrl\". The Remote Terminology server", + "NotFoundSystem", "NotFoundValueSetUrl"), + Arguments.of(new InvalidRequestException("Invalid Request"), "400 Bad Request: Invalid Request", + "Unknown code \"InvalidSystem#CODE\" for ValueSet with URL \"InvalidValueSetUrl\". The Remote Terminology server", "InvalidSystem", "InvalidValueSetUrl") + ); + } + + @ParameterizedTest + @MethodSource(value = "getRemoteTerminologyServerResponses") + public void testValidateCode_codeSystemAndValueSetUrlAreIncorrect_returnsValidationResultWithError(Exception theException, + String theServerMessage, + String theValidationMessage, + String theCodeSystem, + String theValueSetUrl) { + myCodeSystemProvider.myNextValidateCodeException = theException; + myValueSetProvider.myNextValidateCodeException = theException; + IValidationSupport.CodeValidationResult outcome = mySvc.validateCode(null, null, theCodeSystem, CODE, DISPLAY, theValueSetUrl); + + validateValidationErrorResult(outcome, theValidationMessage, theServerMessage); + } + + private static void validateValidationErrorResult(IValidationSupport.CodeValidationResult outcome, String... theMessages) { + assertNotNull(outcome); + assertEquals(IValidationSupport.IssueSeverity.ERROR, outcome.getSeverity()); + assertNotNull(outcome.getMessage()); + for (String message : theMessages) { + assertTrue(outcome.getMessage().contains(message)); + } + } + @Test public void testValidateCode_forValueSet_returnsCorrectly() { createNextValueSetReturnParameters(true, DISPLAY, null); @@ -209,20 +261,6 @@ public class RemoteTerminologyServiceValidationSupportR4Test extends BaseValidat assertNull(myValueSetProvider.myLastValueSet); } - /** - * Remote terminology services shouldn't be used to validate codes with an implied system - */ - @Test - public void testValidateCodeInValueSet_InferSystem() { - createNextValueSetReturnParameters(true, DISPLAY, null); - - ValueSet valueSet = new ValueSet(); - valueSet.setUrl(VALUE_SET_URL); - - IValidationSupport.CodeValidationResult outcome = mySvc.validateCodeInValueSet(null, new ConceptValidationOptions().setInferSystem(true), null, CODE, DISPLAY, valueSet); - assertNull(outcome); - } - @Test public void testTranslateCode_AllInParams_AllOutParams() { myConceptMapProvider.myNextReturnParams = new Parameters(); @@ -342,32 +380,46 @@ public class RemoteTerminologyServiceValidationSupportR4Test extends BaseValidat IValidationSupport.CodeValidationResult outcome = mySvc.validateCodeInValueSet(null, new ConceptValidationOptions().setInferSystem(true), null, CODE, DISPLAY, valueSet); - // validate service doesn't do early return (as when no code system is present) + // validate service doesn't return error message (as when no code system is present) assertNotNull(outcome); + assertNull(outcome.getMessage()); + assertTrue(outcome.isOk()); } @Nested public class MultiComposeIncludeValueSet { - @Test - public void SystemNotPresentReturnsNull() { + public static Stream getRemoteTerminologyServerExceptions() { + return Stream.of( + Arguments.of(new ResourceNotFoundException("System Not Present"), "404 Not Found: System Not Present"), + Arguments.of(new InvalidRequestException("Invalid Request"), "400 Bad Request: Invalid Request") + ); + } + + @ParameterizedTest + @MethodSource(value = "getRemoteTerminologyServerExceptions") + public void systemNotPresent_returnsValidationResultWithError(Exception theException, String theServerMessage) { + myValueSetProvider.myNextValidateCodeException = theException; createNextValueSetReturnParameters(true, DISPLAY, null); ValueSet valueSet = new ValueSet(); valueSet.setUrl(VALUE_SET_URL); valueSet.setCompose(new ValueSet.ValueSetComposeComponent().setInclude( - Lists.newArrayList(new ValueSet.ConceptSetComponent(), new ValueSet.ConceptSetComponent()) )); + Lists.newArrayList(new ValueSet.ConceptSetComponent(), new ValueSet.ConceptSetComponent()))); IValidationSupport.CodeValidationResult outcome = mySvc.validateCodeInValueSet(null, new ConceptValidationOptions().setInferSystem(true), null, CODE, DISPLAY, valueSet); - assertNull(outcome); + String unknownCodeForValueSetError = "Unknown code \"null#CODE\" for ValueSet with URL \"http://value.set/url\". The Remote Terminology server http://"; + validateValidationErrorResult(outcome, unknownCodeForValueSetError, theServerMessage); } - @Test - public void SystemPresentCodeNotPresentReturnsNull() { + @ParameterizedTest + @MethodSource(value = "getRemoteTerminologyServerExceptions") + public void systemPresentCodeNotPresent_returnsValidationResultWithError(Exception theException, String theServerMessage) { + myValueSetProvider.myNextValidateCodeException = theException; createNextValueSetReturnParameters(true, DISPLAY, null); ValueSet valueSet = new ValueSet(); @@ -377,12 +429,13 @@ public class RemoteTerminologyServiceValidationSupportR4Test extends BaseValidat valueSet.setCompose(new ValueSet.ValueSetComposeComponent().setInclude( Lists.newArrayList( new ValueSet.ConceptSetComponent().setSystem(systemUrl), - new ValueSet.ConceptSetComponent().setSystem(systemUrl2)) )); + new ValueSet.ConceptSetComponent().setSystem(systemUrl2)))); IValidationSupport.CodeValidationResult outcome = mySvc.validateCodeInValueSet(null, new ConceptValidationOptions().setInferSystem(true), null, CODE, DISPLAY, valueSet); - assertNull(outcome); + String unknownCodeForValueSetError = "Unknown code \"null#CODE\" for ValueSet with URL \"http://value.set/url\". The Remote Terminology server http://"; + validateValidationErrorResult(outcome, unknownCodeForValueSetError, theServerMessage); } @@ -526,6 +579,7 @@ public class RemoteTerminologyServiceValidationSupportR4Test extends BaseValidat static private class MyCodeSystemProvider implements IResourceProvider { private SummaryEnum myLastSummaryParam; + private Exception myNextValidateCodeException; private UriParam myLastUrlParam; private List myNextReturnCodeSystems; private UriType mySystemUrl; @@ -548,9 +602,12 @@ public class RemoteTerminologyServiceValidationSupportR4Test extends BaseValidat @OperationParam(name = "url", min = 0, max = 1) UriType theSystem, @OperationParam(name = "code", min = 0, max = 1) CodeType theCode, @OperationParam(name = "display", min = 0, max = 1) StringType theDisplay - ) { + ) throws Exception { myCode = theCode; mySystemUrl = theSystem; + if (myNextValidateCodeException != null) { + throw myNextValidateCodeException; + } return myNextValidationResult.toParameters(ourCtx); } @@ -566,6 +623,7 @@ public class RemoteTerminologyServiceValidationSupportR4Test extends BaseValidat private static class MyValueSetProvider implements IResourceProvider { private Parameters myNextReturnParams; + private Exception myNextValidateCodeException; private List myNextReturnValueSets; private UriType myLastUrl; private CodeType myLastCode; @@ -589,13 +647,16 @@ public class RemoteTerminologyServiceValidationSupportR4Test extends BaseValidat @OperationParam(name = "system", min = 0, max = 1) UriType theSystem, @OperationParam(name = "display", min = 0, max = 1) StringType theDisplay, @OperationParam(name = "valueSet") ValueSet theValueSet - ) { + ) throws Exception { myInvocationCount++; myLastUrl = theValueSetUrl; myLastCode = theCode; myLastSystem = theSystem; myLastDisplay = theDisplay; myLastValueSet = theValueSet; + if (myNextValidateCodeException != null) { + throw myNextValidateCodeException; + } return myNextReturnParams; } diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4b/validation/FhirInstanceValidatorR4BTest.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4b/validation/FhirInstanceValidatorR4BTest.java index 1b49f72075b..ca55a41bf4c 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4b/validation/FhirInstanceValidatorR4BTest.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r4b/validation/FhirInstanceValidatorR4BTest.java @@ -1216,7 +1216,8 @@ public class FhirInstanceValidatorR4BTest extends BaseValidationTestWithInlineMo "
"; ValidationResult output = myFhirValidator.validateWithResult(input); logResultsAndReturnAll(output); - assertEquals("The value provided ('notvalidcode') was not found in the value set 'ObservationStatus' (http://hl7.org/fhir/ValueSet/observation-status|4.3.0), and a code is required from this value set (error message = Unknown code 'notvalidcode' for in-memory expansion of ValueSet 'http://hl7.org/fhir/ValueSet/observation-status')", output.getMessages().get(0).getMessage()); + assertThat(output.getMessages().get(0).getMessage()).contains("Unknown code 'http://hl7.org/fhir/observation-status#notvalidcode'"); + assertThat(output.getMessages().get(1).getMessage()).contains("The value provided ('notvalidcode') was not found in the value set 'ObservationStatus' (http://hl7.org/fhir/ValueSet/observation-status|4.3.0), and a code is required from this value set (error message = Unknown code 'http://hl7.org/fhir/observation-status#notvalidcode' for in-memory expansion of ValueSet 'http://hl7.org/fhir/ValueSet/observation-status')"); } @Test @@ -1366,10 +1367,19 @@ public class FhirInstanceValidatorR4BTest extends BaseValidationTestWithInlineMo input.getValueQuantity().setCode("Heck"); output = myFhirValidator.validateWithResult(input); all = logResultsAndReturnNonInformationalOnes(output); - assertThat(all).hasSize(2); - assertThat(all.get(0).getMessage()).contains("The Coding provided (http://unitsofmeasure.org#Heck) was not found in the value set 'Vital Signs Units' (http://hl7.org/fhir/ValueSet/ucum-vitals-common|4.3.0)"); - assertThat(all.get(1).getMessage()).contains("The value provided ('Heck') was not found in the value set 'Body Temperature Units'"); - + assertThat(all).hasSize(3); + // validate first error, in R4B (as opposed to R4) Observation.value.ofType(Quantity) has ValueSet binding, + // so first error has `Unknown code for ValueSet` error message + assertThat(all.get(0).getMessage()).contains("The Coding provided (http://unitsofmeasure.org#Heck) was not found in the value set 'Vital Signs Units' " + + "(http://hl7.org/fhir/ValueSet/ucum-vitals-common|4.3.0), and a code should come from this value set unless it has no suitable code (note that the validator cannot judge what is suitable). " + + " (error message = Unknown code 'http://unitsofmeasure.org#Heck' for in-memory expansion of ValueSet 'http://hl7.org/fhir/ValueSet/ucum-vitals-common')"); + assertThat(all.get(0).getLocationString()).contains("Observation.value.ofType(Quantity)"); + // validate second error + assertThat(all.get(1).getMessage()).contains("Error processing unit 'Heck': The unit 'Heck' is unknown' at position 0 (for 'http://unitsofmeasure.org#Heck')"); + assertThat(all.get(1).getLocationString()).contains("Observation.value.ofType(Quantity).code"); + // validate third error + assertThat(all.get(2).getMessage()).contains("The value provided ('Heck') was not found in the value set 'Body Temperature Units'"); + assertThat(all.get(2).getLocationString()).contains("Observation.value.ofType(Quantity).code"); } @Test @@ -1480,8 +1490,9 @@ public class FhirInstanceValidatorR4BTest extends BaseValidationTestWithInlineMo }"""; ValidationResult output = myFhirValidator.validateWithResult(input); List errors = logResultsAndReturnNonInformationalOnes(output); - assertThat(errors.size()).as(errors.toString()).isEqualTo(2); - assertThat(errors.get(1).getMessage()).contains("The value provided ('BLAH') was not found in the value set 'CurrencyCode' (http://hl7.org/fhir/ValueSet/currencies|4.3.0), and a code is required from this value set"); + assertThat(errors.size()).as(errors.toString()).isEqualTo(3); + assertThat(errors.get(1).getMessage()).contains("Unknown code 'urn:iso:std:iso:4217#BLAH'"); + assertThat(errors.get(2).getMessage()).contains("The value provided ('BLAH') was not found in the value set 'CurrencyCode' (http://hl7.org/fhir/ValueSet/currencies|4.3.0), and a code is required from this value set"); } diff --git a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r5/validation/FhirInstanceValidatorR5Test.java b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r5/validation/FhirInstanceValidatorR5Test.java index 7068b120d29..f47af042ce1 100644 --- a/hapi-fhir-validation/src/test/java/org/hl7/fhir/r5/validation/FhirInstanceValidatorR5Test.java +++ b/hapi-fhir-validation/src/test/java/org/hl7/fhir/r5/validation/FhirInstanceValidatorR5Test.java @@ -894,7 +894,8 @@ public class FhirInstanceValidatorR5Test extends BaseValidationTestWithInlineMoc ""; ValidationResult output = myVal.validateWithResult(input); logResultsAndReturnAll(output); - assertThat(output.getMessages().get(0).getMessage()).contains("The value provided ('notvalidcode') was not found in the value set 'Observation Status' (http://hl7.org/fhir/ValueSet/observation-status|5.0.0), and a code is required from this value set (error message = Unknown code 'notvalidcode' for in-memory expansion of ValueSet 'http://hl7.org/fhir/ValueSet/observation-status')"); + assertThat(output.getMessages().get(0).getMessage()).contains("Unknown code 'http://hl7.org/fhir/observation-status#notvalidcode'"); + assertThat(output.getMessages().get(1).getMessage()).contains("The value provided ('notvalidcode') was not found in the value set 'Observation Status' (http://hl7.org/fhir/ValueSet/observation-status|5.0.0), and a code is required from this value set (error message = Unknown code 'http://hl7.org/fhir/observation-status#notvalidcode' for in-memory expansion of ValueSet 'http://hl7.org/fhir/ValueSet/observation-status')"); } diff --git a/pom.xml b/pom.xml index 57fd5637a87..fd5c167993d 100644 --- a/pom.xml +++ b/pom.xml @@ -1144,7 +1144,7 @@ com.graphql-java graphql-java - 21.0 + 21.5 From 3dd576bc86a2ce471781aa7805196ebbee0ae9f0 Mon Sep 17 00:00:00 2001 From: Michael Buckley Date: Fri, 16 Aug 2024 14:55:58 -0400 Subject: [PATCH 2/3] Extend default batch2 test deadline (#6221) Extend default batch2 test deadline 10s is very quick on slow runners. Fix bad test. --- .../fhir/jpa/batch2/Batch2CoordinatorIT.java | 2 +- .../ca/uhn/fhir/jpa/test/Batch2JobHelper.java | 41 ++++++++++--------- 2 files changed, 23 insertions(+), 20 deletions(-) diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2CoordinatorIT.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2CoordinatorIT.java index 0c40caa80c6..c2ccbf7e5a0 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2CoordinatorIT.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/batch2/Batch2CoordinatorIT.java @@ -763,7 +763,7 @@ public class Batch2CoordinatorIT extends BaseJpaR4Test { myFirstStepLatch.awaitExpected(); // validate - myBatch2JobHelper.awaitJobHasStatusWithForcedMaintenanceRuns(instanceId, StatusEnum.IN_PROGRESS); + myBatch2JobHelper.awaitJobHasStatusWithForcedMaintenanceRuns(instanceId, StatusEnum.ERRORED); // execute ourLog.info("Cancel job {}", instanceId); diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/Batch2JobHelper.java b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/Batch2JobHelper.java index 429e3ee5e39..f6758f8f444 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/Batch2JobHelper.java +++ b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/test/Batch2JobHelper.java @@ -25,6 +25,7 @@ import ca.uhn.fhir.batch2.api.IJobPersistence; import ca.uhn.fhir.batch2.model.JobInstance; import ca.uhn.fhir.batch2.model.StatusEnum; import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse; +import com.google.common.annotations.VisibleForTesting; import org.awaitility.Awaitility; import org.awaitility.core.ConditionTimeoutException; import org.slf4j.Logger; @@ -50,6 +51,8 @@ public class Batch2JobHelper { private static final Logger ourLog = LoggerFactory.getLogger(Batch2JobHelper.class); private static final int BATCH_SIZE = 100; + public static final int DEFAULT_WAIT_DEADLINE = 30; + public static final Duration DEFAULT_WAIT_DURATION = Duration.of(DEFAULT_WAIT_DEADLINE, ChronoUnit.SECONDS); private final IJobMaintenanceService myJobMaintenanceService; private final IJobCoordinator myJobCoordinator; @@ -82,11 +85,11 @@ public class Batch2JobHelper { } public JobInstance awaitJobHasStatus(String theInstanceId, StatusEnum... theExpectedStatus) { - return awaitJobHasStatus(theInstanceId, 30, theExpectedStatus); + return awaitJobHasStatus(theInstanceId, DEFAULT_WAIT_DEADLINE, theExpectedStatus); } public JobInstance awaitJobHasStatusWithoutMaintenancePass(String theInstanceId, StatusEnum... theExpectedStatus) { - return awaitJobawaitJobHasStatusWithoutMaintenancePass(theInstanceId, 10, theExpectedStatus); + return awaitJobawaitJobHasStatusWithoutMaintenancePass(theInstanceId, DEFAULT_WAIT_DEADLINE, theExpectedStatus); } public JobInstance awaitJobHasStatus(String theInstanceId, int theSecondsToWait, StatusEnum... theExpectedStatus) { @@ -144,7 +147,7 @@ public class Batch2JobHelper { return myJobCoordinator.getInstance(theBatchJobId); } - private boolean checkStatusWithMaintenancePass(String theInstanceId, StatusEnum... theExpectedStatuses) throws InterruptedException { + private boolean checkStatusWithMaintenancePass(String theInstanceId, StatusEnum... theExpectedStatuses) { if (hasStatus(theInstanceId, theExpectedStatuses)) { return true; } @@ -170,37 +173,41 @@ public class Batch2JobHelper { return awaitJobHasStatus(theInstanceId, StatusEnum.ERRORED, StatusEnum.FAILED); } - public void awaitJobHasStatusWithForcedMaintenanceRuns(String theInstanceId, StatusEnum theStatusEnum) { + public void awaitJobHasStatusWithForcedMaintenanceRuns(String theInstanceId, StatusEnum... theStatusEnums) { AtomicInteger counter = new AtomicInteger(); + Duration waitDuration = DEFAULT_WAIT_DURATION; try { await() - .atMost(Duration.of(10, ChronoUnit.SECONDS)) + .atMost(waitDuration) .until(() -> { counter.getAndIncrement(); forceRunMaintenancePass(); - return hasStatus(theInstanceId, theStatusEnum); + return hasStatus(theInstanceId, theStatusEnums); }); } catch (ConditionTimeoutException ex) { StatusEnum status = getStatus(theInstanceId); - String msg = String.format( - "Job %s has state %s after 10s timeout and %d checks", + fail(String.format( + "Job %s has state %s after %s timeout and %d checks", theInstanceId, status.name(), + waitDuration, counter.get() - ); + ), ex); } } public void awaitJobInProgress(String theInstanceId) { + Duration waitDuration = DEFAULT_WAIT_DURATION; try { await() - .atMost(Duration.of(10, ChronoUnit.SECONDS)) + .atMost(waitDuration) .until(() -> checkStatusWithMaintenancePass(theInstanceId, StatusEnum.IN_PROGRESS)); } catch (ConditionTimeoutException ex) { StatusEnum statusEnum = getStatus(theInstanceId); - String msg = String.format("Job %s still has status %s after 10 seconds.", + String msg = String.format("Job %s still has status %s after %s seconds.", theInstanceId, - statusEnum.name()); + statusEnum.name(), + waitDuration); fail(msg); } } @@ -291,12 +298,7 @@ public class Batch2JobHelper { } if (!map.isEmpty()) { - ourLog.error( - "Found Running Jobs " - + map.keySet().stream() - .map(k -> k + " : " + map.get(k)) - .collect(Collectors.joining("\n")) - ); + ourLog.error("Found Running Jobs {}",map.keySet().stream().map(k -> k + " : " + map.get(k)).collect(Collectors.joining("\n"))); return true; } @@ -305,7 +307,7 @@ public class Batch2JobHelper { public void awaitNoJobsRunning(boolean theExpectAtLeastOneJobToExist) { HashMap map = new HashMap<>(); - Awaitility.await().atMost(10, TimeUnit.SECONDS) + Awaitility.await().atMost(DEFAULT_WAIT_DURATION) .until(() -> { myJobMaintenanceService.runMaintenancePass(); @@ -335,6 +337,7 @@ public class Batch2JobHelper { myJobMaintenanceService.runMaintenancePass(); } + @VisibleForTesting public void enableMaintenanceRunner(boolean theEnabled) { myJobMaintenanceService.enableMaintenancePass(theEnabled); } From 3e2d84d0f7b4d9279d26c46ee518e17e8ce0178f Mon Sep 17 00:00:00 2001 From: Etienne Poirier <33007955+epeartree@users.noreply.github.com> Date: Mon, 19 Aug 2024 14:53:50 -0400 Subject: [PATCH 3/3] Query with FullText searching (_text) does not return expected result (#6217) * sketch of scroll solution * implementation of a scrolled search to handle Lucene large resultset. * legacy code clean up * adding changelog * addressing comments from first code review * adding more test * correcting implementation after failing test * no-op change to trigger pipeline * no-op change to trigger pipeline again --------- Co-authored-by: Michael Buckley Co-authored-by: peartree --- .../java/ca/uhn/fhir/util/TaskChunker.java | 7 ++ .../ca/uhn/fhir/util/TaskChunkerTest.java | 35 +++++- ...arching-not-returning-expected-results.yml | 8 ++ .../fhir/jpa/dao/FulltextSearchSvcImpl.java | 15 +++ .../uhn/fhir/jpa/dao/IFulltextSearchSvc.java | 11 ++ .../jpa/search/builder/SearchBuilder.java | 73 ++++-------- .../uhn/fhir/jpa/util/InClauseNormalizer.java | 65 ++++++++++ .../uhn/fhir/util/InClauseNormalizerTest.java | 72 +++++++++++ .../r4/FhirResourceDaoR4SearchLastNIT.java | 3 - .../fhir/jpa/dao/r4/FhirSearchDaoR4Test.java | 112 +++++++++++++++++- .../fhir/jpa/embedded/OracleCondition.java | 19 +++ .../jpa/embedded/annotation/OracleTest.java | 19 +++ 12 files changed, 380 insertions(+), 59 deletions(-) create mode 100644 hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_6_0/6216-fulltext-searching-not-returning-expected-results.yml create mode 100644 hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/util/InClauseNormalizer.java create mode 100644 hapi-fhir-jpaserver-base/src/test/java/ca/uhn/fhir/util/InClauseNormalizerTest.java diff --git a/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/TaskChunker.java b/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/TaskChunker.java index 7d4b80f3d08..9514e059740 100644 --- a/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/TaskChunker.java +++ b/hapi-fhir-base/src/main/java/ca/uhn/fhir/util/TaskChunker.java @@ -20,10 +20,12 @@ package ca.uhn.fhir.util; * #L% */ +import com.google.common.collect.Streams; import jakarta.annotation.Nonnull; import java.util.ArrayList; import java.util.Collection; +import java.util.Iterator; import java.util.List; import java.util.function.Consumer; import java.util.stream.Stream; @@ -57,4 +59,9 @@ public class TaskChunker { public Stream> chunk(Stream theStream, int theChunkSize) { return StreamUtil.partition(theStream, theChunkSize); } + + @Nonnull + public void chunk(Iterator theIterator, int theChunkSize, Consumer> theListConsumer) { + chunk(Streams.stream(theIterator), theChunkSize).forEach(theListConsumer); + } } diff --git a/hapi-fhir-base/src/test/java/ca/uhn/fhir/util/TaskChunkerTest.java b/hapi-fhir-base/src/test/java/ca/uhn/fhir/util/TaskChunkerTest.java index d815fde6b31..aff2b3b9b89 100644 --- a/hapi-fhir-base/src/test/java/ca/uhn/fhir/util/TaskChunkerTest.java +++ b/hapi-fhir-base/src/test/java/ca/uhn/fhir/util/TaskChunkerTest.java @@ -3,14 +3,21 @@ package ca.uhn.fhir.util; import jakarta.annotation.Nonnull; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; import org.mockito.ArgumentCaptor; import org.mockito.Captor; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import java.util.ArrayList; +import java.util.Collections; +import java.util.Iterator; import java.util.List; import java.util.function.Consumer; import java.util.stream.IntStream; +import java.util.stream.Stream; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.mockito.Mockito.times; @@ -43,8 +50,32 @@ public class TaskChunkerTest { @Nonnull private static List newIntRangeList(int startInclusive, int endExclusive) { - List input = IntStream.range(startInclusive, endExclusive).boxed().toList(); - return input; + return IntStream.range(startInclusive, endExclusive).boxed().toList(); + } + + @ParameterizedTest + @MethodSource("testIteratorChunkArguments") + void testIteratorChunk(List theListToChunk, List> theExpectedChunks) { + // given + Iterator iter = theListToChunk.iterator(); + ArrayList> result = new ArrayList<>(); + + // when + new TaskChunker().chunk(iter, 3, result::add); + + // then + assertEquals(theExpectedChunks, result); + } + + public static Stream testIteratorChunkArguments() { + return Stream.of( + Arguments.of(Collections.emptyList(), Collections.emptyList()), + Arguments.of(List.of(1), List.of(List.of(1))), + Arguments.of(List.of(1,2), List.of(List.of(1,2))), + Arguments.of(List.of(1,2,3), List.of(List.of(1,2,3))), + Arguments.of(List.of(1,2,3,4), List.of(List.of(1,2,3), List.of(4))), + Arguments.of(List.of(1,2,3,4,5,6,7,8,9), List.of(List.of(1,2,3), List.of(4,5,6), List.of(7,8,9))) + ); } } diff --git a/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_6_0/6216-fulltext-searching-not-returning-expected-results.yml b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_6_0/6216-fulltext-searching-not-returning-expected-results.yml new file mode 100644 index 00000000000..f1c3c8080e0 --- /dev/null +++ b/hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/changelog/7_6_0/6216-fulltext-searching-not-returning-expected-results.yml @@ -0,0 +1,8 @@ +--- +type: fix +issue: 6216 +jira: SMILE-8806 +title: "Previously, searches combining the `_text` query parameter (using Lucene/Elasticsearch) with query parameters +using the database (e.g. `identifier` or `date`) could miss matches when more than 500 results match the `_text` query +parameter. This has been fixed, but may be slow if many results match the `_text` query and must be checked against the +database parameters." diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/FulltextSearchSvcImpl.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/FulltextSearchSvcImpl.java index 0e436fccf10..5c45b2ca875 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/FulltextSearchSvcImpl.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/FulltextSearchSvcImpl.java @@ -32,6 +32,7 @@ import ca.uhn.fhir.jpa.dao.search.ExtendedHSearchResourceProjection; import ca.uhn.fhir.jpa.dao.search.ExtendedHSearchSearchBuilder; import ca.uhn.fhir.jpa.dao.search.IHSearchSortHelper; import ca.uhn.fhir.jpa.dao.search.LastNOperation; +import ca.uhn.fhir.jpa.dao.search.SearchScrollQueryExecutorAdaptor; import ca.uhn.fhir.jpa.model.dao.JpaPid; import ca.uhn.fhir.jpa.model.entity.ResourceTable; import ca.uhn.fhir.jpa.model.search.ExtendedHSearchBuilderConsumeAdvancedQueryClausesParams; @@ -40,6 +41,7 @@ import ca.uhn.fhir.jpa.model.search.StorageProcessingMessage; import ca.uhn.fhir.jpa.search.autocomplete.ValueSetAutocompleteOptions; import ca.uhn.fhir.jpa.search.autocomplete.ValueSetAutocompleteSearch; import ca.uhn.fhir.jpa.search.builder.ISearchQueryExecutor; +import ca.uhn.fhir.jpa.search.builder.SearchBuilder; import ca.uhn.fhir.jpa.search.builder.SearchQueryExecutors; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.searchparam.extractor.ISearchParamExtractor; @@ -183,6 +185,19 @@ public class FulltextSearchSvcImpl implements IFulltextSearchSvc { return doSearch(theResourceName, theParams, null, theMaxResultsToFetch, theRequestDetails); } + @Transactional + @Override + public ISearchQueryExecutor searchScrolled( + String theResourceType, SearchParameterMap theParams, RequestDetails theRequestDetails) { + validateHibernateSearchIsEnabled(); + + SearchQueryOptionsStep searchQueryOptionsStep = + getSearchQueryOptionsStep(theResourceType, theParams, null); + logQuery(searchQueryOptionsStep, theRequestDetails); + + return new SearchScrollQueryExecutorAdaptor(searchQueryOptionsStep.scroll(SearchBuilder.getMaximumPageSize())); + } + // keep this in sync with supportsSomeOf(); @SuppressWarnings("rawtypes") private ISearchQueryExecutor doSearch( diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/IFulltextSearchSvc.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/IFulltextSearchSvc.java index 6890b9bc26f..52dd7589947 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/IFulltextSearchSvc.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/dao/IFulltextSearchSvc.java @@ -62,6 +62,17 @@ public interface IFulltextSearchSvc { Integer theMaxResultsToFetch, RequestDetails theRequestDetails); + /** + * Query the index for a complete iterator of ALL results. (scrollable search result). + * + * @param theResourceName e.g. Patient + * @param theParams The search query + * @param theRequestDetails The request details + * @return Iterator of result PIDs + */ + ISearchQueryExecutor searchScrolled( + String theResourceName, SearchParameterMap theParams, RequestDetails theRequestDetails); + /** * Autocomplete search for NIH $expand contextDirection=existing * @param theOptions operation options diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/search/builder/SearchBuilder.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/search/builder/SearchBuilder.java index 943d3f9abb9..bb50e0dd613 100644 --- a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/search/builder/SearchBuilder.java +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/search/builder/SearchBuilder.java @@ -101,7 +101,6 @@ import ca.uhn.fhir.util.StringUtil; import ca.uhn.fhir.util.UrlUtil; import com.google.common.annotations.VisibleForTesting; import com.google.common.collect.Lists; -import com.google.common.collect.Streams; import com.healthmarketscience.sqlbuilder.Condition; import jakarta.annotation.Nonnull; import jakarta.annotation.Nullable; @@ -141,7 +140,9 @@ import java.util.stream.Collectors; import static ca.uhn.fhir.jpa.model.util.JpaConstants.UNDESIRED_RESOURCE_LINKAGES_FOR_EVERYTHING_ON_PATIENT_INSTANCE; import static ca.uhn.fhir.jpa.search.builder.QueryStack.LOCATION_POSITION; import static ca.uhn.fhir.jpa.search.builder.QueryStack.SearchForIdsParams.with; +import static ca.uhn.fhir.jpa.util.InClauseNormalizer.*; import static java.util.Objects.requireNonNull; +import static org.apache.commons.collections4.CollectionUtils.isNotEmpty; import static org.apache.commons.lang3.StringUtils.defaultString; import static org.apache.commons.lang3.StringUtils.isBlank; import static org.apache.commons.lang3.StringUtils.isNotBlank; @@ -205,9 +206,6 @@ public class SearchBuilder implements ISearchBuilder { @Autowired(required = false) private IElasticsearchSvc myIElasticsearchSvc; - @Autowired - private FhirContext myCtx; - @Autowired private IJpaStorageResourceParser myJpaStorageResourceParser; @@ -332,8 +330,7 @@ public class SearchBuilder implements ISearchBuilder { init(theParams, theSearchUuid, theRequestPartitionId); if (checkUseHibernateSearch()) { - long count = myFulltextSearchSvc.count(myResourceName, theParams.clone()); - return count; + return myFulltextSearchSvc.count(myResourceName, theParams.clone()); } List queries = createQuery(theParams.clone(), null, null, null, true, theRequest, null); @@ -404,8 +401,16 @@ public class SearchBuilder implements ISearchBuilder { fulltextMatchIds = queryHibernateSearchForEverythingPids(theRequest); resultCount = fulltextMatchIds.size(); } else { - fulltextExecutor = myFulltextSearchSvc.searchNotScrolled( - myResourceName, myParams, myMaxResultsToFetch, theRequest); + // todo performance MB - some queries must intersect with JPA (e.g. they have a chain, or we haven't + // enabled SP indexing). + // and some queries don't need JPA. We only need the scroll when we need to intersect with JPA. + // It would be faster to have a non-scrolled search in this case, since creating the scroll requires + // extra work in Elastic. + // if (eligibleToSkipJPAQuery) fulltextExecutor = myFulltextSearchSvc.searchNotScrolled( ... + + // we might need to intersect with JPA. So we might need to traverse ALL results from lucene, not just + // a page. + fulltextExecutor = myFulltextSearchSvc.searchScrolled(myResourceName, myParams, theRequest); } if (fulltextExecutor == null) { @@ -457,7 +462,8 @@ public class SearchBuilder implements ISearchBuilder { // We break the pids into chunks that fit in the 1k limit for jdbc bind params. new QueryChunker() .chunk( - Streams.stream(fulltextExecutor).collect(Collectors.toList()), + fulltextExecutor, + SearchBuilder.getMaximumPageSize(), t -> doCreateChunkedQueries( theParams, t, theOffset, sort, theCountOnlyFlag, theRequest, queries)); } @@ -560,8 +566,9 @@ public class SearchBuilder implements ISearchBuilder { boolean theCount, RequestDetails theRequest, ArrayList theQueries) { + if (thePids.size() < getMaximumPageSize()) { - normalizeIdListForLastNInClause(thePids); + thePids = normalizeIdListForInClause(thePids); } createChunkedQuery(theParams, sort, theOffset, thePids.size(), theCount, theRequest, thePids, theQueries); } @@ -885,41 +892,7 @@ public class SearchBuilder implements ISearchBuilder { && theParams.values().stream() .flatMap(Collection::stream) .flatMap(Collection::stream) - .anyMatch(t -> t instanceof ReferenceParam); - } - - private List normalizeIdListForLastNInClause(List lastnResourceIds) { - /* - The following is a workaround to a known issue involving Hibernate. If queries are used with "in" clauses with large and varying - numbers of parameters, this can overwhelm Hibernate's QueryPlanCache and deplete heap space. See the following link for more info: - https://stackoverflow.com/questions/31557076/spring-hibernate-query-plan-cache-memory-usage. - - Normalizing the number of parameters in the "in" clause stabilizes the size of the QueryPlanCache, so long as the number of - arguments never exceeds the maximum specified below. - */ - int listSize = lastnResourceIds.size(); - - if (listSize > 1 && listSize < 10) { - padIdListWithPlaceholders(lastnResourceIds, 10); - } else if (listSize > 10 && listSize < 50) { - padIdListWithPlaceholders(lastnResourceIds, 50); - } else if (listSize > 50 && listSize < 100) { - padIdListWithPlaceholders(lastnResourceIds, 100); - } else if (listSize > 100 && listSize < 200) { - padIdListWithPlaceholders(lastnResourceIds, 200); - } else if (listSize > 200 && listSize < 500) { - padIdListWithPlaceholders(lastnResourceIds, 500); - } else if (listSize > 500 && listSize < 800) { - padIdListWithPlaceholders(lastnResourceIds, 800); - } - - return lastnResourceIds; - } - - private void padIdListWithPlaceholders(List theIdList, int preferredListSize) { - while (theIdList.size() < preferredListSize) { - theIdList.add(-1L); - } + .anyMatch(ReferenceParam.class::isInstance); } private void createSort(QueryStack theQueryStack, SortSpec theSort, SearchParameterMap theParams) { @@ -1154,7 +1127,7 @@ public class SearchBuilder implements ISearchBuilder { List versionlessPids = JpaPid.toLongList(thePids); if (versionlessPids.size() < getMaximumPageSize()) { - versionlessPids = normalizeIdListForLastNInClause(versionlessPids); + versionlessPids = normalizeIdListForInClause(versionlessPids); } // -- get the resource from the searchView @@ -1243,7 +1216,7 @@ public class SearchBuilder implements ISearchBuilder { Map> tagMap = new HashMap<>(); // -- no tags - if (thePidList.size() == 0) return tagMap; + if (thePidList.isEmpty()) return tagMap; // -- get all tags for the idList Collection tagList = myResourceTagDao.findByResourceIds(thePidList); @@ -1383,7 +1356,6 @@ public class SearchBuilder implements ISearchBuilder { EntityManager entityManager = theParameters.getEntityManager(); Integer maxCount = theParameters.getMaxCount(); FhirContext fhirContext = theParameters.getFhirContext(); - DateRangeParam lastUpdated = theParameters.getLastUpdated(); RequestDetails request = theParameters.getRequestDetails(); String searchIdOrDescription = theParameters.getSearchIdOrDescription(); List desiredResourceTypes = theParameters.getDesiredResourceTypes(); @@ -1922,11 +1894,10 @@ public class SearchBuilder implements ISearchBuilder { } assert !targetResourceTypes.isEmpty(); - Set identityHashesForTypes = targetResourceTypes.stream() + return targetResourceTypes.stream() .map(type -> BaseResourceIndexedSearchParam.calculateHashIdentity( myPartitionSettings, myRequestPartitionId, type, "url")) .collect(Collectors.toSet()); - return identityHashesForTypes; } private List> partition(Collection theNextRoundMatches, int theMaxLoad) { @@ -2506,7 +2477,7 @@ public class SearchBuilder implements ISearchBuilder { private void retrieveNextIteratorQuery() { close(); - if (myQueryList != null && myQueryList.size() > 0) { + if (isNotEmpty(myQueryList)) { myResultsIterator = myQueryList.remove(0); myHasNextIteratorQuery = true; } else { diff --git a/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/util/InClauseNormalizer.java b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/util/InClauseNormalizer.java new file mode 100644 index 00000000000..b1dcc8500b2 --- /dev/null +++ b/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/util/InClauseNormalizer.java @@ -0,0 +1,65 @@ +package ca.uhn.fhir.jpa.util; + +import java.util.ArrayList; +import java.util.Collections; +import java.util.List; + +/* + This class encapsulate the implementation providing a workaround to a known issue involving Hibernate. If queries are used with "in" clauses with large and varying + numbers of parameters, this can overwhelm Hibernate's QueryPlanCache and deplete heap space. See the following link for more info: + https://stackoverflow.com/questions/31557076/spring-hibernate-query-plan-cache-memory-usage. + + Normalizing the number of parameters in the "in" clause stabilizes the size of the QueryPlanCache, so long as the number of + arguments never exceeds the maximum specified below. +*/ +public class InClauseNormalizer { + + public static List normalizeIdListForInClause(List theResourceIds) { + + List retVal = theResourceIds; + + int listSize = theResourceIds.size(); + + if (listSize > 1 && listSize < 10) { + retVal = padIdListWithPlaceholders(theResourceIds, 10); + } else if (listSize > 10 && listSize < 50) { + retVal = padIdListWithPlaceholders(theResourceIds, 50); + } else if (listSize > 50 && listSize < 100) { + retVal = padIdListWithPlaceholders(theResourceIds, 100); + } else if (listSize > 100 && listSize < 200) { + retVal = padIdListWithPlaceholders(theResourceIds, 200); + } else if (listSize > 200 && listSize < 500) { + retVal = padIdListWithPlaceholders(theResourceIds, 500); + } else if (listSize > 500 && listSize < 800) { + retVal = padIdListWithPlaceholders(theResourceIds, 800); + } + + return retVal; + } + + private static List padIdListWithPlaceholders(List theIdList, int preferredListSize) { + List retVal = theIdList; + + if (isUnmodifiableList(theIdList)) { + retVal = new ArrayList<>(preferredListSize); + retVal.addAll(theIdList); + } + + while (retVal.size() < preferredListSize) { + retVal.add(-1L); + } + + return retVal; + } + + private static boolean isUnmodifiableList(List theList) { + try { + theList.addAll(Collections.emptyList()); + } catch (Exception e) { + return true; + } + return false; + } + + private InClauseNormalizer() {} +} diff --git a/hapi-fhir-jpaserver-base/src/test/java/ca/uhn/fhir/util/InClauseNormalizerTest.java b/hapi-fhir-jpaserver-base/src/test/java/ca/uhn/fhir/util/InClauseNormalizerTest.java new file mode 100644 index 00000000000..bd2ed0603ee --- /dev/null +++ b/hapi-fhir-jpaserver-base/src/test/java/ca/uhn/fhir/util/InClauseNormalizerTest.java @@ -0,0 +1,72 @@ +package ca.uhn.fhir.util; + +import ca.uhn.fhir.jpa.util.InClauseNormalizer; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; + +import java.util.ArrayList; +import java.util.List; +import java.util.stream.Stream; + +import static java.util.Collections.nCopies; +import static java.util.Collections.unmodifiableList; +import static org.assertj.core.api.Assertions.assertThat; + +public class InClauseNormalizerTest { + private static final Long ourResourceId = 1L; + private static final Long ourPaddingValue = -1L; + + @ParameterizedTest + @MethodSource("arguments") + public void testNormalizeUnmodifiableList_willCreateNewListAndPadToSize(int theInitialListSize, int theExpectedNormalizedListSize) { + List initialList = new ArrayList<>(nCopies(theInitialListSize, ourResourceId)); + initialList = unmodifiableList(initialList); + + List normalizedList = InClauseNormalizer.normalizeIdListForInClause(initialList); + + assertNormalizedList(initialList, normalizedList, theInitialListSize, theExpectedNormalizedListSize); + } + + @ParameterizedTest + @MethodSource("arguments") + public void testNormalizeListToSizeAndPad(int theInitialListSize, int theExpectedNormalizedListSize) { + List initialList = new ArrayList<>(nCopies(theInitialListSize, ourResourceId)); + + List normalizedList = InClauseNormalizer.normalizeIdListForInClause(initialList); + + assertNormalizedList(initialList, normalizedList, theInitialListSize, theExpectedNormalizedListSize); + } + + private void assertNormalizedList(List theInitialList, List theNormalizedList, int theInitialListSize, int theExpectedNormalizedListSize) { + List expectedPaddedSubList = new ArrayList<>(nCopies(theExpectedNormalizedListSize - theInitialListSize, ourPaddingValue)); + + assertThat(theNormalizedList).startsWith(listToArray(theInitialList)); + assertThat(theNormalizedList).hasSize(theExpectedNormalizedListSize); + assertThat(theNormalizedList).endsWith(listToArray(expectedPaddedSubList)); + } + + static Long[] listToArray(List theList) { + return theList.toArray(new Long[0]); + } + + private static Stream arguments(){ + return Stream.of( + Arguments.of(0, 0), + Arguments.of(1, 1), + Arguments.of(2, 10), + Arguments.of(10, 10), + Arguments.of(12, 50), + Arguments.of(50, 50), + Arguments.of(51, 100), + Arguments.of(100, 100), + Arguments.of(150, 200), + Arguments.of(300, 500), + Arguments.of(500, 500), + Arguments.of(700, 800), + Arguments.of(800, 800), + Arguments.of(801, 801) + ); + } + +} diff --git a/hapi-fhir-jpaserver-elastic-test-utilities/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4SearchLastNIT.java b/hapi-fhir-jpaserver-elastic-test-utilities/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4SearchLastNIT.java index 970a016c16c..50c269f56b5 100644 --- a/hapi-fhir-jpaserver-elastic-test-utilities/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4SearchLastNIT.java +++ b/hapi-fhir-jpaserver-elastic-test-utilities/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirResourceDaoR4SearchLastNIT.java @@ -56,8 +56,6 @@ public class FhirResourceDaoR4SearchLastNIT extends BaseR4SearchLastN { @Mock private IHSearchEventListener mySearchEventListener; - - @Test public void testLastNChunking() { @@ -108,7 +106,6 @@ public class FhirResourceDaoR4SearchLastNIT extends BaseR4SearchLastN { secondQueryPattern.append("\\).*"); assertThat(queries.get(1).toUpperCase().replaceAll(" , ", ",")).matches(secondQueryPattern.toString()); assertThat(queries.get(3).toUpperCase().replaceAll(" , ", ",")).matches(secondQueryPattern.toString()); - } @Test diff --git a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirSearchDaoR4Test.java b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirSearchDaoR4Test.java index 875475b3d08..0a4211234fd 100644 --- a/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirSearchDaoR4Test.java +++ b/hapi-fhir-jpaserver-test-r4/src/test/java/ca/uhn/fhir/jpa/dao/r4/FhirSearchDaoR4Test.java @@ -1,8 +1,8 @@ package ca.uhn.fhir.jpa.dao.r4; -import static org.junit.jupiter.api.Assertions.assertEquals; import ca.uhn.fhir.jpa.dao.IFulltextSearchSvc; import ca.uhn.fhir.jpa.model.dao.JpaPid; +import ca.uhn.fhir.jpa.search.builder.SearchBuilder; import ca.uhn.fhir.jpa.searchparam.SearchParameterMap; import ca.uhn.fhir.jpa.test.BaseJpaR4Test; import ca.uhn.fhir.rest.api.Constants; @@ -12,6 +12,9 @@ import ca.uhn.fhir.rest.api.server.SystemRequestDetails; import ca.uhn.fhir.rest.param.StringAndListParam; import ca.uhn.fhir.rest.param.StringOrListParam; import ca.uhn.fhir.rest.param.StringParam; +import ca.uhn.fhir.rest.param.TokenAndListParam; +import ca.uhn.fhir.rest.param.TokenOrListParam; +import ca.uhn.fhir.rest.param.TokenParam; import org.hl7.fhir.r4.model.Organization; import org.hl7.fhir.r4.model.Patient; import org.junit.jupiter.api.Test; @@ -19,9 +22,11 @@ import org.springframework.beans.factory.annotation.Autowired; import org.springframework.dao.InvalidDataAccessApiUsageException; import org.springframework.transaction.support.TransactionSynchronizationManager; +import java.util.ArrayList; import java.util.List; import static org.assertj.core.api.Assertions.assertThat; +import static org.junit.jupiter.api.Assertions.assertEquals; public class FhirSearchDaoR4Test extends BaseJpaR4Test { @@ -51,11 +56,11 @@ public class FhirSearchDaoR4Test extends BaseJpaR4Test { patient.addName().addGiven(content).setFamily("hirasawa"); id1 = myPatientDao.create(patient, mySrd).getId().toUnqualifiedVersionless().getIdPartAsLong(); } - Long id2; + { Patient patient = new Patient(); patient.addName().addGiven("mio").setFamily("akiyama"); - id2 = myPatientDao.create(patient, mySrd).getId().toUnqualifiedVersionless().getIdPartAsLong(); + myPatientDao.create(patient, mySrd).getId().toUnqualifiedVersionless().getIdPartAsLong(); } SearchParameterMap params = new SearchParameterMap(); @@ -257,4 +262,105 @@ public class FhirSearchDaoR4Test extends BaseJpaR4Test { } } + @Test + public void testSearchNarrativeWithLuceneSearch() { + final int numberOfPatientsToCreate = SearchBuilder.getMaximumPageSize() + 10; + List expectedActivePatientIds = new ArrayList<>(numberOfPatientsToCreate); + + for (int i = 0; i < numberOfPatientsToCreate; i++) + { + Patient patient = new Patient(); + patient.getText().setDivAsString("
AAAS

FOO

CCC
"); + expectedActivePatientIds.add(myPatientDao.create(patient, mySrd).getId().toUnqualifiedVersionless().getIdPart()); + } + + { + Patient patient = new Patient(); + patient.getText().setDivAsString("
AAAB

FOO

CCC
"); + myPatientDao.create(patient, mySrd); + } + { + Patient patient = new Patient(); + patient.getText().setDivAsString("
ZZYZXY
"); + myPatientDao.create(patient, mySrd); + } + + SearchParameterMap map = new SearchParameterMap().setLoadSynchronous(true); + map.add(Constants.PARAM_TEXT, new StringParam("AAAS")); + + IBundleProvider searchResultBundle = myPatientDao.search(map, mySrd); + List resourceIdsFromSearchResult = searchResultBundle.getAllResourceIds(); + + assertThat(resourceIdsFromSearchResult).containsExactlyInAnyOrderElementsOf(expectedActivePatientIds); + } + + @Test + public void testLuceneNarrativeSearchQueryIntersectingJpaQuery() { + final int numberOfPatientsToCreate = SearchBuilder.getMaximumPageSize() + 10; + List expectedActivePatientIds = new ArrayList<>(numberOfPatientsToCreate); + + // create active and non-active patients with the same narrative + for (int i = 0; i < numberOfPatientsToCreate; i++) + { + Patient activePatient = new Patient(); + activePatient.getText().setDivAsString("
AAAS

FOO

CCC
"); + activePatient.setActive(true); + String patientId = myPatientDao.create(activePatient, mySrd).getId().toUnqualifiedVersionless().getIdPart(); + expectedActivePatientIds.add(patientId); + + Patient nonActivePatient = new Patient(); + nonActivePatient.getText().setDivAsString("
AAAS

FOO

CCC
"); + nonActivePatient.setActive(false); + myPatientDao.create(nonActivePatient, mySrd); + } + + SearchParameterMap map = new SearchParameterMap().setLoadSynchronous(true); + + TokenAndListParam tokenAndListParam = new TokenAndListParam(); + tokenAndListParam.addAnd(new TokenOrListParam().addOr(new TokenParam().setValue("true"))); + + map.add("active", tokenAndListParam); + map.add(Constants.PARAM_TEXT, new StringParam("AAAS")); + + IBundleProvider searchResultBundle = myPatientDao.search(map, mySrd); + List resourceIdsFromSearchResult = searchResultBundle.getAllResourceIds(); + + assertThat(resourceIdsFromSearchResult).containsExactlyInAnyOrderElementsOf(expectedActivePatientIds); + } + + @Test + public void testLuceneContentSearchQueryIntersectingJpaQuery() { + final int numberOfPatientsToCreate = SearchBuilder.getMaximumPageSize() + 10; + final String patientFamilyName = "Flanders"; + List expectedActivePatientIds = new ArrayList<>(numberOfPatientsToCreate); + + // create active and non-active patients with the same narrative + for (int i = 0; i < numberOfPatientsToCreate; i++) + { + Patient activePatient = new Patient(); + activePatient.addName().setFamily(patientFamilyName); + activePatient.setActive(true); + String patientId = myPatientDao.create(activePatient, mySrd).getId().toUnqualifiedVersionless().getIdPart(); + expectedActivePatientIds.add(patientId); + + Patient nonActivePatient = new Patient(); + nonActivePatient.addName().setFamily(patientFamilyName); + nonActivePatient.setActive(false); + myPatientDao.create(nonActivePatient, mySrd); + } + + SearchParameterMap map = new SearchParameterMap().setLoadSynchronous(true); + + TokenAndListParam tokenAndListParam = new TokenAndListParam(); + tokenAndListParam.addAnd(new TokenOrListParam().addOr(new TokenParam().setValue("true"))); + + map.add("active", tokenAndListParam); + map.add(Constants.PARAM_CONTENT, new StringParam(patientFamilyName)); + + IBundleProvider searchResultBundle = myPatientDao.search(map, mySrd); + List resourceIdsFromSearchResult = searchResultBundle.getAllResourceIds(); + + assertThat(resourceIdsFromSearchResult).containsExactlyInAnyOrderElementsOf(expectedActivePatientIds); + } + } diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/OracleCondition.java b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/OracleCondition.java index f01b3871ee7..ddefa1a127c 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/OracleCondition.java +++ b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/OracleCondition.java @@ -1,3 +1,22 @@ +/*- + * #%L + * HAPI FHIR JPA Server Test Utilities + * %% + * Copyright (C) 2014 - 2024 Smile CDR, Inc. + * %% + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * #L% + */ package ca.uhn.fhir.jpa.embedded; import org.apache.commons.lang3.StringUtils; diff --git a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/annotation/OracleTest.java b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/annotation/OracleTest.java index 8abb78f2db7..3420c16f52f 100644 --- a/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/annotation/OracleTest.java +++ b/hapi-fhir-jpaserver-test-utilities/src/main/java/ca/uhn/fhir/jpa/embedded/annotation/OracleTest.java @@ -1,3 +1,22 @@ +/*- + * #%L + * HAPI FHIR JPA Server Test Utilities + * %% + * Copyright (C) 2014 - 2024 Smile CDR, Inc. + * %% + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * #L% + */ package ca.uhn.fhir.jpa.embedded.annotation; import ca.uhn.fhir.jpa.embedded.OracleCondition;