* Let RemoteTerminologyServiceValidationSupport.validateCodeInValueSet work when code system is present, even when theOptions.isInferSystem() is true
* Add test for specific use case added
* Consider ValueSets with multiple component includes
* Fix typo
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
* Add tests. Add implementation in R4. Add changelog
* Wip partial fixup of other STU versions
* Updated usages of new patient type everything api
* Partial commit of incomplete documentation
* Remove docs
* Cut to token instead of string
* Bump hapi version due to breaking api changes
* Let RemoteTerminologyServiceValidationSupport.validateCodeInValueSet work when code system is present, even when theOptions.isInferSystem() is true
* Add test for specific use case added
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
* Migration finished
* Work on LOBS
* Add to ClobMigrated annotation
* Deal with lobs
* Work on new approach
* HAPI FHIR version bump
* Add changelog
* Add license header
* Fix intermittent
* CLeanup
* Cleanup
* Add test
* Test fixes
* Work on expansion errors
* Fixes
* Test fixes
* Test fixes
* Test fixes
* Test fixes
* Test fixes
* Test fixes
* Test fixes
* Work on fix
* Test fixes
* Test fixes
* Test fix
* Cleanup
* Test fixes
* Remove accidental commit
* Test fixes
* Version bump
* Add changelog
* Fixes
* Test fix
* Fix conflicts
* Try commenting out clearAllStaticFieldsForUnitTest to see what happens
* Rename clearAllStaticFieldsForUnitTest to randomizeLocaleAndTimezone and only do that
* first pass moving core storage classes to storage-api
* move term service apis
* nearly done
* rename hapi-fhir-storage-api to hapi-fhir-storage and move transaction processor
* rename hapi-fhir-storage-api to hapi-fhir-storage and move transaction processor
* create new SearchConstants class to store platform independent search constants
* move a couple of subscription services
* move transaction processor adapter to storage
* version bump
* move searchparam
* fix test
* fix compile includes
* fix text
* fix docs
* failed attempt to get it to work
* Use simplified FhirContext.forCached methods
* Update hapi-fhir-base/src/main/java/ca/uhn/fhir/context/FhirContext.java
Co-authored-by: James Agnew <jamesagnew@gmail.com>
Co-authored-by: James Agnew <jamesagnew@gmail.com>
* Avoid dead SearchParameter links in CapabilityStatement
* Resolve fixmes
* Build fix
* Test fix
* Test fix
* Test fix
* Work on metadata
* Fixes
* Test fixes
* Test fix
* Test fix
* Test fixes
* Test fix
* Work on test fixes
* Test fixes
* Test fixes
* Test fixes
* Test fix
* Fix params
* Resolve fixme
* Adjust changelogs
* Version bump to 5.6.0-PRE7
* Add license header
* Bump version to 5.6.0-PRE5-SNAPSHOT
* initial roughout of mdm clear batch job
* still roughing out classes
* finished first draft of reader
* most tests passing now
* all tests pass. now FIXMEs
* FIXMEs done. Time for regression.
* fix test
* changelog and docs
* fix test
* pre-review cleanup
* version bump
* move spring autowire deps for cdr
* move spring autowire deps for cdr
* finally got beans working phew
* rearrange method calls so persistence module can perform clear operation on its own
* Update hapi-fhir-docs/src/main/resources/ca/uhn/hapi/fhir/docs/server_jpa_mdm/mdm_operations.md
Co-authored-by: Tadgh <tadgh@cs.toronto.edu>
* review feedback
* review feedback
* Remove inheritance
* fix beans
Co-authored-by: Tadgh <tadgh@cs.toronto.edu>
Co-authored-by: Tadgh <garygrantgraham@gmail.com>
* Impl displayLanguage for $lookup - first cut
* Added original lookupCode method
* Added original lookupCode method to IFhirResourceDaoCodeSystem
* Added more test cases and changelog
* prepare to add $delete-expunge operation that will create a spring batch job
* Add operation
* Wire up jpa provider. Begin with failing test.
* Copy/paste bulk import job as a starting point.
FIXME with proposed design
* delete expunge job parameter validation with test
* implemented reader
stubbed processor, writer
* wip for master merge
* started implementing reader
* started implementing reader
* working with stubs
* happy path batch delete expunge is done
* Provider done but test not passing. Guessing batch infrastructure not running in that test.
* IT test works now
* add reader test
* Converted delete _expunge=true to use new batch job
* DeleteExpungeDaoTest passes
* Fix test
* Change batch size to integer
* rename search count to batch size
* Make delete expunge partition aware
* updated docs
* pre-review cleanup
* change log
* add partition id to SystemRequestDetails
* Make RequestPartitionId serializable
* Change delete expunge provider to use partition id instead of tenant name
* fix tests
* test pointcut gets called
* assert on pointcut calls
* Add resource type to STORAGE_PARTITION_SELECTED pointcut
* bump hapi-fhir version
move expunge provider parameters from JpaConstants to ProviderConstants
* bump hapi-fhir version
* copyrights
* restore deleteexpungeservice for mdm
* restore deleteexpungeservice for mdm
* fix test
* public constants
* convert instant to date
* Moved expunge constants to ProviderConstants
* final review
* disabling InMemoryResourceMatcherR5Test.testNowNextMinute() to see if I can get a clean test run
* fix tests
* fix tests
* fix tests
* fix tests
* review feedback
* review feedback
* review feedback
* review feedback
* review feedback
* review feedback
* improve logging
* bump version
* version bump
* recovering from failed merge
* unzip RequestListJson per Gary's suggestion. I didn't want to do it at first, but as usual Gary was right.
* fix serialization